An Amazon chatbot that’s supposed to surface useful information from customer reviews of specific products will also recommend a variety of racist books, lie about working conditions at Amazon, and write a cover letter for a job application with entirely made up work experience when asked, 404 Media has found.

  • assassin_aragorn@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    9 months ago

    The coffee was hot enough to cause serious burns and fused her labia together and required skin grafts.

    I’ve spilled freshly brewed coffee on myself at home and while it burned, I did not need any medical attention.

    The woman did nothing wrong. The product sold by McDonald’s was purposely overly dangerous. Had she known it was that fucking hot, she probably wouldn’t have put it between her legs. McDonald’s did not properly indicate that their coffee was abnormally hot to the point of causing severe safety risks.

    It’s like if I buy my fifth alcoholic drink in a night, and the bartender gives me straight everclear to drink. I know I’m taking a risk by having another drink, but I’m not expecting to drink nearly pure alcohol and get alcohol poisoning from it.