- cross-posted to:
- aicompanions@lemmy.world
- cross-posted to:
- aicompanions@lemmy.world
For those using ChatGPT, if anything you post is used in a lawsuit against OpenAI, OpenAI can send you the bill for the court case (attorney fees and such) whether OpenAI wins or loses.
Examples:
-
A defamation case by an Australian mayor because ChatGPT incorrectly stated that he had served prison time for bribery: https://www.reuters.com/technology/australian-mayor-readies-worlds-first-defamation-lawsuit-over-chatgpt-content-2023-04-05/
-
OpenAI sued for defamation after ChatGPT fabricates legal accusations against radio host: https://www.theverge.com/2023/6/9/23755057/openai-chatgpt-false-information-defamation-lawsuit
-
Sarah Silverman sues OpenAI for copyright infringement: https://lemmy.ml/post/1905056
Attorney talking about their ToS (same link as post link): https://youtu.be/fOTuIhOWFXU?t=268
https://openai.com/policies/terms-of-use 7. Indemnification; Disclaimer of Warranties; Limitations on Liability (a) Indemnity. You will defend, indemnify, and hold harmless us, our affiliates, and our personnel, from and against any claims, losses, and expenses (including attorneys’ fees) arising from or relating to your use of the Services, including your Content, products or services you develop or offer in connection with the Services, and your breach of these Terms or violation of applicable law.
Tried to read this post twice- what are you telling me to be aware of / stop doing?
I am not a lawyer and the implications are larger than this.
Do not post, share, trade, or otherwise make public any ChatGPT output from your sessions until you fact verify that data to the extent that you’re willing to take legal responsibility for it. In this case, especially causing a lawsuit against OpenAI. Because when that happens, you will foot the bill.
I am not a lawyer.
Hi, I’m a lawyer. While I work in a different area of law and therefore can’t speak too in depth about this with certainty, if their terms are as enforceable as the linked articles seem to indicate, then yes, this is good advice.
As always with the law, things may vary by jurisdiction. If you have specific questions, contact a lawyer in your area.
Basically just be careful if you like to post images/text taken straight from ChatGPT.
If you post anything that someone gets offended about and decides to sue ChatGPT (OpenAI) over it, they can turn around and bill you for those legal costs (whether they win the lawsuit or not).
Or if you post a screenshot that proves that you can get ChatGPT to write out the entire first chapter of some copyright protected book…
I’ve also seen people who like to “jailbreak” ChatGPT and then post things like tricking ChatGPT into giving instructions on how to make certain illegal devices and such. Again, just be careful and think if someone could sue the makers of ChatGPT and they include your social media post in the lawsuit, you have already agreed to pay their legal costs for that lawsuit.
Is that enforceable? Seems ridiculous.
I agree, it seems ridiculous, but according to the attorney in the video this would be enforceable, at least in the U.S.: https://piped.video/fOTuIhOWFXU?t=330
I’m sure you could try to get your own attorney to try to fight back against OpenAI’s attempt to bill you, but that’s going to cost you as well.
If I’m understanding this correctly, whatever chatgpt responds to your queries, you can be held liable for if any damaging content is produced.
It makes sense, right?
They produced a language model. It does nothing more than predict the next word. It will lie all the time, that’s part of how it works. It makes stuff up from the input it gets.
If you post that stuff online and it contains lies about people and you didn’t check it, you absolutely should be liable for that. I don’t see a problem with that.
Right, but what about the case where you post something that doesn’t contain lies at all?
What if ChatGPT outputs something that a certain former president gets offended by and he decides to sue OpenAI?
According to their ToS it doesn’t matter if it’s a “frivolous lawsuit”. If OpenAI had to pay any attorney fees just to respond to some ridiculous lawsuit, they could still bill you for those costs.
I don’t think it makes sense at that point at all.
Of course the vast majority of users would never have to worry about this, but it’s still something to be aware of.
It’s a tool. Can’t sue the manufacturer if you injure someone with it.
This isn’t true in the least. Purchase a tool and look through the manual. Every section marked “danger”, “warning”, or “caution” was put in there because someone sued some company because the user or some bystander was hurt or injured.
You are right. Seems I confused common sense with reality.
You ever heard of a product recall?
You can if the tool is defective.
That’s gotta be more to cover their ass then to come after you. Unless you use it’s generated text to sue the company I don’t think they would ever try to sue their users or else everyone would stop using the platform and Microsoft would have a huge PR problem and their stock price would drop. It just doesn’t logically make sense for them to do that, unless they were sued by you for the content produced by your inputs.