Researchers say that the model behind the chatbot fabricated a convincing bogus database, but a forensic examination shows it doesn’t pass for authentic.
Researchers say that the model behind the chatbot fabricated a convincing bogus database, but a forensic examination shows it doesn’t pass for authentic.
There was someone on radio the other day talking about doing research with it for their show and starting by asking a simple math question that it got wrong and then the conversation devolved into ChatGPT inventing anecdotes when asked about if a Nobel medal was ever brought to space and it ending up saying it didn’t know why it kept inventing anecdotes instead of finding reliable info!
All to say, it doesn’t know what is and isn’t reliable information so it builds answers based on what it interprets you might want to read.