The section about the sky would mention it, so then you go to the index in the R book, find the entry for that phenomenon, and read about Raleigh Scattering. The internet is definitely easier for finding random information though, although it’s harder now than it was like 10 years ago. ChatGPT is amazing for finding random information, but you have to verify what it tells you, since it will just randomly lie for no reason.
It doesn’t “lie” though, it just generates a plausible sequence of words. The sort-of fortunate thing is that facts are often plausible, and it’s going to be trained on a lot of facts. But, facts aren’t the only word-sequences that are plausible, and LLMs are trained to be creative, and that means sometimes choosing a next-word that isn’t the best fit, which might end up meaning the generated sentence isn’t factual.
Calling it a “lie” suggests that it knows the truth, or that it is being deceptive. But, that’s giving “spicy autocomplete” too much credit. It simply generates word salads that may or may not contain truths.
It is certainly more complex than a predictive text machine. It does seem to understand the concept of objective truth, and facts, vs interpretation and inaccurate information. It never intentionally provides false information, but sometimes it thinks it is giving factual information when really it is using an abundance of inaccurate information that it was trained with. I’m honestly surprised at how accurate it usually is, considering it was trained with public data from places like Reddit, where common inaccuracies have reached the level of folklore.
The internet wasn’t allowed for school reports until after I was through with college the first time around. The World Wide Web didn’t even exist for the first half of my life.
Edit: it’s kind of crazy that my career revolves around something that didn’t even exist when people were still asking me what I wanted to be when I grew up. Although, “engineer” was a frequent answer to that question, and that’s certainly in my title now, but it’s an entirely different kind of engineering than I meant back then.
The section about the sky would mention it, so then you go to the index in the R book, find the entry for that phenomenon, and read about Raleigh Scattering. The internet is definitely easier for finding random information though, although it’s harder now than it was like 10 years ago. ChatGPT is amazing for finding random information, but you have to verify what it tells you, since it will just randomly lie for no reason.
It doesn’t “lie” though, it just generates a plausible sequence of words. The sort-of fortunate thing is that facts are often plausible, and it’s going to be trained on a lot of facts. But, facts aren’t the only word-sequences that are plausible, and LLMs are trained to be creative, and that means sometimes choosing a next-word that isn’t the best fit, which might end up meaning the generated sentence isn’t factual.
Calling it a “lie” suggests that it knows the truth, or that it is being deceptive. But, that’s giving “spicy autocomplete” too much credit. It simply generates word salads that may or may not contain truths.
The industry word for it is “hallucination”, but I’m not sure that fits either.
It’s better than lying, but it still implies consciousness. It also implies that it’s doing something different than what it normally does.
In reality, it’s always just generating plausible words.
It’s bullshitting… Faking it till it makes it, if you will.
No, that implies a goal. It’s just spicy autocomplete.
It is certainly more complex than a predictive text machine. It does seem to understand the concept of objective truth, and facts, vs interpretation and inaccurate information. It never intentionally provides false information, but sometimes it thinks it is giving factual information when really it is using an abundance of inaccurate information that it was trained with. I’m honestly surprised at how accurate it usually is, considering it was trained with public data from places like Reddit, where common inaccuracies have reached the level of folklore.
No, it literally isn’t. That’s literally all it is.
Because people are easily fooled, but what it seems like isn’t what’s actually happening.
It’s incapable of thinking. All it does is generate a plausible sequence of words.
I see you have spent time researching the old fashioned way
The internet wasn’t allowed for school reports until after I was through with college the first time around. The World Wide Web didn’t even exist for the first half of my life.
Edit: it’s kind of crazy that my career revolves around something that didn’t even exist when people were still asking me what I wanted to be when I grew up. Although, “engineer” was a frequent answer to that question, and that’s certainly in my title now, but it’s an entirely different kind of engineering than I meant back then.