They discovered that, in the last glacial period, Earth experienced its highest CO2 increase: 14 parts per million in just 55 years. Not, our planet experiences that increase every five years.
I have been noticing silly typos all over the place in articles for the last few years, but have no memory of those being common in the past. I guess editors proofreading articles isn’t really a thing anymore?
I’m going to hazard a guess it’s a combination of falling budget and an over reliance on autocorrect. If it’s like other industries, they’re trying to get more articles out with fewer people.
I know that I often have an atrocious number of typos - but some are entirely the fault of autocorrect either changing a correct word to something else or correcting a typo to a word that makes no sense in the context of the sentence. I’m hoping that the next generation will improve this.
If anything a now - not typo at least indicates that it was written by a human. LLM errors generally don’t involve that sort of thing.
I have been noticing silly typos all over the place in articles for the last few years, but have no memory of those being common in the past. I guess editors proofreading articles isn’t really a thing anymore?
It’s probably mostly AI-driven now. It sees the word ‘Not’ is spelled correctly, so it’s good to go.
Looks more like relying on spellcheck than AI.
Everyone fired their editors in 2008.
I just read this as a “not” joke. As in, “yeah that was the fastest ever CO2 increase in earth’s history. Not”
I’m going to hazard a guess it’s a combination of falling budget and an over reliance on autocorrect. If it’s like other industries, they’re trying to get more articles out with fewer people.
I know that I often have an atrocious number of typos - but some are entirely the fault of autocorrect either changing a correct word to something else or correcting a typo to a word that makes no sense in the context of the sentence. I’m hoping that the next generation will improve this.
If anything a now - not typo at least indicates that it was written by a human. LLM errors generally don’t involve that sort of thing.