- cross-posted to:
- becomeme@sh.itjust.works
- cross-posted to:
- becomeme@sh.itjust.works
If you’re worried about how AI will affect your job, the world of copywriters may offer a glimpse of the future.
Writer Benjamin Miller – not his real name – was thriving in early 2023. He led a team of more than 60 writers and editors, publishing blog posts and articles to promote a tech company that packages and resells data on everything from real estate to used cars. “It was really engaging work,” Miller says, a chance to flex his creativity and collaborate with experts on a variety of subjects. But one day, Miller’s manager told him about a new project. “They wanted to use AI to cut down on costs,” he says. (Miller signed a non-disclosure agreement, and asked the BBC to withhold his and the company’s name.)
A month later, the business introduced an automated system. Miller’s manager would plug a headline for an article into an online form, an AI model would generate an outline based on that title, and Miller would get an alert on his computer. Instead of coming up with their own ideas, his writers would create articles around those outlines, and Miller would do a final edit before the stories were published. Miller only had a few months to adapt before he got news of a second layer of automation. Going forward, ChatGPT would write the articles in their entirety, and most of his team was fired. The few people remaining were left with an even less creative task: editing ChatGPT’s subpar text to make it sound more human.
By 2024, the company laid off the rest of Miller’s team, and he was alone. “All of a sudden I was just doing everyone’s job,” Miller says. Every day, he’d open the AI-written documents to fix the robot’s formulaic mistakes, churning out the work that used to employ dozens of people.
The human serfs will have to proofread increasingly voluminous, numerous and complex output from ai systems. The product has become the master. Until the systems develop a sense of ‘truth’ beyond numerical statistics, generative ai is pretty much a toy.
I’ll start by saying I am pro-worker, pro-99%, pro-human.
Now, I must refute your assertion for specific domains (and specific working styles), e.g. translation (or a preference for editing over drafting/coding from a blank page). If money used to hit your bank account every two weeks because you translated or provided customer service for a company, and now that money doesn’t come in anymore, it wouldn’t feel too playful or like a toy is involved.
This is today, not “until” any future milestone.
Re-sharing some screenshots I took a month or so back, below.
November 2022: ChatGPT is released
April 2024 survey: 40% of translators have lost income to generative AI - The Guardian
Also of note from the podcast Hard Fork:
There’s a client you would fire… if copywriting jobs weren’t harder to come by these days as well.
Customer service impact, last October:
And this past February - potential 700 employee impact at a single company:
If you’re technical, the tech isn’t as interesting [yet]:
Overall, costs down, capabilities up (neat demos):
Hope everyone reading this keeps up their skillsets and fights for Universal Basic Income for the rest of humanity :)
Air Canada did that too. Only the lack of precision made offers to customers they weren’t prepared to honor.
Have to wonder how much Klarna invested in their tech, assuming they’re not big ole fibbers
Fibbing for investors and big tech. Name a more iconic duo?
I think retrieval augmentation and fine tunning are the biggest tools to the results more refined (or better reference a document as a source of truth). The other ironically is just regular deterministic programming.
If it’s just a “toy” then how is it able to have all this economic impact?
it’s an economic bubble, it will eventually burst but several grifters will walk out with tons of money while the rest of us will have to endure the impact
There were bubbles about things which were promising profits in the future but at the moment nobody knew how exactly. Like the dotcom bubble.
This one does bring profits now in its core too, but that’s a limited resource. It will be less and less useful the more poisoned with generated output the textual universe is. It’s a fundamental truth, thus I’m certain of it.
Due to that happening slowly, I’m not sure there’ll be a bubble bursting. Rather it’ll slowly become irrelevant.
While I agree with most points you make, I cannot see a machine that is, at a bare minimum, able to translate between arbitrary languages become irrelevant anytime in the foreseeable future.
OK, I agree, translation is useful and is fundamentally something it makes sense for.
Disagree about “arbitrary”, you need a huge enough dataset for every language, and it’s not going to be that much better than 90’s machine translators though.
It’s closer, but in practice still requires a human to check the whole text. Which raises the question, why use it at all instead of a machine translator with more modest requirements.
And also this may poison smaller languages with translation artifacts becoming norm. Calque is one thing, here one can expect stuff of the “medieval monks mixing up Armorica and Armenia” kind (I fucking hate those of Armenians still perpetuating that single known mistake), only better masqueraded.
While I am sure some job displacement is happening getting worked up over ai at this point when we had decades of blue collar offshoring that is being kinda reversed and replaced with white collar offshoring.
Or here is another one immigration at both levels.
Currently AIs biggest impact is providing cover for white collar offshoring.
Tech support and Customer Service tested the waters now they are full force trying professional services.
Bigger impact all around and not much discussion.
Well, while my views on economics are still libertarian, there’s one trait of very complex and interconnected systems, like market economies (as opposed to Soviet-style planned economy, say, which was still arcanely complex, but with fewer connections to track in analysis, even considering black markets, barter, unofficial negotiations etc), - it’s never clear how centralized it really is and it’s never clear what it’s going to become.
It’s funny how dystopian things from all extremes of that “political compass” thing come into reality. Turns out you don’t need to pick one, it can suck all ways. Matches well what one would expect from learning about world history, of course.
What I’m trying to say is that power is power, not matter what’s written in any laws. The only thing resembling a cure is keeping as much of it as possible distributed at all times. A few decades from now it may be found again, and then after some time forgotten again.
Whatever we got is deff not distributed in any sense of the term lol
That’s my point.
The popular idea that you can avoid giving power to the average person, giving it all to some bureaucracy, or big companies in some industry, or some institutions, and yelling “rule of law”, has come to its logical conclusion.
Where bureaucrats become sort of a mafia\aristocracy layer, big companies become oligopolies entangled with everything wrong, and institutions sell their decisions rather cheap.
Speculation. 100% speculation. A tool is precise. A toy is not. Guided ai, e.g. for circuit optimizing or fleet optimization is brilliant. Gai is not the same.
Evidently “precision” isn’t needed for the things the AI is being used for here.
Right. And neither is the investment it is attracting.
https://www.statista.com/topics/1108/toy-industry/#topicOverview
Because toy industry huge