Google has plunged the internet into a “spiral of decline”, the co-founder of the company’s artificial intelligence (AI) lab has claimed.
Mustafa Suleyman, the British entrepreneur who co-founded DeepMind, said: “The business model that Google had broke the internet.”
He said search results had become plagued with “clickbait” to keep people “addicted and absorbed on the page as long as possible”.
Information online is “buried at the bottom of a lot of verbiage and guff”, Mr Suleyman argued, so websites can “sell more adverts”, fuelled by Google’s technology.
Okay but the problem with that is that LLMs not only don’t have any fidelity at all, they can’t. They are analogous to the language planning centre of your brain, which has to be filtered through your conscious mind to check if it’s talking complete crap.
People don’t realise this and think the bot is giving them real information, but it’s actually just giving them spookily realistic word-salad, which is a big problem.
Of course you can fix this if you add some kind of context engine for them to truly grasp the deeper and wider meaning of your query. The problem with that is that if you do that, you’ve basically created an AGI. That may first of all be extremely difficult and far in the future, and second of all it has ethical implications that go beyond how effective of a search engine it is.
Did you read my last little bit there? I said it depends on the information you are looking for. I can paste error output from my terminal into Google and try to find an answer or I can paste it into chatgpt and be, at the very least pointed in the right direction almost immediately, or even given the answer right away vs getting a stackoverflow link and parsing the responses and comments and following secondary and tiertiary links.
I absolutely understand the stochastic parrot conundrum with LLMs. They have significant drawbacks and they are far from perfect, but then neither is are Google search results. There is still a level of skepticism you have to apply.
One of the biggest mistakes people make is the idea that LLMs and websearching is a zero sum affair. They don’t replace each other. They compliment each other. Imo, google is messing up with their “ai” integration into Google search. It sets the expectation that it is an equivalent function.