I don’t think we are disagreeing. All I wanted to say was that this Deepseek phenomenon is not going to burst the LLM bubble. It is just going to challenge the monopoly of western AI firms.
I guess it depends on what Deepseek does next. Western tech has this really bad habit of wasting away any hardware advances with unoptimised software. It works out in the favour of both software and hardware companies, because this kind of software is easier to produce and hardware companies get to sell expensive hardware. During this whole LLM craze AI companies have been using more and more compute and data in zombie mode and there has been zero effort in reducing it for some reason. I don’t know why that is because even a small saving percent wise translates to a large absolute number but they don’t seem to care. It is like they want to be shit. Maybe they will realise that planning for startup-run nuclear fusion reactors powering AGI datacentres is not something they should be planning for? I doubt it though.
Generally this is because of the logic of capitalism. Optimizing is far less important than Adam Smith believed it would be. More important is market capture. If I get to 2% market share with 50% margins and you get 30% market share with 10% margins, you are considered the winner. Bigger absolute numbers win with investors. This is partly because of the theory that if an investor is going to spend money they want it spent on the thing that will produce the most money, and optimizing small things is worse than growing big things. The theory continues that optimization is the job of specialists who take the big thing and optimize it incrementally after it’s established market dominance, but establishing market dominance is the first job. The theory finishes with the idea that by the time optimization becomes a viable option for making ROI, there’s probably another new growth project that presents greater upside potential. So, we end up with a ton of companies that focus entirely on growth until they can’t anymore and then they get abandoned for the next big thing, picked up by vultures, torn apart and reorganized into oblivion, and everyone makes money except the workers.
I don’t think we are disagreeing. All I wanted to say was that this Deepseek phenomenon is not going to burst the LLM bubble. It is just going to challenge the monopoly of western AI firms.
Don’t confuse “bubbles” with “hype”. It will not burst the LLM hype but it could burst the speculation bubble of investments in AI companies.
I guess it depends on what Deepseek does next. Western tech has this really bad habit of wasting away any hardware advances with unoptimised software. It works out in the favour of both software and hardware companies, because this kind of software is easier to produce and hardware companies get to sell expensive hardware. During this whole LLM craze AI companies have been using more and more compute and data in zombie mode and there has been zero effort in reducing it for some reason. I don’t know why that is because even a small saving percent wise translates to a large absolute number but they don’t seem to care. It is like they want to be shit. Maybe they will realise that planning for startup-run nuclear fusion reactors powering AGI datacentres is not something they should be planning for? I doubt it though.
Generally this is because of the logic of capitalism. Optimizing is far less important than Adam Smith believed it would be. More important is market capture. If I get to 2% market share with 50% margins and you get 30% market share with 10% margins, you are considered the winner. Bigger absolute numbers win with investors. This is partly because of the theory that if an investor is going to spend money they want it spent on the thing that will produce the most money, and optimizing small things is worse than growing big things. The theory continues that optimization is the job of specialists who take the big thing and optimize it incrementally after it’s established market dominance, but establishing market dominance is the first job. The theory finishes with the idea that by the time optimization becomes a viable option for making ROI, there’s probably another new growth project that presents greater upside potential. So, we end up with a ton of companies that focus entirely on growth until they can’t anymore and then they get abandoned for the next big thing, picked up by vultures, torn apart and reorganized into oblivion, and everyone makes money except the workers.