I wonder what will happen with all the compute once the AI bubble bursts.
It seems like gaming made GPU manufacturing scale enough to start using them as general compute, Bitcoin pumped billions into this market, driving down prices (per FLOP) and AI reaped the benefit of that, when crypto moved to asics and crashed later on.
But what’s next? We’ve got more compute than we could reasonably use. The factories are already there, the knowledge and techniques exist.
Most of the GPUs belong to the big tech companies, like OpenAI, Google and Amazon. AI startups are rarely buying their own GPUs (often they’re just using the OpenAI API). I don’t think the big tech will have any problem figuring out what to do with all their GPU compute.
Compute becomes cheaper and larger undertakings happen. LLMs are huge, but there is new tech moving things along. The key part in LLMs, the transformer is getting new competition that may surpass it, both for LLMs and other machine learning uses.
Otherwise, cheaper GPUs for us gamers would be great.
I wonder what will happen with all the compute once the AI bubble bursts.
It seems like gaming made GPU manufacturing scale enough to start using them as general compute, Bitcoin pumped billions into this market, driving down prices (per FLOP) and AI reaped the benefit of that, when crypto moved to asics and crashed later on.
But what’s next? We’ve got more compute than we could reasonably use. The factories are already there, the knowledge and techniques exist.
Finally very detailed climate simulations to know how hard we’re screwed
…made using the arguably the most criminally environmentally disastrous tech we’ve invented in the past few decades. How ironic!
It will be used for more AI research probably.
Most of the GPUs belong to the big tech companies, like OpenAI, Google and Amazon. AI startups are rarely buying their own GPUs (often they’re just using the OpenAI API). I don’t think the big tech will have any problem figuring out what to do with all their GPU compute.
Compute becomes cheaper and larger undertakings happen. LLMs are huge, but there is new tech moving things along. The key part in LLMs, the transformer is getting new competition that may surpass it, both for LLMs and other machine learning uses.
Otherwise, cheaper GPUs for us gamers would be great.
i think open source will build actually useful integrations due to the available compute