Microsoft are looking at putting datacenters under the ocean, which sounds like a really good idea to cool them but I can’t help but think a couple decades from now it’s going to start causing us problems
Microsoft are looking at putting datacenters under the ocean, which sounds like a really good idea to cool them but I can’t help but think a couple decades from now it’s going to start causing us problems
The most sustainable option would also always be the cheapest option, if regulations were properly designed to correct for externalized costs. We should strive for that.
I can tell you that big data centers likely have a 4 year hardware cycle, where it is all under warranty and service contract.
After which, it gets sold to refurbishers who refurb it and resell it. Or the datacenter may repurpose it for labs, OOB hardware, or donate it to schools.
A lot of smaller companies don’t need the latest and greatest, and are quite happy running old 2nd hand hardware.
Even after they are done with it, there are plenty of hobbyists that will buy it. I have a couple 8 year old servers that run absolutely fine for what I need.
Old servers are also kept around as parts for companies that refuse to update old hardware (and will just keep buying spares, or like-for-like replacements).
The last step is ewaste, where the good stuff gets boiled in acid to extract the gold, or whatever they do.
The only things that are generally destroyed during hardware cycles are the storage, and that’s normally for compliance reasons.
It might be the most sustainable option.
Building a server farm that needs to be maintained means building out space for people to maintain the farm, increasing the space and requiring additional cooling to make the farm able to be serviced by humans.
It also means that less efficient hardware gets used for a longer period of time, driving up electricity costs.
And as long as the e-waste is recycled, in which it is easier to recycle an entire farm rather than consumer electronics, what is the issue?