Manufacturing CPUs and other hardware components not only requires huge amounts of energy but also produces a lot of waste in the process of extraction and refinement of material needed to manufacture various components.

So it makes sense to hold onto old hardware for as long as possible to produce the least environmental footprint. However newer hardware typically is much much more efficient and thus consumes less electricity.

So my question is basically: at which point do older components start to be more harmful to the environment due to the amount of electricity they consume compared to the environmental expenditures of manufacturing newer components?

This obviously depends on how electricity is sourced and less so on other factors, but still.

  • @big
    link
    24 years ago

    Best to upgrade when the efficiency gains expected from the next generation are going to be insignificant. Which means technically now is a great time to upgrade as they are already talking about future proof, forever PCs. Windows is not even looking to surpass version 10 and is no longer the poster child of Microsoft.

    However, smartphone and laptop manufacturers seem in cohorts to ensure their entire suite of products (even top end) leave you wanting for something (next year). Choice fatigue on the consumer side is a real thing, and it’s caused by choice maxxing from makers who all try to frustrate consumers into a more profitable tier but savvy consumers see the showstopping feature denial at the lower “fragile” tiers. It only takes one manufacturer to upset the space but it’s too tantalizing not to do the same once there (see Xiaomi).