one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

  • geissi@feddit.de
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    4 months ago

    an ultra-light laptop, or a raspberry pi consume WAY less power than a full on gaming rig, and the same can be said between a data server that is used for e-commerce and a server running AI

    And if external costs are priced into the cost of electricity then that will be reflected in the cost of operating these devices.
    Also there are far more data servers than servers running AI which increases the total effect they have.