The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • themurphy@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 days ago

    Technologies come and go, but often when a worldwide popular one vanishes, it’s because it got replaced with something else.

    So lets say we need LLM’s to go away. What should that be? Impossible to answer, I know, but that’s what it would take.

    We cant even get rid of Facebook and Twitter.

    BUT that being said. LLMs will be 100x more efficient at some point - like any other new technology. We are just not there yet.