The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • Steve Dice@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    The team measured GPT-5’s power consumption by combining two key factors: how long the model took to respond to a given request, and the estimated average power draw of the hardware [they believe is] running it.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      Basically, it’s just a number whipped out the backside, since response delay varies a lot, and I don’t think we know what hardware OpenAI uses exactly.