The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • dan@upvote.au
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 days ago

    OpenAI are not profitable today, and don’t estimate they’ll be profitable until 2029, so it’s almost guaranteed that they’re selling their services at a loss. Of course, that’s impossible to verify - since they’re a private company, they don’t have to release financial statements.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      That’s not what I’m saying. They’ve all but outright said they’re unprofitable.

      But revenue is increasing. Now, if it stops increasing like they’ve “leveled out”, that is a problem.

      Hence it’s a stretch to assume they would decrease costs for a more expensive model since that would basically pop their bubble well before 2029.

      • dan@upvote.au
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 days ago

        Revenue is increasing, but according to their own estimates, it has to increase 10x in order for them to become profitable.