The cost of running OpenAI’s ChatGPT has become the subject of rampant speculation on social media with estimates placing costs at $3 million a day.
The popular chatbot has become an internet sensation thanks to its ability to swiftly answer questions on a range of topics, write and/or debug code, and even create original poetry. The question now is how much that computing power costs.
Also read: What Is a Prompter? Super Prompts Explained
ChatGPT is very expensive
The cost of running ChatGPT has become the subject of heated debate on social media, with some placing a hefty price tag on the service.
ChatGPT is costing @OpenAI an estimated $3M a day to run 🤯
— Shaun Els (@ShaunEls) December 18, 2022
The OpenAI team, who operates ChatGPT, has fueled that speculation by admitting its expense without specifics.
On December 5, OpenAI CEO Sam Altman said, “We will have to monetize it somehow at some point; the compute costs are eye watering.”
In a later Twitter conversation with Elon Musk, Altman went on to say the cost of the average conversation, “is probably single-digits cents per chat; trying to figure out more precisely and also how we can optimize it.”
Since a chat can compromise a number of queries and since there is no way to know how many queries are in an average chat, figuring out the costs becomes extremely difficult.
Is $3 million unreasonable?
Earlier this week Twitter user Shaun Els ignited the debate on exactly how eye-watering those figures were. According to Els the cost of running ChatGPT is $3 million per day. On request, Els later cited a podcast by media outlet The Verge as his initial source.
The claim was swiftly rebutted by multiple dissenters. As one commentator said, “Thanks for posting this as the source, because now I know it’s a completely unreliable estimate.”
On December 6, Tom Goldstein, Associate Professor at Maryland crunched the numbers and concluded, “I think it’s reasonable to estimate that ChatGPT serves ~10M queries per day. I estimate the cost of running ChatGPT is $100K per day, or $3M per month.”
Goldstein went on to provide some health warnings on his own estimate saying “This is a back-of-the-envelope calculation.”
Other estimates on the cost of ChatGPT have varied from $1.5 million per month to $8 million a month.
As for the $3 million a day figure, is it possible that this outlier estimate is the product of confusion, a misremembering of Goldstein’s $3 million a month estimate? If so, it would be a very human error.
The source is a TikTok video of a podcast where someone roughly remembers that someone else did the math by guessing values and multiplying them with other guessed values.
— Alice und Bob (@alice_und_bob) December 19, 2022
ChatGPT gets cagey with MetaNews
To find out the true cost of running ChatGPT MetaNews took the question to its source. MetaNews asked ChatGPT how much it costs to run. The bot was in no mood to disclose its secrets.
“I’m sorry, but I am not able to provide information on the cost of running ChatGPT or any other specific language model,” said ChatGPT.
“The cost of running a machine learning model can depend on various factors, including the hardware and infrastructure used to train and serve the model, the amount of data used to train the model, and the complexity of the model itself,” the AI added.
“It’s also important to note that language models like ChatGPT are typically trained on large amounts of data and can require significant computing resources to train, so the cost of training and running them can vary widely.”