The operation of ChatGPT has a stratospheric daily cost. OpenAI’s popular artificial intelligence platform, powered by its popular chatbot, requires massive amounts of computing power to be able to respond to all the users who interact with it. Specifically, $694,444 per day, according to a report published by the Semianalysis company. This translates into an estimate of 0.36 cents in cost for each consultation, highlight the authors of the research. It also consumes an inordinate amount of water, which is used to cool its data centers.
These figures that are cited in the aforementioned report are based on the GPT-3 model. The cost could be even higher in the case of the latest versions of the chatbox, as explained by the head of the consultancy, Dylan Patel, in The Information. This latest version, yes, is paid, which can partly alleviate the maintenance of the system.
The reason why the cost of ChatGPT skyrockets
The reason ChatGPT is so costly to use is due to the type of AI that the platform’s chatbot works with, which experts call “machine learning inference.” In addition, another report published by SemiAnalysis noted that “ChatGPT infer costs exceed training costs on a weekly basis.” And that has not been contemplated that the number of users who use the platform increases daily.
In addition to the users who access ChatGPT for free to make their queries, there are many companies that use the OpenAI language models. This is the case of Latitude, a new company that is developing a video game through artificial intelligence. These services have an additional cost.
Also read La Vanguardia
As reported by the CEO of said company, Nick Walton, to the US chain CNBC, “running the model together with the payments to the Amazon Web Services servers cost the company $200,000”. “We spend hundreds of thousands of dollars a month on AI and we’re not a big start-up, so it was a very high cost,” he added. This amount was paid for the AI to respond to millions of user queries.
Athena, from Microsoft, the solution for OpenAI
While ChatGPT continues to generate unsustainable maintenance costs, Microsoft is working on developing an AI chip called Athenea. Its objective is to be able to accelerate the progress of other OpenAi models such as ChatGPT and GPT4; and, in addition, thanks to this chip, execution costs can be reduced. Therefore, the nearly 700,000 dollars a day that the use of the popular OpenAI platform supposes would be reduced.
While it may come as a surprise that Microsoft is developing a chip that will help OpenAI, the reality is that the AI models of the company founded by Bill Gates ran on chips from Nvidia, who curiously is, in turn, a provider of OpenAI.