Using its own chips could help OpenAI manage costs associated with running its products in addition to addressing GPU shortages. Each ChatGPT query costs Bernstein Research about 4 cents, according to Stacy Rasgon’s analysis. Although the service lost users for the first time in July, it did reach 100 million monthly users in its first two months, which translates to millions of queries per day. According to Radigon, if ChatGPT queries are a tenth of what Google receives, it will initially require$ 48.1 billion in GPUs and will eventually spend$ 16 billion annually on chips.
The Microsoft supercomputer OpenAI, which was used to develop its technology, currently uses 10, 000 NVIDIA GPUs, indicating that the company currently dominates the market for chips intended for AI applications. Because of this, other businesses— larger players in the tech sector— have decided to begin creating their own. According to The Information, Microsoft, the biggest supporter of OpenAI, has been developing its own AI chip since 2019. Athena is the name of the product, and OpenAI is reportedly testing it.
According to Reuters, OpenAI has not yet made up its mind whether to carry out its plans. Even if it decides to proceed, it might take years before it can begin powering its products with its own chips.