OpenAI, the company behind ChatGPT, is reportedly taking steps to address the mounting challenge of the escalating scarcity and cost of artificial intelligence (AI) chips.
Consequently, delving into custom AI chip creation might be on OpenAI’s horizon, a move that mirrors the steps of tech giants like Amazon and Google.
The Dire GPU Landscape
Currently, OpenAI leans heavily on Graphics Processing Units (GPUs), with Nvidia, a market leader holding over 80% of the global market share, being their primary provider. GPUs are particularly suited for AI applications because they can process parallel operations.
🤖💥 OpenAI's Quest for AI Chip Dominance 🧠🔌
OpenAI, the force behind ChatGPT, is embarking on a strategic journey to tackle the AI chip shortage and cost issues, with potential plans including chip development and acquisitions. Here's the scoop:
— AI Journos (@AIJournos) October 6, 2023
However, with the increased global demand for AI-driven solutions, the availability of these vital chips has dwindled. Consequently, their prices have surged, leaving companies like OpenAI in a predicament.
CEO Sam Altman has not been silent about the strain. OpenAI’s flagship product, ChatGPT, exemplifies the gravity of the situation. Each query processed by ChatGPT costs approximately 4 cents. To put this in perspective, if ChatGPT’s usage grew to even a tenth of Google search’s scale, OpenAI would need an initial investment of around $48.1 billion worth of GPUs, followed by an annual investment of $16 billion for operations.
To Build or Not to Build: OpenAI’s Conundrum
Faced with such costs and the potential threat of GPU shortages, OpenAI is evaluating alternatives. One viable option is to create custom AI chips tailored to their specific needs. Such a strategic shift would put OpenAI on the same path as Amazon and Google, which have explored chip design to better align with their operational demands.
Yet, venturing into chip creation is challenging. Besides the technical complexities, there are significant financial implications. Industry experts estimate that this endeavor could cost OpenAI hundreds of millions annually. And while acquisitions could expedite the process (much as Amazon’s acquisition of Annapurna Labs in 2015 did), they don’t eliminate the inherent risks of the venture.
Beyond Immediate Needs
While the immediate concern is to address GPU shortages and costs, the implications of this move extend beyond just operational efficiency. One of OpenAI’s key investors, Microsoft, has used a massive supercomputer with 10,000 Nvidia GPUs since 2020. This setup can only be maintained for a while, mainly as artificial intelligence advances and more computing power is required.
Furthermore, a few players monopolize the AI chip market, with Nvidia being the dominant force. Diversifying chip sources or having in-house chips would provide OpenAI greater flexibility, reduce dependencies, and lead to more tailored, efficient AI solutions.
Deciphering the Processing Power Puzzle
Understanding AI models’ computing capability is crucial to understanding the difficulty. Additionally, training and fine-tuning these models require resources. Significantly, OpenAI’s generative AI technologies operate on a 10,000-GPU supercomputer. These models are not only extensive, but they are continuously changing. Moreover, OpenAI’s models have evolved enormously in recent years, which may continue.
However, more parameters equate to more computational power, leading to higher costs. With models reaching up to 175 billion parameters, as in the case of the ChatGPT model, the computational demands are colossal. The primary expense is acquiring these GPUs and the ongoing electricity, maintenance, and upgrade costs.
OpenAI’s potential foray into custom AI chip creation reflects a proactive approach to an industry-wide challenge. While the initial motivations are grounded in addressing the current GPU scarcity and high costs, the long-term implications could be game-changing.
Custom chips might result in more efficient AI models, lower operating costs, and create a precedent for other AI-driven businesses. However, the path ahead is unpredictable and has hurdles, as with any pioneering effort. The following steps taken by OpenAI define the road for the next phase of AI research.