OpenAI, a prominent AI startup known for its substantial funding, is currently exploring the possibility of manufacturing its own AI chips.

According to Reuters, internal discussions regarding AI chip strategies have been ongoing within the company since at least last year, driven by the escalating shortage of chips required for training AI models. OpenAI is said to be evaluating various approaches to advance its chip-related aspirations, including the potential acquisition of an existing AI chip manufacturer or an in-house chip design initiative.

CEO Sam Altman has reportedly placed a high priority on securing additional AI chips for OpenAI.

At present, like many of its competitors, OpenAI relies heavily on GPU-based hardware to develop models such as ChatGPT, GPT-4, and DALL-E 3. GPUs are favored for their ability to execute numerous computations in parallel, making them well-suited for training advanced AI models.

However, the surge in demand for generative AI technologies has placed tremendous strain on the GPU supply chain. Microsoft, for instance, warned of severe shortages in the server hardware required to support AI operations, possibly leading to service disruptions. Furthermore, Nvidia, a prominent GPU manufacturer, has reported that its top-performing AI chips are sold out until 2024.

GPUs are also essential for running and serving OpenAI’s models, necessitating the use of GPU clusters in the cloud, albeit at considerable expense.

An analysis by Bernstein analyst Stacy Rasgon estimates that if ChatGPT queries were to reach a scale one-tenth that of Google Search, it would initially require approximately $48.1 billion worth of GPUs and around $16 billion worth of chips annually to sustain its operation.

OpenAI wouldn’t be the first organization to explore creating its own AI chips. Companies like Google, Amazon, and Microsoft have already developed proprietary processors for AI tasks. Google, for example, employs TPUs (tensor processing units) for training large generative AI systems. Amazon offers its Trainium and Inferentia chips to AWS customers for training and inference, respectively. Microsoft is reportedly collaborating with AMD on an in-house AI chip named Athena, which OpenAI is said to be testing.

OpenAI possesses significant resources for R&D, having raised over $11 billion in venture capital and approaching $1 billion in annual revenue. A recent Wall Street Journal report suggests that the company is also considering a share sale that could potentially value it at $90 billion in the secondary market.

However, the AI chip industry is highly competitive and fraught with challenges. AI chipmaker Graphcore, for instance, faced a significant valuation reduction after a deal with Microsoft fell through, leading to announced job cuts due to the challenging economic environment. Similarly, Habana Labs, an AI chip company owned by Intel, laid off approximately 10% of its workforce. Meta, known for its custom AI chip efforts, encountered issues, resulting in the abandonment of some experimental hardware projects.

Even if OpenAI commits to developing its own custom AI chip, such an endeavor could span several years and involve annual expenditures in the hundreds of millions of dollars. It remains uncertain whether the startup’s investors, including Microsoft, are willing to embrace such a risky venture.