OpenAI: the chips that will revolutionize AI

3 mins

Greg Russell

Published by: Greg Russell

18 September 2023, 12:06PM GMT+00:00

In Brief

OpenAI's Plan for AI Chips:OpenAI is considering creating its own AI chips due to the increasing demand for chips to train AI models, potentially through acquisitions or in-house development.

Challenges with Current GPU Usage:The current reliance on GPU-based hardware for AI training is facing challenges, including shortages in the market and complexities in scaling up operations, leading to increased costs.

Need for Custom AI Chips:Developing custom AI chips could alleviate the challenges faced by OpenAI and other companies relying on GPU clusters, potentially reducing costs and improving performance.

Financial Resources and Industry Trends:OpenAI has substantial venture capital funding and annual revenue, enabling it to invest heavily in research and development, despite the challenging nature of the hardware business.Other tech giants like Google, Amazon, and Microsoft are also investing in proprietary AI chips, signaling a broader trend in the industry.

Potential Challenges and Future Outlook:Developing custom AI chips is a complex and costly endeavor, as evidenced by the challenges faced by other AI chip makers.OpenAI's bold move into developing its own AI chips has the potential to reshape the industry, but success will depend on overcoming technical and financial hurdles in the years ahead.

OpenAI: the chips that will revolutionize AI

Revolutionizing AI with Open AI Chips

One of the industry’s leading AI startups, OpenAI, is reportedly planning to create its own AI chips, making in a thrilling move. The company has been discussing this huge undertaking since last year, reflecting the urgency of the demand for chips to train AI models. OpenAI is said to be eyeing various approaches to throw coal into its chip ambitions including buying an AI chipmaker to even eloping in-house chips.

The Company’s CEO Sam Altman has enumerated the acquisition of more AI chips as a key strategic change that OpenAI has made to boost its technological capabilities. At present, OpenAI, as it is common with many other AI players, uses GPU-based hardware to produce high-tech models like  ChatGPT , GPT-4, and DALL-E 3. The GPUs are best suitable for training the most powerful AI systems we have today since they can handle many calculations at the same time.

Nevertheless, the current increase in generative AI requirement has overwhelmed the GPU supply chain leading to shortages in the market. Tech giants like Microsoft have just started to experience shortage of server hardware that might interfere with their services. Furthermore, Nvidia’s top-performing AI chips are already on back order through 2024.

Why Open AI Needs It's Own Chips?

In addition to training of models, GPUs are also required to run and serve OpenAI’s AI applications on the cloud. However, for the company, the drawback is that GPU clusters are screen – a complex, expensive, and costly exercise. For example, one analysis argued that, based on Google search volumes, scaling up  ChatGPT  in terms of queries would initially require billions of dollars’ worth of GPUs and cost billions per year to operate.

However, as groundbreaking as OpenAI’s elopment of its own AI chips is, it is not a new elopment. The tensor processing unit (TPU) for training large generative AI systems has also been eloped by Google, and Amazon has proprietary AI chips offered to AWS customers. Microsoft has even been reported to be working with AMD on an in-house AI chip called Athena that OpenAI is said to be testing.

OpenAI has more than $11 billion in venture capital funding and nearly $1 billion in annual revenue making it possible for the company to inject heavily in research and elopment. Shutterstock is in fact contemplating a share sale that could push up its earning to $90 billion. Nevertheless, the hardware business, and especially AI chips, is an unforgiving one.

Difficulties in Production

The difficulties encountered by the AI chip makers such as Graphcore with the recent valuation drop that ended up slumping jobs go-ahead to emphasize how complex the industry it is. Earlier this year, Intel-owned AI chip company Habana Labs also had to fire part of its team, while Meta has faced multiple problems with its custom AI chip efforts.

It is worth mentioning that even if OpenAI sprints to market place its custom chip, the drive would still take years and cost hundreds of millions of dollars a year. However, OpenAI’s investors, who include Microsoft, may baulk at such a daring move.

Finally, OpenAI is moving into uncharted waters in eloping its own AI chips that is set to change the AI field. OpenAI now has impressive resources and a vision that has the potential to reshape the industry ever further, pushing beyond the existing capabilities of AI. But it’s a hard road that lies ahead and only time will tell if their bold bet pays off.


Bell notification
Blue mail
Blured bell
Blue Mail
Mail plane
Mail plane
Mail icon
Mail icon
Mail icon

Join our newsletter

Stay in the know on the latest alpha, news and product updates.