OpenAI Considers Building Its Own AI Chips to Power ChatGPT and Other AI Models

ChatGPT owner OpenAI is exploring ways to secure more powerful artificial intelligence chips needed to deploy tools like the popular bot, including building its own, according to a new report. According to people familiar with the company’s plans, the Microsoft-backed company has even gone as far as evaluating a potential acquisition target to help make that happen. OpenAI has not yet decided to advance, but it has discussed various options since last year to solve the shortage of expensive AI chips that the company relies on to run ChatGPT and other programs.

Developing its own AI chips would be an extremely ambitious endeavor that likely wouldn’t pay off in the long term. That’s why most tech giants have opted for the less risky path of purchasing chips from commercial providers such as Nvidia and Advanced Micro Devices. If OpenAI is planning to design its chips, the process could take several years and leave it reliant on commercial processors in the interim.

However, if it decides to proceed down this path, it would join a handful of large tech companies that have taken control of designing the critical chips for their businesses. This list also includes Google and Amazon, which both took on this task when they purchased Annapurna Labs and Meta, respectively.

The development of an in-house chip is attractive to these companies because it can help them manage costs more effectively. For example, based on an analysis by Bernstein analyst Stacy Rasgon, each ChatGPT query currently costs the company around 4 cents. If the service were to scale to one-tenth of Google search queries, it would require an initial investment of around $48.1 billion in GPUs and about $16 billion per year in chips to maintain operations.

If OpenAI decides to pursue this strategy, it would still need to rely on TSMC for manufacturing its chips. The Chinese company currently has the most advanced semiconductor production capacity in the world, according to data from GlobalData.

As more companies scramble to develop generative AI models and integrate them into their systems, demand for the high-powered graphics processing units (GPUs) required to deploy these tools has surged. That has led to a shortage of the costly chips that many startups and large enterprises depend on to build and train these programs. This has forced some of them to turn to alternative solutions to secure more necessary chips.

Near Amputation at SpaceX Leads to Fine, Worker Safety in Focus

$2.5 Billion Climb: Brookfield Acquires ATC India, Reshaping Telecom Infrastructure Landscape

Starship to the Red Planet: Musk Unveils “Game Plan” for Interplanetary...

Generative AI on Your Phone: Baidu Expands Reach with Lenovo Partnership

spot_img

More from this stream

Recomended