1527837 OpenAI turns to Google's AI chips to power its products, source says

Header Ads Widget

OpenAI turns to Google's AI chips to power its products, source says


 


Reports indicate that OpenAI has started renting Google's AI chips, known as Tensor Processing Units (TPUs), to power products like ChatGPT. This marks a significant shift for OpenAI, which has heavily relied on Nvidia's GPUs and Microsoft's data centers in the past.

Here's why this move is noteworthy:

 * Diversification of Chip Supply: OpenAI is one of the largest purchasers of Nvidia's GPUs. By using Google's TPUs, OpenAI is diversifying its hardware suppliers, aiming to reduce its dependence on any single provider.

 * Cost Reduction: The primary driver for this move appears to be cost. OpenAI hopes that renting Google's TPUs through Google Cloud will help lower the cost of "inference computing" – the process where an AI model uses its trained knowledge to make predictions or decisions.

 * Reduced Reliance on Microsoft: This also signals a shift away from complete reliance on its major backer, Microsoft, and its Azure data centers.

 * Google's TPU Push: For Google, this is a win for its in-house TPUs. While Google historically reserved its most powerful TPUs for internal use (like its Gemini project), it has been expanding external availability of these chips. Landing a high-profile customer like OpenAI demonstrates the capabilities and competitiveness of Google's TPU technology in the AI market.

 * Competitive Landscape: This collaboration is particularly interesting given that Google and OpenAI are direct competitors in the AI space. However, it highlights the intense demand for AI computing resources and the willingness of companies to partner even with rivals to meet their needs. It's important to note that Google is reportedly not providing OpenAI with its most powerful TPUs, likely to maintain a competitive advantage for its own AI development.

This move could have broader implications for the AI chip market, potentially boosting TPUs as a viable and more cost-effective alternative to Nvidia's dominant GPUs for certain AI workloads. OpenAI is also reportedly developing its own custom AI chips, further indicating a strategy to gain greater control over its hardware infrastructure and optimize for its specific AI needs.


Post a Comment

0 Comments