In a surprising move, OpenAI announced that it currently has no active plans to scale Google's in-house AI chips. While early tests with Google's tensor processing units (TPUs) are underway, the lab is sticking with its trusted partners, Nvidia's GPUs and AMD's AI chips, to drive its innovations. 🤖
Testing new hardware is part of the AI world, but scaling them up takes extra architecture and advanced software support. For now, OpenAI remains focused on what works best, and it’s even on track to develop its very own chip—a milestone expected later this year. 🚀
Adding a twist to the tech tale, OpenAI has also partnered with Google Cloud services to support its growing computing needs, mainly through GPU servers run by CoreWeave. This collaboration shows that in today’s fast-paced tech scene, even competitors can join forces behind the scenes. 🔥
Despite earlier speculations about a switch to more Google AI chips, OpenAI's strategy underlines a balanced approach: keep testing new possibilities while relying on proven, robust technology. Stay tuned for more updates as the story unfolds!
Reference(s):
cgtn.com