Locator: 49356GOOGLE.
This needs to be completed but so I don't forget:
November 6, 2025: huge announcement for Google -- developed its own AI chips in-house -- "Ironwood."
- also received contract with Apple for Google Gemini to integrate with Apple's Siri for AI
News:
Stories everywhere:
- Tom's Hardware, link here.
Today, Google Cloud introduced new AI-oriented instances, powered by its own Axion CPUs and Ironwood TPUs. The new instances are aimed at both training and low-latency inference of large-scale AI models, the key feature of these new instances is efficient scaling of AI models, enabled by a very large scale-up world size of Google's Ironwood-based systems.
Ironwood is Google's 7th Generation tensor processing unit (TPU), which delivers 4,614 FP8 TFLOPS of performance and is equipped with 192 GB of HBM3E memory, offering a bandwidth of up to 7.37 TB/s.
Ironwood pods scale up to 9,216 AI accelerators, delivering a total of 42.5 FP8 ExaFLOPS for training and inference, which by far exceeds the FP8 capabilities of Nvidia's GB300 NVL72 system that stands at 0.36 ExaFLOPS. The pod is interconnected using a proprietary 9.6 Tb/s Inter-Chip Interconnect network, and carries roughly 1.77 PB of HBM3E memory in total, once again exceeding what Nvidia's competing platform can offer.
It's important to note that none of this competes with Apple, and in fact, this is good news for Apple.

