Alphabet's Google Cloud division has unveiled a new generation of tensor processing units and signed a wave of partnership deals that analysts say will cement the company's position as the cheapest provider of AI compute at scale. Shares of Google climbed 1.7% on the news, outperforming the broader market even as oil spiked above $100 a barrel on fresh tensions in the Strait of Hormuz.
The new lineup splits Google's custom silicon into two purpose-built products: the TPU 8T for training AI models and the TPU 8i for running those models in production. The split mirrors moves by Amazon Web Services, but Bloomberg's Ed Ludlow noted that Google has "led the pack" in designing its own chips and vertically integrating them into its cloud.
Alongside the hardware launch, Google announced an expanded partnership with Nvidia to power agentic and physical AI workloads, a deal with Oracle to let joint customers query Oracle data in natural language, and agreements with Salesforce, CrowdStrike, Broadcom, and startups including Mira Murati's Thinking Machines.
Mandep Singh, senior technology analyst at Bloomberg Intelligence, said the TPU launch is more than an incremental upgrade. "Right now it is about having the capacity to deploy AI workloads. That's what everyone is scrambling for," Singh said. He pointed out that Anthropic has already been forced to change its models because it lacks the compute capacity to serve its best model at scale.
Google, by contrast, owns the full stack. "Google has that advantage when it comes to owning that TPU supply and also deploying it at scale with their family of apps," Singh said. He expects the gap between Google Cloud and its rivals to widen when Alphabet reports earnings on April 29. Last quarter, the cloud segment grew close to 50%, compared with 40% growth at Microsoft Azure.
The economics, Singh argued, are what set Google apart. "Not only are they growing faster than the other cloud players, their margins are expanding, and that's where people will realize they are the lowest cost token provider when you compare them to everyone else out there," he said. Google is already processing around 1,500 trillion tokens per month, roughly three to four times the volume of OpenAI and Anthropic combined.
That scale advantage is being reinforced by the new deal structure. Singh highlighted the contrast between Amazon's commercial relationship with Anthropic, where AWS offered $5 billion of credits to spend on its cloud, and Google's deal to supply Anthropic with 3.5 gigawatts of TPU capacity without giving money back. "That is a sign that illustrates Google probably is in a better position when it comes to negotiating with these frontier LLMs," he said.
The TPU launch also landed on the same day that Anthropic confirmed a small group of unauthorized users had accessed Mythos, its newest model, which the company says could enable dangerous cyber attacks. Bloomberg's coverage noted that Mythos may have been trained entirely on Google TPUs, a detail that would serve as validation of the chip's capability for frontier training workloads.
For now, the immediate question is how quickly the partnership announcements translate into revenue. Singh said that as AI workload share shifts, Google's share of cloud infrastructure will rise meaningfully above its traditional cloud position. "This will probably be the biggest driver of that, owning the TPUs and then obviously their chips doing far better than other ASIC providers including Amazon," he said.
