Datamation content and product recommendations are
editorially independent. We may make money when you click on links
to our partners.
Learn More
US-based AI chip startup Groq has raised $750 million to give it a $6.9 billion valuation, more than doubling from $2.8 billion just 13 months ago.
The round was led by Disruptive with significant investment from BlackRock, Neuberger Berman, Deutsche Telekom Capital Partners and a large US-based West Coast mutual fund manager. The raise also included continued support from Samsung, Cisco, D1, Altimeter, 1789 Capital and Infinitum.
“Inference is defining this era of AI, and we’re building the American infrastructure that delivers it with high speed and low cost.” said Jonathan Ross, Groq Founder and CEO, in the announcement.
The startup powers AI apps for over two million developers and is delivering something that once felt out of reach, making AI responses faster than human thought.
The need for speed
While headlines often orbit Nvidia, Groq has been building to break the pattern. Their language processing units are designed to beat traditional chips Recent tests show Groq hitting 750 tokens per second in ChatGPT-style responses, compared with 30 to 60 tokens on conventional GPUs.
The trick sits in what Groq calls deterministic processing. Instead of GPUs juggling many tasks at once, Groq’s LPUs use SRAM for on-chip memory, which enables sub millisecond latency and up to 10 times better energy efficiency per token. That is why BlackRock, Samsung, and Deutsche Telekom did not just write checks, they bet on a different way of running AI.
Sweet Saudi Arabian deal
Seven months ago, Groq landed a $1.5 billion commitment from Saudi Arabia to build what could become the world’s largest AI inference hub.
The Saudi deal alone is expected to generate approximately $500 million in revenue this year. Most startups never see that number. At the same time, Groq has expanded to 13 global data centers and is targeting 50 percent of global inference compute capacity by deploying over 100,000 LPUs.
What this means for the $97 billion AI inference market
The timing is helpful. The global AI inference market was valued at $97.24 billion in 2024 and is projected to reach $253.75 billion by 2030. Nvidia still controls over 90% of the AI chip market, but that dominance is rooted in training, not the inference chips that run real applications.
Groq is not chasing Nvidia in training, they are opening a different front. Their partnerships with Meta for Llama 4 inference and Bell Canada for large scale infrastructure make the point with real deployments, not slideware.
When institutions like BlackRock commit nine figures, it reads less like hype and more like a call on where the market is headed.
Designs, disruption, depth
Groq says its name comes from the word “grok” which means the ability “to understand profoundly and intuitively.” Maybe they’re right.
The draw here is not just the capital, it is the timing. As AI shifts from training to deployment, the constraint moves from raw horsepower to speed and efficiency. Groq is betting that instant feels right, and that in a world of live apps and impatient users, specialized inference chips become non-negotiable.
With an estimated $3 billion+ raised to date and a valuation in rare unicorn territory, Groq has the firepower to try.