Showing posts with label AI Inference. Show all posts
Showing posts with label AI Inference. Show all posts

6.17.2025

Groq just made Hugging Face way faster — and it’s coming for AWS and Google

Groq, the artificial intelligence inference startup, is making an aggressive play to challenge established cloud providers like Amazon Web Services and Google.

The company announced Monday that it now supports Alibaba’s Qwen3 32B language model with its full 131,000-token context window — a technical capability it claims no other fast inference provider can match. Simultaneously, Groq became an official inference provider on Hugging Face’s platform, potentially exposing its technology to millions of developers worldwide.