
Fortune Business Insight estimates the global AI market at $233.46 billion in 2024, projecting a rise from $294.16 billion in 2025 to $1.77 trillion by 2032, representing a compound annual growth rate (CAGR) of 29.2% from 2024 to 2032.
Dublin, Ireland, March 23, 2025 – Cerebras Systems, an AI hardware company seeking to rival Nvidia in the AI sector, announced a significant expansion of its data center infrastructure and two key enterprise partnerships on Tuesday. These moves are designed to position Cerebras Systems as a leading provider of rapid AI inference capabilities.
The company’s inference capacity will increase dramatically, scaling up to over 40 million tokens per second through the addition of six new AI data centers located in North America and Europe. With 85% of the expansion focused in the US, these facilities will be located in Dallas, Minneapolis, Oklahoma City, Montreal, New York, and France.
This data center expansion underscores the company’s bold expectation that the market for high-speed AI inference—the process of generating outputs for real-world applications using trained AI models—will grow significantly, as businesses seek faster alternatives to GPU-based solutions from Nvidia.
Alongside the infrastructure expansion, Cerebras announced partnerships with AlphaSense, a market intelligence platform widely used in the financial services industry, and Hugging Face, a prominent AI development platform.
The Hugging Face integration will provide its five million developers with one-click access to Cerebras Inference, eliminating the need for individual registration with Cerebras. This could become a major distribution channel for Cerebras, particularly for developers using open-source models like Llama 3.3 70B.
The AlphaSense partnership signifies a major enterprise customer win, with the financial intelligence platform transitioning from a “global, top-three closed-source AI model vendor” to Cerebras. Cerebras is assisting AlphaSense, which serves over 85% of Fortune 100 companies, in accelerating its AI-driven market intelligence search functionalities.
About Cerebras Systems
Cerebras Systems is composed of pioneering computer architects, computer scientists, deep learning researchers, and engineers united to accelerate generative AI by building a new class of AI supercomputer from the ground up. Our flagship product, the CS-3 system, is powered by the Wafer-Scale Engine-3, the world’s largest and fastest commercially available AI processor. CS-3s can be easily clustered to create the largest AI supercomputers globally, simplifying model placement by eliminating the complexities of distributed computing. Cerebras Inference delivers groundbreaking inference speeds, enabling customers to develop cutting-edge AI applications. Leading corporations, research institutions, and governments utilize Cerebras solutions to develop groundbreaking proprietary models and train open-source models with millions of downloads. Cerebras solutions are available via the Cerebras Cloud and on-premise. For more information, visit cerebras.ai


Media Contact
Monument
+353 (0)8 1800 5284
5 Earlsfort Terrace
Source :Monument
“`