Cerebras Systems to Launch Six AI Inference Data Centers by 2025

Article Highlights
Off On

Cerebras Systems is embarking on a significant expansion plan to launch six new AI inference data centers across North America and Europe by 2025, a move that promises to enhance high-speed AI capabilities globally. These state-of-the-art facilities will be equipped with thousands of Cerebras CS-3 systems, projected to deliver an astonishing performance of over 40 million Llama 70B tokens per second. Llama is Meta’s groundbreaking open-source large language AI model. Notable locations earmarked for these centers include Minneapolis, Oklahoma City, and Montreal, along with three undisclosed sites in the Midwest and East of the United States as well as in Europe. This endeavor aims to not only bolster Cerebras’ market presence but also to drive significant advancements in AI research and business applications.

Advanced Infrastructure and High-Speed AI Inference

Scheduled for gradual deployment, the Minneapolis center is set to become operational in the second quarter of 2025, followed closely by the centers in Oklahoma City and Montreal, which will commence operations in June and July, respectively. The Oklahoma City site is poised to be a cornerstone facility, featuring over 300 CS-3 systems housed in the Scale Datacenter. This center will incorporate cutting-edge water-cooling solutions, ensuring optimal performance and energy efficiency, and is designed to be one of the most robust and advanced data centers in the United States. Concurrently, the Montreal facility, overseen by Enovum, a division of Bit Digital, Inc., promises to significantly enhance AI capabilities in the region. The remaining three data centers are scheduled for completion by the fourth quarter of 2025, reinforcing Cerebras Systems’ commitment to expanding its infrastructure and services.

Strategic Importance and Future Prospects

These substantial expansions are aimed at cementing Cerebras Systems as a leading provider of high-speed AI inference, a crucial step in maintaining the United States’ global leadership in AI technology while accommodating a growing demand for sophisticated AI solutions. These centers are expected to serve as hubs for critical research and facilitate transformative business efficiencies worldwide. The CS-3 systems, featuring Cerebras’ renowned wafer-scale chips such as the Wafer Scale Engine 3, which boasts four trillion transistors and 900,000 AI cores, represent an unparalleled leap in computational power. These advancements underpin the firm’s strategic vision of fostering innovation and delivering cutting-edge technology to its users. Additionally, Cerebras’ confidential filing for an IPO with the SEC in 2024 signals its readiness to engage more broadly with the market, further solidifying its ambitious growth trajectory.

Explore more