AMD’s Monster AI Accelerator, MI300, Set to Ramp Up in the AI Market

At CES in January, AMD CEO Dr. Lisa Su pulled a “one more thing” move at the end of the presentation and unveiled a monster AI accelerator named MI300. This highly anticipated product is now making its way to the market, with AMD already shipping MI300A accelerators to the El Capitan Exascale Supercomputer.

Shipping of MI300A Accelerators

During a call with investors and analysts, Dr. Su confirmed that the shipment of MI300A accelerators has begun. These accelerators will power the El Capitan Exascale Supercomputer, showcasing the capabilities and performance of AMD’s latest innovation.

Overview of MI300A

The MI300A is a beast of a CPU+GPU accelerator, boasting an impressive 146 billion transistors. With 24 Zen 4 cores and CDNA3 GPU chiplets, this computing powerhouse is designed to handle the most demanding AI workloads. The combination of high-performance CPUs and GPUs is expected to deliver exceptional performance and efficiency for AI applications.

AMD’s Plans for the MI300 Series

AMD has ambitious plans for its MI300 series of accelerators. The MI300A, featuring both CPU and GPU capabilities, is just the beginning. The company also has a strictly GPU product in the pipeline called MI300X. With these products, AMD aims to cater to different market segments and meet varied AI computing requirements.

Shipping of MI300X to Cloud Providers and OEMs

Dr. Su further revealed that the GPU-only version of MI300, the MI300X, will be shipped to cloud providers and original equipment manufacturers (OEMs) in the coming weeks. This move is expected to enable cloud-based AI inference and AI training workloads, offering flexibility and scalability to AI infrastructure.

Revenue Projections for Data Center GPU

AMD has high expectations for its data center GPU revenue. Dr. Su stated that the company anticipates generating $400 million in data center GPU revenue for the fourth quarter of 2023 alone. Impressively, this number is projected to skyrocket to over $2 billion in 2024. If these projections come to fruition, the MI300 series will be the fastest product to ramp up to $1 billion in sales in AMD’s history.

Focus on the AI Market

Similar to Nvidia, AMD is fully embracing the AI hype train, fueled by surging demand and the higher profit margins associated with AI products compared to gaming products. The company recognizes the tremendous growth potential in the AI market and is strategically positioning itself to capture a significant share of this expanding segment.

Overall Earnings and Growth

While AMD’s quarterly results were a mixed bag, one area that stood out was the significant growth in its Ryzen processors. This growth showcases AMD’s strong position in the CPU market and bodes well for the success of its MI300 series, which integrates powerful CPUs and GPUs.

With the MI300 series of accelerators, AMD is making a bold statement in the AI market. The company’s focus on delivering high-performance computing solutions tailored for AI workloads aligns with the increasing demand for AI infrastructure. As the MI300A accelerators make their way to the El Capitan Exascale Supercomputer and the MI300X targets cloud providers and OEMs, AMD anticipates substantial revenue growth in the data center GPU segment. The potential success of the MI300 series, combined with impressive Ryzen growth, positions AMD as a key player in the AI market. As this sector continues to expand rapidly, AMD’s innovative solutions are poised to make a significant impact.

Explore more

How AI Agents Work: Types, Uses, Vendors, and Future

From Scripted Bots to Autonomous Coworkers: Why AI Agents Matter Now Everyday workflows are quietly shifting from predictable point-and-click forms into fluid conversations with software that listens, reasons, and takes action across tools without being micromanaged at every step. The momentum behind this change did not arise overnight; organizations spent years automating tasks inside rigid templates only to find that

AI Coding Agents – Review

A Surge Meets Old Lessons Executives promised dazzling efficiency and cost savings by letting AI write most of the code while humans merely supervise, but the past months told a sharper story about speed without discipline turning routine mistakes into outages, leaks, and public postmortems that no board wants to read. Enthusiasm did not vanish; it matured. The technology accelerated

Open Loop Transit Payments – Review

A Fare Without Friction Millions of riders today expect to tap a bank card or phone at a gate, glide through in under half a second, and trust that the system will sort out the best fare later without standing in line for a special card. That expectation sits at the heart of Mastercard’s enhanced open-loop transit solution, which replaces

OVHcloud Unveils 3-AZ Berlin Region for Sovereign EU Cloud

A Launch That Raised The Stakes Under the TV tower’s gaze, a new cloud region stitched across Berlin quietly went live with three availability zones spaced by dozens of kilometers, each with its own power, cooling, and networking, and it recalibrated how European institutions plan for resilience and control. The design read like a utility blueprint rather than a tech

Can the Energy Transition Keep Pace With the AI Boom?

Introduction Power bills are rising even as cleaner energy gains ground because AI’s electricity hunger is rewriting the grid’s playbook and compressing timelines once thought generous. The collision of surging digital demand, sharpened corporate strategy, and evolving policy has turned the energy transition from a marathon into a series of sprints. Data centers, crypto mines, and electrifying freight now press