Can Intel’s Arrow Lake iGPU Outperform AMD’s Radeon 890M?

The ongoing battle between Intel and AMD has reached another milestone with Intel’s latest iGPU from their Arrow Lake series being put to the test against AMD’s trusted Radeon 890M. When Intel unveiled the Xe-LPG+ based Arc 130T, many wondered if it could potentially rival AMD’s established dominance in the integrated graphics market. Geekbench’s OpenCL test provided some revealing insights into this competition, indicating that Intel’s new architecture might be more than just an incremental enhancement. Scoring 33,508 points, the Arc 130T demonstrated a significant lead over Intel’s own, more modern Xe2-based Arc 140V, which clocked in around the 27-28K mark. While the score doesn’t outshine AMD’s Radeon 890M, which sits at a commendable 37,804 points, the test results do speak volumes about the potential efficiency of Intel’s refined Xe-LPG+ architecture. This scenario raises questions about the balance and importance of new architectures versus well-optimized older versions.

Intel’s Xe-LPG+ vs. Xe2: Breaking Down the Scores

In the realm of integrated GPUs, architecture plays a pivotal role in determining overall performance and efficiency. The Xe-LPG+ architecture of Arc 130T is essentially a polished version of the older Xe-LPG and has been shown to excel particularly in standardized benchmarks like OpenCL. So, how does this refined architecture stack up against the newer but seemingly less optimized Xe2 cores found in the Arc 140V? The Intel Core Ultra 225H "Arrow Lake-H" CPU, which houses the Arc 130T iGPU, features a 14-core configuration comprising 4 P-Cores and 10 E-Cores with a base clock of 1.70 GHz, boosting up to 4.9 GHz. Tested within a Samsung laptop equipped with 16 GB of memory, this system managed to harness the potential of its newer GPU effectively. The 7 cores of the Xe-LPG+ managed to outperform the 8 Xe2 cores of the Arc 140V, thanks to better support for older OpenCL APIs amongst other enhancements.

The CPU performance indicated by Geekbench scores provides an additional perspective, with the Intel Core Ultra 5 225H achieving single-core scores of 2547 points and multi-core scores of 12,448 points. These figures might suggest modest improvements in CPU performance when compared to its predecessor, the Core Ultra 5 155H. However, it’s the iGPU performance that steals the limelight, highlighting how architectural refinements and optimizations for existing APIs can propel older designs into competitive territory against newer but less mature counterparts.

Comparing Arrow Lake with the Competition

AMD’s Radeon 890M continues to set a high benchmark, achieving 37,804 points in OpenCL tests. This shows that while Intel’s Arc 130T is a respectable competitor, it doesn’t take the lead. These scores mainly reflect standardized benchmarks, raising questions about real-world performance, especially in gaming, where driver optimizations can substantially affect results. The nuanced performance metrics question the practical edge of Arc 130T’s Xe-LPG+ architecture compared to AMD’s more powerful iGPU.

Intel seems to focus on achieving better support for older APIs with the Xe-LPG+ architecture, aiming for smoother experiences in specific benchmarks. On the other hand, AMD is dedicated to producing high-performing, versatile products that excel beyond synthetic tests. Ongoing optimizations and future driver updates present opportunities for further performance improvements, adding complexity to this competitive tech narrative.

Intel’s Arrow Lake may advance iGPU technologies further through continued refinements. Although CPU improvements seem minimal, significant gains in iGPU performance, as seen in benchmarks, cannot be dismissed. The competition with AMD’s Radeon 890M highlights the dynamic nature of integrated graphics advancements. Future updates and driver enhancements promise more exciting developments, potentially reshaping user experiences and performance perceptions.

Explore more

Are Retailers Ready for the AI Payments They’re Building?

The relentless pursuit of a fully autonomous retail experience has spurred massive investment in advanced payment technologies, yet this innovation is dangerously outpacing the foundational readiness of the very businesses driving it. This analysis explores the growing disconnect between retailers’ aggressive adoption of sophisticated systems, like agentic AI, and their lagging operational, legal, and regulatory preparedness. It addresses the central

What’s Fueling Microsoft’s US Data Center Expansion?

Today, we sit down with Dominic Jainy, a distinguished IT professional whose expertise spans the cutting edge of artificial intelligence, machine learning, and blockchain. With Microsoft undertaking one of its most ambitious cloud infrastructure expansions in the United States, we delve into the strategy behind the new data center regions, the drivers for this growth, and what it signals for

What Derailed Oppidan’s Minnesota Data Center Plan?

The development of new data centers often represents a significant economic opportunity for local communities, but the path from a preliminary proposal to a fully operational facility is frequently fraught with complex logistical and regulatory challenges. In a move that highlights these potential obstacles, US real estate developer Oppidan Investment Company has formally retracted its early-stage plans to establish a

Cloud Container Security – Review

The fundamental shift in how modern applications are developed, deployed, and managed can be traced directly to the widespread adoption of cloud container technology, an innovation that promises unprecedented agility and efficiency. Cloud Container technology represents a significant advancement in software development and IT operations. This review will explore the evolution of containers, their key security features, common vulnerabilities, and

Ireland Ends Data Center Ban with Tough New Power Rules

As the artificial intelligence boom strains global power grids to their breaking point, Ireland has pivoted from a complete ban on new data centers to a revolutionary policy that redefines the cost of digital expansion. This analysis examines the landmark decision to end the de facto moratorium on new grid connections, detailing a stringent new framework that transforms data centers