Can Intel’s Arrow Lake iGPU Outperform AMD’s Radeon 890M?

The ongoing battle between Intel and AMD has reached another milestone with Intel’s latest iGPU from their Arrow Lake series being put to the test against AMD’s trusted Radeon 890M. When Intel unveiled the Xe-LPG+ based Arc 130T, many wondered if it could potentially rival AMD’s established dominance in the integrated graphics market. Geekbench’s OpenCL test provided some revealing insights into this competition, indicating that Intel’s new architecture might be more than just an incremental enhancement. Scoring 33,508 points, the Arc 130T demonstrated a significant lead over Intel’s own, more modern Xe2-based Arc 140V, which clocked in around the 27-28K mark. While the score doesn’t outshine AMD’s Radeon 890M, which sits at a commendable 37,804 points, the test results do speak volumes about the potential efficiency of Intel’s refined Xe-LPG+ architecture. This scenario raises questions about the balance and importance of new architectures versus well-optimized older versions.

Intel’s Xe-LPG+ vs. Xe2: Breaking Down the Scores

In the realm of integrated GPUs, architecture plays a pivotal role in determining overall performance and efficiency. The Xe-LPG+ architecture of Arc 130T is essentially a polished version of the older Xe-LPG and has been shown to excel particularly in standardized benchmarks like OpenCL. So, how does this refined architecture stack up against the newer but seemingly less optimized Xe2 cores found in the Arc 140V? The Intel Core Ultra 225H "Arrow Lake-H" CPU, which houses the Arc 130T iGPU, features a 14-core configuration comprising 4 P-Cores and 10 E-Cores with a base clock of 1.70 GHz, boosting up to 4.9 GHz. Tested within a Samsung laptop equipped with 16 GB of memory, this system managed to harness the potential of its newer GPU effectively. The 7 cores of the Xe-LPG+ managed to outperform the 8 Xe2 cores of the Arc 140V, thanks to better support for older OpenCL APIs amongst other enhancements.

The CPU performance indicated by Geekbench scores provides an additional perspective, with the Intel Core Ultra 5 225H achieving single-core scores of 2547 points and multi-core scores of 12,448 points. These figures might suggest modest improvements in CPU performance when compared to its predecessor, the Core Ultra 5 155H. However, it’s the iGPU performance that steals the limelight, highlighting how architectural refinements and optimizations for existing APIs can propel older designs into competitive territory against newer but less mature counterparts.

Comparing Arrow Lake with the Competition

AMD’s Radeon 890M continues to set a high benchmark, achieving 37,804 points in OpenCL tests. This shows that while Intel’s Arc 130T is a respectable competitor, it doesn’t take the lead. These scores mainly reflect standardized benchmarks, raising questions about real-world performance, especially in gaming, where driver optimizations can substantially affect results. The nuanced performance metrics question the practical edge of Arc 130T’s Xe-LPG+ architecture compared to AMD’s more powerful iGPU.

Intel seems to focus on achieving better support for older APIs with the Xe-LPG+ architecture, aiming for smoother experiences in specific benchmarks. On the other hand, AMD is dedicated to producing high-performing, versatile products that excel beyond synthetic tests. Ongoing optimizations and future driver updates present opportunities for further performance improvements, adding complexity to this competitive tech narrative.

Intel’s Arrow Lake may advance iGPU technologies further through continued refinements. Although CPU improvements seem minimal, significant gains in iGPU performance, as seen in benchmarks, cannot be dismissed. The competition with AMD’s Radeon 890M highlights the dynamic nature of integrated graphics advancements. Future updates and driver enhancements promise more exciting developments, potentially reshaping user experiences and performance perceptions.

Explore more

Why Are Big Data Engineers Vital to the Digital Economy?

In a world where every click, swipe, and sensor reading generates a data point, businesses are drowning in an ocean of information—yet only a fraction can harness its power, and the stakes are incredibly high. Consider this staggering reality: companies can lose up to 20% of their annual revenue due to inefficient data practices, a financial hit that serves as

How Will AI and 5G Transform Africa’s Mobile Startups?

Imagine a continent where mobile technology isn’t just a convenience but the very backbone of economic growth, connecting millions to opportunities previously out of reach, and setting the stage for a transformative era. Africa, with its vibrant and rapidly expanding mobile economy, stands at the threshold of a technological revolution driven by the powerful synergy of artificial intelligence (AI) and

Saudi Arabia Cuts Foreign Worker Salary Premiums Under Vision 2030

What happens when a nation known for its generous pay packages for foreign talent suddenly tightens the purse strings? In Saudi Arabia, a seismic shift is underway as salary premiums for expatriate workers, once a hallmark of the kingdom’s appeal, are being slashed. This dramatic change, set to unfold in 2025, signals a new era of fiscal caution and strategic

DevSecOps Evolution: From Shift Left to Shift Smart

Introduction to DevSecOps Transformation In today’s fast-paced digital landscape, where software releases happen in hours rather than months, the integration of security into the software development lifecycle (SDLC) has become a cornerstone of organizational success, especially as cyber threats escalate and the demand for speed remains relentless. DevSecOps, the practice of embedding security practices throughout the development process, stands as

AI Agent Testing: Revolutionizing DevOps Reliability

In an era where software deployment cycles are shrinking to mere hours, the integration of AI agents into DevOps pipelines has emerged as a game-changer, promising unparalleled efficiency but also introducing complex challenges that must be addressed. Picture a critical production system crashing at midnight due to an AI agent’s unchecked token consumption, costing thousands in API overuse before anyone