NVIDIA Unveils Downgraded RTX 5090D for Chinese Market

Article Highlights
Off On

As geopolitical tensions influence technological advancements, NVIDIA has announced the release of a downgraded version of its anticipated GeForce RTX 5090D graphics card tailored specifically for the Chinese market. This strategic move reflects the constraints imposed by the US government’s export policies, aimed at limiting certain technological exports to China. As part of this compliance, the RTX 5090D will feature significant reductions in memory and processing power compared to its original global counterpart, the RTX 5090. These changes include a decrease in VRAM capacity from 32 GB to 24 GB and a reduction in core count from 21,760 CUDA cores to 14,080. NVIDIA’s decision illustrates a balancing act between adhering to regulatory requirements and catering to an essential segment within its consumer base.

Performance and Market Implications

The adjustments to the RTX 5090D, specifically in VRAM and CUDA core numbers, are anticipated to significantly influence the graphics card’s performance. These specs are vital for managing the heavy demands of modern gaming and productivity, so users might experience a decrease in speed and efficiency compared to the complete RTX 5090. MANLI, a partner of NVIDIA, confirmed these specification reductions, aligning them with the restrictions on memory bandwidth, capped under 1.4 TB/s for exports to China. Additionally, rumors suggest NVIDIA may soon release a new RTX 50 Blackwell GPU, potentially called the RTX 5080 Super or RTX 5080 Ti. This version is projected to have 24 GB of memory, offering more choices in the market while adhering to similar export guidelines. These developments emphasize NVIDIA’s strategy to comply with international trade policies while maintaining competitiveness. The RTX 5090D is expected to begin shipping between late July and early August, marking a key progression in regional tech offerings.

Explore more

Can This New Plan Fix Malaysia’s Health Insurance?

An Overview of the Proposed Reforms The escalating cost of private healthcare has placed an immense and often unsustainable burden on Malaysian households, forcing many to abandon their insurance policies precisely when they are most needed. In response to this growing crisis, government bodies have collaborated on a strategic initiative designed to overhaul the private health insurance landscape. This new

Is Your CRM Hiding Your Biggest Revenue Risks?

The most significant risks to a company’s revenue forecast are often not found in spreadsheets or reports but are instead hidden within the subtle nuances of everyday customer conversations. For decades, business leaders have relied on structured data to make critical decisions, yet a persistent gap remains between what is officially recorded and what is actually happening on the front

Rethink Your Data Stack for Faster, AI-Driven Decisions

The speed at which an organization can translate a critical business question into a confident, data-backed action has become the ultimate determinant of its competitive resilience and market leadership. In a landscape where opportunities and threats emerge in minutes, not quarters, the traditional data stack, meticulously built for the deliberate pace of historical reporting, now serves as an anchor rather

Data Architecture Is Crucial for Financial Stability

In today’s hyper-connected global economy, the traditional tools designed to safeguard the financial system, such as capital buffers and liquidity requirements, are proving to be fundamentally insufficient on their own. While these measures remain essential pillars of regulation, they were designed for an era when risk accumulated predictably within the balance sheets of large banks. The modern financial landscape, however,

Agentic AI Powers Autonomous Data Engineering

The persistent fragility of enterprise data pipelines, where a minor schema change can trigger a cascade of downstream failures, underscores a fundamental limitation in how organizations have traditionally managed their most critical asset. Most data failures do not stem from a lack of sophisticated tools but from a reliance on static rules, delayed human oversight, and constant manual intervention. This