NIST Deprioritizes Pre-2018 CVEs Amid Backlog and New Threats

Article Highlights
Off On

The US National Institute of Standards and Technology (NIST) recently made a significant decision affecting the cybersecurity landscape by marking all Common Vulnerabilities and Exposures (CVEs) published before January 1, 2018, as “Deferred” in the National Vulnerability Database (NVD). This move impacts over 20,000 entries and potentially up to 100,000, signaling that these CVEs will no longer be prioritized for further enrichment data updates unless they appear in the Cybersecurity and Infrastructure Security Agency’s (CISA) Known Exploited Vulnerabilities (KEV) catalog. NIST’s decision comes in response to an ongoing struggle with a growing backlog in processing vulnerability data, exacerbated by a 32% surge in submissions in the past year.

An Overwhelming Backlog and Strategic Reprioritization

NIST’s challenges in processing and enriching the vast amount of incoming data have delayed its goal of clearing the backlog by the end of fiscal year 2024. In response, NIST is developing new systems to handle these issues more efficiently. Industry experts consider this move practical given the complexities of managing vulnerabilities at scale. Ken Dunham from Qualys describes it as an evolution in the face of changing cyber threats. Meanwhile, Jason Soroko from Sectigo interprets this as a strategic reprioritization, with resources redirected towards addressing emerging threats, assuming that legacy issues have been mitigated through routine patch management practices. The responsibility for managing deferred CVEs now shifts more heavily onto organizations. For security teams, this means identifying and monitoring legacy systems, prioritizing the patching of deferred vulnerabilities, and hardening or segmenting outdated infrastructure. Using real-time threat intelligence to detect attempts at exploiting these vulnerabilities becomes crucial. This shift highlights a broader trend where organizations must adopt proactive risk management strategies due to the increasing volume of CVEs and limited resources available to handle them.

Embracing Advanced Technology for Improved Efficiency

In addressing its backlog, NIST is also exploring the potential use of artificial intelligence (AI) and machine learning to streamline the processing of vulnerability data. This move reflects an ongoing trend in the cybersecurity industry toward leveraging advanced technologies for more efficient management of vulnerabilities. By incorporating AI and machine learning, NIST aims to ensure that both older and newer vulnerabilities receive appropriate attention within the constraints of available resources. This nuanced approach to cybersecurity management underscores the need for a balance between addressing legacy vulnerabilities and staying ahead of emerging threats. Organizations are encouraged to adopt similar strategies, using technology to enhance their cybersecurity efforts and ensure comprehensive coverage of potential vulnerabilities. This shift in focus not only addresses immediate backlog issues but also sets the stage for more sustainable and scalable vulnerability management practices in the future.

New Paradigm for Cybersecurity Management

The US National Institute of Standards and Technology (NIST) has recently made a crucial decision that impacts the cybersecurity domain by designating all Common Vulnerabilities and Exposures (CVEs) published before January 1, 2018, as “Deferred” in the National Vulnerability Database (NVD). This adjustment affects over 20,000 entries and potentially up to 100,000, indicating that these CVEs will no longer receive prioritized updates for enrichment data unless they are listed in the Cybersecurity and Infrastructure Security Agency’s (CISA) Known Exploited Vulnerabilities (KEV) catalog. NIST’s decision is a response to an ongoing challenge with a growing accumulation of vulnerability data, which has been aggravated by a 32% increase in submissions over the past year. This strategic shift aims to address the backlog more effectively and allocate resources more efficiently, ensuring newer and more critical vulnerabilities receive the attention they require for maintaining robust cybersecurity measures.

Explore more

Trend Analysis: AI in Real Estate

Navigating the real estate market has long been synonymous with staggering costs, opaque processes, and a reliance on commission-based intermediaries that can consume a significant portion of a property’s value. This traditional framework is now facing a profound disruption from artificial intelligence, a technological force empowering consumers with unprecedented levels of control, transparency, and financial savings. As the industry stands

Insurtech Digital Platforms – Review

The silent drain on an insurer’s profitability often goes unnoticed, buried within the complex and aging architecture of legacy systems that impede growth and alienate a digitally native customer base. Insurtech digital platforms represent a significant advancement in the insurance sector, offering a clear path away from these outdated constraints. This review will explore the evolution of this technology from

Trend Analysis: Insurance Operational Control

The relentless pursuit of market share that has defined the insurance landscape for years has finally met its reckoning, forcing the industry to confront a new reality where operational discipline is the true measure of strength. After a prolonged period of chasing aggressive, unrestrained growth, 2025 has marked a fundamental pivot. The market is now shifting away from a “growth-at-all-costs”

AI Grading Tools Offer Both Promise and Peril

The familiar scrawl of a teacher’s red pen, once the definitive symbol of academic feedback, is steadily being replaced by the silent, instantaneous judgment of an algorithm. From the red-inked margins of yesteryear to the instant feedback of today, the landscape of academic assessment is undergoing a seismic shift. As educators grapple with growing class sizes and the demand for

Legacy Digital Twin vs. Industry 4.0 Digital Twin: A Comparative Analysis

The promise of a perfect digital replica—a tool that could mirror every gear turn and temperature fluctuation of a physical asset—is no longer a distant vision but a bifurcated reality with two distinct evolutionary paths. On one side stands the legacy digital twin, a powerful but often isolated marvel of engineering simulation. On the other is its successor, the Industry