NIST Deprioritizes Pre-2018 CVEs Amid Backlog and New Threats

Article Highlights
Off On

The US National Institute of Standards and Technology (NIST) recently made a significant decision affecting the cybersecurity landscape by marking all Common Vulnerabilities and Exposures (CVEs) published before January 1, 2018, as “Deferred” in the National Vulnerability Database (NVD). This move impacts over 20,000 entries and potentially up to 100,000, signaling that these CVEs will no longer be prioritized for further enrichment data updates unless they appear in the Cybersecurity and Infrastructure Security Agency’s (CISA) Known Exploited Vulnerabilities (KEV) catalog. NIST’s decision comes in response to an ongoing struggle with a growing backlog in processing vulnerability data, exacerbated by a 32% surge in submissions in the past year.

An Overwhelming Backlog and Strategic Reprioritization

NIST’s challenges in processing and enriching the vast amount of incoming data have delayed its goal of clearing the backlog by the end of fiscal year 2024. In response, NIST is developing new systems to handle these issues more efficiently. Industry experts consider this move practical given the complexities of managing vulnerabilities at scale. Ken Dunham from Qualys describes it as an evolution in the face of changing cyber threats. Meanwhile, Jason Soroko from Sectigo interprets this as a strategic reprioritization, with resources redirected towards addressing emerging threats, assuming that legacy issues have been mitigated through routine patch management practices. The responsibility for managing deferred CVEs now shifts more heavily onto organizations. For security teams, this means identifying and monitoring legacy systems, prioritizing the patching of deferred vulnerabilities, and hardening or segmenting outdated infrastructure. Using real-time threat intelligence to detect attempts at exploiting these vulnerabilities becomes crucial. This shift highlights a broader trend where organizations must adopt proactive risk management strategies due to the increasing volume of CVEs and limited resources available to handle them.

Embracing Advanced Technology for Improved Efficiency

In addressing its backlog, NIST is also exploring the potential use of artificial intelligence (AI) and machine learning to streamline the processing of vulnerability data. This move reflects an ongoing trend in the cybersecurity industry toward leveraging advanced technologies for more efficient management of vulnerabilities. By incorporating AI and machine learning, NIST aims to ensure that both older and newer vulnerabilities receive appropriate attention within the constraints of available resources. This nuanced approach to cybersecurity management underscores the need for a balance between addressing legacy vulnerabilities and staying ahead of emerging threats. Organizations are encouraged to adopt similar strategies, using technology to enhance their cybersecurity efforts and ensure comprehensive coverage of potential vulnerabilities. This shift in focus not only addresses immediate backlog issues but also sets the stage for more sustainable and scalable vulnerability management practices in the future.

New Paradigm for Cybersecurity Management

The US National Institute of Standards and Technology (NIST) has recently made a crucial decision that impacts the cybersecurity domain by designating all Common Vulnerabilities and Exposures (CVEs) published before January 1, 2018, as “Deferred” in the National Vulnerability Database (NVD). This adjustment affects over 20,000 entries and potentially up to 100,000, indicating that these CVEs will no longer receive prioritized updates for enrichment data unless they are listed in the Cybersecurity and Infrastructure Security Agency’s (CISA) Known Exploited Vulnerabilities (KEV) catalog. NIST’s decision is a response to an ongoing challenge with a growing accumulation of vulnerability data, which has been aggravated by a 32% increase in submissions over the past year. This strategic shift aims to address the backlog more effectively and allocate resources more efficiently, ensuring newer and more critical vulnerabilities receive the attention they require for maintaining robust cybersecurity measures.

Explore more

Why Are Big Data Engineers Vital to the Digital Economy?

In a world where every click, swipe, and sensor reading generates a data point, businesses are drowning in an ocean of information—yet only a fraction can harness its power, and the stakes are incredibly high. Consider this staggering reality: companies can lose up to 20% of their annual revenue due to inefficient data practices, a financial hit that serves as

How Will AI and 5G Transform Africa’s Mobile Startups?

Imagine a continent where mobile technology isn’t just a convenience but the very backbone of economic growth, connecting millions to opportunities previously out of reach, and setting the stage for a transformative era. Africa, with its vibrant and rapidly expanding mobile economy, stands at the threshold of a technological revolution driven by the powerful synergy of artificial intelligence (AI) and

Saudi Arabia Cuts Foreign Worker Salary Premiums Under Vision 2030

What happens when a nation known for its generous pay packages for foreign talent suddenly tightens the purse strings? In Saudi Arabia, a seismic shift is underway as salary premiums for expatriate workers, once a hallmark of the kingdom’s appeal, are being slashed. This dramatic change, set to unfold in 2025, signals a new era of fiscal caution and strategic

DevSecOps Evolution: From Shift Left to Shift Smart

Introduction to DevSecOps Transformation In today’s fast-paced digital landscape, where software releases happen in hours rather than months, the integration of security into the software development lifecycle (SDLC) has become a cornerstone of organizational success, especially as cyber threats escalate and the demand for speed remains relentless. DevSecOps, the practice of embedding security practices throughout the development process, stands as

AI Agent Testing: Revolutionizing DevOps Reliability

In an era where software deployment cycles are shrinking to mere hours, the integration of AI agents into DevOps pipelines has emerged as a game-changer, promising unparalleled efficiency but also introducing complex challenges that must be addressed. Picture a critical production system crashing at midnight due to an AI agent’s unchecked token consumption, costing thousands in API overuse before anyone