NIST Deprioritizes Pre-2018 CVEs Amid Backlog and New Threats

Article Highlights
Off On

The US National Institute of Standards and Technology (NIST) recently made a significant decision affecting the cybersecurity landscape by marking all Common Vulnerabilities and Exposures (CVEs) published before January 1, 2018, as “Deferred” in the National Vulnerability Database (NVD). This move impacts over 20,000 entries and potentially up to 100,000, signaling that these CVEs will no longer be prioritized for further enrichment data updates unless they appear in the Cybersecurity and Infrastructure Security Agency’s (CISA) Known Exploited Vulnerabilities (KEV) catalog. NIST’s decision comes in response to an ongoing struggle with a growing backlog in processing vulnerability data, exacerbated by a 32% surge in submissions in the past year.

An Overwhelming Backlog and Strategic Reprioritization

NIST’s challenges in processing and enriching the vast amount of incoming data have delayed its goal of clearing the backlog by the end of fiscal year 2024. In response, NIST is developing new systems to handle these issues more efficiently. Industry experts consider this move practical given the complexities of managing vulnerabilities at scale. Ken Dunham from Qualys describes it as an evolution in the face of changing cyber threats. Meanwhile, Jason Soroko from Sectigo interprets this as a strategic reprioritization, with resources redirected towards addressing emerging threats, assuming that legacy issues have been mitigated through routine patch management practices. The responsibility for managing deferred CVEs now shifts more heavily onto organizations. For security teams, this means identifying and monitoring legacy systems, prioritizing the patching of deferred vulnerabilities, and hardening or segmenting outdated infrastructure. Using real-time threat intelligence to detect attempts at exploiting these vulnerabilities becomes crucial. This shift highlights a broader trend where organizations must adopt proactive risk management strategies due to the increasing volume of CVEs and limited resources available to handle them.

Embracing Advanced Technology for Improved Efficiency

In addressing its backlog, NIST is also exploring the potential use of artificial intelligence (AI) and machine learning to streamline the processing of vulnerability data. This move reflects an ongoing trend in the cybersecurity industry toward leveraging advanced technologies for more efficient management of vulnerabilities. By incorporating AI and machine learning, NIST aims to ensure that both older and newer vulnerabilities receive appropriate attention within the constraints of available resources. This nuanced approach to cybersecurity management underscores the need for a balance between addressing legacy vulnerabilities and staying ahead of emerging threats. Organizations are encouraged to adopt similar strategies, using technology to enhance their cybersecurity efforts and ensure comprehensive coverage of potential vulnerabilities. This shift in focus not only addresses immediate backlog issues but also sets the stage for more sustainable and scalable vulnerability management practices in the future.

New Paradigm for Cybersecurity Management

The US National Institute of Standards and Technology (NIST) has recently made a crucial decision that impacts the cybersecurity domain by designating all Common Vulnerabilities and Exposures (CVEs) published before January 1, 2018, as “Deferred” in the National Vulnerability Database (NVD). This adjustment affects over 20,000 entries and potentially up to 100,000, indicating that these CVEs will no longer receive prioritized updates for enrichment data unless they are listed in the Cybersecurity and Infrastructure Security Agency’s (CISA) Known Exploited Vulnerabilities (KEV) catalog. NIST’s decision is a response to an ongoing challenge with a growing accumulation of vulnerability data, which has been aggravated by a 32% increase in submissions over the past year. This strategic shift aims to address the backlog more effectively and allocate resources more efficiently, ensuring newer and more critical vulnerabilities receive the attention they require for maintaining robust cybersecurity measures.

Explore more

What Digital Marketing Skills Do Future Leaders Need Now?

Bridging the Gap Between Technology and Human-Centric Strategy The convergence of sophisticated automation and the fundamental human need for connection has redefined the parameters of corporate success in the current marketplace. Modern marketing is moving far beyond the simple management of social media accounts or the purchase of display ads. Today, the field sits at a high-stakes intersection of emerging

Will the Digital Euro Redefine the Future of Money?

The traditional clink of coins and the rustle of paper notes are becoming increasingly rare sounds in a global economy that favors instantaneous electronic transfers over physical exchanges. This fundamental transformation has prompted the European Central Bank to accelerate the development of the digital euro, a sovereign electronic currency designed to provide a secure and universally accepted alternative to existing

What Caused the Fatal Fungal Outbreak at RPA Hospital?

The sterile promise of a high-tech hospital environment often masks the persistent threat of microscopic airborne pathogens that can prove lethal to the most vulnerable patients during periods of structural redevelopment. Managing these clinical environments within major metropolitan health districts requires a delicate balance between modernizing facilities and maintaining strict biosecurity. For immunocompromised individuals in high-risk zones like transplant wards,

How Will 6G Move From Data Pipes to AI-Native Networks?

The global telecommunications landscape is currently undergoing a radical metamorphosis as engineers and policymakers pivot from the incremental improvements of 5G toward the profound, intelligence-driven architecture of 6G. While previous cellular transitions focused primarily on increasing the diameter of the “data pipe” to allow for more content to flow, the 6G movement represents a fundamental reimagining of what a network

Next-Gen Data Engineering – Review

The relentless pressure to transform raw organizational noise into crystalline insights has finally pushed the data engineering discipline past its breaking point of manual scripting. For decades, the industry relied on a fragile web of imperative code, where engineers painstakingly dictated every movement of data through brittle pipelines. This aging paradigm is currently being dismantled by a next-gen architecture that