NIST Deprioritizes Pre-2018 CVEs Amid Backlog and New Threats

Article Highlights
Off On

The US National Institute of Standards and Technology (NIST) recently made a significant decision affecting the cybersecurity landscape by marking all Common Vulnerabilities and Exposures (CVEs) published before January 1, 2018, as “Deferred” in the National Vulnerability Database (NVD). This move impacts over 20,000 entries and potentially up to 100,000, signaling that these CVEs will no longer be prioritized for further enrichment data updates unless they appear in the Cybersecurity and Infrastructure Security Agency’s (CISA) Known Exploited Vulnerabilities (KEV) catalog. NIST’s decision comes in response to an ongoing struggle with a growing backlog in processing vulnerability data, exacerbated by a 32% surge in submissions in the past year.

An Overwhelming Backlog and Strategic Reprioritization

NIST’s challenges in processing and enriching the vast amount of incoming data have delayed its goal of clearing the backlog by the end of fiscal year 2024. In response, NIST is developing new systems to handle these issues more efficiently. Industry experts consider this move practical given the complexities of managing vulnerabilities at scale. Ken Dunham from Qualys describes it as an evolution in the face of changing cyber threats. Meanwhile, Jason Soroko from Sectigo interprets this as a strategic reprioritization, with resources redirected towards addressing emerging threats, assuming that legacy issues have been mitigated through routine patch management practices. The responsibility for managing deferred CVEs now shifts more heavily onto organizations. For security teams, this means identifying and monitoring legacy systems, prioritizing the patching of deferred vulnerabilities, and hardening or segmenting outdated infrastructure. Using real-time threat intelligence to detect attempts at exploiting these vulnerabilities becomes crucial. This shift highlights a broader trend where organizations must adopt proactive risk management strategies due to the increasing volume of CVEs and limited resources available to handle them.

Embracing Advanced Technology for Improved Efficiency

In addressing its backlog, NIST is also exploring the potential use of artificial intelligence (AI) and machine learning to streamline the processing of vulnerability data. This move reflects an ongoing trend in the cybersecurity industry toward leveraging advanced technologies for more efficient management of vulnerabilities. By incorporating AI and machine learning, NIST aims to ensure that both older and newer vulnerabilities receive appropriate attention within the constraints of available resources. This nuanced approach to cybersecurity management underscores the need for a balance between addressing legacy vulnerabilities and staying ahead of emerging threats. Organizations are encouraged to adopt similar strategies, using technology to enhance their cybersecurity efforts and ensure comprehensive coverage of potential vulnerabilities. This shift in focus not only addresses immediate backlog issues but also sets the stage for more sustainable and scalable vulnerability management practices in the future.

New Paradigm for Cybersecurity Management

The US National Institute of Standards and Technology (NIST) has recently made a crucial decision that impacts the cybersecurity domain by designating all Common Vulnerabilities and Exposures (CVEs) published before January 1, 2018, as “Deferred” in the National Vulnerability Database (NVD). This adjustment affects over 20,000 entries and potentially up to 100,000, indicating that these CVEs will no longer receive prioritized updates for enrichment data unless they are listed in the Cybersecurity and Infrastructure Security Agency’s (CISA) Known Exploited Vulnerabilities (KEV) catalog. NIST’s decision is a response to an ongoing challenge with a growing accumulation of vulnerability data, which has been aggravated by a 32% increase in submissions over the past year. This strategic shift aims to address the backlog more effectively and allocate resources more efficiently, ensuring newer and more critical vulnerabilities receive the attention they require for maintaining robust cybersecurity measures.

Explore more

Is Fairer Car Insurance Worth Triple The Cost?

A High-Stakes Overhaul: The Push for Social Justice in Auto Insurance In Kazakhstan, a bold legislative proposal is forcing a nationwide conversation about the true cost of fairness. Lawmakers are advocating to double the financial compensation for victims of traffic accidents, a move praised as a long-overdue step toward social justice. However, this push for greater protection comes with a

Insurance Is the Key to Unlocking Climate Finance

While the global community celebrated a milestone as climate-aligned investments reached $1.9 trillion in 2023, this figure starkly contrasts with the immense financial requirements needed to address the climate crisis, particularly in the world’s most vulnerable regions. Emerging markets and developing economies (EMDEs) are on the front lines, facing the harshest impacts of climate change with the fewest financial resources

The Future of Content Is a Battle for Trust, Not Attention

In a digital landscape overflowing with algorithmically generated answers, the paradox of our time is the proliferation of information coinciding with the erosion of certainty. The foundational challenge for creators, publishers, and consumers is rapidly evolving from the frantic scramble to capture fleeting attention to the more profound and sustainable pursuit of earning and maintaining trust. As artificial intelligence becomes

Use Analytics to Prove Your Content’s ROI

In a world saturated with content, the pressure on marketers to prove their value has never been higher. It’s no longer enough to create beautiful things; you have to demonstrate their impact on the bottom line. This is where Aisha Amaira thrives. As a MarTech expert who has built a career at the intersection of customer data platforms and marketing

What Really Makes a Senior Data Scientist?

In a world where AI can write code, the true mark of a senior data scientist is no longer about syntax, but strategy. Dominic Jainy has spent his career observing the patterns that separate junior practitioners from senior architects of data-driven solutions. He argues that the most impactful work happens long before the first line of code is written and