Vulnerability management (VM) is not as simple as setting up an automated assessment tool and then remediating all vulnerabilities contained in the reports. Regular review of these reports makes it easy to determine that the idea of achieving zero vulnerability is unrealistic for most companies adhering to a budget. The number of existing vulnerabilities combined with new ones disclosed each month, creates a mountain of work that can easily overwhelm a company’s systems operations team to the point of ignoring future reports. Therefore, it becomes necessary to develop a prioritization algorithm/methodology that helps keep risk to a minimum.
Most VM tools will return the vulnerabilities found and assign risk based upon the CVSS score provided by the various vendors. This score is a good starting point, but it shouldn’t be used as the only variable when determining prioritization. A company’s asset management program, if existing, can be leveraged to various degrees based up its maturity level. A mature asset management program would allow for vulnerability scoring refinement based upon:
• Type of data hosted (personally identifiable information, credit card information, intellectual property, etc.)
• Business criticality or value (Will the asset impact business if taken down, for any amount of time?)
• Asset role (Do critical applications or processes depend on the asset?)
• Placement within the security infrastructure (assets protected by existing security controls and architecture can mitigate the likelihood of a vulnerability being exploited)
This data can be input into modern VM assessment tools either manually or automatically in some cases. Maintaining data accuracy is critical when using it to determine vulnerability remediation prioritization. Inaccurate data could subsume resources, remediating one vulnerability while a riskier one goes unaddressed.
"Maintaining data accuracy is critical when using it to determine vulnerability remediation prioritization"
Prioritization can be further enhanced by using vulnerability threat data such as:
• Known exploits (proof of concept and observed in the wild escapades should weigh differently)
• Age of vulnerability (older vulnerabilities have given potential attackers time to develop and spread tools to conduct an exploit)
• Prevalence (more assets with the same vulnerability increases the possible attack surface)
If a company were to do all of this on its own, it might need to create an entire data analytics team dedicated to the monitoring of all data points mentioned as well as generating reports. This is where risk-based vulnerability management (RBVM) tools are used in combination with vulnerability assessment tools that can justify their cost. By importing the data gathered by the vulnerability assessment tool and combining its threat data, the RBVM tool can do all the “heavy lifting” and help create a more accurate vulnerability score.
This score can now be used to create a report that may help provide a more directed approach to vulnerability remediation. However, even the most accurate vulnerability risk scoring, and the distribution of the clearest and concise reports is of no use if your systems operations team doesn’t use the data that you have created. Therefore, it is highly recommended that a regular meeting between the two groups be conducted to review the results, and to discuss the prioritized plan for remediation. Collaboration is the key.
Achieving 100% vulnerability remediation is nearly impossible. Prioritizing by the use of a risk-based solution, in combination with accurate real-time asset management, can help a company achieve a more efficient and effective path to risk remediation.