[Editor’s note: Kenneth van Wyk is credited by the SANS Institute as one of the people who made substantive contributions to compiling the Top Error list.]
Last week the SANS Institute and MITRE together published their CWE/SANS Top-25 Most Dangerous Programming Errors list.
Ta dah! Wow, that “ta dah” didn’t have the public impact I expected. So then, what’s the big deal about the world producing yet another top-N list? I’ll tell you why this one is different.
I can certainly understand if you weren’t bowled over by the announcement, though. It’s a natural (non)reaction. After all, the world already has the SANS Top-20 Vulnerabilities list and the OWASP Top-10. Do we really need another list that points out our problems? Let’s consider a few things here.
The SANS list is well established and respected, and it’s definitely cited by many in the information security field. On the other hand, it is quite a general document that addresses network and OS-level problems, along with such issues as so-called zero day (“0day”) attacks. That’s all well and good, but where are the root causes (no pun intended) being considered? Well, they’re really not.
So then let’s consider the OWASP Top-10 list. They get closer to the root causes by addressing software application problems like cross-site scripting and SQL injection. And, to be fair, their documentation does a superb job at describing these weaknesses and pointing to effective remediation steps that can be taken, right down to code examples. Excellent stuff, but it too falls short of truly pointing a finger directly at the underlying problems themselves.
I should also point out MITRE’s own CVE and CWE efforts. The Common Vulnerabilities and Exposures (CVE) project documents (and makes searchable) a collection of software vulnerabilities, patches, etc. Its counterpart, the Common Weakness Enumeration (CWE) project is a dictionary of underlying software weakness types. Both of these are excellent resources to information security as well as software development staff.
Now do you see what’s missing here? Or perhaps you think you could probably gleam the vast majority of the information in the CWE/SANS Top-25 list by poring through the CWE information? That’s true, but only to a point.
From where I sit, I see two things missing. For one, the SANS Top-20 list is hugely advertised and well known. Of course it’s not a comprehensive list of all the security problems on the Net, but it is arguably the most well known list of problems, and that fact by itself is not a bad thing. (It can become a bad thing if we only address these flaws, but more on that later.)
The second thing missing is that — until now — we didn’t have a list of the biggest, baddest, nastiest programming security defects. We didn’t have a counterpart, if you will, to the OWASP list that speaks specifically to the programming mistakes and not just the vulnerabilities. After all, an XSS vulnerability can look a lot different in different contexts.
Until we draw attention to the programmatic problems that lead to XSS vulnerabilities, we’re only talking about the symptoms and not the problems. That’s the real big deal here.
And just like the other Top-N lists, this one isn’t comprehensive. There are many other programming mistakes that can be made, of course. We all understand that, right? Right?
Well, there’s a danger there as well. We’ve seen the SANS list being adopted by many a security auditor as a mere checklist of things to look for. If the CWE/SANS list gets similarly adopted, we’re guilty of a very bad thing indeed: negative validation.
Negative validation happens when we evaluate something against a set of known bad things and assume it to be safe if we don’t find them. Positive validation, on the other hand, evaluates something against a set of accepted good attributes and presumes it to be dangerous if it doesn’t conform. (Remember me mentioning that in my column here just a few short months back?)
The danger of negative validation here is that we must not focus solely on these 25 bad things. That said, raising awareness to these bad things is positive and valuable. This list has already been widely cited in just about every trade publication I’m aware of. That’s great news for us all.
OK, so this list is different. How should we work with it in our day-to-day work? For starters, every software developer should be exposed to the CWE/SANS Top-25 list. They should all understand the issues and how to avoid them. But don’t stop there.
We all need to also ensure that software developers understand the underlying sound engineering principles that are implicitly referenced in the list. Things like the principle of least privilege, compartmentalization, and so on—think Salzer and Schroeder circa 1975. You know, the things that instantly and irrevocably cure insomnia among software developers. Well, you can use the Top-25 list as a way of drawing attention to those principles in an interesting and engaging way.
So yes, there is real value to the CWE/SANS Top-25 list. Use it to raise people’s attention. Use it to make change. Just don’t get so hung up on the list that you miss the underlying messages. Sound engineering principles are the foundation that we need to build a strong and reliable infrastructure, and we mustn’t ever lose sight of that.