Security influencers provide real-world insight and “in-the-trenches” experiences on topics ranging from application security to DevOps and risk management

Why Static Application Security Scanners Just Can't Cut It Anymore

Old School Computer Testing

Static Analysis and Dynamic Analysis Tools Have Their Place
To be clear: I’ve been an advocate of both dynamic vulnerability scanning (DAST) and static analysis (SAST). These technologies can be helpful when used by experts as part of an application security program. But traditional tools don’t stand much of a chance against the onslaught of new code, new frameworks, new protocols, and new data structures today's modern applications require.

The State of Web Application Vulnerabilities
Many vulnerabilities, including XSS, SQL injection, command injection, LDAP injection, XML injection, etc. happen when programmers send untrusted data to dangerous calls. It seems easy enough to prevent, if you know what data is untrusted. That's the difficult part.

When you're looking at a million lines of source code, laced with string operations, complex method calls, inheritance, aspect oriented programming, and reflection -- tracing the data flow can get extremely difficult. So by the time the data gets to the dangerous call, it's impossible for the developer to know that it still hasn't been validated or encoded in a way that makes it safe. If only there were a way to confidently trace that data through all that code and let the developer know? (Hint: There is, just keep reading.)

How Static Analysis Tools Work
Static analysis tools build a model of an application to represent data flow, control flow, and other interesting attributes of software. When used on real world applications, these models can overwhelm the user by including many findings that an actual program can never execute. So, while some security issues can be identified this way, static analysis alone simply can't fix all of your problems.

Check out the study done each year by the NSA's Center for Assured Software on static analysis tools. They created a test suite with over 25,000 test cases (called Juliet) and the static analysis tools correctly pass only a tiny fraction of them. And these test cases aren't nearly as complex as real code. The data flows contained within them are simple and relatively direct.  In the real world, data flows can span dozens of steps through stack traces that are 50 frames deep or more.

Here's a slide from the NSA's presentation on their study that shows the performance of all the top static analysis tools against the NSA test suite. It's not an encouraging chart, particularly when you realize that the gray part of each bar is the false alarms. The green part (mostly below 10%) is the actual good vulnerability identification rate.


Screen Shot 2013 01 07 at 10.05.29 PM resized 600

Current scanning tools still don’t do a very good job with Ajax and Web Services, technologies that are nearing the 10 year-old mark. And, static tools still struggle with false alarms and need to have new rules for every new framework and library that comes out. That's why in 2012 we created Contrast™, because modern times require modern tools.

How Contrast's Modern Sensors Aid in Application Security Testing
Contrast tracks actual data through the running application itself. This approach means that we never make a mistake about where untrusted data travels, so large numbers of false alarms are avoided.  We correctly pass 73% of the entire NSA test suite, and 99% of the test cases that represent "web application" type flaws, such as those in the OWASP Top Ten.

There's another huge advantage to the Contrast approach. Static analysis tools have to identify every method that propagates untrusted data, or else their pseudo-execution can't correctly calculate the data flow. Because we track data through libraries, frameworks, and even the underlying runtime platform, there's no need to model each new framework and component that comes out.

We simply track the actual values at runtime. It's simpler and more accurate, but more importantly, we can do "whole app analysis" by following the data through frameworks and libraries, not just the custom code. So not only do we identify previously unknown vulnerabilities in the way your code uses frameworks and libraries, we also point out known vulnerabilities so that you can update your components. (Click here for our three-part series on libraries and frameworks by Dave Wichers.)

SQL injection, XSS, and other data-flow type flaws are important, but they're not everything.  So, in addition to data-flow type vulnerabilities, Contrast also analyzes HTTP responses, configuration files, backend connections, control-flow, and more to see if your application has any weaknesses. The wealth of information available to Contrast is literally unfair.  To the bad guys.

Unlike the scanners of yesteryear, Contrast gives you the exact line(s) of code associated with the problem and the associated HTTP request so you can fix the problem, test, and retest easily and quickly. You can even see the actual runtime data values that are part of the vulnerability pattern.  And we do it without the false-positive results that dated scanners often give. Contrast simply gives you real results. Accurate results. And that's why a modern instrumentation-based monitor is simply superior to outdated scanner technology. 

Hackers have better tools than you...

Jeff Williams, Co-Founder, Chief Technology Officer

Jeff Williams, Co-Founder, Chief Technology Officer

Jeff brings more than 20 years of security leadership experience as co-founder and Chief Technology Officer of Contrast. Previously, Jeff was co-founder and CEO of Aspect Security, a successful and innovative application security consulting company acquired by Ernst & Young. Jeff is also a founder and major contributor to OWASP, where he served as the Chair of the OWASP Board for 8 years.


Learn how to unify security strategy across & development operations. See how to set up a CAS program with only eight activities!

Download the Handbook