SECURITY INFLUENCERS BLOG

Security influencers provide real-world insight and “in-the-trenches” experiences on topics ranging from application security to DevOps and risk management

START FREE TRIAL

60 Minutes & the "Signaling System Seven (SS7) Vulnerability"

Signal-System-Seven.pngOver the weekend, 60 Minutes featured a segment on how cellphones and mobile phone networks are vulnerable to hacking, exploiting a security flaw discovered in Signaling System Seven – or SS7. According to security researcher Karsten Nohl, “the flaw is a significant risk mostly to political leaders and business executives whose private communications could be of high value to hackers. The ability to intercept cellphone calls through the SS7 network is an open secret among the world’s intelligence agencies – including ours – and they don’t necessarily want that hole plugged.” 

We’ve seen the SS7 vulnerability discussed from time to time in the media over the past year; in fact, several articles hit back in April on what users can do about the SS7 flaw, as well as why SS7 is still vulnerable more than a year after first reported.

Given the "proposed" gravity of the flaw, Jeff Williams, Co-Founder and CTO of Contrast Security, gave his perspective on the vulnerability, as well as how susceptible users and organizations are to SS7 hacking. 

Point-of-View by Jeff Williams:
"My Reaction?"

My first reaction is that I really dislike one-sided stories designed to scare people into taking actions that often don’t make much sense.  Particularly when there’s no news here.  The attacks here have pre-requisites that make them dangerous only in very limited situations.  I’m not an expert in this type of exploit, but my understanding is that:

  1. You have to have access to backbone phone networks.“In the case of 60 Minutes Australia, Luca Melette was given access to SS7 by the German government” -- https://www.engadget.com/2016/04/22/how-60-minutes-played-telephone-with-public-hacking-hysteria/

  2. At least one of the “victims” in this trumped up demonstration would have to allow software to be installed on the mobile device.  It’s like sending someone an email that says please download and install this virus… and then claiming that you pwned them.

While I share the opinion that just about everything can be hacked, that doesn’t mean the sky is falling. I have found many software vulnerabilities that looked disastrous, but in practice, software is often part of a larger system that can detect and block attacks, or recover from the impact of a breach.

We need better visibility into the kinds of threats our technology is designed to handle, and which it cannot.  We have this in almost everything else.  Drugs – “do not use internally”, Boxes – “not a toy”, and Coffee Cups – “HOT!”  Ironically, if you actually read the EULA that comes with most software, it says that it is unsuitable for almost everything.  For example, the license for Java used to say, “You acknowledge that Software is not designed, licensed or intended for use in the design, construction, operation or maintenance of any nuclear facility."  Now that Oracle is in charge, the license has no such warning.  But the point is that by disclaiming all liability, they also abdicate their duty to make anything secure.

But there’s really no good way to figure out whether a particular piece of software is fit for some use.  We get absolutely no insight into how the software was built, tested, or is intended to be used.  No information about the threat model that the software was designed for.  Airplane?  Financial System?  Mattel Electronic Football?  Who knows?   If government were going to intervene in the software market to do something about this kind of problem, all they would have to do is make software vendors disclose security relevant details of how their software was made.

The outrage about this one particular vulnerability also strikes me as uninformed. There are serious vulnerabilities all over the place.  It’s a haystack full of needles.  Creating drama about one of them simply gives people a false sense of security when it eventually gets fixed.  The assumption is that things are safe until some security researcher discovers a hole.  Then we panic until it’s fixed and we are safe again.  But that is absolutely WRONG!!!  We averaged 22.4 serious vulnerabilities per application over many years of pentesting.

One interesting side-topic mentioned in the article is whether the “Feds” should be fired if they didn’t disclose this problem. I think that’s a strange position to take.  Of course the nations of the world study vulnerabilities so that they will have an advantage in the ever looming cyberwar.  Personally, I think we would be bat-shit crazy to do anything that might provoke a cyberwar for one primary reason: we are more exposed than other nations.

continuous-application-security 

Jeff Williams, Co-Founder, Chief Technology Officer

Jeff Williams, Co-Founder, Chief Technology Officer

Jeff brings more than 20 years of security leadership experience as co-founder and Chief Technology Officer of Contrast. Previously, Jeff was co-founder and CEO of Aspect Security, a successful and innovative application security consulting company acquired by Ernst & Young. Jeff is also a founder and major contributor to OWASP, where he served as the Chair of the OWASP Board for 8 years.

SUBSCRIBE TO THE BLOG