Skip to content

Interview: Bruce Brody of Cubic Cyber Solutions

    
Interview: Bruce Brody of Cubic Cyber Solutions

In this interview, Jeff Williams interviews Bruce Brody of Cubic Cyber Solutions, a leading provider of specialized systems and services in the rapidly changing world of technology. They examine the relationship between federal cybersecurity rules and regulations, and how workforces can stay on top of educating their employees regarding the changing threatscape.

NOTE: The audio sounds a bit garbled at times. The transcription below will clarify questions about what was said.



 

Jeff Williams: Hello. I'm Jeff Williams, CTO of Contrast Security, and I want to thank you for joining us on our new episode of the Security Influencers channel. Our goal is to provide a series of brief and, we hope, highly informative interviews with informed security professionals. 

Our theme for 2014 is discussion on implications of continuous. Today we're talking with Bruce Brody, the new Chief Cyber Security Strategist at Cubic Cyber Solutions, and a true leader in information security. Bruce has held numerous security positions over the years, including being the CISO for the US Department of Energy and the CISO for the Department of Veterans Affairs. Bruce, thank you so much for joining us today. 

Bruce Brody: My pleasure. Thank you, Jeff. 

Jeff: So let's get started. So the first question I have for you:  how is application security different in the government sector versus the commercial sector? 

Bruce: Well, in the government sector, there's a tremendous amount of interest in the security of an application when it comes to a variety of different operating environments. The government has classified operating environments, and then they have unclassified operating environments. And application security is dealt with slightly differently depending on the environment in which the application resides. 

If it's going to be in a classified environment, then some very rigorous tests and evaluation need to occur before that application is approved to operate in that environment. In unclassified environments, the application does have to withstand some scrutiny and some testing, but it's not nearly as rigorous as those in the classified environments.

Hackers have better tools than you...

Jeff: So do you think...you're referring to the certification/accreditation process? 

Bruce: Well, to a certain extent, but in the area of software, there's actual tests that have to take place and certain labs that have to be involved in these tests, and those labs have to be approved by whether NSA or National Institute for Standards in Technology. So yes, I'm talking about the certification and accreditation process, which is now the Authority to Operate process. But more importantly, the codex has to be inspected in the case of applications. 

Jeff: Gotcha. So I was under the impression that most applications had to get their code reviewed. Is that true for most applications or just a subset? 

Bruce: Well, it's true for a subset that operates specifically in very sensitive and classified environments. It's also true that a system that carries applications on it that is used in an unclassified environment has to go through an Authority to Operate process, which is the C&A process, the old certification and accreditation process. And that's a little less scrutiny on the application and more on the system level performance in terms of security. 

Jeff: Right. Okay, thank you. So have you noticed a change in software development in government to more ad-hoc DevOps-style software development? And if so, have you noticed that it's affecting the way the security works for government apps? 

Bruce: Like all programs in government, the intent is there to move in that direction. The speed and pace at which we move in that direction is very [laughter] government, and very bureaucratic. So there are initiatives in that regard. There are some things going on with Department of Homeland Security and across various agencies to put some good processes, some better processes, more agile processes in place. Those are moving along. They're moving along at government's pace, which is not overnight. 

Jeff: Right. It's an interesting contradiction that these super-high speed development processes are moving at a government pace. [laughter]

Bruce: That's a...way to put it. 

Jeff: Okay. So I've seen you've written that -- and this is your quote -- that "there's no longer any reasonable argument regarding whether or not continuous monitoring is the right move for federal departments and agencies." 

Bruce: Right. 

Jeff: Why do you think continuous monitoring is so important? 

Bruce: Well, for one thing, the government has long had an approach where periodic monitoring was okay. It was actually written in policies and procedures. Now, we all know that periodic, especially when it's interpreted as pretty much every three years or whenever a significant change occurs...that doesn't give you the ability to take a look at a system that's constantly changing and say that...if it's as secure as when you originally authorized it to operate. 

What was needed and what never was in carried into legislation or in policy and procedure was to take the periodic look at these systems and turn that into a continuous look at these systems. So that you know that the controls you have put in place to elevate the security level of the systems are continuously in place and operating according to how they should be operating. And the only way to do that, given how much systems change over the course of time and how fast they change, is doing it on a continuous basis. Not on a periodic basis. 

Jeff: Yeah. I really couldn't agree more. And I'm trying to bring that same wisdom to the application security space. I think application security is going to have to go through that same transition. And right now most organizations do an annual pen test or a periodic scan of their application for security vulnerabilities. But I think your sense is right that if you want to actually do it and keep things secure, you've got to be doing it continuously. 

Bruce: Right... 

Jeff: Have you looked at how--

Bruce: It's a 24/7, 365 kind of approach to security that will, once every department and agency inside the federal government begins doing it that way, you'll see the overall security posture of the federal government improve. As opposed to usually splashing on the front page of the Washington Post in a negative way. 

Jeff: Now what about the expense of doing things continuously? 

Bruce: Well, some people have argued that it takes a lot more money to do it continuously. But if you do it right, continuous monitoring can actually save you money. Because you're more to the left side of the problem, in programmatic speak, as opposed to the right side of the problem. 

You're fixing things before they happen. You're anticipating. You're being proactive. Over at the Department of Defense, you might call that "left of boom," [laughter] 'boom' being a bad thing, left being the place you want to be because that's the proactive side, that's the preparatory side.

That's the part that prevents things from happening. If you are to the right of 'boom,' you're sweeping up the mess, you're doing incident response, you're forensics, you're in the bad part of problem space, and continuous monitoring takes you to the good side of the problem space. You spend less money and you spend money more efficiently when you're "left of boom."

Jeff: That's fantastic, I love that term. I'm going to start using that, I think. 

Bruce: [laughter] I didn't make it up. It comes from very serious people who try to prevent 'boom' from happening in hostile territory overseas, and they take it very seriously, and so if you apply it to cyber security, it's a good way of looking at it.

Jeff: Yeah, I agree. So you talked at the beginning about workforce, and I'm wondering what do you think the effect of continuous security is on the culture of security within a large organization? 

Bruce: What continuous does is it puts you on a proper footing when it comes to dealing with the risk management profile of an organization. Because components of risk are threat and vulnerability as well as the controls you put in place and the likelihood it is to occur. And when you're operating on the Continuous kind of a mode, you're operating in a mode that keeps everybody on their toes, keeps everybody alert, awake, alive, and very well tuned-in to the kind of problems that need to be thwarted on a fairly regular basis.

And it also tends to tune your workforce according to the needs and the specific mission of your own organization. So that you don't have a whole bunch of people who are on policy, necessarily. 

You have people who can spread across the spectrum, those who can manage and configure perimeter defenses, those who can analyze malware and reverse-engineer that code, and those that can anticipate threats and then put preparative measures in place. Once you've sprinkled the right skill sets across your enterprise, you then tend to have added another layer of defense to your layers of defense that you try to put in place to beat the threat.

Jeff: Right. I think people are an incredibly important part of this problem, so I'm always interested in to hear about people's take on the cultural impact. 

Bruce: Well, the Department of Defense has actually put some fairly serious directives in place in terms of how to keep the workforce fresh and skilled. And those people who have specific cybersecurity responsibilities must have a certain specific qualification. The security inside of government with NIST and OPM, Office of Personnel Management, have begun a similar kind of approach in defining various categories of cybersecurity personnel and what those categories should have as their-- This is a long overdue process, by the way. Discussions on this began in the 1990s. We're 20 years in, so we talked previously about a government pace. Well, that's not gonna happen overnight...unless there's an absolute crisis. 

Jeff: I looked at some of your background, and it looked like you worked on multi-level technology back in the early '90s. I cut my teeth on a multi-level secure project for the Navy back in the late '80s and early '90s, and I actually think I--

Bruce: Would that have been Radiant Mercury? 

Jeff: It was close. It was the OSIS baseline upgrade. 

Bruce: Oh, yeah, I remember that. That's right. That was back in my day too, yeah.

Jeff: So what I've found is the people who have experienced from back in those days when security was much more positive and driven from overall goals, I found that they have a different approach to security than folks that have been brought up within the last ten years. Who, I think, take a more sort of negative approach to security, like we'll pen-test to find holes and then say something's secure. How do you feel assurance has evolved over the last 10, 20...

Bruce: Yeah, you make a good point. There was a time, let's say 20 or 30 years ago, when with high assurance trusted systems we got into a sort of rigorous methodology based on...back then the Rainbow Series of documents that helped us define certain categories of security and what needed to be done in each category. And was a pretty clear, straightforward approach to getting things secure. Systems were so much simpler back then. So it was a good way to learn, [laughter] good way to learn the science, good way to learn the subject matter. 

Jeff: Yeah. 

Bruce: And you're right. Nowadays it's about...well, it's about overemphasizing problems. [laughter] Sometimes a problem should be overemphasized. Such is the case where hundreds of thousands or millions of credit cards have been compromised. But the fact of the matter is, we have taken more of a serious kind of a danger approach to the problem these days. 

Jeff: Yeah. Do you think we can ever get back to the point when assurance is actually something people care about? Right now, I would say the only confidence we have in our systems and particularly our software, my area of expertise...is that they haven't been hacked yet. Or that we found a few of the vulnerabilities that might be in there, which is a really weak assurance argument. Do you think the world ever sort of flips back? 

Bruce: At the corporate level, you'll find that whether or not the board cares about security is kind of how it's viewed across the corporate world. And that's unfortunate, because very few board members have security in their background unless it's actually a security company. I'm very happy to see that John Thompson from Symantec is the chairman of Microsoft's board. And that could be a good thing for the entire world in terms of the quality of Microsoft's offering in the security space.

But in the government, the only driver for being more secure is the last crisis that you had to deal with, [laughter] and the heads that rolled in that crisis. And the processes and budget that was put in place as a result of that crisis. The unfortunate thing is that's the "right of boom" problem. That's the... We take advantage of a crisis, but we don't anticipate the next one. Fight the last war, in DOD-speak, we don't fight the next war. We're always prepared to fight the war we just fought. We're never prepared to fight the next war. 

Jeff: Yeah. That's very frustrating to me that we can't see what's coming, even in the face of staggering evidence of insecurity. So one last question for you. From your perspective, what are the key metrics that you use to make sure that you can sleep at night, particularly about your application security programs. But also about your program as a whole? 

Bruce: Well, putting metrics in place is still a difficult part. It hasn't even progressed to a science. What I want to know, I want to have the assurance that my business processes that I'm responsible for assuring, my mission that I'm responsible for delivering, whether it's inside the government or in corporate America, that that mission has not been impeded or obstructed by something that I have some amount of control over. Which would be IT security and IT resources. 

That's why I go to bed sleeping well, when I know that I have... But I've never had a good night's sleep, so don't get me wrong. [laughter] I perpetually can't sleep well because I never know that I have enough assurance that the processes that I'm responsible for making sure are assured...that those processes are operating without any threat of obstruction or interference, or diminishing or anything like that. So you're talking to someone who never gets any sleep. 

Jeff: I think that's kind of an occupational hazard for our industry. 

Bruce: That's right. 

Jeff: Any final thoughts before we wrap up? 

Bruce: Well, yeah. It used to be when...in the days gone by, when I was a chief information security officer in the government, which would've been...a subset...and then some, and all the way back to my multi-level security days, we focused on a lot on the vulnerability side of the risk management profile. Because the thought being that the more you reduce your vulnerabilities, the less of a target you become to the bad guys or to the threat. So a good place to spend your time would be on the vulnerabilities. And find them and reduce them, and find them and reduce them. 

Nowadays, that problem has sort of morphed into taking a closer look at the threat and being threat-aware. And being able to put in place threat-specific kinds of things. Because it's the threat that's changing. It's becoming more dangerous and becoming more persistent. 

And as a result of that, the focus of the past decade or so has shifted from vulnerability to the threat side of the risk management equation. I think that's a positive thing, but it also argues for different kinds of technology, different kinds of skill sets, different approaches to professionalization of the workforce, etc., etc. So there's sort of a shift moving here from vulnerability to threat that is kind of the undercurrent of cyber risk management. 

Jeff: And can you make those processes continuous as well? 

Bruce: Absolutely. You've got to. [laughter] No other way to do it. If you don't, you're dead. 

Jeff: All right. Bruce, it has been a pleasure talking to you. Thanks for joining us on the Security Influencers channel. 

Bruce: My pleasure, Jeff. 

Jeff: And thank you, our audience, for joining us as well. This conversation continues online. We'd enjoy hearing your thoughts on today's discussion and ideas for additional security topics you'd like to hear about.

Jeff Williams, Co-Founder, Chief Technology Officer

Jeff Williams, Co-Founder, Chief Technology Officer

Jeff brings more than 20 years of security leadership experience as co-founder and Chief Technology Officer of Contrast Security. He recently authored the DZone DevSecOps, IAST, and RASP refcards and speaks frequently at conferences including JavaOne (Java Rockstar), BlackHat, QCon, RSA, OWASP, Velocity, and PivotalOne. Jeff is also a founder and major contributor to OWASP, where he served as Global Chairman for 9 years, and created the OWASP Top 10, OWASP Enterprise Security API, OWASP Application Security Verification Standard, XSS Prevention Cheat Sheet, and many more popular open source projects. Jeff has a BA from Virginia, an MA from George Mason, and a JD from Georgetown.