Agile Infosec

This is a reprint of my comment to a Joshua Corman’s posting on The Fudsec Blog. Consider going there to read his article and the discussion that followed.

I can’t link to my comment there and, since I’m going to continue down the rabbit hole on this particular topic, I wanted to be certain that I had a link to reference should internet churn happen.

I see where you’re trying to go here, but I’m not quite with you.

First, the OODA loop can easily turn into the usual Hamster Wheel of Pain as Jaquith mentions in his book Security Metrics: Replacing Fear, Uncertainty, and Doubt. If you shared the link entitled On Sheep, Wolves, and Sheepdogs with non-insiders, I believe most people would find it offensive. People don’t like being called a sheep because they don’t understand the dizzying details and byzantine process and pitfalls of our industry that is largely driven by irrationality. I also don’t really find it directly relevant or constructive in a complexity and technology risk management discussion, though it is if someone objected to carrying a gun in church.

After talking with Mr Gragido, him bring up this blog entry, my saying that I had read it already, and his encouraging me to join the conversation, I find myself ready to talk about some of the same talking points that I’ve been bringing up for the last couple of years:

  • relevance
  • metrics
  • unjustifiable complexity
  • over-specialization
  • mental inflexibility

First, most of what everyone in the industry speaks about is entirely irrelevant to business. Completely. If the information security profession wants to be taken seriously, they need to be relevant and speak in terms that the business will understand. Everything else I bring up is in line with this first point.

Second, almost nothing is measurable. There are many workflows, scorecards, risk valuations, and frameworks, but nearly all of the time, they are not put in terms that the consumers of risk information find relevant. Metrics need to be automated (cheap to gather) and meaningful.

  • Measuring if past implementations have been effective or if the ROI was achieved after the unforeseen operational costs. Basing decisions on rich data case study would be great and also nearly completely unavailable.
  • No information sharing between consumers anywhere. There is no Consumer Reports for enterprise technology. Every vendor or analyst has their hand out and it significantly colors their recommendation findings IMHO. Enterprise doesn’t share the data that matters.
  • A vulnerability scanner provides what is the worst kind of metric; one that isn’t meaningful to anyone. The risk practitioner knows that it is only a faction of appreciable risk, a non-practitioner looking at a scorecard may draw unjustified conclusions based on the score delta, etc.

Third, with all this talk about cloud computing, people seem to be forgetting that cloud computing is not anything new. It’s distributed computing bundled with an API and given a fluffy concept to be marketed. This is not helping anything. If we as an industry are going to add a bunch of additional layers to the old conceptual model, we do not need to evolve, we need to optimize. I’ve asked around. Almost no one knows what we do. We’re the gnomes that fix their shoes at night and lead people to believe that their shoes fix themselves. If we’re going to accept giant expansion of the threat landscape in accepting massively insecure Web 2.0 applications and, at the same time, accept outsourcing all of our data to complex distributed systems where it intermingles with everyone elses data in a way that people throw up their hands, as it is too complex, and declares “it is in the cloud,” someone needs to appreciate that they are making this risk decision. It is our responsibility to communicate this. No one else will do it.

Fourth, people have become way too specialized to the point of not understanding what their actions have on other teams. It may be the case that literacy in many areas of our practice is hard. As complexity increases, the amount of people who will be up for it will decrease. The dispassionate that only came for a day job that pays a lot of money will not care enough to do what it takes to get their hands around it. We need to be clear that this complexity we’re developing will accelerate the Peter Principal of technology and technology-dependant business management. I find it interesting that Technological Management is a stub here, though I am not surprised. We need to work toward a middle ground so that communication can happen on a level playing field. ASVS may help us to do this.

Fifth, and finally, best laid plans need to be right-sized on the ground. A mechanic’s touch needs to be worked into human resource valuation. Flexibility and agile organization has to be valued more than the ability for bad managers to find someone else to blame for the systematic problems that they have had a part in creating. Complacency is too widespread. Complacent organizations are driven by the minimum standards of compliance. Leaders do not talk much about compliance as it is way in their review mirror.

If we as risk managers can not put risk in terms that the decision makers and shareholders can understand without calling them sheep or cattle, then we are not worth anything. If we can’t make the argument inside of the technology discussion, what chance do we have translating that to those who do not have an interest in technology?

One response to “Agile Infosec

  1. Pingback: Specialists, Generalists, Incompetence, and Cognitive Bias « Bad Penny·

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s