Blood is thicker than TCP/IP

Jeffrey Carr predicts 2012 isn’t going to be a pretty way forward for information security professionals, industry or governments. He has a couple of points I’d like to dissect a little bit. I’m not much for alarm bells. We’ve had a lot of wake up calls, enough snooze alarms, and a bunch of over sleeping in information security over the decades. There is enough evidence to support a hypothesis that chronic aberrant network behavior has nearly zero effect on a populace at this point. Catastrophic and or critical event thresholds are rising to the point now that unless millions of people are effected it isn’t even news.

The corollary point to “it isn’t news” is that “if it isn’t news it is normal” and we’ve adjusted to that level of noise. Yes, another building burned down today, but one burns down every day about this time. No news is good news, but the frog is still in the boiling water. Key take away is ain’t no news in 2012. Just the same news we’ve had for the past fifty years that information systems are subject to breaches. It isn’t good news, but it ain’t even news. So, knowing my bias that information technology is more about plumbing than fighting let’s look at three of Mr. Carrs points.

1) We’re entering 2012 more vulnerable than ever.

I fully disagree. The threat and vulnerability envelope is not substantially bigger or growing. If anything it is slowing over time. I’m not just basing that on opinion but simple math. There are 52K elements in the Common Vulnerability Exposure database from MITRE. That is the real number of vulnerabilities since 1999 that have been recorded. Simply as a function of math if all things continue as they’ve been going there will be the same or less vulnerabilities discovered next year than there were this year.


This is the Common Vulnerability Exposure Database from Mitre by Year (click to make larger)


A little more detail for those who are interested.














































Now, Mr. Carr and his supporters will state that he’s really talking about the over all vulnerability to the infrastructure and society in general. He has made a classic mistake in advancing vulnerability in lieu of risk. Never mind I just got through teaching this exact point to my graduate students. Thank you to Mr. Carr for providing me a great teachable moment for the class. All kidding aside this is a pretty common way of stating the problem.

Words matter especially when we’re trying to inform leaders about impending nefarious and possibly catastrophic events. The problem is that  vulnerability has absolutely zero impact without an additional vector which is threats. Even that isn’t enough for our little discussion to make a guess at impact. Of course there are always mitigations which we’ve not exactly done very well (see below). We need to know what the likelihood will be of these all added up together. This is a classic risk formulation.

The risk equation suggested by Dan and Julie Ryan (retrieved from (click to make larger)

There is more to be discussed but the point remains. Vulnerabilities are quite well measured and are reducing. The argument could remain that each individual vulnerability has an increasing impact but that is an argument for another time. I have another post I’m working on looking at threats specifically.

2) We’re vulnerable because <insert your name here> has not done their job.

We most assuredly agree. This is where we find the failure in mitigation. Every major national strategy over the last decade has had a couple of thematic elements; 1) Whole of government (cross agency coordination); 2) Public private partnership/coordination. On the first part basically the task gets handed to Department of Defense because you can make soldiers and sailors do stuff. There are lots of people who try diligently to do great stuff but they are resourced on a shoe string, and to many of them leave government service because they are ignored.

You can’t have a partnership or even coordinate unless all parties are equal at the table. Government feels that they own the regulatory and legal functions and can “coordinate” with industry. Industry though has the ability to lobby and seek relief from government intrusion through the legislative arena. The result is a friction between the legislative branch and the executive branch goals, ways, and means of doing business.

In summary there is no partnership within government agencies because nobody is resourced to take on additional tasks and only Department of Defense has the ability to order particular behaviors. In the realm of public and private partnerships corporations will seek legistlative relief including corporate control of agencies and commissions if they attempt to control their actions. Just read through the FCC versus cable company debacle for examples of this in the public eye. When you look at non-regulatory agencies attempting to coordinate you have a serious disconnect between the functioning entities.

I’ve been saying this for years. We need to send a select number of CERT types to Quantico for the FBI academy and give them badges. It’s time to start putting some providers in jail for the level of malfeasance they’ve been committing. No… I know… That will never happen. Still you have to rise to some level of information security. Well. Actually you don’t currently.

3) Multi-modal attacks will become prevalent (Hoffman would say hybrid).

Nearly all conflict has had a technical communications component for the last several hundred years if not forever. Command and control warfare has been part of the tactical and operational strategy of generals for a long time. With the evolution of the electronic communication field that has only accelerated. There is incredible risk in not securing the communications conduits of society but they’ve existed for a long time. Luckily the fusion of a couple technological threads is going to solve a bunch of problems.

Utility computing is when the computer leaves the monolithic computing device strategy and the “horsepower” is off loaded to a cluster or <shudder> cloud computing platform. Utility computing becomes an appliance  that has very few parts and is more of a display and interface mechanism. Appliance computing platforms and cloud environments (when actually secured) have the ability to secure information much better than end point devices. There is some great research being done down this path and the Google Android/ChromeBook, Apple iPhone/iPad, and Microsoft Metro systems all seem to be converging on this.

The current vulnerability that has not been put into most risk calculus is ignored by government and industry security types alike. Though I’ve been discussing this for years I haven’t found any traction in the discussion. The continued development of cyber physical systems, and brain to computer interfaces have only been delved into a tiny bit. The exploit of an insulin pump earlier this year announced at a hacker conference I thought would give people pause. Past stories about threats to former Vice President Cheney’s heart diagnostic machine might have gotten traction. Yet they are fairly well ignored. This is a classic example of risk calculus breaking down. The vulnerabilities exist, there are specific and alliterated threats, and the mitigation strategies are almost non-existent.

There is another element to this hybrid attack and risk vector. As computing does become more secure the stakes rise. So adversarial action rises to that new higher threshold. The tools of exploitation and acquisition keep pace with this rising level of difficulty. That means the threats, vulnerabilities and associated impacts become larger. Adversary work factor will focus the adversary on only the most tasty of targets. Current state of the art is not up to this task. Especially when the weak link is the human being. We’ll know our technology efforts has succeeded when the bad guys start snatching system administrators in person.


Leave a Reply