His point was not simply that there is an essential level of safety for the acceptance and use of a technology but that there is a necessary level of public trust and confidence that must be sustained. Damage that trust and confidence and the technology will not be used.
Moreover, public trust and confidence is fragile and difficult to sustain. My favorite example is atomic energy. In the late forties and early fifties, proponents of atomic power argued that it would be too cheap to meter. It did prove to be more expensive than that but that is not the reason that we do not use more of it. It is not that it is not safe, or even that its safety is difficult to measure. We continue to burn fossil fuels though they are far more dangerous than atomic energy and we count the bodies in hundreds per year.
Rather it is that Three-Mile Island destroyed the necessary public trust and confidence. Three decades later we have still not succeeded in repairing it. We were getting close when a once in a thousand year event took place at Fukishima. As a result Germany, that gets 23% of its energy from nuclear power, shut down six plants and has announced that it will decommission all of its plants over the next decade. While Germany asserts that it can do this while becoming an energy exporter and reducing its carbon emissions, result of this decision, this, at least arguably disproportionate, response, is that its use of toxic fossil fuels would increase.
I have argued that information technology is different. First, the public does not see IT as being as intrinsically dangerous as energy or transportation. Second, because, just as they "feel" safer in an automobile than on an airplane, they "feel" safer in IT, in part because, as in the automobile, they enjoy some local control. We get a pass that we did not earn.
That said, both the government and the media clearly believe that the public is too complacent about IT. Every breach or compromise is widely reported. Both vulnerability and threat are expressed in hyperbolic, not to say alarming, terms like "Cyber War" and even "Cyber Pearl Harbor." While electronic transactions are demonstrably safer than the same transactions in paper, activity like identity theft, that originates mostly in paper, is blamed on IT. Security and safety are used to resist efficient, not to say necessary and urgent, automation of paper health care records. While the US spends more on international intelligence gathering than the rest of the world combined, activity of others is viewed with alarm and both capability and motive are inferred.
On the other hand, the financial condition of small business, non-profits, and municipalities is being damaged by fraudulent use of their on-line banking credentials. We continue to use mag-stripe and PIN for retail payments, an application for which they were never intended and clearly are not safe. We are spending billions of dollars per year to resist "spam" and malicious code. The software industry continues to ship code with implementation-induced vulnerabilities, doing it over and over rather than doing it right the first time. (Safe software is no more difficult than safe airliners; they do a better job.) The number of records reported breached exceeds the number of people. There is a consensus that controls between the public networks and other infrastructure make those controls vulnerable to misuse and abuse. All of these things erode public trust and confidence. Since we do not know where the breaking point is, we need to err on the safe side.
I do not expect a total melt-down like "Three-mile Island." (Pun intended.) Rather, I expect that use of IT will be resisted at the margins, used less than might be efficient. Health care is the "poster child" for this concern. While it is true that the organization of this industry makes automating it difficult, security and safety concerns, the public's lack of trust, have made it nigh impossible. Those responsible for automating it know that at least part of the public is fearful and would prefer that they fail.
Let me return to Jim Barksdale's analogy to aviation. Like the computer, the airplane began as an expensive toy for a few. They even had their hackers, amateurs who learned by experimentation, those who pushed the envelope of performance, safety, and ethics. As the computer grew up to be "information technology" the plane grew up to be "aviation." Boy did it grow up. My colleague, Dr. Peter Tippett, suggests that in the sixty years from 1937 to 1997, aviation safety improved one thousand fold. Planes are ten times safer than in 1937. Even those DC-3s still operating, and yes there are some, are ten times safer.
We got another ten-fold improvement from better flight procedures and pilot training. Peter says, think check-list. A 1937 pilot would say "Real pilots do not use check-lists." The 1997 pilot would say all pilots, professional or amateur, use check-lists. One might call this "professionalization." After only fifty years our hardware and software now have the controls that we need to resist leakage, preserve integrity, and provide transparency and accountability. Even though our professional associations develop and publish check lists, we are not as professional at using these controls as their existence would suggest.
Finally, we got a ten fold improvement from the timely reporting and sharing of intelligence. From weather to navigation, to traffic, to maintenance, to accident reports. We have a formal system in place, not only for sharing, but in some cases to ensure receipt and compliance. In part because we do not trust one another, and particularly because we do not trust government, in information assurance, we do not share well and question much of what we see.
I confess that I admire the air transport industry, I look to them when people tell me how difficult safe and secure IT is. I hold up their performance as a standard to which we should aspire. We should emulate engineering that produces planes that can be operated safely. In aviation, safety does not yield to schedule or even profit. Think B-787 and the 3 year slip.
We should emulate their training, experience, and professionalism, from pilots to mechanics to those who serve our comfort and safety in the cabin. They have earned and conserve our trust.
We should emulate their collection and use of timely intelligence, a use which manages to get safety information to everyone that can use it while not unnecessarily alarming the public.
As in efficiency, as in infrastructure, everything that we do, or fail to do, increases or decreases necessary public trust and confidence, fragile confidence, maintained at a cost, and which, if broken, we may not be able to repair or replace. It is for this that we are called professionals and are paid the big bucks.