At the 2012 Colloquium on Information System Security Education in Orlando I was repeatedly reminded how much computer security education owes to, and has yet to learn, from engineering education.
For example, every engineering student takes a course called strength of materials. In this course, he learns not only the strength of those materials that he is most likely to use but how to measure the strength of novel materials. The student studies how, and in how many different ways, his materials are likely to fail. He learns how to design in such a way as to compensate for the limitations of his materials.
A computer science or computer security student can get an advanced degree without ever studying the strength of the components that he uses to build his systems. It may be obvious that all encryption algorithms are not of the same strength but how about authentication mechanisms, operating systems, database managers, routers, firewalls, and communication protocols? Is it enough for us to simply know that some are preferred for certain applications?
Courtney's First Law, remember that's the one that says nothing useful can be said about the security of a mechanism except in the context of a specific environment and application. In this construction, security is analogous to strength, environment to load or stress, and application to the consequence of failure. Said another way, environment equates to threat or load and application to requirements.
Computer science students are taught that their systems are deterministic, that integrity is binary, that security is either one or zero. On the other hand, William Tompson, Lord Kelvin, cautioned that unless one can measure something, one cannot recognize its presence or its absence. W. Edwards Deming taught us that if we cannot measure it, we cannot improve it.
One way to measure the strength of a material is by destructive testing. The engineer applies work or stress to the material until it breaks and measures the work required to break the material. Note that different properties of a material may be measured. The engineer may measure yield, compressive, impact, tensile, fatigue, strain, and deformation strength.
The strength of a security mechanism can be expressed in terms of the amount of work required to overcome it. We routinely express the strength of encryption algorithms this way, i.e., the cost of a brute force or exhaustive attack, but fail to do it for authentication mechanisms where it is equally applicable. As with engineering materials, security components may be measured for their ability to resist different kinds, for example exhaustive or brute force, denial of service, dictionary, browsing, eavesdropping, spoofing, counterfeiting, asynchronous, of attacks. While some of these attacks should be measured in "cover time," the minimum time to complete an attack, most should be measured in cost to the attacker.
There are now a number of ways in the literature for measuring the cost of attack. The cost used should consider the value or cost to the attacker of such things as work, access, risk of punishment, special knowledge, and time to success. Since these are fungible, it helps to express them all in dollars. Of course, we will never know these with the precision that we know how much work it takes to fracture steel, but can measure them well enough to improve our designs.
The Trusted Computer System Evaluation Criteria, The TCSEC, can be viewed as an attempt at expressing the strength of a component composed of hardware and software. While an evaluation speaks to suitability for a threat environment, with a few exceptions, it does not speak to the work required to overcome resistance. One exception is in Covert Channel analysis, where the evaluation is expected to speak to the rate at which data might flow via such a channel.
Because it is often misused, a caution about the TCSEC is necessary. The TCSEC uses "divisions." The division in which a component is evaluated is not a measure of its strength. Many fragile components are evaluated in Division A, while some of our strongest are in D. In order to understand the strength of a component, to understand how to use it, one must read the evaluation.
We have two kinds of vulnerabilities in our components, fundamental limitations and implementation-induced flaws. The former are more easily measured than the latter. On the other hand, it is the implementation-induced that we are spending our resources on. We are not developing software as well as we do hardware, even as well as we know how.
The engineers use their knowledge of the strength and limitations of their materials to make design choices. They use safety factor and margin of safety metrics to improve their designs. More recently, engineers at MIT's Draper Laboratory have proposed that "complex systems inhabit a 'gray world' of partial failure." Olivier de Weck, associate professor of aeronautics and astronautics and engineering systems says,
Said another way, systems may be optimized for operation over time rather than at a point in time. The more difficult it is to determine the state of a system at a point in time, the more applicable this design philosophy. Thus, we see organizations like NSA designing and operating their systems under the assumption that there are hostile components in them.“If you admit ahead of time that the system will spend most of its life in a degraded state, you make different design decisions,” de Weck says. “You can end up with airplanes that look quite different, because you’re really emphasizing robustness over optimality.”
While most of our components are deterministic, none of our systems or applications are; they have multiple people in them, and interact in diverse, complex, and unpredictable ways. Therefore, designing for degraded state may be more efficient over the life of a system than designing for optimum operation at a point in time. We should be designing for fault tolerance and resilience. We should be designing to compensate for the limitations of our materials.
Of course, I am aware that my audience of information assurance and law enforcement professionals cannot reform computer security education or practice. I will continue to advance that agenda in other forums. What I hope for is that you will spend some of your professional development hours, effort, and study on the idea of strength and that it will inform and improve your practice. It is in part because our education is a work in progress that we are called professionals and are paid the big bucks.
No comments:
Post a Comment