This tiny url will be decoded for you before you open the synonymous link.
Thursday, October 22, 2020
Thursday, October 15, 2020
"In anything at all, perfection is finally attained not when there is no longer anything to add, but when there is no longer anything to take away..."
-- Antoine de Saint Exupery (Aviator, Mechanic, Poet,
Exemplar), "Wind, Sand and Stars"
“Simplicate and Add Lightness”
-- Design philosophy of Ed Heinemann, Douglas Aircraft
(Also attributed to Igor Sikorsky)
“Make everything as simple as possible, but not simpler.”
-- Albert Einstein
“One should not increase, beyond what is necessary, the number of entities required to explain anything.”
-- William of Occam
Engineers, indeed even philosophers and poets, have a preference for simple designs. They eschew (gratuitous or unnecessary) complexity. One need not point out that IT designers and those who secure those systems have no such preference. Engineers try to do nothing in preference to the wrong thing. They recognize that the greater the complexity of a mechanism the greater the potential for failure. They recognize that the greater the complexity of a mechanism the more difficult it is to know yourself that it is performing correctly or to demonstrate to others that it is doing so.
Simple designs are easier to implement, demonstrate, and maintain. Engineers promote the KISS principle. They call simple designs “elegant.” They recognize that complexity causes errors and masks malice. Engineers manage “parts count” because they know that quality goes up as parts count goes down.
Friday, September 18, 2020
It was a long time ago. I was doing data security market support for IBM. I thought of my job as helping IBM customers keep their computers safe, use them safely, and use them to protect their contents (from accidental or intentional modification, destruction, or disclosure). There were a manager and three professionals doing this work.
There was another security group in IBM, much larger, responsible for how IBM's intellectual and other property was protected. They were piloting their first "security awareness" program. As a courtesy, they invited me to sit in on one of the pilot sessions. Their intent was, that if the pilot program was successful, they would make it mandatory for all employees.
The program was motivated in part by a major breach of intellectual property. A competitor had called to tell us that someone had just offered them the design of what was to become one of IBM's most profitable products, a new disk drive, code named WINCHESTER, which was to launch an entire industry. The design was offered to the competitor for $50,000 and that price included monthly updates for a year.
Not only was IBM's general management upset but the competitor was equally so. His position was that he could absorb full development cost and still compete. What he could not do was compete with a mutual competitor who got the design for a pittance. That if IBM could not keep its own secrets, the next time he was offered such a deal, he would take it.
While the perpetrators were caught and the design documents were recovered, the lesson was not lost on IBM management. They were going to ensure that the problem was fixed. The pilot program was to be part of the fix.
I sat through the program not once, but twice. When I was asked if I had any reaction, I said that the course was a great lesson in how to commit fraud and intellectual property theft, that it was a good description of the problem but was likely to make it worse rather than better.
When asked what I would recommend, I suggested that the program be shortened perhaps to as little as fifteen minutes and that the message be shortened to what we expected the managers, professionals, and other employees to do.
IBM actually had a robust system for classifying and handling data. It defined how data was to be classified, labeled, and handled. Most of the documents created were public. Indeed, at the time IBM was one of the largest publishers in the world, second perhaps only to the US Government Printing Office. Other documents were classified and labeled as "For Employee Use Only," along with two levels of "confidential." Confidential meant that use of a document was to be on a "need to know only" basis. Each level was defined by the controls that were intended for that level. The highest level, "Registered Confidential," was intended for a only a very small number of documents, those whose disclosure might affect profitability, and where the controls included limiting and numbering every copy, keeping them under lock and key, and logging every use.
What IBM really needed was for every employee to classify and label documents properly, know the procedures for every class that they were likely to handle, and follow the procedures. That meant that most employees needed to be trained only on public and employee use only, a smaller number on confidential and "need to know," and only a tiny number on how to recognize, classify, and handle the "crown jewels."
It was much easier to teach this. Not only did this focus increase the effectiveness of the program but it greatly reduced its cost. Keep in mind that this was in an era when most information was still stored on cheap paper rather than on expensive computer storage media. The more sensitive the data, the less likely it was to be in a computer. It was very different from today where we use little expensive paper, and store the most sensitive data in cheap computer storage.
The lessons for today are very different but the emphasis of awareness training should be the same. Focus on behavior, what we need for people to do. Focus on roles and applications. Leave descriptions of the environment, the threat sources and rates, the attacks, the vulnerabilities, the problem, to the specialists. Awareness training should be about the solution, not the problem.
Thursday, July 9, 2020
It is a long story, longer than one can tell in a single blog entry. One has a choice of breaking it up into multiple entries or simply providing a link from this blog to the story, the choice I have taken. https://tinyurl.com/Real-Programmers
In response to this, I wrote on NewsBites:
"For the moment and for most enterprises "patching" remains mandatory; failing to do so not only puts one at risk but puts one's neighbors at risk. At what point do we decide that the cost of patching is too high? When do we realize that the attack surface of these widely used products is so big, so homogenous, and so porous, that collectively they weaken the entire infrastructure? When do we realize that the architectures (e.g., von Neumann), languages, and development processes that we are using are fundamentally flawed? That hiding these products behind local firewalls and end-to-end application layer encryption is a more efficient strategy? When do we acknowledge that we must fundamentally reform how we build, buy, pay for, and use both hardware and software? At what point do we admit that we cannot patch our way to security?"
A reader responded in part:
I agree with you that the cost of patching does remain high. I agree with you that our languages and development (and testing) processes are flawed. Those complaints are not new and not interesting.
But our architectures, especially von Neumann? Would you lump IPv6 into that category as well? I'm curious why a man of your obvious accomplishments would think of that. Even more interesting would be if you had a better idea. The paradigm of the stored-program computer with instructions and data in the same memory seems unshakable at this point. Everybody has thought of separating memory into instruction space and data space, but that's just another way of getting more parallelism, to make things faster. It doesn't really change how computers work or how we think about them.
Monday, June 22, 2020
We are now three years since the first "ransomware" attacks. We are still paying. Indeed, a popular strategy is to pay an insurance underwriter to share some of the risk. This is a strategy that only the underwriters and the extortioners can like. While this was an appropriate strategy to follow early, it is no substitute for resisting and mitigating the attacks as time permits. Has three years not been enough to address these attacks? One would be hard pressed to make that case.
The decision to pay is a business decision. However, the decision to accept or assign the risk, rather than resisting or mitigating the attack, that is a security decision. It seems clear that our plans for resisting and mitigating are not adequate and that paying the extortion is simply encouraging more attacks.
One last observation. If there is ransomware on your system, network, or enterprise, you have first been breached. Hiding your data from you to extort money, is only one of the bad things that can result from the breach. If one is vulnerable to extortion attacks, one is also vulnerable to industrial espionage, sabotage, credential and identity theft, account takeover and more. The same measures that resist and mitigate ransomware resist and mitigate all of these other risks.
Ransomware attacks will persist as long as any significant number of victims choose to pay the ransom, as long as the value of a successful attack is greater than its cost. The implication is that to resist attack one must increase its cost, not simply marginally but perhaps by as much as an order of magnitude. Failure to do so is at least negligent, probably reckless. Do, and protect, your job.
Wednesday, June 10, 2020
The rate of published "fixes" suggests that there is a reservoir of known and unknown vulnerabilities in these popular products (e.g., operating systems, browsers, readers, content managers). No matter how religiously one patches, the products are never whole.
They present an attack surface much larger than the applications for which they are used and cannot be relied upon to resist those attacks. However, in part because they are standard across enterprises and applications, they are a favored target.
They should not be exposed to the public networks. Hiding them behind firewalls and end-to-end application layer encryption moves from "good" practice to "essential."
Patching may be mandatory but it is expensive, a cost of using the product.
In assessing the "search and seizure" of personal data by law enforcement, modern courts have applied the test of "reasonable expectation of privacy." This test implies that if the citizen has used his data in such a way that exposes it to others, for example, used it in a business transaction, then law enforcement may use it against them without restriction.
I am not hopeful that this view will be argued before the courts or that, even it argued, it will change much. Nonetheless, I had to argue it.
Friday, May 22, 2020
The market, the collective of buyers, prefers systems that are open, general, flexible, and that have a deceptively low price. The real cost includes the cost of perpetual patching, the unknown cost of accepting the unknown risk of all the vulnerabilities in the reservoir, along with the risk of an unnecessarily large and public attack surface.
We do not even measure the cost of their poor quality.
We should be confronting the vendors with this hidden cost. We should be comparing them on it.