Wednesday, May 19, 2010

Encryption by Default

A recent survey was reported as follows:

IDG News Service - Employees at many U.S. government agencies are using unsecure methods, including personal e-mail accounts, to transfer large files, often in violation of agency policy, according to a survey.


Pasted from www.computerworld.com/s/article/9176889/Survey_Gov_t_agencies_use_unsafe_methods_to_transfer_files?taxonomyId=17>

Stephen Northcutt, writing as an editor of SANS Newsbites, observes:

I agree that too many people use insecure means to move data; disagree the root cause is no access to encryption.

A lot of people have access to encryption for email at work and yet consistently send data in the clear. We discuss this in the class I author and teach, and I think we as a community are becoming numb to the dangers we face from the Internet. Pretty Good Privacy (PGP) has been around almost 20 years now. In the early days, when you went to conferences, they had PGP signing parties and almost all the security professionals I interacted with had PGP and a key. Now, almost nobody seems to use it outside of FIRST, AV Research and similar enclaves...(stephen@sans.edu).


In another context this week I was reminded of a lesson I learned a long time ago, "One must make the desired behavior at least marginally easier than the wrong behavior." Almost by definition, "harder to do it right" is too hard.

Twenty years ago we were very concerned that user credentials would be compromised in the network. Today with activity more than a 1000 times what it was twenty years ago, credentials are compromised at the end points, not in the network. The reason is that for data in motion we use encryption. We use SSL. Thanks to Netscape, we use it by default.

When we say our prayers at night we should say, "Thanks for Netscape." Netscape understood that encryption in the World Wide Web was essential, like brakes on a car, not optional. They made it standard, not a separately priced feature. It was included in the function and price of the server. Thinking back on my time at IBM, I have often thought that had IBM invented SSL, they might well have priced it as an option and it would have failed. The way we price things often influences how we think of them and how we use them.

Even though the software is not separately priced, SSL has to be turned on and, at the level of its current default use, it has a significant cost. Nonetheless, we use it pervasively and users have come to expect it. We use it by default. If either party expects it, the other party can hardly avoid it.

Note that the problem addressed by the survey is identified as "file transfer," much of which is not even done in the network but on portable media, on what we used to call the "sneaker net." Much of it is ad hoc, with no standard procedures. Management has not told employees how to transfer data, much less how to do it securely.

The data leaks in dozens of ways. It leaks when users make gratuitous copies and then loses them. It leaks when backup copies fall off the back of the truck. It leaks when hackers compromise servers. It leaks through the user interface of ftp servers and other ways too numerous to enumerate. The user does not even contemplate most of these leakage modes and believes that the ones that he does contemplate are too rare to worry about.

Stephen Northcutt points out that PGP can be used to resist most of these leaks. Even simpler tools like passwords on .doc and .pdf files would resist many of them. PKZip and sftp are powerful tools to help us. However, most of these solutions require user involvement and a high level of user knowledge, not to mention judgment and initiative.

The solution to the problem includes making using encryption on all data easier than not, to make the encryption of data at rest the default, not the exception. It includes providing encryption by default across enterprises. It includes resisting gratuitous copies at the end points, even where the use requires that the data must be in the clear. It includes management direction and automated procedures to implement that direction.

A tall order you say? Suppose I told you that encryption by default is routine, automagic, in many enterprise and government domains and even across domains? True. Just for an example, Lotus Notes protects files and databases at rest, by default, using encryption. Even if one makes a gratuitous copy of the file on one's laptop or thumb-drive, it is encrypted. Notes provides for automatic safe exchange across domains. It provides for automatic key management that is transparent to the users. Obtaining copies of these files and databases in the clear requires both privileges and work. In this environment, it is easier to do it the right way. Indeed, it is so easy that many, not to say most, users do not even know that it is happening.

Though I believe that it is under-sold and under-appreciated, I am not here to sell Lotus Notes. I use it merely as an example of "encryption by default." I believe that encryption by default should be the standard in all government agencies and most private enterprises, and that we have at least one successful model of how to achieve it.

Wednesday, May 12, 2010

Security in "The Cloud"

Plus ça change, plus c'est la même chose.

When T.V. Learson was leading IBM, he was asked by a customer whether his IT should be centralized or decentralized. Learson responded that whatever way he was currently organized he should change it. Said another way, "What goes around, comes around."

In the early days of shared resource computing, the computer and most of the data resources were owned by the enterprise. "Data security," as we called it then, meant that what the enterprise said it intended, what it intended it did. We tried to help them think about it by suggesting the properties of the data that the enterprise most wanted to conserve. In some proportion of one to the others, the enterprise wanted the data to exhibit confidentiality, integrity, and availability.

To the extent that Grosch's Law described the economics, i.e., efficiency increased with scale, the economics favored centralization. Similarly, protection and control was also centralized. The risk was information leakage. The control of interest was Data Access Control, usually implemented as an optional process of the operating system.

In some cases use was metered and cost allocated but often cost was simply absorbed by the enterprise. This was in part because the meters and metrics of cost and value were immature. Metering and cost allocation were expensive and often had perverse effects on usage and uses.

At some point, Grosch's Law gave way to Moore's Law. Efficiency began to favor the small. When the scale of computing changed, it was not so much that data in the glass house moved to departmental and personal systems, although copies clearly did, as that data in departmental paper files got sucked in to the departmental and personal system, increasing the number of electronic records. At the same time, all computers were being connected to the Internet, making them and their data more vulnerable to attack by outsiders.

At about the same time as the scale was changing, we went from talking about "data security" to "information assurance," reflecting a shift in priority from confidentiality to integrity. Protection and control moved from centralized to distributed. The risk shifted to system contamination with malicious code. While we still used data access control, we relied more upon control of access to systems and applications. Other controls of interest included anti-virus, firewalls, and cryptography.

At this writing, we are discussing what security means in "cloud" computing. The name, cloud, for this style of computing comes form the cloud symbol that we used in network diagrams to represent that which was not known or beneath the level of abstraction at which we were working.

NIST defines cloud computing as "a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources" (e.g., networks, servers, storage, applications, software, and other services) "that can be rapidly provisioned and released with minimal management effort or service provider interaction."

However useful one may find the definition, some examples may help to appreciate the concept. The earliest emergent examples really define the cloud. Perhaps the most important of these is the Domain Name Service (DNS). E-mail and the World Wide Web are also on the list. Note that these are collaborative services, instantiated by the cooperation of many edge processes. For most users, their cost is included in the cost of their connection.

An early example is Hotmail, an advertising supported personal e-mail service. A more recent competitor to Hotmail is gmail from Google. As a personal service gmail is ad supported but Google also offers a service to "outsource" corporate e-mail. Instead of operating its own e-mail servers, an enterprise contracts with Google.

E-mail is an example of an application level service Dropbox is an example of a private file service in the cloud. Carbonite is an example of a backup service. IBM and EMC offer segment level storage backup. Indeed, they will operate an enterprise's entire storage network for them. Amazon offers a complete web storefront Think about almost anything that is hidden behind a standard service interface; it is available as a service in the cloud.

Those of us who were around in the days of "shared resource computing" think "what goes around…." In this analysis, the "cloud" is simply another shared resource computer. After all, it looks the same to the end users. At some level or another cloud service protect what they offer from contamination, leakage, or loss.

However, it is really not quite, indeed nowhere near, that simple. The cloud is really not just a computer or use. Rather it is an abstraction, a model for looking at computers and computing. It is on the same list as serial re-use, time-sharing, host-guest, and client-server. However, unlike these, cloud computing is not designed and implemented top-down but emergent from the bottom up.

The computing resources may include any combination of connectivity, computing capacity, instantiated processes, servers, storage, and services, including software (SaaS) and application services. While the resources are rapidly, and usually automagically, allocated and provisioned, use is metered and cost is allocated.

Security in the cloud turns not only on the axis of centralization v decentralization but also on one of scale, and on another of organization. Let's think about the last first.

In the cloud, the services are used by multiple users or organizations but owned by none of them. While most of the data may belong to the users, the hardware, software, and many of the controls are owned and operated by another enterprise, the service provider.

Each organization's interest in the security of the data is different. For example, the owner of the data may rank confidentiality, integrity, and availability, in that order, might prefer that the data disappear before leaking. The service provider, on the other hand ranks availability, integrity, and confidentiality, would prefer that the data leak than that he not be able to deliver it when it is asked for. One can easily imagine a scenario in which the service provider has so many copies of the data that he cannot erase them all on demand, perhaps not ever.

Users of the T-Mobile smart phone, the Sidekick were offered a service to backup the names, phone numbers, calendars, to-do lists, and other data that they had stored in their phones. This service was implemented by an enterprise ironically named Danger. However, it was offered to the user by T-Mobile under the T-Mobile brand. That there was a second enterprise involved was not apparent to most users.

Danger had a server crash. The service was clearly down and user's data was at least temporarily unavailable, perhaps lost altogether. To complicate matters, Danger was in the process of being acquired by Microsoft.

The story had a happy ending. In less than a week, Microsoft/Danger recovered the data and made it available on a new server. However, it illustrates another aspect of the cloud that impacts security; that is, you may not know with whom you are doing business or upon whom they rely.


The abstraction, The Cloud, hides the fact that it, the cloud, is a mechanism for combining, composing, and connecting (other cloud) resources to provide services with those properties, i.e., on-demand, easily and rapidly provisioned, that are described in the NIST definition of the cloud. A cloud application may reside in a cloud virtual machine, using cloud connectivity, cloud storage, cloud data, and even other cloud applications. Each of these resources may be offered by a different vendor and components my be added or subtracted on the fly. The service level agreements (SLAs) for these resources are probably "best efforts," the default service level for most information technology.

This is potentially a security nightmare for both buyers and sellers. Of course, a proper understanding of the problem is an essential step to a solution. More on this later.