Thursday, August 22, 2013

Security with Persistent Threat but no Perimeter and no Edge

Whether one focuses on the consumerization of technology, "bring your own device to work," "Advanced Peristent Threat," or merely the exponential growth in use, uses, and users of information technology, we really have reached a tipping point. Our approach to Information assurance is no longer working.  We cannot discern "the edge."  We cannot control the user device.  While the network is spreading and flattening, the perimeter is crumbling.   As the base of the hierarchy of authority, privilege, capability and control is spreading, the altitude is shrinking.  The compromise of one system, can compromise the entire network.  A compromise of a single user can compromise the entire enterprise. We cannot afford for all of our data, the protection indicated for the most sensitive.

Our traditional tools, user identification and authentication, access control, encryption, and firewalls are not scaling well.

My purpose is not so much to tell you what to do as to change the way you think about what you do.  I hope to change your view of your tools  and methods and the materials to which you apply them.

First and foremost, you must identify and segregate the data you most want to protect.  This will include, but may not be limited to, the books of account, intellectual property, and personally identifiable data.  You cannot protect all your data to the level that is required by these.  "Classification" of data is essential to effective and efficient security.

Prefer closed systems for this sensitive data.  Think AS/400 and Lotus Notes but you can close any system.  While most of our systems will continue to be open, these will always be vulnerable to contamination and leakage and not reliable for your most sensitive data.  Lotus Notes is both closed and distributed.  Trusted clients are available for many popular edge devices.

Consider single application client systems for sensitive network applications.  In an era of cheap hardware, it can be efficient to use different systems for on-line banking on the one hand, and web browsing or e-mail on the other.

Prefer object-oriented formats and databases to flat files for all sensitive data.  This should include Enterprise Content Management or Document Management systems, for example, Lotus Notes, SharePoint, or netdocuments.   The common practice of storing documents as file system objects is not appropriate for intellectual property or other sensitive documents.

Control access as close to the data source as possible,  i.e., at the server, not on the edge device.  Control access at every layer, edge device, application, network, database, and file.  Do not rely upon one layer for exclusive control of access to another.  For example, do not rely exclusively upon the application to control all access to the database.  The application controls should mediate what data is passed to the user but database controls should be used to mediate what data is to be passed to the application.

Prefer application-only access, not file system, not database management systems, not device.  Prefer purpose-built application clients; think "apps."  Said another way, the application should be the only way for a user to access the data.  The user should not be able to bypass the application and access the data by other methods or tools.

Prefer end-to-end encryption, that is, known edge client to the application, not to the network, not to an operating system.  Said another way, when a user opens a VPN, he should see an application, not an operating system command line, not a desktop.  While there are many ways to accomplish this, for existing applications, an easy way to do this is to hard-wire an encrypting proxy, a router, in front of the application.  The cost of such a router will range from tens of dollars to low tens of thousands, depending upon the load. A limitation of this control is that what appears to be the edge device may be acting as a proxy for some other device.  While we can know that data passes to a known device, we can not know that it stops there.

Prefer strong authentication for sensitive data; consider the edge device identity, for example, EIN or MAC address, as one form of evidence.  Consider out-of-band to the user or token-based  one-time-passwords to resist replay. Check out Google Two Factor Authentication as an example.  (It takes advantage of the fact that the typical hand-held computer ("smartphone") has addresses in both public networks.  Thus, when I want to log on to Google, I am prompted for my e-mail address and my password.  However, these are not sufficient; knowing them would not enable you to impersonate me.  I am then prompted for a six digit number, a one-time password,  that Google has sent, in text or spoken language, to a telephone number of my choice, provided to Google by me at enrollment time.)  Consider the user's hand held or other edge device as the token.  Both RSA and Verisign offer complete solutions  for this.

Control the data rate at the source; prefer one record or page at a time.  One may not be able to prevent the user from "screen scraping" and reconstructing the document or database at the edge but one can resist it, the stick.

Provide a high level of service, the carrot.  You can make any control or restriction at least tolerable provided that you couch it in a sufficiently high level of service.  Remember that most leakage is of gratuitous copies.  These copies are made to trade off cheap local storage against scarce bandwidth and/or high network latency.  The faster you can deliver data from the source, the fewer copies will be made at the edge.  

Involve multiple people in privileged access and control. System administrators and other privileged users have become favored targets and means for compromising the enterprise.  Tools and methods are available to exercise control over them and to provide the necessary accountability.  These include such tools as sudo at the Unix system level to Cyber-Ark at the network or enterprise level.

These measures focus on the data rather than the technology.   They address malice, for example, state or organized crime sponsored commercial espionage, and errors and omissions, for example leakage or opportunistic contamination at the edge.  They address outsiders, the largest source of attack, and insiders, the largest source of risk.

They are not for those who look for or believe in "magic bullets."  They may be too expensive for all the data in the enterprise but efficient, and perhaps even mandatory, for the most sensitive data.  It is for drawing the line between the classes of data and applying the appropriate measures that we are called professionals and are paid the big bucks. 

2 comments:

  1. Well Said, including tools for session recording and privileged user management are also requirements, removing global authority and decomposing and scoping admin rights based on role and tied directly to identity for full accountability are also steps that need to "table stakes" in the new enterprise.

    ReplyDelete
  2. This was well written and I think it makes sense to recall the data from the parameter.

    Statistically no matter how hard we try to protect the thousands of floating data-mine-fields out there, it's inevitable that one will get compromised, come back to the mother ship and blow it up.

    Thanks for your insight.

    ReplyDelete