SOPA, H.R.3261, The Stop Online Piracy Act, is a long and complicated act. While I am not a lawyer, I flatter myself that I am literate. However, I do not pretend to fully appreciate this law.
A cursory reading suggests that it pretends to be aimed at "foreign (DMCA) infringing (internet) sites." It burdens and punishes US enterprises directly with the intent of indirectly punishing the so-called "foreign infringing sites. Since most of the sales of these foreign sites are to parties outside the US, they are not likely to be punished very much.
The television advertising that urges support of this law, suggests that it is about resisting "international piracy." It even suggests that on-line piracy is the moral equivalent of piracy on the high seas. One is led to conclude that this is a national, or at least cyber, security issue, justifying a dramatic increase in the police powers of the state.
If you are an Internet service provider (ISP), an Internet search engine provider, a payment network service, or an internet advertising service, the law requires you to identify agents to accept legal orders on your behalf and to provide both controls and operators to filter access or revenue to "foreign infringing sites." As users of all of these services, American citizens will be forced to bear the cost. If you use a foreign site, so censured, for any purpose, you may be unable to access it.
Surprise! This law is sponsored, written, and supported by the publishing industry, the RIAA and the MPAA, Sony, and Nintendo, in a last ditch effort, a futile attempt, to shore up their broken business model.
Throughout history, every time there was an advance in technology that reduced the cost of copying, the authorities have used it to reduce their own cost while attempting to maintain their prices and control. When the end-users of their content have used the same technology to force a change in prices, the authorities have cried foul. They have "screamed like stuck pigs," which is, I suggest, an apt metaphor.
Of course, that has happened many times throughout history and multiple times in the last century. It happened with movable type, the linotype, and the photo-offset press. It happened with magnetic tape recording. It happened with the plain-paper dry-process photo copiers and scanners. It happened with the general purpose digital computer. Let's not forget VHS and MP3 players. The publishers have tried to outlaw all of these. Each time, the publishers have exploited the technology but tried to use the law to resist its use by others.
Every time, their strategy has failed. Every time the price of their product has fallen to the point where it approximates the marginal cost of using the technology to exploit them. That is why one no longer pays $20- for an album but $0.99 for the tracks that one wants. The irony is that the value of their rights actually increase because their sales increase and the illegal copying decreases.
Said another way, Piracy is a service and pricing problem. For example, I stopped looking for free down-loads the day I got iTunes. I use bitTorrent and FrostWire because they are fast, as much as 30 times as fast as ftp, not to access illegal content. Quite candidly, I think that it is a tragedy that collaborative networking has been so tainted by its abuse and misuse that we do not use it for legitimate purposes.
To the extent that this law and these controls, are used to enforce court orders and injunctions they would be limited in their potential for abuse. However, the Department of Justice, the Attorney General. is authorized to use them as part of his police powers.
Of course, it is appropriate that this expansion of power is really to the publishers under the DMCA. After all, they wrote the law, and they have paid for it. As citizens, we should be very suspicious when the interest of the money coincides with those of the politicians, the "control freak" politicians, those who believe that legislation can ensure that hammers hit only nails.
Is it reasonable to believe that this increase in power will not be abused by those who have already abused the power that they have? Not only have the publishers used their power to punish arbitrarily and capriciously, but they have used their power to "take down" pages that they have no rights over but whose content offended them. Noting that they have abused their power to take down pages, do we really want to empower them to take down entire domains.
Note that the role of domain registrars is merely to create the bind between a name and an address, a role similar to that of the Post Office or the publisher of the phone directory. Why not pass a law that the Post Office cannot accept mail that contains DMCA contraband.
The proposed law transfers the burden of proof from the state to the citizen. Penalize first, decide, if at all, later. The law provides no defenses from and remedies for such abuses by either the state or the copyright holder. According to the MPAA and the RIAA, there is no such right as "fair use" merely a defense of fair use.
To the extent that the cost and burdens of this law was to be borne by the publishers and their customers, it might be defensible. However, the law places the cost and burdens on the providers of unrelated services and their customers.
Like the USA Patriot Act, this law, the intended purposes of this law are likely to be dwarfed by the unintended consequences. Law is a blunt instrument, one that we should resort to only when all else fails. Moreover, this specific law is particularly blunt. Without due process, it permits entire sites to be taken down because of any infringing use.
The House Committee with jurisdiction has published a list of tens of industry supporters of the Bill. While roughly half are publishers, there are some that would be subjects of the law. There is also a great deal of popular opposition. I went to YouTube to find the ad promoting the bill but found only videos opposing it. However, while the race is not always to the swift and the legislation is not always to the RIAA and the MPAA, that is how the smart money bets.
I have tried to be measured in my response to this proposed law, I have tried not to "view with alarm." That said, I am much less sanguine now then when I began my research. If you think that I have failed, I invite you to visit YouTube where the proposal is covered with vitriol. I particularly commend to you the speech by Cory Doctorow at the Chaos Computer Conference.
As a citizen, I find this law obnoxious and its sponsors greedy and corrupting. As an information security professional, I expect some, not to say much of its burden to fall on us. But, of course, that is why we are called professionals and are paid the big bucks.
Tuesday, January 10, 2012
Tuesday, January 3, 2012
In the nineties I attempted to mediate an on-line dispute between college students and system administrators that was taking place on American campuses. The students felt that system administrators were over-reacting, exceeding their authority, indeed violating their civil and human rights, in response to trivial and innocent behavior.
The students had grown up in a world of cheap single-user computers, a world in which the boundaries of the system were clear, hard, and embraced nothing that did not belong to its user. The primary applications were trivial, mostly games, and the rules of the game were implicit in the game; it the game would do it, then it was legal, even ethical. One could not cheat at Pac-Man. There was no problem that could not be solved by pressing ctl-alt-del, system reset, a control that would return the system to a known and stable state.
The administrators had grown up in a world of expensive shared-resource computers, a world in which the boundaries of the user's space were obscure, soft, and where most of the addressable resources did not belong to the user. Applications included those that were essential to the health and continuity of the enterprise; their legal and ethical use required judgment, prudence, and care. Misuse or abuse harmed others; it often destabilized the system and took time and other scarce resources to return it to a stable state.
The students believed that the system was there to support their learning, learning by exploring the world, including the system. The administrators saw such exploration as threatening, rude, and dangerous. The students saw their exploration as innocent and, to the extent that ethics involves how we treat others, as an-ethical. The administrators saw the the issue as about the effect on others and essentially ethical.
When the administrators observed what they identified as forbidden behavior, they responded, usually by revoking the system privileges of the students. The students saw any attempt by the administrators to impose order and discipline as an abuse of authority; they needed the system to complete their assignments. Restricting their privileges was the ethical equivalent of denying them access to the library, or even he cafeteria.
Needlessness to say, mediating the conversation between these two groups was neither fruitful or satisfying. Not only did they have different ideas about how the world works, they had conflicting, not to say irreconcilable, views of how the world works.
While I was sympathetic to the administrators, world views are. They are neither correct or incorrect, good or bad; they just are. They tend to be generational. The little nuns that taught me were certain that if I could write a pretty Palmer Method hand and add long columns of numbers, I would be guaranteed a living for life. While I was guaranteed a living for life, and while it was based in large part on their efforts, it had little to do with what they believed to be important.
The current generation, one that our colleague, Jim Beeson, CISO, GE Capital Americas, calls the "digital natives," comes to us with yet another world view. For them, the purpose of the network of computers is to facilitate sharing and collaboration, what the media likes to call "social networking." Not only will they sacrifice enterprise security, but their own personal privacy, to this view.
According to a report from the Threat Research group at Cisco, "seven out of 10 young employees frequently ignore IT policies and 67 percent feel the IT policies on social media and personal device usage are outdated and need to be modified to 'address real-life demands for more work flexibility.'"
Like the system administrators of the 90s, young security managers project the world view of their generation onto the next. In their view Facebook, Twitter, bitTorrent, and user-owned devices look threatening, opportunities to leak and contaminate.
However, there is a difference between the way things appear and how they really are, between things that look threatening and things that really are. Most of the students in my tale really were benign even though their behavior matched a threat profile that the administrators recognized. While FaceBook and user-owned devices appear threatening, they may not represent a risk. However, their users do have a different and persistent view and with it different attitudes and behavior.
The security managers often respond to what they see as threatening by resisting the technology and the world view of the young. What they ought to do is identify and restrict access to the sensitive data and applications as close to them as possible. What they ought to do is layer and compartment the network.
I do not use Facebook or Twitter, not so much because I see them as threatening as because I value my privacy more highly than the young seem to do. I have a less trusting world view. In its light I make different choices. Whatever their choices, they carry responsibilities. For example, one of the responsibilities that they are learning the hard way is that they must resist cyber-bullying. It is up to us to help them learn how nice people behave in the world that they are creating. To the extent that the past is a guide, it will not last a generation.
It is up to us to achieve our enterprise security objectives in spite of the persistence of the new world view. It is for that we are called professionals and are paid the big bucks.
Posted by William Hugh Murray, CISSP at 2:39 PM No comments:
Monday, January 2, 2012
Security is about Trust
At an RSA Conference in 1997. Jim Barksdale, then CEO of Netscape and late of FedEx, pointed out that if airline safety had remained constant at 1937 levels, the year the DC-3 (C47, Dakota) came on-line, and traffic had risen to 1997 levels, we would be killing people at the rate of two 747's per day. He then asked, "Would you fly?"
His point was not simply that there is an essential level of safety for the acceptance and use of a technology but that there is a necessary level of public trust and confidence that must be sustained. Damage that trust and confidence and the technology will not be used.
Moreover, public trust and confidence is fragile and difficult to sustain. My favorite example is atomic energy. In the late forties and early fifties, proponents of atomic power argued that it would be too cheap to meter. It did prove to be more expensive than that but that is not the reason that we do not use more of it. It is not that it is not safe, or even that its safety is difficult to measure. We continue to burn fossil fuels though they are far more dangerous than atomic energy and we count the bodies in hundreds per year.
Rather it is that Three-Mile Island destroyed the necessary public trust and confidence. Three decades later we have still not succeeded in repairing it. We were getting close when a once in a thousand year event took place at Fukishima. As a result Germany, that gets 23% of its energy from nuclear power, shut down six plants and has announced that it will decommission all of its plants over the next decade. While Germany asserts that it can do this while becoming an energy exporter and reducing its carbon emissions, result of this decision, this, at least arguably disproportionate, response, is that its use of toxic fossil fuels would increase.
I have argued that information technology is different. First, the public does not see IT as being as intrinsically dangerous as energy or transportation. Second, because, just as they "feel" safer in an automobile than on an airplane, they "feel" safer in IT, in part because, as in the automobile, they enjoy some local control. We get a pass that we did not earn.
That said, both the government and the media clearly believe that the public is too complacent about IT. Every breach or compromise is widely reported. Both vulnerability and threat are expressed in hyperbolic, not to say alarming, terms like "Cyber War" and even "Cyber Pearl Harbor." While electronic transactions are demonstrably safer than the same transactions in paper, activity like identity theft, that originates mostly in paper, is blamed on IT. Security and safety are used to resist efficient, not to say necessary and urgent, automation of paper health care records. While the US spends more on international intelligence gathering than the rest of the world combined, activity of others is viewed with alarm and both capability and motive are inferred.
On the other hand, the financial condition of small business, non-profits, and municipalities is being damaged by fraudulent use of their on-line banking credentials. We continue to use mag-stripe and PIN for retail payments, an application for which they were never intended and clearly are not safe. We are spending billions of dollars per year to resist "spam" and malicious code. The software industry continues to ship code with implementation-induced vulnerabilities, doing it over and over rather than doing it right the first time. (Safe software is no more difficult than safe airliners; they do a better job.) The number of records reported breached exceeds the number of people. There is a consensus that controls between the public networks and other infrastructure make those controls vulnerable to misuse and abuse. All of these things erode public trust and confidence. Since we do not know where the breaking point is, we need to err on the safe side.
I do not expect a total melt-down like "Three-mile Island." (Pun intended.) Rather, I expect that use of IT will be resisted at the margins, used less than might be efficient. Health care is the "poster child" for this concern. While it is true that the organization of this industry makes automating it difficult, security and safety concerns, the public's lack of trust, have made it nigh impossible. Those responsible for automating it know that at least part of the public is fearful and would prefer that they fail.
Let me return to Jim Barksdale's analogy to aviation. Like the computer, the airplane began as an expensive toy for a few. They even had their hackers, amateurs who learned by experimentation, those who pushed the envelope of performance, safety, and ethics. As the computer grew up to be "information technology" the plane grew up to be "aviation." Boy did it grow up. My colleague, Dr. Peter Tippett, suggests that in the sixty years from 1937 to 1997, aviation safety improved one thousand fold. Planes are ten times safer than in 1937. Even those DC-3s still operating, and yes there are some, are ten times safer.
We got another ten-fold improvement from better flight procedures and pilot training. Peter says, think check-list. A 1937 pilot would say "Real pilots do not use check-lists." The 1997 pilot would say all pilots, professional or amateur, use check-lists. One might call this "professionalization." After only fifty years our hardware and software now have the controls that we need to resist leakage, preserve integrity, and provide transparency and accountability. Even though our professional associations develop and publish check lists, we are not as professional at using these controls as their existence would suggest.
Finally, we got a ten fold improvement from the timely reporting and sharing of intelligence. From weather to navigation, to traffic, to maintenance, to accident reports. We have a formal system in place, not only for sharing, but in some cases to ensure receipt and compliance. In part because we do not trust one another, and particularly because we do not trust government, in information assurance, we do not share well and question much of what we see.
I confess that I admire the air transport industry, I look to them when people tell me how difficult safe and secure IT is. I hold up their performance as a standard to which we should aspire. We should emulate engineering that produces planes that can be operated safely. In aviation, safety does not yield to schedule or even profit. Think B-787 and the 3 year slip.
We should emulate their training, experience, and professionalism, from pilots to mechanics to those who serve our comfort and safety in the cabin. They have earned and conserve our trust.
We should emulate their collection and use of timely intelligence, a use which manages to get safety information to everyone that can use it while not unnecessarily alarming the public.
As in efficiency, as in infrastructure, everything that we do, or fail to do, increases or decreases necessary public trust and confidence, fragile confidence, maintained at a cost, and which, if broken, we may not be able to repair or replace. It is for this that we are called professionals and are paid the big bucks.
Posted by William Hugh Murray, CISSP at 10:38 AM 1 comment:
Subscribe to: Posts (Atom)