A recent report quoted the Director of the FBI as complaining that he had more than 7000 mobiles for which he has established probable cause to believe contain evidence of a crime, but that their security is so good that he cannot be sure. Well, perhaps his emphasis was different than mine but you get the gist.
Of course, a decade ago he did not have any. The modern mobile has given him a rich source of evidence that he has never had before. Instead of saying ”thank you,” he complains that the source is not even richer than it is. He neglects to say how many mobiles that he has opened while finding the few that he cannot. He neglects to address what percentage of those contained useful, much less admissable, evidence of crimes, a number that might give us some idea of any probative value of the contents of the 7000.
What he is really complaining about is that the default security of these devices raises his cost of investigation. He does not even speak to the resistance to crimes against the tens of millions of legitimate devices, users applications, data, and information that that security provides. Therefore, he cannot even get to the idea that in the absence of such security, there would be fewer devices, users, and applications, much less that his rich source of evidence might not even exist.
He argues that, in order to reduce his cost, the default security of the devices should be reduced. In spite of all the testimony against this proposition, and the absence of any in its favor, he argues that the purveyors of the mobiles can reduce his cost while maintaining the security against all others. Without specifying what would satisfy him, he argues that this is simply a small technical problem that the industry can solve any time it wants to.
While the Director talks in terrms of ”capability,” that he does not have, I talk in terms of ”cost.” I assert that if one has a cryptogram, the method, and the key, all of which are on the mobile device, then, at some price, one can recover the clear text. Depending upon the design of the device, the cost may be high but it is finite. The Bureau demonstrated this for us in the San Bernardino case. After asserting that Apple could, but that they could not, they turned to the Israelis, who for a million dollars, recovered the data. Incidentally it proved to be worth considerably less; it provided neither evidence nor intelligence. On the other hand, on a wholesate basis, the cost per device would be significantly less.
One problem is that, whatever the cost, the Bureau prefers to transfer it to the purveyor and the user than to just pay it. It hopes to do this by sowing enough fear, uncertainty, and doubt that a law and order Congress will pass coercive legislation forcing the uninvolved and unwilling to become arms of law enforcement. If the purveyor is coerced into reducing the security, i.e., a value, of his product, he will lose sales and profit. Remaining users will lose security and privacy, experience costly breaches, and incur costs for compensating controls.
The net is that, while the Director may not be able to read every mobile for which he has a warrant, he can read most of them. While he knows what he cannot read, he bears the burden of proof that reading it would yield evidence or intelligence; he has the data, he must share. We are not talking about cryptography in general but only about the security of mobile devices. We are not talking about capabitlity but cost. Not so much about how much as about who will pay; will we pay by taxation on all or coercion of a few? The Director may have a case, but he has not made it yet.
Showing posts with label over-reach. Show all posts
Showing posts with label over-reach. Show all posts
Wednesday, February 21, 2018
Monday, February 29, 2016
Encryption and National Security versus Liberty
In the 1990s, in what might be called the first battle of the Crypto War, the government classified encryption as a munition and restricted its export. While opposing export in general, the government was licensing the export of implementations that were restricted to a forty bit key. Of course, 56 bit was then the norm and, at the time, expensive for the NSA to crack.
IBM had just purchased Lotus Notes and wanted to,export it. In order to get a license, they negotiated an agreement under which they would encrypt 16 bits of the 56 bit message key under a public key provided by the government and attach it to the message or object. This would mean that while the work factor anyone else would be 56 bits, for the government it would be only 40 bits.
Viewed today, 40 bit encryption is trivial; twenty years ago it was strong enough that, while the government could read any message that it wanted to, it could not read every message that it wanted to. Said another way, it would be able to do intelligence, or even investigation, but it still would not be able to engage in mass surveillance.
Moreover, we believed that the NSA only collected,traffic that crossed our borders, that it could not be used against citizens. We believed that the government could keep,their private key secure. Of course, post "warrant-less surveillance," the routine breaches of government computers, including those of the NSA,and the exponential growth of computing power over a generation, this all seems very naive.
However, I like,to think that it illustrates that it is possible to craft solutions that grant authorized access to the government, with a work factor measured in weeks to months per message, file, device or key, while presenting all,others with a cost of attack measured in decades or even centuries.
It also illustrates the fundamental, application, and implementation-induced limitations of any such scheme, limitations that would have to be compensated for. No such scheme will be fool-proof, nor need it be. Like our other institutions and tools, it need only work well enough for each intended application and environment.
IBM had just purchased Lotus Notes and wanted to,export it. In order to get a license, they negotiated an agreement under which they would encrypt 16 bits of the 56 bit message key under a public key provided by the government and attach it to the message or object. This would mean that while the work factor anyone else would be 56 bits, for the government it would be only 40 bits.
Viewed today, 40 bit encryption is trivial; twenty years ago it was strong enough that, while the government could read any message that it wanted to, it could not read every message that it wanted to. Said another way, it would be able to do intelligence, or even investigation, but it still would not be able to engage in mass surveillance.
Moreover, we believed that the NSA only collected,traffic that crossed our borders, that it could not be used against citizens. We believed that the government could keep,their private key secure. Of course, post "warrant-less surveillance," the routine breaches of government computers, including those of the NSA,and the exponential growth of computing power over a generation, this all seems very naive.
However, I like,to think that it illustrates that it is possible to craft solutions that grant authorized access to the government, with a work factor measured in weeks to months per message, file, device or key, while presenting all,others with a cost of attack measured in decades or even centuries.
It also illustrates the fundamental, application, and implementation-induced limitations of any such scheme, limitations that would have to be compensated for. No such scheme will be fool-proof, nor need it be. Like our other institutions and tools, it need only work well enough for each intended application and environment.
Monday, February 22, 2016
US v. Apple
SUNDAY: Comey tries to downplay the dispute, arguing in his new statement that no precedent would be set if Apple would just go along.
"I hope folks will take a deep breath and stop saying the world is ending, but instead use that breath to talk to each other," he said.
"Although this case is about the innocents attacked in San Bernardino, it does highlight that we have awesome new technology that creates a serious tension between two values we all treasure — privacy and safety," he said, adding:
"We simply want the chance, with a search warrant, to try to guess the terrorist's passcode without the phone essentially self-destructing and without it taking a decade to guess correctly."
This sounds like capitulation to me. If this is now about the "victims," then the government made a serious mis-step in attacking Apple in the first place. However, the government's current position does not support a charge of "government over reach."
The issue of how far the government may go in coercing the unwilling and the un-involved to assist them in recovering evidence that they are otherwise entitled to is important and needs to be litigated. We should be glad that Apple is prepared to fight it. Perhaps not since Runnymede has the King had a more formidable adversary. However, this is not the right case to fight it on.
There is ample precedent for un-involved citizens to voluntarily assist the government. It would not be precedent setting for Apple to voluntarily assist with this one mobile in this one case. Apple should "declare victory and go home." It should do here what it can do and fight the government over reach issue when the government is more certainly guilty of it.
Labels:
1789,
all writs,
Apple,
civil liberties,
coercion,
encryption,
FBI Comey,
over-reach,
privacy,
security
Subscribe to:
Comments (Atom)