a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by user-inactivated
user-inactivated  ·  2932 days ago  ·  link  ·    ·  parent  ·  post: Why Apple is completely screwed in the FBI/San Bernadino case

    Using the "key" analogy, you're arguing that the courts can compel Apple, the safe manufacturer, to make a key to a customer's safe. That's hardly the same thing as providing an existing key. More than that:

I agree with you here, it's definitely true that the difference here is that instead of compelling a living owner of the safe/phone, the compelling is occurring against the manufacturer. I just did about two hours worth of case law search and I can't find a single precedent that even brings the safe manufacturer in the court room (I am not a lawyer, and I didn't search every case ever existing and am not the best at it, but I did many searches and read many cases just now).

That means this case is very unique and very new to courts. I still think that at the very minimum the amount of effort and expenses required for the FBI to perform this task is going to end up brought up and possibly be a strong swaying part of the court discussion. There are other unrelated laws that allow the government to pay industry standard rates for things (I can't think of examples other that Eminent Domain laws, which are completely unrelated to this), so it could be argued that if Apple can do it for $X, and the FBI would require $X+$Y dollars to perform the task, the undue burden could be resolved simply through the $X industry standard payment. It is definitely easier for Apple to do this (they just have to use their existing EFI key to sign the code the FBI wants which is just a simple integer change rather than the FBI disassembling and reading the chip by hand).

Your argument at least gives me some hope in the case, but I'm still skeptical that Apple will win. The FBI would likely have wanted to do such things for many years and have probably prepared an extensive case for themselves, and laws can always be misinterpreted to mean whatever you feel like since there are so many of them and they are so wordy instead of clear and straightforward like they should be.

    Apple is arguing that the "key" that the FBI is requesting renders all future safes useless.

As I understand it (correct me if I'm wrong), this actually doesn't render future safes useless. I was under the impression (obviously I have no evidence to support this claim) that the newest iPhone has mitigated this threat already.

That being said, from a security design perspective I can't imagine how they could have developed a new system that does that. Because everyone uses PIN codes that are 4 digits as I was saying, their entire encryption model is massively flawed. This isn't likely to change, either, as not very many people are willing to use strong enough secrets.

I do find it strange that the FBI has focused way too much on the fact that the EFI key is so heavily buried in the chip, because that key is meaningless if they have the ciphertext and the algorithms used, so perhaps it was a good thing that the ACLU showed an alternate method because the FBI keeps focusing on this one thing.

--

I hadn't read the document you gave until after I wrote the above, so this is kind of separate. I went out and dug for the actual full document, which is available here (40 page ruling):

https://epic.org/amicus/crypto/apple/Orenstein-Order-Apple-iPhone-02292016.pdf

So this case could be strikingly different on first look, since the case involves a defendent that actually plead guilty. So this case in particular is the government v. an individual, where the individual is on trial. In the FBI v. Apple case, it's much different. I don't know what that means in regards to to case specifically, but it's worth noting that they weren't attempting to determine guilt. In both cases, the government is trying to determine potential co-conspirators, though. Just a note, as the comments the judge makes on the laws themselves are definitely transcendent to any structure of any case.

Most of the judge's argument here is that the all writ's act itself fails to comply with existing law based on how it is worded (pg 11):

    The plain text of the statute thus confers on all federal courts the authority to issue orders where three requirements are satisfied:

1. issuance of the writ must be "in aid of" the issuing court's jurisdiction;

2. the type of writ requested must be "necessary or appropriate" to provide such aid to the issuing court's jurisdiction; and

3. the issuance of the writ must be "agreeable to the usages and principles of law"

The part he claims it doesn't satisfy is #3, the first two apparently hold. He then drags it into CALEA as the law that it is in conflict with (pg 17):

CALEA:

    Information services; private networks and interconnection services and facilities.  The requirements of subsection (a) of this section do not apply to –

(A) information services; or

(B) equipment, facilities, or services that support the transport or switching of communications for private networks or for the sole purpose of interconnecting telecommunications carriers.

Apple's argument:

    Under CALEA "information services" means the "offering of a capability for generating, acquiring, storing, transforming, processing, retrieving, utilizing, or making available information via telecommunications," and "includes a service that permits a customer to retrieve stored information from, or file information for storage in, information storage facilities; electronic publishing; and electronic messaging services." [47 U.S.C.] § 1001(6).

    Apple is substantially engaged in developing and offering products that provide such capabilities. For example, Apple’s iTunes service allows customers to purchase, store, and access music, movies, television shows, games and apps via an Internet-connected Apple device, such as an iPhone[.] iTunes thus constitutes an "information service" under CALEA by providing "a capability for ... acquiring, storing ... [and] retrieving .  . . information via telecommunications." Id. Similarly, iMessage allows Apple customers (connected over the Internet) to communicate by messages sent and received via their iPhone[.]

I'm going completely off topic now as I think this particular set of arguments is way more important than the iPhone case now. If you think about the CALEA usage in the past, while it was used by the NSA to compel AT&T/Verizon/etc (although it sounded like they didn't need much convincing), it was also used to compel information service providers (Google/Facebook/PRISM stuff). If this argument is in fact upheld (I assume that the FBI is bringing this case up to the Supreme Court, I doubt they'd just accept a loss), this actually is a major thing. This means that only telecommunications providers can be compelled through CALEA.

Think about Lavabit. CALEA was used IIRC to compel him to deliver his private SSL key to the FBI. It's been kind of used as a bullying tactic for information service providers since none have sufficiently combatted this tactic to have a proper ruling. I think if someone actually did do this, we could actually prevent this sort of tactic being used.

Even more off-topic: If you think Lavar Levinson is some sort of hero, think again. He screwed the American people in that case by representing himself and not knowing the procedures of the circuit court he appealed to, and essentially entered a null argument for his own case when the circuit court was not legally allowed to look at his previous ruling. DO NOT EVER REPRESENT YOURSELF IN COURT!!!

Bringing it back on the topic of the iPhone case, I also want to mention that this particular case is also a drug related offense and not a national security ruling. Many laws end up get interpreted (4th and 5th amendments included) to have "exceptions" for national security much like the first amendment has exceptions to free speech for yelling "fire" in a movie theater. It's so strange and really unnerving since the 4th and 5th amendments were clearly written for the purposes of national security since the American Revolution was clearly about the british soldiers searching, setting up shop in houses, etc, during wartimes.

Just some thoughts, I guess I have no clear-cut opinion on this anymore just wanted to respond with some information I found. Only opinion I firmly have is that being a lawyer or judge would suck because most of this stuff is weirdly chained together between law to law to facet of another law, opinion here, no opinion here, etc.





kleinbl00  ·  2932 days ago  ·  link  ·  

I think it comes down to this:

The FBI will get into the phone. That's never really been at issue. The question at hand is how easily they will get into the phone, and what that means for device manufacturers moving forward.

the FBI is arguing that they can compel Apple to cripple encryption after the fact. Apple is arguing they can't. It's a hell of a precedent to set, and most manufacturers have been trying hard not to set it. American manufacturers are already hindered by laws against strong encryption in the United States; it doesn't effectively keep Americans from using them, but it does keep American manufacturers from selling them, giving European and Asian vendors a leg up. Aside from issues of civil liberties (which I in no way mean to discount), giving American law enforcement agencies the precedent of cracking into any American device however they want whenever they want because they decide they should is likely to have a chilling global effect on American technology companies.

And the thing of it is, weak encryption is adequate for most people. They're not looking to protect their information from the NSA, they're trying to keep their credit card info safe from their children. They want to know that if they drop their phone on the street, they have a few hours to wipe it remotely before someone hooks up a dongle to it to brute-force the combo. The type of encryption and the methods of decrypting it aren't really at issue.

It comes down to the government wanting the legal right to compel companies to disable their security features whenever the government says so and Apple, in this case, arguing that complying with the government would be bad for their customers and bad for their bottom line.

And I suspect that Apple will win.

user-inactivated  ·  2932 days ago  ·  link  ·  

    And the thing of it is, weak encryption is adequate for most people. They're not looking to protect their information from the NSA, they're trying to keep their credit card info safe from their children. They want to know that if they drop their phone on the street, they have a few hours to wipe it remotely before someone hooks up a dongle to it to brute-force the combo. The type of encryption and the methods of decrypting it aren't really at issue.

True only because if you're a criminal attacking the encryption is the stupid way to get at someone's data. A dictionary attack on stupid passwords, or phishing, or some other sort of social engineering is much better. Getting at the data though the user is almost always going to be easier than through the machine.

But strong encryption is still important because it creates trust. "Even the resources of a nation-state would be insufficient to crack this" engenders trust; "only thieves willing to shell out for some FPGAs can crack this" does not.

user-inactivated  ·  2932 days ago  ·  link  ·  

You are right about nearly everything in your statements, I don't really want to respond to all of it I just want to respond and make clear one thing here that I think you are already aware of. I feel weird that I have to add this disclaimer since generally when I read people quote a single part of an statement and comment on just that it looks like they are trying to take down the whole factual platform of the other guy off one thing, which I'm not. It's weird how quotations and discussions that are civil can even look confrontational when they are direct responses back and forth between two users.

Anyway, I just wanted to point out:

    the FBI is arguing that they can compel Apple to cripple encryption after the fact. Apple is arguing they can't. It's a hell of a precedent to set, and most manufacturers have been trying hard not to set it.

Technically they aren't crippling encryption after the fact, it was poorly designed from the beginning and they instead relied on tamper-proofing the device itself to protect the data on the phone. It's not actually possible to cripple encryption after the fact without solving a new research problem on the algorithms themselves. It happens from time to time, but would be remarkable on some of the most well researched algorithms like AES and RSA that the NSA can't even get around. PRISM and many other programs wouldn't need to exist if they could break AES or RSA (keep in mind the usage of "or" and not "and" there, if they break one the other naturally fails in hybrid encryption like SSL/TLS for instance).

    American manufacturers are already hindered by laws against strong encryption in the United States

I'm actually unaware of this, could you talk about which laws you are referring to? I know of no strong encryption laws on the books, as basically every project that calls SSL (basically everything) is using strong cryptography. Every phone or other device manufacturer simply by installing OpenSSL have packaged in strong crypto and regularly use it.

kleinbl00  ·  2932 days ago  ·  link  ·  

    Technically they aren't crippling encryption after the fact, it was poorly designed from the beginning and they instead relied on tamper-proofing the device itself to protect the data on the phone.

See, I'm not sure that distinction matters. It's like this:

- the device shipped with mediocre encryption, protected by tamper-proofing

- stronger encryption is currently available

- the FBI wants an end-run around the tamper-proofing

So whether the FBI wants to cripple "encryption" or "security" is a legal point, to be sure, but the precedent set is all about the after-the-fact part. They want to be able to compel a company to crack open something that was secure. That makes everything that is secure potentially insecure whenever the government can shove a writ through.

You're right - strong encryption will protect you. But there's also the pain-in-the-ass factor: if most people are using weak encryption, then using weak encryption is a great way to blend into the crowd. If everyone uses strong encryption, then using strong encryption becomes anonymous. In the NSA/FBI/CIA/TLA's horror world, everyone shifts to strong encryption, meaning that they can't single people out just by what encryption they're using.

And then, they're sure going to want to be able to compel Apple into cracking that strong encryption, rather than just weak encryption.

    I'm actually unaware of this, could you talk about which laws you are referring to? I know of no strong encryption laws on the books, as basically every project that calls SSL (basically everything) is using strong cryptography.

https://en.wikipedia.org/wiki/Export_of_cryptography_from_the_United_States

http://www.cryptolaw.org/cls-sum.htm