What a Waste

Lots of interesting news at the intersection of privacy and security these days. The ongoing Apple/FBI feud is only a tiny piece of it.

Consider, for example, the case of Paytsar Bkhchadzhyan. It seems that not all locking methods are created equal in the eyes of the law.

Things you know, such as a password, are legally protected: you can’t be forced to give them up because that would infringe on your constitutional right not to testify against yourself.

But things you own, like a PIN fob, or things you are, like a fingerprint, are not protected.

Accordingly, a court has ordered Ms. Bkhchadzhyan to give investigators her fingerprint along with her iPhone. It’s unclear whether they’re holding her fingerprint–and presumably her finger–while searching the phone.

Mind you, there’s still some wiggle room in the legal interpretation. Ars also has a report on a man who’s been held in jail for seven months for refusing to supply the password to decrypt a pair of hard drives.

His lawyer has invoked the Fifth Amendment privilege against self-incrimination, but to date the legal system appears to believe that the now-infamous All Writs Act–the same law the FBI was trying to use against Apple–supersedes the Constitution.

So, pending the result of the current appeal, using a passcode doesn’t seem much safer than a fingerprint.

Not all the news is bad, however. In a case that will mostly be of interest to residents of Washington State, a King County judge has ruled that sanitation workers cannot dig through trash while collecting it.

Seattle required workers to inspect trash to ensure that food waste went into compost bins instead of trash. However, the judge held that amounted to a warrantless search, and was forbidden under the privacy provisions of the Washington State Constitution.

It’s a minor victory for privacy, yes. And sanitation workers can–and will–still check for compostable materials “in plain view.” But at least they won’t be able to open garbage bags and dig through them checking for compliance.

We’ll take our victories where we can find them.

Here We Go Again, Again

More proof that magical thinking knows no political boundaries.

The latest attempt to deny reality comes from Senators Dianne Feinstein and Richard Burr, a Democrat from California and a Republican from North Carolina*, respectively.

* What’s up with lawmakers from North Carolina these days? Are they totally devoid of common sense?

Their joint venture, the delightfully named “Compliance with Court Orders Act of 2016” bill is intended to force anyone–make that ANYONE–who makes hardware or software that allows communication or data storage to ensure that law enforcement officials have access.

If the bill becomes law and you offer communication, be it of voice or data, you must provide a way for the police, FBI, NSA, and any other federal, state, or local agency to listen in.

Similarly, if you store data or provide equipment for storing data, you’ll have to include a method for all those groups to bypass whatever encryption you offer.

Obligatory disclaimer: I’m not a lawyer.

That said, the bill is written so broadly that I can see it being applied to pencil and paper. Make up your own code to protect the notes you’re taking for blog post on corruption in your city government? If the local police department gets a local judge to agree, you’ll be forced to decode your notes.

Maybe I’m wrong about that. Another interpretation would be that the companies that made your pencil and paper might be required to break your code. I really doubt that’s a valid interpretation, but I also really doubt that someone wouldn’t try it.

Hopefully I’m wrong about both of those interpretations, but it’s pretty clear that if you store your notes on a computer, the hardware manufacturer, the OS maker, and the company that wrote the software you used would all be required to make your notes available to the cops when they come knocking. For that matter, if you upload the software from your notebook to a server somewhere, the server owner and the company that made your network software (i.e. your web browser) are also on the hook.

Bad enough, but all of those manufacturers are also required to obey all laws about protection of your privacy and security. In other words, that access can only be available to law enforcement and only when they have a court order in hand.

We’ve talked about this before. As I said then, “…there’s no such thing as a backdoor that can only be used by authorized people. If there’s a way to bypass or remove encryption, crackers–independent, criminal-sponsored, and government-sponsored will find a way to use it.” That’s still true. That will always be true.

Fortunately, President Obama has already said he won’t support the bill, and Senator Ron Wyden–probably the strongest proponent of privacy in the Senate today–has promised to filibuster any attempt to enact it.

But as the Republican leadership keeps reminding us, there’s an election coming up. We can’t count on Senator Wyden remaining on the job forever, nor can we trust that the next president–whichever party he or she belongs to–will be as sensible as President Obama.

I’m not suggesting that anyone should make this fall’s elections a single-issue matter. But if you don’t at least consider your candidates’ positions–all of your candidates at all levels–on privacy in general and encryption in particular, you’re doing yourself a grave injustice.

A Bushel of Apples

Yesterday, of course, was Apple Day. Not only did Apple announce new products, but there’s been an interesting development in the battle over encryption.

Let’s start with the new goodies.

Nothing really new for the Apple Watch–unless you like changing the band. We’ve got a set of woven nylon bands coming in a variety of colors. Forgive me if I find that less than enthralling.

Apple TV gets an OS update to include, among other things, Live Photos support. I guess that explains why Apple has been running iPhone commercials focusing on Live Photos recently. (To refresh the memory of those of you who don’t have an iPhone, Live Photos are short, looped movies: you take a photo, and it moves. Basically, it’s the high-resolution, high-color version of an animated GIF.)

There’s a new iPhone coming, the SE. Hardware is similar to the 6S, but with a four-inch screen. Consider it a 6S in a 5S form-factor. Kudos to Apple for catering to those of us who think holding a six-inch slab of glass and metal up to our ears is pretty darn silly.

And on a similar note, we’re also getting a smaller iPad Pro. I’m a little dubious about that. I’ve tried using my Nexus 9 for serious work (writing, naturally), and found it a bit cramped. I have to think the new iPad Pro would be similarly constrained. And let’s not even think about typing on a keyboard scaled down to be the cover for a 9.7 inch screen. I still remember trying to type on a netbook. It wasn’t fun.

There’s a new iOS, of course. 9.3 brings us “Night Shift”. It knows when local sunset is, and starts removing blue tones from the display. Everyone seems to be going nuts for this idea that limiting blue light in the evening will help you sleep better. If I’m not mistaken, all of the excitement comes from a single study that hasn’t been replicated yet, and I have to wonder just how over-hyped the findings are. But in any case, if my iPad starts removing blue tones from videos after dark, I’m going to lose sleep, because I’ll be too busy swearing at it (hint: removing the blues from the Mariners’ uniforms are going to leave them looking peculiar). (Later note: Yes, it can be turned off or changed to a clock-based schedule instead of following the sun.)

Finally, there’s a new framework for application development, CareKit. It builds on last year’s ResearchKit, which is designed to help create medical research applications. CareKit is for apps to help individuals with medical needs. Examples mentioned at the Apple event include post-surgery recovery and monitoring of Parkinson’s Disease. Although they didn’t say so, I suspect that it’s closely tied into the HealthKit framework for fitness apps.

It sounds like there are some interesting app possibilities in CareKit, but there are some significant privacy implications as well. Which, of course, brings us to Apple’s squabble with the FBI.

During the Apple event, Tim Cook reiterated Apple’s belief that they “have a responsibility to help you protect your data and protect your privacy.” In other words, Apple would not give in and obey the court order to write a crippled version of iOS for law enforcement.

Shortly after that, the FBI asked the judge in the case to cancel today’s hearing, saying that they believe they have a way to break into the phone in question without Apple’s help, and they want time to test their method.

It’s unclear where they got the technique. The NSA, perhaps? In any case, if the idea proves out, I imagine they’ll drop the case against Apple, rather than risk a precedent being set that would prevent them from making similar demands for backdoors in the future. And, no doubt, the next version of iOS will include a fix for whatever bug allows the FBI access to the phone.

Stay tuned for free baseball!

Crack!

A federal court has made it official. We knew it was coming, but I don’t think any of us expected it to arrive this promptly. Now we know: as far as the Federal Government is concerned, your right to “life, liberty, and the purfuit of happineff” doesn’t include privacy.

I’m not going to write about it at length. It’s a rainy day, the turkeys are arguing about something incomprehensible outside my window, and I already said most of what I think last Tuesday. Why should I take out my frustration on you?

Bottom line: it’s still worth the time it takes to encrypt your electronic devices, but not by as much as it was last week. And don’t expect it to do you any good if any police officer anywhere takes an interest in you for any reason.

If you want any detail, go read Ars’ take on the news.

Then you can come back here for something slightly more cheerful.


Back? OK, good.

Baseball is back!

OK, OK, so far it’s just pitchers and catchers reporting to Spring Training, but we’ll take it. Position players will be showing up over the next week, and we can look forward to the usual slew of articles telling us which athletes are in “the best shape of their lives” and which ones let themselves go over the off-season.

More importantly, we’re less than two weeks away from the first Spring Training game–as previously noted, between the Phillies and the University of Tampa Spartans*–and that means it’s time to start warming up your MLB app for the season’s radio and TV broadcasts.

* I’ll skip the jokes about “picking on someone your own size,” mostly because I’m not sure who those jokes should be aimed at.

There’s some good news about MLB.TV, too. According to the renewal reminder I received a couple of days ago, the full-season package is $20 cheaper than last year. Even better, if you’re only interested in one team, you can get a “Single Team Package” for $25 less than the regular package.

A price drop? Customer-friendly features? Is anyone surprised that the changes are the result of a lawsuit?

To nobody’s surprise, the changes are part of a legal settlement. In essence, MLB agreed to lower the price of the “Premium” package and introduce the “Single Team Package” to avoid the risk of going to trial and potentially be forced to modify their obnoxious blackout policy.

The Single Team Package is only available for out of market fans–Giants fans in the Bay Area, for example, can’t buy the package to follow their team unless they can prove to MLB that they can’t get satellite or cable TV in their home. That’s “can’t get,” not “don’t want”.

As in years past, out-of-market teams’ games against in-market teams will be blacked out. So if our hypothetical Giants fan moves to LA, he can watch the Giants via either a Single Team or Premium package, except when the Giants are playing the Dodgers or Angels–even if the game is in SF. Interestingly, MLB.TV is offering a limited exception to the blackout rule*. For $10, our Giants fan can also watch the Giants’ broadcasts when they play the Dodgers and Angels. But he’s out of luck if he’s also an As fan. The exemption is only good for a single team. There are also a couple of significant limitations to which fans can purchase the add-on. It can’t be added to a Single Team Package, only the full Premium Package, and it can only be purchased if the fan gets the in-market teams’ games if he subscribes to Comcast cable or DIRECTV satellite service with a package that includes the local teams’ broadcasts. If our Giants fan has satellite service from DISH, or if Comcast drops the Dodgers’ games, he’s SOL.

* This is, IMNSHO, the most significant change MLB agreed to in the settlement. It’s the first, faint hint that MLB might be willing to think about considering the possibility of down-scaling the tight relationship with their BigMedia sponsors.

So, all-in-all, the good news is limited. But fans are certainly no worse off than they were last year, with faint hints of improvement ahead. In today’s climate of lowered expectations, that has to count as a victory.

Signs of Intelligence

Hooray for Representatives Ted Lieu and Blake Farenthold.

Yesterday they introduced a bill in Congress that would prevent any state (or smaller political unit) from requiring encryption backdoors. And yes, their bill is a direct response to the proposed legislation in California and New York that I complained about on Tuesday.

Note by the way, that Rep. Farenthold is a Republican. The bill faces an uphill battle, and early bi-partisan support certainly won’t hurt its chances.

That said, strictly speaking the “ENCRYPT Act of 2016” wouldn’t actually block the proposed laws in California and New York. Those don’t require smartphone manufacturers to include backdoors, they just ban the sale of phones without backdoors in their respective states. I suppose it’s possible that Apple and Google could pull their phones out of the California and New York markets. It’s also possible they could produce a “vulnerable” OS version for sale in those states. But I suspect that just the threat of suspending sales would bring in enough muscle from the telecom companies to squash the bills.

Seriously, can you imagine Verizon, Sprint, T-Mobile, and the rest quietly accepting laws that would prevent them from selling iPhones? They might–might–let Android go, but not iOS.

Note, by the way, that I didn’t include AT&T on that list. As security guru Bruce Schneier pointed out, AT&T CEO Randall Stephenson says that tech companies shouldn’t be in the position of deciding whether to include encryption, with or without backdoors, in their products.

Schneier suggests that the NSA and FBI are steering policy at AT&T; whether he’s right or not, I do have to wonder if the prospect of losing Apple sales in two enormous markets would change Stephenson’s mind.

Stay tuned–and drop your representative a note asking them to support Lieu and Farenthold’s bill.

Meanwhile, Warner/Chappell is giving up their efforts to hold onto the copyright to “Happy Birthday”. According to The Hollywood Reporter, Warners has agreed to a settlement that would release the song to the public domain. An agreement to request a judicial declaration that a work is in the public domain is unusual, to say the least, so it’s possible that the settlement might fall apart at a hearing in March.

But Warner’s decision that the potential income from the song over the next fifteen years wasn’t worth the risk of being penalized for improperly collecting licensing fees if they had lost the class action lawsuit shows rare intelligence from a big media company–a group best known for aggressively hoarding copyrights.

Take Two

It seems that the debate over encryption technology is entering a new phase.

The argument that law enforcement needs a magical* backdoor into every piece of encryption software “to fight terrorism” isn’t making enough headway to satisfy our political masters. If an article by Melody Gutierrez in today’s Chron (only available behind the paywall on the website) is any indication, the new argument is our old friend “think of the children”.

* I don’t need to explain this do I? Oh, all right. For the benefit of anyone who’s been aggressively not paying attention: there’s no such thing as a backdoor that can only be used by authorized people. If there’s a way to bypass or remove encryption, crackers–independent, criminal-sponsored, and government-sponsored will find a way to use it.

As Ms. Gutierrez puts it, “lawmakers and law enforcement groups are pushing bills [in California and New York] to enable investigators to unscramble data […] in human trafficking, terrorism and child pornography cases.”

Worthy causes, certainly, but does anyone really believe it would stop there? Even leaving aside the impossibility of limiting access to the decryption tools to the “right” people, once the capability is introduced, there would be an immediate push to expand the categories of crimes it could be used for.

And the emotional appeals are so over-the-top the article almost reads like a parody. Consider this quote from Assemblyman Jim Cooper, who wrote the bill being considered in California:

“If your kid goes to meet someone and your kid disappears and we find the phone, right now–today, there is no way for us to find out who they were last texting, who they were talking to unless you have the pass code to get in.”

Really, Jim? You haven’t heard of “metadata”? The records of who you call and who you text that the NSA has been using for years to build profiles on Americans who have no connection to terrorism? You don’t even need to “find the phone” to get that information. You just need the kid’s phone number!

Be honest, Jim. You know fully well what you’re talking about is the content of those texts, which are stored on the phones. You even admit it:

“The biggest thing is the information we want to get is from your phone at rest, not information traveling over the airwaves. This won’t affect 99.9 percent of the public.”

Yes, Jim, it will affect that 99.9%. Again, anything that weakens encryption for some people weakens it for everyone.

Feh! You can’t trust the company that makes your device to keep your data safe, and you can’t trust the government either. My advice is to make sure that your devices are encrypted now, while it’s still legal for companies to sell them without government-mandated backdoors. Turn the encryption on now (instructions below) and be prepared to refuse any future OS update that introduces a backdoor. Oh, and don’t forget to encrypt your backups!


On iOS 8 and above, you enable the device encryption by turning on the passcode feature. Go to “Settings | Touch ID & Passcode” (or “Settings | Passcode” if your phone or tablet doesn’t have Touch ID hardware). Tap “Turn Passcode On”. Tadaa!

Android is a little more time-consuming. Google advises you to set aside at least an hour for the encryption. (These instructions apply primarily to Nexus devices; non-Google devices may differ to a greater or lesser degree.) Once you have the time blocked out, set a lock screen pin, pattern, or password. Plug in the charger (encrypting the entire device will burn through your battery like crazy). Then go to “Settings | Security | Encrypt phone” (or “Encrypt tablet”) and follow the instructions.

Paranoia

Paranoia is a normal operating mode for the human mind.

Sad, but true. Case in point: I was sick recently. The details aren’t important for this discussion, so I’ll spare you the unpleasantness. What is important is my immediate reaction to the first symptoms: “Why is this happening to me?”

The answer that came to mind was “Because the Universe is out to get me.” Not very helpful, huh? Before you laugh, though, ask yourself if you would have answered any differently the last time you found yourself spewing unpleasant material from one orifice or another. Admit it: that’s the universal answer, isn’t it?

Why do we default to the paranoid response? It’s a survival tactic. It puts us in the proper mind-space to answer the next question: “How did the Universe attack me?”

We examine everything microscopically: “Did someone at the theater sneeze on me?” “Was dinner undercooked?” “Did that glass of elderberry wine those old ladies gave me taste like bitter almond?”

When we look that closely at everything around us and everything we do, we’re certain to arrive at the right answer: “It’s a miracle anyone lives long enough to celebrate their first birthday.”

Pardon me. Still a little paranoid, I guess. Obviously, the correct answer is going to vary. In my case it turned out to be “I don’t have the faintest idea how I got sick.”

Maybe paranoia was a more viable technique a few thousand years ago when there were fewer possible answers to the kinds of questions we try to answer paranoically.

Joking aside, paranoia is a useful technique. When you come right down to it, QA and defensive design are examples of systematic paranoia: you examine your subject minutely and ask yourself “What are all the ways this program could try to kill me?” and then you design tests or write code to handle all of those scenarios.

When paranoia gets you into trouble is when it’s the only mode of operation you’ve got. Because paranoia doesn’t deal with likelihood. In the paranoid approach, everything is either a risk or it isn’t. That old lady on the bus is just as likely to attack you with her cane as the guy sitting next to her, fondling the AK-47 in his lap*.

* What, people don’t routinely carry automatic weapons on the bus where you live? Are you sure? Have you checked lately?

Once you have your list of dangers identified, you need to turn off the paranoia and figure out how likely they are to bite your ass.

Need some practice? Allow me to suggest several questions that really, really need an application of non-paranoid risk assessment:

  • What to do about Syrian refugees–or any other refugees, for that matter.
  • Whether to require all encryption software to have a government backdoor.
  • Whether to drive across the Bay Bridge.

Have fun!