As promised, here’s my usual cynical rundown of all the exciting things Google announced in the I/O keynote. As usual, thanks to Ars for the live stream.
Looks like a great year ahead, doesn’t it? See you Thursday.
Okay, okay. I just had to get that out of my system.
First up, Sundar admitted to Google’s well-publicized failures with the cheeseburger and beer emojis. It’s great that they’ve been fixed and that Google has apologized publicly. But when are they going to apologize for their role in inflicting emojis on us in the first place?
Google has been testing their AI’s ability to diagnose and predict diabetic retinopathy and other health conditions. I’m hoping this is not being done via smartphone. Or, if it is, it’s fully disclosed and opt-in. I’m quite happy with my medical professional, thanks, and I really don’t want my phone to suddenly pop up a notification, “Hey, I think you should see an ophthalmologist ASAP. Want me to book you an appointment?”
I do like the keyboard that accepts morse code input. That’s a nice accessibility win that doesn’t have any glaring detrimental impact on people who don’t need it.
That said, I’m less enthusiastic about “Smart Compose”. I’m not going to turn over writing duties to any AI. Not even in email.
But I do have to wonder: would it improve the grammar and vocabulary of the typical Internet troll, or will it learn to predict the users’ preferences and over time start composing death threats with misspellings, incoherent grammar, and repetitive profanity? Remember what happened with Microsoft’s conversational AI.
And I’ve got mixed feelings about the AI-based features coming to Google Photos. I pointed out the privacy concerns about offering to share photos with the people in them when Google mentioned it last year. Now they’re going to offering the ability to colorize black and white photos. Didn’t Ted Turner get into trouble for doing something of the sort?
More to the point, how many smartphones have black and white cameras? Taking a B&W photo is a conscious decision these days. Why would you want Google to colorize it for you?
Fixing the brightness of a dark photo, though, I could totally get behind.
Google Assistant is getting six new voices, including John Legend’s. Anyone remember when adding new voices to your GPS was the Hot Thing?
More usefully, it’ll remain active for a few seconds after you ask a question so you don’t have to say “Hey, Google,” again. Which is great, as long as it doesn’t keep listening too long.
That said, it’ll help with continuing conversations, where you ask a series of questions or give a sequence of commands; for example, looking up flights, narrowing down the list, and booking tickets.
And, of course, they’re rolling out the obligatory “teach little kids manners by forcing them to say please” module. If it starts responding to “Thank you,” with “No problem,” I will make it my life mission to destroy Google and all its works.
Smart displays–basically, Google Home with a screen–will start coming out in July. I can see the utility in some areas, but I’m not going to be getting one. On the other hand, I haven’t gotten a screenless GH, nor have I enabled Google Assistant on my phone. I just don’t want anything with a network connection listening to me all the time. But if you’re okay with that, you probably ought to look into the smart displays. It will significantly add to the functionality of the home assistant technology.
Good grief! You thought I was joking about your phone offering to make a medical appointment for you? Google isn’t. They’re going to be rolling out experimental tech to do exactly that: your phone will call the doctor’s office and talk to the receptionist on your behalf.
Not just no. Not just hell no. Fuck no! No piece of AI is going to understand my personal constraints about acceptable days and times, the need to coordinate with Maggie’s schedule, and not blocking my best writing times.
Google is rolling out a “digital wellbeing initiative” to encourage users to get off the phone and spend time with human beings.
Just not, apparently, receptionists and customer service representatives.
It’s a worthy cause, but let’s face it: the people who would benefit most won’t use it, either because they don’t recognize the problem, or because being connected 24/7 is a condition of employment. I’m sure I’m not the first to point out that Google employees are likely to be among the most in need of the technology and the least likely to use it.
The new Google News app will use your evolving profile to show you news stories it predicts will interest you. No word on whether it’ll include any attempts to present multiple viewpoints on hot-button topics, or if it’ll just do its best to keep users in their familiar silos. Yes, they do say it’ll give coverage “from multiple sources” but how much is that worth if all the sources have the same political biases bases on your history of searches? Let’s not forget that Google’s current apps with similar functionality allow you to turn off any news source.
Android P (and, as usual, we won’t find out what the P dessert is until the OS is released) will learn your usage patterns so it can be more aggressive about shutting down apps you don’t use.
It’ll offer “App Actions” so you can go straight from the home screen to the function you want instead of launching the app and navigating through it.
Developers can export some of their content to appear in other apps, including your Google searches.
The AI and machine learning functionality will be accessible to developers. Aren’t you thrilled to know that Uber will be able to learn your preferences and proactively offer you a ride to the theater?
And, of course, the much-ballyhooed navigation designed for a single thumb. The “recent apps” button will go away and the “Back” button will only appear when Android thinks it’s needed. And some functionality will be accessible via swipes starting at the “Home” button. Because the “Back” button wasn’t confusing enough already.
I do like the sound of a “shush” mode that triggers when you put the phone face down. I’m using a third-party app to do that with my phone now. Very handy when you want to be able to check in periodically, but don’t want to be interrupted. Sure, you can set the phone to silent, but putting it face down is faster and you don’t have to remember to turn notifications back on.
On to Google Maps.
It’s going to start letting you know about hot and trending places near you and rate them according to how good a fit they are for you. I’ve got serious questions about how well that’s going to work, given the number of times Google’s guessed wrong about which business I’m visiting. If they start telling me about popular Chinese restaurants because there’s a Panda Express next door to the library, I’m gonna be really peeved.
Oh, and businesses will be able to promote themselves in your personalized recommendations. How delightful. Thanks, Google!
Okay, the new walking navigation sounds useful. Hopefully it will learn how quickly you walk so it can give reasonably accurate travel time estimates. Hopefully there’s also a way to get it to make accommodations for handicaps.
Of course, if you don’t want to walk, Google–well, Waymo–will be happy to drive you. Their self-driving program will launch in Phoenix sometime this year. Which seems like a good choice, since they’re unlikely to have to deal with snow this winter.
I guess people in Phoenix will be getting a real preview of Google’s future. Not only will their phones preemptively book their medical appointments, but they’ll also schedule a self-driving car to get them there. Will they also send someone along to help you put on the stylish white jacket with extra-long sleeves and ensure you get into the nice car?