A final thought on Spectre and Meltdown: while you’re updating your systems, don’t forget about your video cards. Modern cards have powerful processors. Even if the card itself isn’t vulnerable, there could be interactions between the video card and the main CPU that could be exploited. Nvidia is currently releasing new drivers that eliminate at least one such vulnerability.
Moving on.
In the latest sign of the impending Collapse of Civilization, a couple of Apple’s shareholders, the California State Teachers Retirement System and Jana Partners, are demanding that Apple modify their products to avoid hurting children.
Let that sink in for a moment.
Okay, ready to continue. Yes, there is evidence that overuse of smartphones (or, I suspect more accurately, apps) can result in feelings of isolation, anxiety, and depression. But the key word there is “overuse”.
The groups say that because the iPhone is so successful, Apple has a responsibility to ensure they’re not abused.
Apparently, less-successful companies don’t have a similar responsibility to their users, but leave that aside.
Apple certainly doesn’t have an unfulfilled legal responsibility here. So I’m assuming the groups believe Apple’s responsibility is moral. The same moral responsibility that forces companies that make alcoholic beverages to make them less attractive to teenagers and to promote them in ways that don’t make them seem cool. Ditto for the companies that make smoking and smokeless tobacco products, automobiles, and guns.
There are bigger, more important targets for Jana Partners and CSTRS to go after, in other words. But leave that aside too.
What their argument seems to boil down to is that Apple isn’t doing enough to protect the children who use their devices.
Keep in mind that currently parents to can set restrictions in iOS to limit which apps kids can use (including locking them into one specific app) and to require parental approval to buy apps or make in-app purchases.
The groups’ letter asks that Apple implement even finer degrees of control, so that parents can lock out specific parts of apps while allowing access to others.
Technically, that could be done, but it would be a programming and testing nightmare–and make customer support even more hellish than it already is. Every app would have to be modularized far more completely than they are now. That often results in apps getting larger and more complicated as critical functionality gets duplicated across the app, because developers can’t count on being able to invoke it from another module.
And just how fine-grained would it have to be? Could a parent prevent their kids from, say, messaging anyone with certain words in their user name? Or only prevent them from messaging anyone? Would Apple have to implement time-based or location-based restrictions so certain parent- or teacher-selected functions couldn’t be used at school?
How about a camera restriction that prevents teenagers from taking pictures of anyone the age of eighteen? That’ll stop sexting dead in its tracks, right?
The groups’ other suggestion is that Apple implement notifications to remind parents to talk about device usage with their kids.
Sorry, but if the parents aren’t already paying attention to what their offspring are doing on their phones, popups aren’t going to suddenly make them behave responsibly.
And that’s really where the responsibility lies: with parents. Responsible parents don’t buy their underage children booze and smokes, they don’t let their kids get behind the wheel on I-5 before they have a driver’s license, and they don’t leave their guns where their rugrats can get to them.
It’s a good goal, guys, but the wrong approach.