The new top-of-the-range iPhone does away with the home button and its built-in fingerprint reader in favor of a new biometric -- called Face ID -- which uses a 3D scan of the user's face for authenticating and unlocking their device. It also replaces Touch ID for Apple Pay too.
Apple suggests this is an advancement over a fingerprint reader because it's an easier and more natural action for the user to perform -- you just look at the phone and it unlocks; no need to worry if you have wet fingers and so on. Apple is working the convenience angle hard.
However offering to gate the smorgasbord of personal content that lives on a smartphone behind a face biometric inevitably raises lots of security questions.
And of course there's already a mountain of high-pitched Twitter chatter on the topic, including speculation about whether the face of someone who is dead or sleeping, or otherwise unwilling to unlock their device in your presence, could be used to do so against their will.
This is exacerbated by existing face unlock systems on smartphones having a dire reputation.
A different facial recognition unlock feature used by Samsung has, for example, been shown to be fooled with just a photo of the face in question -- making it laughably insecure in a digital era where selfies are traded publicly as the standard social communication currency...
Using your face as an authentication factor is a very bad idea, Part 383855 of an endless series. https://t.co/i815zU8dAd
— Eva (@evacide) September 7, 2017
Not to single Samsung out here. Android had a face unlock feature that could be just as easily spoofed way back in 2011. Even a subsequent version of Android Face Unlock, which required users to blink before it would unlock and give up its secrets, was shown to be conquerable with a sly bit of photoshopping.
However it's clear that Apple has packed in both a lot more hardcore technology and a lot more thought to try to put its implementation of facial biometrics on a more solid footing.
The iPhone X's camera is not just looking for a 2D image of a face; the sensor-packed notch at the top of the device includes a dot projector, flood illuminator and infrared camera, as well as a traditional camera lens, so it's able to sense depth and read face-shape (including in the dark).
As we wrote yesterday, it’s essentially an Xbox Kinect miniaturized and put on the front of your phone. Ergo, Face ID would interpret a photo of a face as a flat surface -- and therefore not actually a face.
Although the proof of the pudding will be in the eating, as they say.
There was a brief on-stage demo fail when an iPhone X apparently failed to identify Craig Federighi's face, and therefore wouldn't unlock -- displaying the other potential problem here, given that a tech that's too unyielding in opening up to its owner may be highly secure but it won't be at all convenient.
The Apple exec's first reaction at being unexpectedly locked out appeared to be to wipe sweat from under his eyes -- suggesting the sensors may be confused by shine. We'll have to wait and see.
Face ID needs your attention
Yesterday, Apple showed how the iPhone X user has to record a 3D scan of their face from multiple angles, with the interface asking them to tilt and turn their head to enroll the biometric.
The biometric is of course stored locally, in the secure enclave, so it does not leave the device.
Apple also revealed that it's created neural networks to mathematically model faces so that the tech can be smart enough to adapt to the changing landscape and aspects of a person's face -- such as if they start wearing glasses, or get a new hairstyle, put on a scarf or grow a beard (less clear: Whether it works if a user is wearing a fuller face covering) -- apparently training their model with more than a billion images of faces from around the world.
The risk of bias in the training data here is obvious. But Apple at least sounds confident that it's nailed the technology, claiming the overall risk of another person being able to unlock someone's device is 1 in one million.
It also said Face ID cannot be fooled by photographs of faces, and noted testing the system against face masks -- seeming confident that even a photorealistic face mask won't fool it, likely on account of the infrared sensor. (Though one wonders whether a heated silicone face mask might not do the trick... )
It did confirm that Face ID does get confused by identical twins, as you'd expect.
More interestingly, Apple said that Face ID needs "your attention" -- specifying that means a user's eyes have to be open and on the device for Face ID to work. So it appears it will require some kind of user interaction to successfully unlock it, not just for the face to be in the sensors' line of sight.
This is one of the most interesting unknowns here.
Demos of Face ID yesterday in Cupertino were locked to Apple staff, so we haven't yet had the chance to freely play and test its parameters. But TechCrunchers who were in Cupertino suggested it was not that easy to trigger Face ID, and that a person would only have to screw up their eyes for it not to work.
Again, though, it's unclear how much and how active a user's ocular attention needs to be for the device's virtual padlock to pop open.
Could someone pry open a sleeping or deceased person's eyeball to pass muster with Face ID? Or do eyes have to be seen to move -- and to move willingly -- towards the phone before it will unlock?
What about if you sweep your eyes intentionally elsewhere to try to avoid looking at the device? Will the phone read that as your attention being willingly averted?
We don't know yet. Testing this phone is going to be fun for sure.
But forcing someone to put a finger on a phone screen seems at least theoretically easier than compelling a person to open their eyes and look a particular way if they don't want to. So you could argue that Face ID is a slight step up on Apple's Touch ID fingerprint biometric.
Albeit, that might also depend on how much time you have on your hands to try to trick the iPhone X user into looking at their phone. Or how much force you're willing to expend...
— Shane Richmond (@shanerichmond) September 12, 2017
Safe to say, a lot rides on how Apple is interpreting and reading the user's gaze.
But even if Cupertino's engineers have designed this aspect of the tech in a very thoughtful and highly attention-tuned way, there's no getting away from the fact that biometric security tends to make security experts uncomfortable.
Biometrics vs passcodes
And with good and multiple reasons. Not least the salient fact that you can't change a biometric if that highly detailed 3D scan of your face, say, happens to leak.
Biometrics are also less secure than using a (strong) passcode. Though of course a poorly chosen passcode is a security nightmare. (Apple offers multiple options for iOS passcodes -- default requiring a six-digit passcode, but also supporting longer strings of letters and numbers if a user chooses. Though it also lets users revert to a four-digit passcode if they really want to.)
Security is, as ever, a spectrum. And consumer-grade biometrics sit pretty low down the ladder -- best used in combination with additional, more robust measures in multi-factor authentication scenarios. If you're going to deploy them at all.
Passcodes and passwords have another advantage over biometrics too -- in that they appear to offer more legal safeguards against state agents forcibly unlocking a device against an owner's will.
In early 2016, Forbes found what it described as the first known case of a warrant being used to compel an iPhone owner to unlock their device with their biometric information -- in that case using the Touch ID fingerprint biometric on an iPhone which had been seized by police.
While, in a landmark ruling in 2014, a U.S. judge said that while a defendant could not be forced to hand over a passcode they could be made to provide their biometric information to unlock their device.
Device security at borders has also become a matter of growing concern under the current U.S. administration -- which has shown an appetite to expand Homeland Security's powers to being able to demand passwords off visitors.
And while legislation is being proposed to outlaw such extralegal intrusions, it's not clear whether forced unlocking of devices based on requiring a person to apply their biometric information might not present a continued loophole for border agents to go on accessing the content of devices without a warrant.
So there could be a wider risk attached to Apple encouraging people to adopt facial biometrics if overreaching state agents are able to use the tech as a route for circumventing individuals' rights.
That said, the company has evidently been thinking about ways to mitigate this risk -- adding a feature to iOS 11 that lets users quickly disable Touch ID, via an SOS mode than can be triggered to require the full passcode.
It has been confirmed there will be a similar shortcut to quickly disable Face ID, too.
face unlock will have the same protections and timeouts as those listed for Touch ID https://t.co/tGlDVFLDQe
— Mary Branscombe (@marypcbuk) September 12, 2017
In iOS 11, the passcode will also be specifically required to be entered before any data can be pulled off a device -- limiting searches of unlocked devices at borders to agents being able to manually sift through contents there and then, rather than giving them unfettered access and the ability to easily download all the data.
Looking at how Apple is deploying a facial biometric within a wider security system is key.
If it was pushing Face ID as a complete replacement for a passcode that would indeed be irresponsible.
But, at the end of the day, it's offering the tech as an option for users who want added usability convenience, while also providing a fallback of stronger security safeguards that can be invoked or can step in to gate content at key moments.
For a mainstream consumer player like Apple that looks -- at this untested stage of the Face ID feature -- to be a fairly thoughtful approach to the age-old security vs convenience problem.
There is another, wider concern here too, though.
Always watching me
Human faces inherently contain a wealth of personal information -- from physical identity and features, to gender and ethnicity, mood/emotional state, even an approximation of age. A face could even indicate sexuality, if recent research is to be believed.
So technologies that normalize mass scanning of facial features do inexorably push in an anti-privacy direction -- carrying the uncomfortable risk of misuse.
Good: Design looks surprisingly robust, already has a panic disable.
Bad: Normalizes facial scanning, a tech certain to be abused.
— Edward Snowden (@Snowden) September 12, 2017
And it's clear that for Face ID to function at least some of the iPhone X's sensors will need to be always on, scanning for potential faces.
Which means it could be gathering very sensitive data without users being aware.
Face ID therefore opens a potential conduit for users to be surreptitiously spied on, say by scanning their faces to try to determine how happy or otherwise they look when contemplating a particular bit of on-screen content; or even to glean insights about the domestic context of the device owner, such as by identifying and counting multiple different faces in the same location to estimate family size.
And even if only some of the sensors that are in play on the iPhone X powering Face ID are always on, some of this hardware and software has to be continuously watching, no matter where you are, who you're with, what you're doing...
Remember, people carry smartphones with them, on their person, everywhere they go -- even from room to room within their own home. So while the Amazon Echo Look proposes to view you in your bedroom, the iPhone X has no such restrictions on the places it can watch you.
How third parties with apps on the iOS platform will be allowed to access the iPhone X's camera and sensor hardware is a key consideration. It doesn't take much imagination to consider what a data gathering behemoth like Facebook might like to do with this kind of technology -- even if it can only make use of it when its own app is open and running on the device.
And it's not yet clear whether or what kind of controls Apple might put in place to limit how app makers are able to access the X's face scanning capabilities (yes, we're asking). But the fact the hardware has been created and will soon be pushed out -- doubtless promoted with the help of millions of Apple marketing dollars -- already represents the next wave of tech-fueled privacy erosion.
So while smartphone technology has taught us to be accustomed to being continuously disturbed by digital prods and pings, at any and all times of the day or night -- to the point of mobile OSes including a 'do not disturb' setting to manually switch off intrusions we otherwise now expect -- Apple's championing of facial recognition technology positions face-scanning and face-reading to become the new normal.
And from facial recognition for identity and authentication it's but a small step to ushering in even more personally intrusive technology systems -- like emotion-tracking timestamped against the content you're browsing. As just one off-the-top-of-my-head example.
Perhaps future smartphones will come with a new type of underused control-toggle in the settings menu -- which simply states: 'Stop watching me.'