Face ID is easily the most hot-button topic to come out of Apple’s iPhone event this week, notch be damned. As people have parsed just how serious Apple is about it, questions have rightly begun to be raised about its effectiveness, security and creation.
To get some answers, I hopped on the phone with Apple’s SVP of Software Engineering, Craig Federighi. We went through a bunch of the common concerns in rapid fire fashion, and I’ve also been asking around and listening to Apple folks who have been using the feature over long periods. Hopefully we can clear up some of the FUD about it.
Making Face ID
On stage during the event, Apple’s Phil Schiller mentioned that Apple had gathered ‘a billion’ images to train Face ID. Federighi says that Apple went even further than that.
“Phil mentioned that we’d gathered a billion images and that we’d done data gathering around the globe to make sure that we had broad geographic and ethnic data sets. Both for testing and validation for great recognition rates,” says Federighi. “That wasn’t just something you could go pull of the internet.”
Especially given that the data needed to include a high fidelity depth map of facial data. So, says Federighi, Apple went out and got consent from subjects to provide scans that were ‘quite exhaustive’. Those scans were taken from many angles and contain a lot of detail that was then used to train the Face ID system.
I asked what Apple did with that training data.
“We do retain a high fidelity depth map of that [training] data that we protect,” says Federighi. “As we train these models and iterate on these algorithms you want raw sensor data to use and develop and optimize them.”
When it comes to customers — users — Apple gathers absolutely nothing itself. Federighi was very explicit on this point.
“We do not gather customer data when you enroll in Face ID, it stays on your device, we do not send it to the cloud for training data,” he notes.
There is an adaptive feature of Face ID that allows it to continue to recognize your changing face as you change hair styles, grow a beard or have plastic surgery. This adaptation is done completely on device by applying re-training and deep learning in the redesigned Secure Enclave. None of that training or re-training is done in Apple’s cloud. And Apple has stated that it will not give access to that data to anyone, for any price.
Which is a good a time as any to talk about another big topic: security.
The security and privacy of Face ID
One of the primary questions about Face ID that has come from many quarters is how Apple is going to handle law enforcement requests for facial data.
The simple answer, which is identical to the answer for Touch ID by the way, is that Apple does not even have a way to give it to law enforcement. Apple never takes possession of the data, anonymized or otherwise. When you train the data it gets immediately stored in the Secure Enclave as a mathematical model that cannot be reverse engineered back into a ‘model of a face’. Any re-training also happens there. It’s on your device, in your SE, period.
This answers questions about whether Apple is taking stewardship of the data of underage users in the US as well. It isn’t. It stays on device.
I also asked Federighi whether Apple had considered allowing the very security conscious to enable a mode that forced both Face ID and a passcode to be used to unlock a device. A sort of two-factor identification that combined both numeric and biometric factors into one system.
“We’ve definitely talked about it internally,” says Federighi. “We have people who that are interested [in that]."
He points out that there are scenarios you’d need to account for, like shaving your ‘mountain man’ beard and needing access. “The thing we’d need to do is you’d need a backup super long passcode…but it’s certainly something that gets discussed.”
So, not for now, but the topic has come up.
I also quizzed Federighi about the exact way you ‘quick disabled’ Face ID in tricky scenarios — like being stopped by police, or being asked by a thief to hand over your device.
“On older phones the sequence was to click 5 times [on the power button] but on newer phones like iPhone 8 and iPhone X, if you grip the side buttons on either side and hold them a little while - we’ll take you to the power down [screen]. But that also has the effect of disabling Face ID,” says Federighi. “So, if you were in a case where the thief was asking to hand over your phone - you can just reach into your pocket, squeeze it, and it will disable Face ID. It will do the same thing on iPhone 8 to disable Touch ID.”
That squeeze can be of either volume button plus the power button. This, in my opinion is an even better solution than the ‘5 clicks’ because it’s less obtrusive. When you do this, it defaults back to your passcode.
This is a bit of an aside, but I’d also like to point out here that Face ID emits no visible light. I’ve seen some misconceptions on social media that it’s going to be shining a light at your face. Nope. It uses only infrared and existing light which means it will work in darkness without any more light than is coming off of the phone’s screen.
It’s worth noting a few additional details here:
- If you haven’t used Face ID in 48 hours, or if you’ve just rebooted it will ask for a passcode
- If there are 5 failed attempts to Face ID, it will default back to passcode. (Federighi has confirmed that this is what happened in the demo on stage when he was asked for a passcode — it tried to read the people setting the phones up on the podium)
- Developers do not have access to raw sensor data from the Face ID array. Instead, they’re given a depth map they can use for applications like the Snap face filters shown off on stage. This can also be used in ARKit applications.
- You’ll also get a passcode request if you haven’t unlocked the phone using a passcode or at all in 6.5 days and if Face ID hasn’t unlocked it in 4 hours.
Also be prepared for your phone to immediately lock every time your sleep/wake button is pressed or it goes to sleep on its own. This is just like Touch ID.
Federighi also noted on our call that Apple would be releasing a security white paper on Face ID closer to the release of the iPhone X. So if you’re a researcher or security wonk that’s looking for more, he says it will have ‘extreme levels of detail’ about the security of the system.
Using Face ID
Then there’s the topic of how well it works. Everyone I’ve spoken to who has been in a position to be using it for weeks or months says it’s incredibly reliable no matter the light level. The combination of using the RGB camera and the IR emitter plus the dot projector covers a wide array of scenarios that allow it to be very reliable and very fast.
One anecdotal thing: if you lift your phone and swipe up immediately, there’s a good chance that the Face ID system will have performed its authentication fast enough to have unlocked your device by the time you finish your swipe. That’s how fast it is.
But the speed isn’t the only question. Sunglasses, for instance, are fairly commonly worn outdoors. Federighi had mentioned in an email to a user that ‘most’ sunglasses would work fine. I asked whether polarization was the problem. He said no.
“It turns out that polarization isn’t the issue - I have a set of polarized sunglasses that work just fine with Face ID. What it is is that different lenses have a different amount of infrared filtration. Most let enough IR through that through most lenses your eyes are visible to IR even if they’re not visible to the human eye,” says Federighi. “There are some lenses whose coatings block IR. In those cases the customer can just use a passcode or take them off.”
He notes that there are some people for whom the ‘attention’ feature just won’t work. If you’re blind or vision impaired for instance, you may not be able to stare directly at the phone to communicate your intent. In those cases, where a face is recognized (even with sunglasses on), but it can’t see your eyes you can just turn off the ‘attention detection’ feature. You still get Face ID, but at a lower level of overall security because it’s not ensuring that your eyes are directly focused on it.
“You can turn off intention detection as a user,” says Federighi. “There’s some compromise to detection there — but if you have a condition here you can’t look at it that’s the choice you have.
And if you don’t want to use the Face ID feature at all you can turn it off.”
Face ID requires that it be able to see your eyes, nose and mouth. This means that there are scenarios where it just won’t work.
“If you’re a surgeon or someone who wears a garment that covers your face, it’s not going to work,” says Federighi. “But if you’re wearing a helmet or scarf it works quite well.”
This means that Face ID is not going to be a viable option for people who wear a mask for work or wore a niqab, for instance. They would need to use a passcode. Federighi notes that this limitation is similar to Touch ID, which simply didn’t work if you wore gloves or had wet fingers.
Another common question is about what kind of angles and distances you can be at in relation to your iPhone to get it to unlock.
“It’s quite similar to the ranges you’d be at if you put your phone in front facing camera mode [to take a picture],” says Federighi. Once your space from eyes to mouth come into view that would be the matching range - it can work at fairly extreme angles — if it’s down low because your phone is in your lap it can unlock it as long as it can see those features. Basically, If you’re using your phone across a natural series of angles it can unlock it.”
Obviously, those angles are limited because it tries to use the attention detection system to match that you’re not just in your phones view while it is awake, but that you’re actively trying to use it. This is why it won’t work when you’re asleep if someone tries to unlock your phone, or if you’re talking to someone and a third party tries to point the phone at your face to unlock it. You have to be paying attention.
As mentioned above, Apple went to great lengths to gather its own data on facial shapes and angles. And when it did so it made sure to gather them across a broad basis of geographic locations, ages and ethnic backgrounds. Federighi says that Apple has tested it extensively and it should not matter what your cultural background is, Face ID will work with your face.
There have been plenty of examples of theoretically accessible technology not living up to the actual cultural diversity of our world. A recent example that went viral was a soap dispenser that didn’t recognize a man’s hand because his skin was dark. Apple has gone through some efforts in hardware and software to make sure that this doesn’t happen with Face ID. How it works once it gets out to millions of people we’ll have to wait to see.
As far as the TrueDepth camera, there are also some side benefits for developers that use ARKit or the depth map to create effects.
“With both the rear facing cameras and front-facing system [on iPhone X] we expose to developers a depth map, so ARKit will take depth from a photo and create a mesh, but it’s not raw sensor data. It’s depth that can be used for photographic effects,” says Federighi when I ask about other uses for the array. “It’s designed to be very good at close range where the rear facing cameras are good at greater distances. It’s different technology with a different purpose - as far as selfie range - the probing dot pattern provides a great solution [for a depth map].”
For those of you who are ARKit developers out there, the system will use a fusion of RGB data with the IR data to give you that mesh. This will allow for enhanced effects and accuracy using the TrueDepth camera array.
The Face ID story so far
Obviously, most people have not had a chance to use Face ID yet and there are a lot more questions. But many of those will likely get answered automatically as people are actually able to put it into practice and start learning the ins and outs of the system.
The fact of the matter is that there is likely an outsized amount of skepticism about Face ID because other manufacturers like Samsung have shipped versions of facial recognition that are, frankly, crap. If it can be fooled by a simple photo what the hell are you doing shipping it at all?
Face ID is not a simple image recognition system. It looks at a three dimensional model of your entire face, recognizing features at a level of detail high enough that Apple is confident that masks will not fool it. It’s a different ballgame entirely.
The rewards for making security (a passcode) incredibly easy for people to implement and use on a daily basis are enormous. The vast majority of people still use common passwords and don’t enable two-factor authentication on any of their devices. The amount of work that Touch ID has done and Face ID will do to improve the security of regular users is huge.
This is clearly Apple’s play for the future, and, like Apple commentator John Gruber, I’ve heard that Apple has not intended on including any Touch ID-based authentication in the iPhone X for a very long time.
I’ll probably dissect this further down the line but there is also massive potential for the Face ID system bloom into a boom of intent-based computing. If our devices know who we are with authority and constancy, what kinds of interactions are we able to have with them securely and automatically?
Contextual computing has been chugging along for quite a while now without a huge breakthrough to allow our devices to become more aware. Face ID just might be opening another enormous door here. But, of course, with any door you have to ask about the consequences of opening it and who holds the keys.
Apple’s Face ID processes mirror those they developed for Touch ID very closely. And, even though there are caveats, those have largely stood up to probing from security researchers and nation states alike.
It’s important to ask the questions and listen carefully to the answers. But so far the answers seem fairly plain.
No comments:
Post a Comment