Is iPhone X’s Face ID the Beginning of an Arms Race for Facial Data?

Print Friendly, PDF & Email

In this special guest feature, Robin Bordoli, CEO of CrowdFlower, discusses how the new iPhone X facial recognition technology may represent the beginning of an arms race for facial data. Robin has spent the past two decades helping high growth companies launch and scale platforms and products into rapidly transforming markets. Prior to CrowdFlower, Robin was the Vice President & General Manager, Strategic Consumer Industries at Marketo where he launched and led the business unit selling to consumer digital marketing teams. He has also held leadership roles at Jive Software, Worksimple, Yahoo, Excite@Home & Micromuse. Robin holds a Master’s degree in Engineering from Cambridge University and a Master’s degree in Business Administration from Stanford University. Outside of work Robin spends his time trying to keep up with his two young children and enjoying all that the Bay Area has to offer.

We’ve witnessed the much ballyhooed iPhone X release, and nothing has garnered quite so much attention as its facial recognition feature, Face ID, which replaces Touch ID. We may look back in a couple of years and realize this was the beginning of the widespread adoption of facial recognition.  The new top-of-the-line model uses your facial features to unlock the phone, authenticate access to apps, and to make purchases through Apple Pay. Face ID is made possible by the TrueDepth camera in the iPhone X, which projects and analyzes more than 30,000 invisible dots to create a map of your face. To allay privacy concerns, Apple has promised that this data will never leave the device itself.

Which begs the question: will that always be the case? Because, unlike with Touch ID, where Apple was capturing thumbprints to keep user phone’s secure, facial recognition data is far more valuable. After all, beyond law enforcement, there aren’t many legitimate uses for our fingerprints. Facial data, on the other hand, can be used to train myriad recognition models, powering everything from advertisements to security to emotionally aware assistants and so much more.

To be fair, facial recognition AIs are already out in the wild, but they’re merely scratching the surface of what they’ll become. Right now, most of these projects are pointed towards security concerns. In China, for example,  the government is using it to finger jaywalkers and identify toilet paper thieves. Dubai and Emirates are partnering on virtual aquariums that double as airport security scanners. Australia is aiming towards more ubiquitous surveillance ahead of the Commonwealth Games.

But not all facial recognition technology is being applied for security purposes. Affectiva uses well-labeled facial data to help machines understand our moods and emotions. Tesco is looking to scan faces at the pump to target advertisements to demographic groups. Listerine–of all companies–created an app that helps blind people tell if people around them are smiling.

The list goes on. The point here is that, while this is still fairly nascent from a commercial standpoint, there’s no really no limit on clever deployments here. It’s worth remembering that some of these use cases–like Listerine’s aforementioned app–don’t actually require a model to identify a specific person, just that person’s expression. Applications that understand our nuanced, personal moods or models that can identify an individual from a vast library of unique facial data will likely be a bit less common but a lot more valuable.

Which brings us back to the iPhone X. If you remember what mobile phones looked like pre-iPhone–chances are yours folded in half–you know that as Apple goes, so goes Samsung and, frankly, so goes everyone else. It’s not hard to imagine a world where we barely even blanch at unlocking a phone with a glance. After that? Facial scans replacing credit cards or our car keys and more are rather easy to envision.

Now, if we posit that facial recognition is coming–which I believe is a fairly unremarkable prediction–we need to understand what powers those models. The only answer is troves of well-labeled facial data.

Look no further than Facebook’s famously accurate DeepFace algorithm. There are incredibly bright minds behind the model, but what really drives its confidence and accuracy is actually us. You and I and everyone we know. We tag our children at their birthday and our spouse over dinner. We tell Facebook “this person is named Joe Smith.” We label the data for them. And that data drives an algorithm that can identify individual people with human-level accuracy.

And it’s precisely this sort of data set that gets easily created by facial recognition like Face ID. We’re talking millions of phones, scanning their owners faces dozens of times a day after all. That’s an unfathomably valuable data set. As facial scanning technology becomes more ubiquitous and we find it less futuristic and more commonplace, it’s really only a matter of time until businesses decide to stop storing that data on our devices and instead start amassing it in the cloud. History shows most consumers will go along.

Now, if a lot of this facial data is going to be collected by what, ostensibly, is a luxury item, this does bring up a real concern: algorithmic bias. Because a thousand dollar phone is only creating facial recognition data for people who can afford thousand dollar phones, what you end up with is a very uniform, first-world data set. Large data sets like these can mis-categorize individuals, exhibit racial bias, and produce unequal (or simply bad) outcomes. For this reason, companies creating models will need to take care training and retraining their recognition algorithms with more diverse data sets. Kiva, the microlending platform focused on the developing world, is actually working on this very problem by creating a data set of almost a million labeled faces from their vast library of borrower images from around the world. This data set will be diverse in terms of ethnicity and skin color as well as environment such as backdrop and lighting.  By combining data sets like these with ones collected from smartphones, it’s easy to imagine incredibly accurate, confident models that know both who we are and how we feel.

When you see companies large and small collecting and analyzing more and more of the same kind of information, you can rest assured you’re see the next data arms race. This time, it’s right in front of our face. In fact, it actually is our face. And while right now, Apple and others won’t be collecting these images, it’s only a matter of time until the temptation and the value is too much to ignore.

Which is all to say: don’t expect any of this to be just about your personal security for very long.

 

Sign up for the free insideBIGDATA newsletter.

 

 

Speak Your Mind

*