Apple will let you unlock the iPhone X with your face, in a move likely to bring facial recognition to the masses, along with concerns over how the technology may be used for nefarious purposes.
"Apple has done a number of things well for privacy but it's not always going to be about the iPhone X," said Jay Stanley, a policy analyst with the American Civil Liberties Union.
"There are real reasons to worry that facial recognition will work its way into our culture and become a surveillance technology that is abused."
A study last year by Georgetown University researchers found nearly half of all Americans in a law enforcement database that includes facial recognition, without their consent. Civil liberties groups have sued over the FBI's use of its "next generation" biometric database, which includes facial profiles, claiming it has a high error rate and the potential for tracking innocent people.
"We don't want police officers having a watch list embedded in their body cameras scanning faces on the sidewalk," said Stanley. Clare Garvie - the Georgetown University Law School associate who led the 2016 study on facial recognition databases - agreed that Apple is taking a responsible approach but others might not.
"My concern is that the public is going to become inured or complacent about this," Garvie said. Widespread use of facial recognition "could make our lives more trackable by advertisers, by law enforcement and maybe someday by private individuals," she said.
Garvie said her research found significant errors in law enforcement facial recognition databases, opening up the possibility someone could be wrongly identified as a criminal suspect. Another worry, she said, is that police could track individuals who have committed no crime simply for participating in demonstrations. Shanghai and other Chinese cities have recently started deploying facial recognition to catch those who flout the rules of the road, including jaywalkers.
Facial recognition and related technologies can also be used by retail stores to identify potential shoplifters, and by casinos to pinpoint undesirable gamblers. It can even be used to deliver personalized marketing messages, and could have some other potentially unnerving applications. Last year, a Russian photographer figured out how to match the faces of porn stars with their social media profiles to "doxx" them, or reveal their true identities. This type of use "can create huge problems," Garvie said. "We have to consider the worst possible uses of the technology."
Apple's system uses 30,000 infrared dots to create a digital image which is stored in a "secure enclave," according to a white paper issued by the company on its security. It said the chances of a "random" person being able to unlock the device are one in a million, compared with one in 50,000 for its TouchID. Apple's FaceID is likely to touch off fresh legal battles about whether police can require someone to unlock a device. U.S. courts have generally ruled that it would violate a user's rights to give up a passcode because it is "testimonial," but that situation becomes murkier when biometrics are applied. Apple appears to have anticipated this situation by allowing a user to press two buttons for two seconds to require a passcode, but Garvie said court battles over compelling the use of FaceID are likely.