Skip to main content

Facebook’s facial recognition now looks for you in photos you’re not tagged in

Facebook’s facial recognition now looks for you in photos you’re not tagged in

Share this story

Illustration by Alex Castro / The Verge

Facebook is expanding how it uses facial recognition to find people in photos. From today, the company will notify users when someone uploads a photo with them in it — even if they’re not tagged. The user will then have the option to add their tag to the photo, leave themselves untagged, or report the photo if they think it’s inappropriate. The feature will also work with profile photos, but won’t be available in Canada or the EU, where data laws restrict the use of facial recognition.

According to Facebook, the new tool is designed to empower users by helping them control their image online. “We really thought this as a privacy feature for a long time. If someone posts a photo of you, you might not know about it,” Rob Sherman, Facebook’s head of privacy, told The Verge. “Now, the users can access the photo, and they can communicate to the person who posted it.”

The new tool will notify you when friends and mutual acquaintances upload photos you’re tagged in.
The new tool will notify you when friends and mutual acquaintances upload photos you’re tagged in.
Image: Facebook

Sherman also says the tool could be a prompt for nostalgia, alerting people about photos they’ve forgotten. (Although the tool doesn’t work retroactively.) And, of course, it will encourage people to engage with the site, and increase the amount of time they spend on it. It’s like that old thought experiment: if someone uploads a photo of you to Facebook but doesn’t tag you in it, does it show up in engagement metrics? The answer, apparently, is no.

From Facebook’s point of view, this sort of behavioral nudge makes sense. Last week, the company shared scientific research that found that “passively consuming” the News Feed made people unhappy. However, that same research found that being actively involved with the site — messaging people, liking posts, and reminiscing about old times — was “linked to improvements in well-being.” One problem here is that simply “interacting” with Facebook is a crude way to categorize behavior, and every tool that increases engagement potential does so in a way that makes people unhappy.

The new facial recognition notification tool, for example, could be used to harass or bully people. Users will be notified even when they appear in photos uploaded by someone they’re not friends with — they simply need to have friends in common, and the photo’s “audience” (Facebook’s term for who can or cannot see content) needs to be set to “everyone.” So, a harasser who isn’t able to directly message their target could upload a picture with them in it, perhaps Photoshopped to include a nasty or abusive joke, and Facebook will recognize their face and ping that user, doing the harasser’s work for them. And for profile photos, users don’t even need to have mutual friends to be sent a notification.

The new tool will also be used to improve Facebook’s automatic text descriptions — telling visually impaired users not just what is in a picture, but also who.
The new tool will also be used to improve Facebook’s automatic text descriptions — telling visually impaired users not just what is in a picture, but also who.
Image: Facebook

For Facebook, the challenge is trying to minimize situations like the one above while making these tools open enough to encourage the good sort of engagement — like connecting with a new friend you met at a house party because you appear in their photos. When asked about the harassment scenario above, Facebook pointed out that the targeted user would then have the ability to report the photo, and that they could turn off facial recognition altogether. As the new features roll out, users will be prompted in their News Feed to review their current privacy settings and make a choice about the tech. (Perhaps not unrelatedly, Facebook is currently in the middle of a lawsuit alleging that its use of facial recognition violates user privacy by failing to ask for explicit consent.)

“We heard from people that they wanted it to be easy to choose whether to use these things or not,” says Sherman. “So if you want to turn this off, we think it should be really easy.”