The Privacy Risks of Facial Recognition in Smart Glasses

Originally published at: The Privacy Risks of Facial Recognition in Smart Glasses - TidBITS

In a 404 Media post titled “Someone Put Facial Recognition Tech onto Meta’s Smart Glasses to Instantly Dox Strangers,” Joseph Cox writes:

A pair of students at Harvard have built what big tech companies refused to release publicly due to the overwhelming risks and danger involved: smart glasses with facial recognition technology that automatically looks up someone’s face and identifies them. The students have gone a step further too. Their customized glasses also pull other information about their subject from around the web, including their home address, phone number, and family members.

The project is designed to raise awareness of what is possible with this technology, and the pair are not releasing their code, AnhPhu Nguyen, one of the creators, told 404 Media.

Despite the title, what’s going on here feels more like extreme Google stalking than doxing, which involves the public release of personal information. In 2023, Kashmir Hill wrote in the New York Times about how Google and Facebook had developed but decided not to ship such technology. Now, thanks to the likes of Clearview AI and PimEyes, facial recognition on consumer-grade hardware is within the reach of clever college students.

With luck, the awareness this project raises will highlight the woeful lack of privacy regulation in the US, with fragmented state-level policies leaving large gaps in privacy protections. Although enforcement and compliance mechanisms won’t be fully active for several years, the recently passed EU Artificial Intelligence Act prohibits:

the placing on the market, the putting into service for this specific purpose, or the use of AI systems that create or expand facial recognition databases through the untargeted scraping of facial images from the internet or CCTV footage

Smart glasses aren’t far away. While tech giants will likely block problematic uses of facial recognition—I can’t imagine Apple approving such an app—technology like this could soon become available to those who want it thanks to fast-moving Asian electronics manufacturers, open source tools, and readily available services. While the technology may appeal to those who wouldn’t use it for malicious purposes, it also opens the door to significant abuses like real-world stalking, identity theft, and kidnapping.

While corporate self-regulation and ethical guidelines are necessary, it’s hard to see anything other than comprehensive regulation effectively curbing privacy abuses enabled by instant facial recognition and real-time digital stalking.

2 Likes

Maybe not a full app, but Apple’s new AI ad features the actress from Last of Us using Siri to remind her who someone is so she can act like she remembers him:

Weird ad since she can remember him enough to know exactly when and where she saw him, but just not his name.

It’s also not facial recognition, but wouldn’t that be the easy next step? Or using the Camera to ask Siri, “Who is this guy?” Seems a no-brainer, actually useful, and yet…

(Such an explicit use of AI or facial rec doesn’t bother me, while eyeglasses that automatically retrieve all the info on everyone you pass is definitely creepy.)

Off topic a bit, but I have to say that these two ads really bother me. They are basically advertising Apple intelligence as a way to trick people that you remember who they are, or that you read the document that they sent to you (which you haven’t actually read.)

4 Likes

I don’t think that’s weird. That’s happened to me many times; I remember the face and I remember the circumstances, but I just don’t remember the name. There’s another ad on TV right now that’s similar, with a woman in a supermarket trying to remember someone’s name. I think it’s for a supplement that’s supposed to help with memory (but I don’t remember for sure).

1 Like

Happens to me constantly, and has for decades. I’ve become quite good at bluffing.

As Groucho Marx is reported to have said, “I remember your name perfectly. I just can’t place your face”.

1 Like

Me too. I do think the ad raises an interesting ethical question, but I’m not sure the fault (if any) lies with Siri.

You can use various mental tricks to help you remember someone’s name. I don’t think those of us who can remember all kinds of things about a person except their name consider those tricks unethical. We’re not pretending to have met them when we haven’t, we’re just trying to remember those couple of words from our prior conversation that followed “Hi, my name is…”.

So in the ad, the actor uses Siri/Apple Intelligence to remind herself of something she knew at one point, but in the moment has forgotten. But she then goes on to make a point of not only remembering the other person’s name, but also how unforgettable he and his name are. That’s perhaps edging closer to unethical fabrication on her part. Personally, though, I don’t give Apple Intelligence a bad mark in this situation.

Not for me! I am terrible at remembering people out of context. Like if I see a work client at the grocery store, they are totally familiar but I can’t place them. There is no way I could have told Siri the date and place of where I saw that guy like in the Siri ad.

Just a few days ago a lady driving by stopped when she saw me walking my dog. She knew me and my dog and we chatted for a few minutes about how things were going. I assume she’s a neighbor on the road as I often meet people in passing. She was familiar to me, but without knowing which house she was from, I had no idea who she was. I could have really used those smart glasses!

(I do have some form of face blindness. I blame it on not having glasses early enough when I was a kid. Faces were fuzzy to me and I just gave up trying to recognize them. To this day I recognize people more by their gait and voice than their features. I’m great at identifying celebrities doing voice overs in commercials – but show me their picture and I’ll have a hard time figuring out who they are!)

Have you seen/read the sci-fi novels Daemon and Freedom™ by David Suarez. They date back to 2010 and are eerily prescient about this stuff.

In the book, the glasses work as a heads-up display — so you can see the info without others even being aware of what you are seeing/doing.

I hadn’t seen them, but I’ll look for them. I do need to reread Vernor Vinge’s “Rainbow’s End,” which also touches on this topic.

The end of anonymity in public seems pretty scary to me. You walk past a stranger on a quiet street or parking lot and they quickly know a lot about you: where you live, where you work, your family and your friends. You become a target because of the perceived wealth, celebrity, politics or religion of you or your family. Even in a crowd, someone who identifies you, correctly or not, could coordinate with cohorts to follow you to someplace less crowded.

Given that anyone can create the ability to identify strangers from their faces simply by indexing images on public web sites and applying face recognition, I don’t see how any amount of legislation can put this genie back in the bottle.

1 Like

So much to be concerned about here. The first time I met someone wearing Google Glass they were exiting the men’s at work.

Cameras everywhere seems to have become acceptable. Folks videoing at incidents on the street rather than helping. A couple at their wedding dances, a sea of cameras are aimed at them. A good friend has installed 16 cameras around his property in a near crime free rural community, then noted a stream of data going to a site in China which he managed to block. Honestly, as one of my neighbours remarked after we removed a wasps nest from our shared hedge ‘I can’t see the good in them’.

And that’s speaking as a photographer and filmmaker.

Increasingly we seem to be feeding something and as Kevin Kelly presciently remarked in 2007 humanity may become the eyes and ears of the web.

Allying this with AI to remove privacy seems like a final step in something long underway. And that alliance can happen anywhere on the chain, not just in the glasses but also perhaps on a data center somewhere for some other purpose and without the wearer of the glasses knowledge.

1 Like