On a recent Friday afternoon, Kashif Hoda was waiting for a train near Harvard Square when a young man asked him for directions. Mr. Hoda was struck by the man’s nerdy, thick-framed glasses, but he did not realize that they were Ray-Ban Meta smart glasses and that a small white light indicated that they were recording.
A few minutes later, as Mr. Hoda’s train was pulling into the station, the bespectacled man, who was a junior at Harvard University named AnhPhu Nguyen, approached him again.
“Do you happen to be the person working on minority stuff for Muslims in India?” Mr. Nguyen asked.
Mr. Hoda was shocked. He worked in biotechnology, but had previously been a journalist and had written about marginalized communities in India.
“I’ve read your work before,” Mr. Nguyen said. “That’s super cool.”
They shook hands, but Mr. Hoda didn’t have time to continue the conversation because his train was boarding. He posted on social media, reflecting on how strange the encounter had been.
A month later, he found out just how strange. He had been an unwitting guinea pig in an experiment meant to show just how easy it was to rig artificial intelligence tools to identify someone and retrieve the person’s biographical information — potentially including a phone number and home address — without the person’s realizing it.
A friend texted Mr. Hoda, telling him that he was in a video that was going viral. Mr. Nguyen and a fellow Harvard student, Caine Ardayfio, had built glasses used for identifying strangers in real time, and had demonstrated them on two “real people” at the subway station, including Mr. Hoda, whose name was incorrectly transcribed in the video captions as “Vishit.”
Mr. Nguyen and Mr. Ardayfio, who are both 21 and studying engineering, said in an interview that their system relied on widely available technologies, including:
Meta glasses, which livestream video to Instagram.
Face detection software, which captures faces that appear on the livestream.
A face search engine called PimEyes, which finds sites on the internet where a person’s face appears.
A ChatGPT-like tool that was able to parse the results from PimEyes to suggest a person’s name and occupation, as well as look up the name on a people search site to find a home address, a phone number and relatives.
“All the tools were there,” Mr. Nguyen said. “We just had the idea to combine them together.”
The video makes it appear as if the system works instantaneously and consistently on everybody. But the process took a minute and a half, the students said, and worked on about a third of the people they tested it on.
Coding the system took just four days. “We spent most of the time making the video,” Mr. Ardayfio said.
The technology to put a name to a face is now free or cheap to use, so it is mostly a matter of ethics and propriety about whether to exercise the ability or not.
Mr. Nguyen and Mr. Ardayfio said they enjoyed doing random projects for fun and had recently created a flamethrower. That experiment singed Mr. Ardayfio’s leg, but it was the facial recognition system that blew up, metaphorically. Given how accessible face search engines are, they have been surprised by how much attention the project garnered around the world. Its main novelties were incorporating the ChatGPT-like assistant and the Meta Ray-Bans.
Meta has discussed creating similar facial recognition glasses — and even developed an early prototype — but has not released the capability publicly because of legal and ethical concerns. When the students’ video was first reported by 404 Media, a Meta spokesman, Andy Stone, dismissed the company’s role via a post on Threads.
“What these students have done would work with any camera, phone or recording device,” Mr. Stone wrote. “And unlike most other devices, Ray-Ban Meta glasses have an LED light that indicates to people that the user is recording.”
Mr. Hoda did not notice the light.
Multiple investors have since reached out to the students, in messages shared with The New York Times, offering to fund further development of the glasses. Mr. Ardayfio said they had no desire to commercialize this particular extracurricular project and had simply wanted to show it was possible.
In a Google Doc accompanying their video, they encouraged people to remove their information from data broker sites that can reveal names, home addresses and contact information.
“We want people to learn to protect themselves,” Mr. Ardayfio said. He and Mr. Nguyen removed information from data broker sites that would expose their home addresses, but did not attempt to make their faces unsearchable.
Jim Waldo, a computer science professor at Harvard, said having these glasses would be useful for him because he had to learn the names of 100 students at the beginning of every semester.
“This is the kind of technology that could be incredibly useful and incredibly destructive,” he said.
Last week, Mr. Waldo, who teaches a class on privacy and technology, invited Mr. Nguyen and Mr. Ardayfio to give a presentation to his students.
“Is this legal?” one student asked.
Mr. Nguyen said they had violated some companies’ terms of service, but not the law. (PimEyes removed the students’ access to its product, according to the company’s chief executive, because they had uploaded photos of people without their consent.)
Massachusetts does not prohibit the identification of people with facial recognition technology, but it does forbid recording conversations without consent, even in public, Woodrow Hartzog, a law professor at Boston University, said.
“This whole incident shows how easy it is to secretly surveil people with these glasses,” he said.
If the students had asked for permission to feature Mr. Hoda in their video, he would have given it, Mr. Hoda said. He thought it was an important demonstration of what new technology made possible.
“When I got on the internet in the ’90s, it was called a global village,” Mr. Hoda said. “And now the world really is becoming one, which is good and bad. Just like a village, everyone is up in your business. Privacy will be impossible.”
<