In late February, Swedish publications Svenska Dagbladet and Göteborgs-Posten published an investigation into Meta’s AI training pipeline, finding Meta contractors in Kenya help train the artificial intelligence powering the glasses (comprising the Ray-Ban Meta Wayfarer (Gen 2), the Ray-Ban Display, and the Oakley Meta HSTN models). What they saw was startling.
“We see everything, from living rooms to naked bodies,” a worker said in the report. “Meta has that type of content in its databases.”
Any user who opts into sharing data for AI training purposes effectively allows all parts of their life to be recorded, and then as a result, reviewed, either by the AIs it’s supposed to train or by the humans behind it. That includes footage of people in bathrooms, undressing, and watching porn, and in at least one documented case, a pair of glasses left on a bedside table captured a partner who had never consented to being recorded.
The report triggered legal action. On March 4, plaintiffs Gina Bartone and Mateo Canu filed a class action lawsuit against Meta Platforms (and glasses-maker Luxottica of America) accusing the companies of violating federal and state laws by failing to disclose that videos captured by the glasses are transmitted to servers and then to a Kenyan subcontractor for manual labeling. Referencing new privacy bills and regulations as result of the increase in AI and the surveillance economy, the suit says that “Meta knows this,” in reference to the public’s growing concern over privacy and safety, and that “against this backdrop,” Meta released the glasses with a “reassuring promise: The glasses were ‘designed for privacy, controlled by you.’”
(When Google unveiled its prototype Google Glass in 2013, it ignited a fierce public backlash over surveillance, consent, and the death of anonymity. Bars, restaurants, casinos, and strip clubs banned the device outright, and wearers were mockingly dubbed “Glassholes.”)
Hall said his biggest concern isn’t actually the glasses-wearers themselves, it’s everyone else caught in the frame. “The bystanders, the people who are being filmed and identified, they’re the ones that are at risk,” he said. “Sadly, our privacy laws are not designed to protect those people. They’re designed to protect the people who are wearing the glasses and their ability to manage their own data.”
Hall noted existing law is simply not built for what Meta’s glasses make possible. “I don’t know that the existing laws are really sufficient to protect us from the risks of the kind of things that Meta and other social media companies are doing right now,” he said. “It’s sort of getting shoehorned into the privacy laws, but those are rarely enforced as it is, and this is completely upending the whole framework that those were built upon.
“I’m not seeing that people are meaningfully addressing it in any way,” he said, noting current regulations are piecemeal and fail to address the concerns of privacy entirely. Once privacy is addressed, he said, “everything else is just kind of window dressing.”
Meta did not respond to requests for comment.



