Meta, the company behind the Facebook and Instagram platforms, is facing a new class action lawsuit in the United States over the way its Ray-Ban Meta AI glasses handle user data. The lawsuit, filed on March 5, 2026 by the Clarkson Law Firm, was the result of an investigation by Swedish media that revealed that workers at a Kenyan subcontractor had reviewed videos taken by users of the glasses, including explicit content such as footage of nudity, sex and intimate moments in the bathroom.
The lawsuit was filed by Gina Bartone of New Jersey and Mateo Canu of California, who claim they bought the glasses convinced by Meta’s marketing promises. The company advertised the glasses with phrases like “designed for privacy, under your control” and “built for your privacy,” which in no way suggested that users’ intimate footage would be viewed by foreign workers. Prosecutors state that they have not seen any exception or note that would contradict those promises.
The lawsuit was filed against Meta Platforms and its eyewear partner, Luxottica of America, for allegedly violating consumer protection laws. Meta claimed to blur faces in videos to protect privacy, but according to the BBC, sources disputed that the blurring works consistently.
The case also attracted the attention of the UK’s data protection regulator, the Information Commissioner’s Office, which launched its own investigation. The Clarkson Law Firm, which has brought a number of high-profile lawsuits against tech companies over the years, highlights the scale of the problem: in 2025, more than seven million people bought Meta Glasses, meaning their footage entered the review system, with no option to log out.
In a press release, Meta clarified that the content recorded by users remains on the device until they themselves share it with the Meta AI system, and that review by contract workers is performed solely to improve the user experience, as stated in the terms of use. The company pointed to the Supplemental Meta Platforms Terms of Service, without specifying where exactly that information was listed. The UK’s BBC found that note only in the United Kingdom’s Meta terms of use, and the US version of the terms more generally mentions the possibility of manual review of user interactions by AI systems.
This lawsuit is also important for understanding the current moment in the development of wearable technology. Meta Glasses are one of the most successful AI devices on the market, but their rise in popularity has been accompanied by a growing public backlash against what is increasingly being called “luxury surveillance.” One developer just released an app capable of detecting the presence of AI glasses nearby, which speaks to how the fear of unannounced filming has entered everyday discourse. The problem that this lawsuit brought to light is not technical, but fundamental: when millions of people wear devices that constantly record their surroundings, the question of who has access to those recordings is not a detail in the terms of use, but a central issue of trust between the company and the user, reports TechCrunch.