Meta Nairobi Ray-Ban Scandal

A shocking new investigation reveals that Meta’s Ray-Ban glasses aren’t just using AI—they’re sending intimate footage from private homes to human reviewers in Nairobi. From bathrooms to bank cards, your privacy might be an open book.

We’ve all seen the futuristic ads. You’re walking through the streets of Accra or Nairobi, wearing sleek Ray-Bans, and with a simple “Hey Meta,” your glasses tell you exactly what you’re looking at or translate a sign on the fly. It feels like living in the future.

But a recent bombshell investigation has revealed that the “AI” behind these glasses isn’t just a bunch of code—it’s actually a room full of people right here in our backyard.

A report published this week (March 2026) by Swedish journalists at Svenska Dagbladet (Swedish) and Göteborgs-Posten (Swedish) has exposed a massive privacy breach involving Meta’s Ray-Ban smart glasses. It turns out that when you ask the AI to analyze your world, that footage is being sent to Nairobi, Kenya, where human workers at a company called Sama are manually watching and labeling your private life.

The details are, frankly, disturbing. Workers in the Nairobi office have reported seeing things they never should have seen. Because these glasses are worn on the face, they capture everything the wearer looks at, often without the wearer realizing the camera is still rolling.

Tech Analyst and former Product Manager, Aakash Gupta recently broke down the scale of this operation:

“Meta’s Ray-Ban glasses need human data annotators to train the AI… When you say ‘Hey Meta‘ and ask the glasses to analyze something, that video gets sent to Meta’s servers, then routed to Sama, a subcontractor in Nairobi, Kenya. Workers there manually label objects in your footage. They see everything you recorded, intentionally or not… 7 million pairs sold in 2025 alone. Every single pair generates training data that flows through human eyes in Kenya. Workers told Swedish journalists they see people undressing, using bathrooms, having sex, and accidentally filming bank card details. One worker said “we see everything, from living rooms to naked bodies.”

The Privacy “Fail”

Meta claims these glasses were “designed with privacy in mind.” They even put a tiny LED light on the frame to show when they’re recording. But let’s be real: how many of us actually notice a tiny blinking light?

More importantly, the automatic blurring meant to protect identities is reportedly failing in low light. According to the investigation, workers have seen:

  • People undressing or in bathrooms.

  • Sensitive financial details like bank cards.

  • Intimate “sex scenes” filmed accidentally when glasses were left on bedside tables.

One worker told investigators: “If you start asking questions, you are gone.”

Why This Matters for Us

For the “Silicon Savannah” and the growing tech hubs across Africa, this is a wake-up call. We are often celebrated as the “workforce of the future” for AI training, but the conditions are often exploitative.

  • Intimate Surveillance: Your “private” moments are being reviewed by a stranger just a few kilometers away.

  • The “Name Tag” Threat: Meta is currently planning to add facial recognition (internally called “Name Tag”) to these glasses in 2026. This would allow wearers to identify strangers in real-time—a feature that could lead to instant doxxing and harassment.

  • Labor Exploitation: This involves the same company, Sama, that was previously called out for paying workers as little as $2/hour to filter graphic content for OpenAI.

Final Thoughts

Technology is great, but “smart” shouldn’t mean “spying.” Before you drop your hard-earned cash on the next big wearable, ask yourself if you’re comfortable with the “manual human review” buried in the terms of service.

Related Articles:

Previous Post
Next Post