This is beyond the pale.


Your robot vacuums are watching you — and the resulting imagery of your most private moments can, horrifically, get leaked online.

As the MIT Technology Review reports, the aptly-named company iRobot, behind the uber-popular Roomba vacuums, confirmed that gig workers outside of the US broke a non-disclosure agreement when sharing intimate photos, including one of a woman on the toilet, to social media.

As the report warns, however, those few images posted privately to contract worker shop talk groups may just be the tip of the privacy iceberg when it comes to robot vacuums and other smart home products that collect a ton of invasive data.

Annotation Station

The images in question, some of which MIT Tech shared — though thankfully not the bathroom one — were snapped by the vacuums for the purpose of data annotation, the process in which humans confirm or deny whether AI has accurately labeled things correctly. For Roombas, data annotation is a necessary part of the vacuums' efficiency — these human gig workers, most of whom live abroad, tell the robots' large AI database whether the thing in front of it is indeed a chair or if it's, say, a dog.

While the data annotation process is integral to Roomba-style vacuums and other AI-enabled robotics, most people are unaware of the process, though iRobot claimed in its responses to MIT Tech that the leaked images came from development robots that had a bright green label that said "video recording in process."

Still, the fact remains not only that these machines we bring into our homes are recording us, but also that there are humans out there who see the images and videos they capture — a creepy concept even for those who shrug over consumer privacy concerns.

More on creepy AI: There's a Problem With That AI Portrait App: It Can Undress People Without Their Consent

Share This Article