Home » Technology » Roomba testers feel duped after intimate photos hit Facebook

Roomba testers feel duped after intimate photos hit Facebook

“Much of this language seems designed to exempt the company from applicable privacy laws, but none of it reflects the reality of how the product works.”

What’s more at Test takers had to accept that their data could be used for machine learning and object recognition training. Specifically, the “Use of Research Information” section of the Global Testing Agreement required an acknowledgment that “text, video, images, or audio…may be used by iRobot to analyze usage statistics and data, diagnose technology problems, , improve product performance and innovation features, market research, commercial presentations and internal training, including machine learning and object recognition.

What isn’t explained here is that iRobot does its machine learning training using human data labelers, which click-by-click teach the algorithms to recognize individual captured elements in the raw data. In other words, the agreements shared with us never explicitly mention personal images viewed and analyzed by other people.

Baussmann, a spokesperson for iRobot, said the language we highlighted “covers a variety of test scenarios” and isn’t specific to images sent for data annotation. “For example, testers are sometimes asked to take photos or videos of a robot’s behavior, e.g. if he gets stuck on a certain object or doesn’t want to fully lock on and sends those photos or videos to iRobot,” he wrote, adding that “for tests that capture images for annotation purposes, certain conditions apply, set out in the agreement accompanying this test.

He also wrote that “we cannot be certain that the people you spoke to were part of the development work related to your article”, although he did not specifically dispute the veracity of the global testing agreement that ultimately allowed it at Data collected from test users and used for machine learning.

What users really understand

When we asked privacy advocates and academics to review consent agreements and share test user concerns with them, they saw the documents and resulting data breaches as emblematic of a fragmented consent framework that affects us all. , whether we are beta testers or regular consumers .

Experts say companies are aware that people rarely read privacy policies carefully, if we read them at all. But what iRobot’s global testing agreement bears witness to, says Ben Winters, an attorney at the Electronic Privacy Information Center that focuses on AI and human rights, is that “even if you read it, you still don’t get clarity.”

Rather, “much of this language seems designed to exempt the company from applicable privacy laws, but none of it reflects the reality of how the product works,” says Cahn, pointing to the mobility of robot vacuum cleaners and the impossibility of controlling where potentially sensitive people or objects, especially children, are always present in the house.

Ultimately, this “place[s] much of the responsibility … falls on the end user,” notes Jessica Vitak, a computer scientist at the University of Maryland College of Information Studies who studies research best practices and consensus policies. However, she doesn’t give them a detailed explanation of “how things could go wrong,” she says, “which would be very valuable information in deciding whether to participate.”

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.