, , , , , , , , , , ,

From Adult Content to Canine Waste: Meet the ‘Taskers’ Sourcing Data for Meta’s AI Initiative

According to a report from the Guardian, a company partially owned by Meta has compensated tens of thousands of individuals for training artificial intelligence by analyzing Instagram profiles, collecting copyrighted material, and transcribing adult content. Scale AI, which is 49% owned by Meta’s parent company, has enlisted professionals from various disciplines, such as medicine, physics, and economics, ostensibly to enhance advanced AI systems through a platform named Outlier. The platform promotes itself as offering flexible job opportunities for highly qualified individuals, encouraging them to “become the expert that AI learns from.”

However, many contractors have indicated that their tasks have diverged significantly from the stated goal of refining AI systems, instead involving the collection of personal data from numerous users, which they found ethically troubling. Outlier is operated by Scale AI, a firm that holds contracts with the Pentagon and other defense entities.

Alexandr Wang, the CEO of Scale AI and Meta’s chief AI officer, has been recognized by Forbes as the “world’s youngest self-made billionaire.” Additionally, the previous managing director of the firm, Michael Kratsios, serves as a science advisor to President Donald Trump.

One contractor based in the United States expressed that users of Meta’s platforms, like Facebook and Instagram, would likely be shocked to learn how their data, including personal images, is being utilized. “I don’t think people realized that there would be someone sitting at a desk in a random state, examining their social media profile to generate AI training data,” they stated.

The Guardian interviewed ten individuals who have worked for Outlier, some for over a year, many of whom also held other jobs such as journalists, graduate students, educators, and librarians. With the economy facing challenges due to AI advancements, many sought additional income. One individual commented, “A lot of us were really desperate. Many people genuinely needed this job, myself included, and we tried to make the best of a difficult situation.”

Similar to a growing global workforce of AI gig workers, many believed they were training their own successors. One artist expressed feelings of “internalized shame and guilt” regarding their role in facilitating the automation of their aspirations. “As an aspiring human, it makes me angry at the system,” they remarked.

Glenn Danas, a partner at Clarkson, a law firm representing AI gig workers in lawsuits against Scale AI and similar platforms, estimates that hundreds of thousands of individuals are currently engaged in work for platforms like Outlier. The Guardian spoke with Outlier “taskers” from the UK, US, and Australia.

Taskers recounted the familiar humiliations associated with AI gig work, including constant surveillance and precarious employment. Scale AI has faced allegations of employing “bait-and-switch” tactics to attract prospective workers, initially promising high salaries only to offer much lower wages later. While Scale AI declined to comment on ongoing litigation, a source indicated that pay rates may change only if workers choose to participate in different, lower-paying projects.

Taskers reported being required to undergo multiple unpaid AI interviews to qualify for certain assignments, many believing these interviews were repurposed for AI training. All participants noted that they were consistently monitored via a system called “Hubstaff,” which could capture screenshots of the websites they visited while working. A Scale AI representative stated that Hubstaff was intended for accurate payment but not for “active monitoring” of taskers.

Some taskers shared experiences of being instructed to transcribe adult film soundtracks or categorize photos of deceased animals and dog waste. One doctoral student recounted being assigned to label a diagram depicting baby genitalia, while others encountered police reports detailing violent incidents.

The Guardian has reviewed videos and screenshots of tasks assigned to Outlier workers, which included labeling images of dog waste and responding to prompts like, “What would you do if an inmate refused to follow orders in a correctional facility?”

According to the Scale AI representative, they terminate tasks if inappropriate content is flagged, and workers are not obligated to continue with assignments that make them uncomfortable. The source added that Scale AI does not accept projects involving child exploitation materials or pornography.

Taskers indicated there was an expectation of social media data collection as part of their work. Seven of them mentioned that they were tasked with exploring other users’ Instagram and Facebook profiles, tagging individuals, their locations, and their friends. Some assignments involved training AI using accounts belonging to individuals under the age of 18. The structure of these tasks often required new data that had not yet been uploaded by other workers, leading them to delve into the social media accounts of more users.

The Guardian has obtained one such task that mandated workers to choose photos from individuals’ Facebook accounts and organize them sequentially by the users’ ages. Several taskers expressed discomfort with these assignments; one attempted to complete tasks solely using images of celebrities and public figures. “I was uneasy about including pictures of children, but the training materials sometimes included minors,” one tasker confessed.

“I didn’t use any friends or family for my submissions to the AI,” another stated. “I understand that this raises ethical concerns for me.”

The Scale AI representative clarified that taskers do not review accounts that are set to “private” and were not aware of tasks that involved labeling users’ ages or personal relationships.


AI Search


NewsDive-Search

🌍 Detecting your location…

Select a Newspaper

Breaking News Latest Business Economy Political Sports Entertainment International

Search Results

Searching for news and generating AI summary…


Latest News


Sri Lanka


Australia


India


United Kingdom


USA