30.1 C
New Delhi
Sunday, June 23, 2024

AI Tools Are Utilizing Real Images of Children for Training Purposes

More from Author

In Short:

Personal details and over 170 images of children from Brazil have been taken without their consent to train AI, as per a report by Human Rights Watch. The images were scraped from family blogs and YouTube videos, violating the kids’ privacy. LAION-5B dataset, which included these images, has been taken down after reports of links to illegal content. YouTube is taking action against unauthorized scraping of content.


Over 170 images and personal details of children from Brazil have been scraped by an open-source dataset without their knowledge or consent, and used to train AI, claims a new report from Human Rights Watch released Monday.

Details of the Report

The images have been scraped from content posted as recently as 2023 and as far back as the mid-1990s, according to the report, long before any internet user might anticipate that their content might be used to train AI. Human Rights Watch claims that personal details of these children, alongside links to their photographs, were included in LAION-5B, a dataset that has been a popular source of training data for AI startups.

Concerns Raised

“Their privacy is violated in the first instance when their photo is scraped and swept into these datasets. And then these AI tools are trained on this data and therefore can create realistic imagery of children,” says Hye Jung Han, children’s rights and technology researcher at Human Rights Watch and the researcher who found these images. “The technology is developed in such a way that any child who has any photo or video of themselves online is now at risk because any malicious actor could take that photo, and then use these tools to manipulate them however they want.”

About LAION-5B Dataset

LAION-5B is based on Common Crawl—a repository of data that was created by scraping the web and made available to researchers—and has been used to train several AI models, including Stability AI’s Stable Diffusion image generation tool. Created by the German nonprofit organization LAION, the dataset is openly accessible and now includes more than 5.85 billion pairs of images and captions, according to its website.

Source of Images

The images of children that researchers found came from mommy blogs and other personal, maternity, or parenting blogs, as well as stills from YouTube videos with small view counts, seemingly uploaded to be shared with family and friends.

Actions Taken

LAION spokesperson Nate Tyler says the organization has already taken action. “LAION-5B were taken down in response to a Stanford report that found links in the dataset pointing to illegal content on the public web,” he says, adding that the organization is currently working with “Internet Watch Foundation, the Canadian Centre for Child Protection, Stanford, and Human Rights Watch to remove all known references to illegal content.”

YouTube’s Response

YouTube’s terms of service do not allow scraping except under certain circumstances; these instances seem to run afoul of those policies. “We’ve been clear that the unauthorized scraping of YouTube content is a violation of our Terms of Service,” says YouTube spokesperson Jack Maon, “and we continue to take action against this type of abuse.”

Concerns and Additional Findings

In December, researchers at Stanford University found that AI training data collected by LAION-5B contained child sexual abuse material. The problem of explicit deepfakes is on the rise even among students in US schools, where they are being used to bully classmates, especially girls. Hye worries that, beyond using children’s photos to generate CSAM, that the database could reveal potentially sensitive information, such as locations or medical data. In 2022, a US-based artist found her own image in the LAION dataset, and realized it was from her private medical records.

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

- Advertisement -spot_img

Latest article