Facebook co-founder and CEO Mark Zuckerberg testifies before the House Financial Services Committee in the Rayburn House Office Building on Capitol Hill October 23, 2019 in Washington, DC. Zuckerberg testified about Facebook’s proposed cryptocurrency Libra, how his company will handle false and misleading information by political leaders during the 2020 campaign and how it handles its users’ data and privacy.
Chip Somodevilla | Getty Images News | Getty Images
Attorneys general from 44 states and territories urged Facebook to abandon its plans to create an Instagram service for kids under the age of 13, citing detrimental health effects of social media on kids and Facebook’s reportedly speckled past of protecting children on its platform.
Monday’s letter follows questioning from federal lawmakers who have also expressed concern over social media’s impact on children. The topic was a major theme that emerged from lawmakers at a House hearing in March with Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai and Twitter CEO Jack Dorsey. Republican staff for that committee later highlighted online protection for kids as the main principle lawmakers should consider in their legislation.
BuzzFeed News reported in March that Facebook had been exploring creating an Instagram service for kids, based on internal documents it obtained.
Protecting children from harm online appears to be one of the rare motivators both Democrats and Republicans can agree on, which puts additional pressure on any company creating an online service for kids.
In Monday’s letter to Zuckerberg, the bipartisan group of AGs cited news reports and research findings that social media and Instagram, in particular, had a negative effect on kids’ mental well-being, including lower self-esteem and suicidal ideation.
The attorneys general also said young kids “are not equipped to handle the range of challenges that come with having an Instagram account.” Those challenges include online privacy, the permanence of internet posts and navigating what’s appropriate to view and share. They noted that Facebook and Instagram had reported 20 million child sexual abuse images in 2020.
Officials also based their skepticism on Facebook’s history with products aimed at children, saying it “has a record of failing to protect the safety and privacy of children on its platform, despite claims that its products have strict privacy controls.” Citing news reports from 2019, the AGs said that Facebook’s Messenger Kids app for children between six and 12 years old “contained a significant design flaw that allowed children to circumvent restrictions on online interactions and join group chats with strangers that were not previously approved by the children’s parents.” They also referenced a recently reported “mistake” in Instagram’s algorithm that served diet-related content to users with eating disorders.
“It appears that Facebook is not responding to a need, but instead creating one, as this platform appeals primarily to children who otherwise do not or would not have an Instagram account,” the AGs wrote. “In short, an Instagram platform for young children is harmful for myriad reasons. The attorneys general urge Facebook to abandon its plans to launch this new platform.”
In a statement, a Facebook spokesperson said the company has “just started exploring a version of Instagram for kids,” and committed to not show ads “in any Instagram experience we develop for people under the age of 13.”
“We agree that any experience we develop must prioritize their safety and privacy, and we will consult with experts in child development, child safety and mental health, and privacy advocates to inform it. We also look forward to working with legislators and regulators, including the nation’s attorneys general,” the spokesperson said.
Facebook isn’t the only social media platform that’s created services for kids. Google-owned YouTube has a kids service, for example, though with any internet service, there are usually ways for kids to lie about their age to access the main site. In 2019, YouTube reached a $170 million settlement with the Federal Trade Commission and New York AG over claims it illegally earned money from collecting the personal information of children without parental consent, allegedly violating the Children’s Online Privacy Protection Act (COPPA).
Following the settlement, YouTube said in a blog post it will limit data collection on videos aimed at children, regardless of the age of the user actually watching. It also said it will stop serving personalized ads on child-focused content and disable comments and notifications on them.
WATCH: The big, messy business of content moderation on Facebook, Twitter, YouTube