In 2021, Zuckerberg refused to expand Meta’s child safety team, now he was forced to apologize to the affected families
During a Senate hearing on child safety on the Internet, Meta CEO Mark Zuckerberg apologized to parents who claim Instagram played a role in their children’s suicides or sexual exploitation, reports NBC News. Responding to a direct request from Senator Josh Hawley, Zuckerberg expressed sympathy for the families’ suffering, recognizing the seriousness of their experience.
At a hearing held by the Senate Judiciary Committee titled “Big Tech and the Online Child Sexual Exploitation Crisis,” Zuckerberg, along with the CEOs of TikTok, Discord, X, and Snap, answered intense questions from lawmakers. Parents in attendance, displaying photos of their children and wearing blue ribbons in support of the Keeping Our Children Safe Online Act (KOSA), clearly expressed their disapproval as Zuckerberg entered the room.
Zuckerberg’s apology, although not spoken directly into the microphone, was heard live on the stream. He emphasized Meta’s commitment to leading industry efforts to prevent such tragedies. The questions touched on a variety of serious issues, including the presence of sexually explicit images of children on Instagram without their consent, harassment through social media, and other concerns.
Meta is currently embroiled in a federal lawsuit filed by several states alleging that Facebook and Instagram have developed addictive features for children and are withholding data about the platforms’ harmful effects on young users. Senator Richard Blumenthal cited emails from Meta’s Director of International Affairs Nick Clegg in which he expressed concerns about Meta’s progress in addressing key well-being issues such as problematic use, bullying, harassment, and suicidal self-harm.
According to internal Meta correspondence in 2021, Zuckerberg rejected proposals from Meta’s senior management to expand the company’s team dedicated to child safety and well-being. Nick Clegg, then vice president of global affairs, asked for an additional 45 employees to strengthen the fight against bullying, harassment, and self-harm among children. Despite the support of other company executives, Zuckerberg did not approve the staff expansion, even after a revised, smaller proposal for the number of employees was submitted later that year.
This denial came at a time when internal documents later released by the Wall Street Journal under the title “The Facebook Files” revealed that the company was aware of the negative impact of its platforms, including Instagram, on the mental health of teens. The Senate Judiciary Committee released emails related to these discussions ahead of a hearing on child online safety, highlighting internal debates within the company about how to address these issues.
Zuckerberg told the senators that Meta’s trust and security department employs 40,000 people. However, questions arose about recent layoffs affecting these departments, to which Zuckerberg replied that the layoffs were general and not specifically targeted at the Trust and Security department.
Senator Tom Tillis addressed the CEOs of Meta, TikTok, Discord, X, and Snap, recognizing the complexity of their roles and the challenges they face in balancing business goals with security obligations. He expressed hope that they are working hard to reduce the harmful impact of their platforms.
“At the end of the day, I find it hard to believe that any of you people started this business, some of you in your college dorm rooms, for the purposes of creating the evil that is being perpetrated on your platforms,” said Tillis. “But I hope that every single waking hour, you’re doing everything you can to reduce it.”