Meta leadership is blinding their ears and eyes to their platforms’ dangers, whistleblower says
By Fox Business
Key Concepts
- Unwanted Sexual Advances: This refers to unsolicited sexual propositions or requests made on social media platforms.
- Harmful Content: Content that is detrimental or damaging, particularly to young users, including sexually explicit material or content that facilitates exploitation.
- Recommendation Algorithms: The systems used by social media platforms to suggest content to users based on their past behavior and engagement.
- Whistleblower: An individual who exposes illegal or unethical activity within an organization.
- Edge Computing/Fear Research: Research focused on understanding and mitigating potential harms, particularly those related to fear and exploitation, within digital environments.
- Protect and Care Team: A team within a company responsible for safeguarding users, especially vulnerable ones, from harm.
Testimony Before Congress and Whistleblower Allegations
Arturo Bejar, a former Director of Edge Fearring at Facebook and responsible for its Protect and Care team, testified before Congress in 2023. He presented alarming statistics and personal accounts regarding the dangers children face on Meta's social media platforms, specifically Instagram.
- Prevalence of Unwanted Sexual Advances: Bejar stated that "one in eight kids, age 13-15 years old, experienced unwanted sexual advances in the last seven days." He emphasized that it is "unacceptable that a 13-year-old girl gets propositioned on social media."
- Meta's Knowledge of Harm: Bejar asserted that Meta "knowingly exposed children to" these dangers. He claims that Instagram is aware when a child spends a significant amount of time viewing "harmful content" that the platform itself is recommending.
- Accountability for Recommendations: Bejar called for Meta to be held accountable for its recommendation algorithms and for the "unwanted sexual advances that Instagram enables."
- Parental Distress: He observed that "just about every parent and grandparent has seen their kid's faces change from happiness to grief to distress the moment that they check social media." He believes the public and parents need to understand the "true level of harm enabled by these products."
Inadequacy of Existing Tools and Discontinuation of Work
Bejar detailed his experience upon returning to Facebook in 2019, finding that crucial work on user protection had been discontinued.
- Lack of Reporting Mechanisms: He stated that when he returned, "the work had been discontinued and decimated." Specifically, if a girl was experiencing unwanted sexual advances, "there was no way for her to let Instagram know that that happened."
- Executive Team's Awareness: Bejar believes the executive team was not fully aware of the extent of the harm. He conducted research involving "hundreds of thousands of people" which revealed "staggering amounts of harm." These findings were brought to the attention of the executive team, but he felt "we're not doing the work to make it better."
Personal Experience and Direct Appeal to Mark Zuckerberg
Bejar shared a personal account of his daughter's experience on Instagram, highlighting the direct impact of these platform vulnerabilities.
- Daughter's Exploitation: Bejar's daughter began receiving "unwanted sexual advances on Instagram shortly after joining when she was just 14 years old." This included "requests for sex, unwanted nude pictures."
- Lack of Recourse: At the time, there was "nothing she could do" to report these advances.
- Direct Communication with Zuckerberg: Bejar personally emailed Mark Zuckerberg, with whom he had worked closely for years, to inform him of the situation four years prior to the interview.
- Continued Inaction: Despite his whistleblowing two years prior to the interview, Bejar stated that "today there's still no way for a 13-year-old girl to let Meta know they received unwanted sexual advance." He accused the company of "blinding their ears and eyes to the harm happening on the platform" while simultaneously assuring parents that their products are safe.
Conclusion and Call for Awareness
Arturo Bejar's testimony underscores a critical disconnect between Meta's public messaging about product safety and the reality experienced by young users. He argues that the platform's design and recommendation systems actively contribute to the exposure of children to harmful content and unwanted sexual advances, with inadequate mechanisms for reporting and addressing these issues. His personal experience and extensive research serve as a stark warning to parents and the public about the dangers lurking on social media platforms.
Chat with this Video
AI-PoweredHi! I can answer questions about this video "Meta leadership is blinding their ears and eyes to their platforms’ dangers, whistleblower says". What would you like to know?