© Reuters. FILE PHOTO: Meta and Facebook logos are seen in this illustration taken February 15, 2022. REUTERS/Dado Ruvic/Illustration/File Photo

By Katie Paul

NEW YORK (Reuters) – A former Meta employee is testifying before a U.S. Senate subcommittee on Tuesday, alleging that the Facebook (NASDAQ:) and Instagram parent company was aware of harassment and other harms facing teens on its platforms but failed to address them.

The employee, Arturo Bejar, worked on well-being for Instagram from 2019 to 2021 and earlier was a director of engineering for Facebook’s Protect and Care team from 2009 to 2015, he said.

Bejar is testifying before the Senate Judiciary Subcommittee on Privacy, Technology and the Law at a hearing about social media and its impact on teen mental health.

“It’s time that the public and parents understand the true level of harm posed by these ‘products’ and it’s time that young users have the tools to report and suppress online abuse,” he said in written remarks made available before the hearing.

Bejar’s testimony comes amid a bipartisan push in Congress to pass legislation that would require social media platforms to provide parents with tools to protect children online.

The goal of his work at Meta was to influence the design of Facebook and Instagram in ways that would nudge users toward more positive behaviors and provide tools for young people to manage unpleasant experiences, Bejar said at the hearing.

Meta said in a statement that it is committed to protecting young people online, pointing to its backing of the same user surveys Bejar cited in his testimony and its creation of tools like anonymous notifications of potentially hurtful content.

“Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” the Meta statement said. “All of this work continues.”

Bejar told senators he met regularly with senior executives at the company, including Chief Executive Mark Zuckerberg, and considered them supportive of the work at the time. However, he concluded subsequently that the executives had decided “time and time again to not tackle this issue,” he testified.

In one 2021 email, Bejar flagged to Zuckerberg and other top executives internal data revealing that 51% of Instagram users had reported having a bad or harmful experience on the platform in the past seven days and that 24.4% of children aged 13-15 had reported receiving unwanted sexual advances.

He also told them that his own 16-year-old daughter had been sent misogynistic comments and obscene photos, without adequate tools to report those experiences to the company. The existence of the email was first reported by the Wall Street Journal.

In his testimony, Bejar recounted that in one meeting Meta Chief Product Officer Chris Cox was able to cite precise statistics on teen harms off the top of his head.

“I found it heartbreaking because it meant that they knew and that they were not acting on it,” said Bejar.

(This story has been refiled to fix a typo in the headline)

Read the full article here

Share.

Leave A Reply

Your road to financial

freedom starts here

With our platform as your starting point, you can confidently navigate the path to financial independence and embrace a brighter future.

Registered address:

First Floor, SVG Teachers Credit Union Uptown Building, Kingstown, St. Vincent and the Grenadines

CFDs are complex instruments and have a high risk of loss due to leverage and are not recommended for the general public. Before trading, consider your level of experience, relevant knowledge, and investment objectives and seek financial advice. Vittaverse does not accept clients from OFAC sanctioned jurisdictions. Also, read our legal documents and make sure you fully understand the risks involved before making any trading decision