Meta redirects inquiries about child harm to the Google and Apple app stores

The US Senate called for rules mandating parental consent of app downloads as it started investigating the company’s failures to protect minors.

The Senate started looking into Meta’s shortcomings in protecting kids who use its platforms on the same day that Meta made a plea to US politicians to regulate Google and Apple’s app stores.

Parenting in a Digital World Is Hard, the blog post said. Congress Can Simplify It, The global head of safety for Meta, Antigone Davis, has advocated for federal legislation that would require app retailers to warn parents anytime a child between the ages of 13 and 16 downloads an app and to request their consent. It is already against the law for minors under 13 to register accounts or download apps without parental permission.

The largest smartphone software shops in the world, the Play Store for Android and the software Store for iOS on the iPhone, are run by Google and Apple, who are not specifically mentioned in Meta’s blog post. Both would be the focus of any legislation governing the download of apps for kids.

Davis said that regulations requiring parental approval before a kid can open a social media account are not the “better way” to regulate smartphone and internet usage. For instance, in an effort to protect “the mental health of our youth,” as state governor Spencer Cox put it, Utah started asking parents of minors under the age of 18 to provide their permission before their children could use TikTok, Instagram, Facebook, and other applications in March.

The Senate judiciary committee wrote to Meta CEO Mark Zuckerberg on the same day that Davis’s call was made public, asking him to “provide documents related to senior executives’ knowledge of the mental and physical health harms associated with its platforms, including Facebook and Instagram.” The letter requests the materials by November 30. As of press time, neither Apple nor Google have any statements.

Meta released the following statement following the publication of this story: “We’ve always said that we support internet regulation, especially with regard to young people.” Having said that, we’re worried about what amounts to a disorganized patchwork of different laws in several US jurisdictions. Teens will have uneven internet experiences as a result of laws that require different apps to meet varying standards in different states. According to the corporation, since 2021, it has backed laws establishing “clear industry standards” for age verification and parental monitoring.

In reply, Meta came under fire from the National Society for the Prevention of Cruelty to Children, which said, “It is encouraging to see that social media regulation is already focusing the minds of top tech executives on protecting children.” However, Meta is trying to shift the blame when they ought to be taking care of its own affairs after sitting on its hands and knowing the harm that their platforms can do to kids.

A week after a former high-level Meta employee testified before the Senate committee on the harm Instagram may cause to minors, including his own daughter, the committee decided to launch a preliminary probe. When he voiced his concerns internally, he claimed that Meta’s leaders disregarded them.

“I appear before you today as a dad with first-hand experience of a child who received unwanted sexual advances on Instagram,” Arturo Bejar, a former technical director at Instagram, said to the group of senators.

Another Meta whistleblower, Frances Haugen, focused most of her testimony on the same topic when she disclosed to the US government, through internal papers, how corporate leaders disregarded advice regarding the negative impacts of teen girls’ usage of social networking. In October 2021, she testified before Congress.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like