Protecting minors should be Meta’s objective

By Angie Hund | Manor Ink

highergroundsmgmt.com photo

Conditioned to consume, generations of Internet users partake in the thriving enterprise of social media. From traditional platforms like Facebook, Instagram and WhatsApp to more contemporary services like TikTok and Discord, social media continues to grow, becoming a central part of culture, communications and education.

Meta, the company formerly known as Facebook, now consists of that platform as well as Instagram, Threads and WhatsApp, all owned by billionaire entrepreneur Mark Zuckerburg. With a personable online image, over 65,000 employees and a net worth of more than $885 billion, Meta is arguably the most important technology company of our time.

But the services Meta provides are not as benign as they may seem.

In November, Judge Yvonne Gonzalez Rogers of the US Court for the Northern District of California declared that Meta should be held accountable for harmful content the company’s platforms have exposed young users to. In addition to that judgment in California, Meta is also dealing with a multitude of similar lawsuits from other attorneys across the country. The New York Times claims Meta’s Instagram platform “routinely documented” or collected personal data on children under 13. According to a complaint brought by the attorneys general of 33 states, Meta has received more than one million reports of underage users on Instagram since 2019; the company, however, has only disabled a fraction of those accounts. In an attempt to improve their reputation and protect themselves in the legal realm, Meta has begun a campaign to “protect children” through securing age restrictions across app stores, thus preventing underage users from accessing Instagram and other Meta apps.

This effort is, however, a way for Meta to dodge responsibility.

Meta-sponsored legislation

I am a casual daily consumer of Spotify, the digital music, podcast and video service, and recently I came across an ad sponsored by Meta. The ad called for public support of Meta-sponsored legislation which would restrict minors from creating social media accounts without their parents’ permission. Since preventing kids from viewing controversial content online would mean work for Meta, the company’s proposal would simply pass that task on to app stores. The ad stated, “As an industry, we should come together with lawmakers to create simple, efficient ways for parents to oversee their teens’ online experiences.” The proposal is clearly just a simple way for Meta to pass the buck, and avoid taking responsibility for its platforms’ inappropriate content reaching minors.

Meta’s proposal would pass the task of restricting minors from inappropriate content on to app stores.

According to Meta’s Global Head of Safety Antigone Davis, the legislation will hand app stores the responsibility of guaranteeing young kids are not on 18-plus apps. “We support federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps,” Davis said. “With this solution, when a teen wants to download an app, app stores would be required to notify their parents, much like when parents are notified if their teen attempts to make a purchase.” The logistics of this legislation include having to confirm your identity via ID, being kicked off a platform if you cannot verify your identity, and having parental permission to download paid apps.

Company shields itself

However Meta’s proposal is interpreted by the public or by lawmakers, the company will not have made amends for the unethical things it has done in the past. In an attempt to shield itself from further legal jeopardy, the Meta is advocating placing responsibility for protecting youth on app stores. But improving age restrictions requires a shift in social platform policy, one that will require Meta to participate equally in resolving the issue. They are the source of the problem, since they allow the posting of controversial or inappropriate content. The app stores have nothing to do with what is on Meta’s platforms.

So how could Meta take responsibility for what is posted on its platforms? That’s a no-brainer. They could simply do what they are suggesting the app stores do: require an ID proving the user is older than 13 when downloading an app or joining a group, and obtaining parental permission for minors who wish to use the service.

How these restrictions would be implemented would be up to Meta, but given the company’s expertise with algorithms and data mining, creating content and use walls should be a walk in the park. They just have to do it – and not fob the responsibility off on a third party.