Should legislators mandate that all digital products pay consideration to public health?
The upshot is a rush to protect children from social platforms’ algorithms and echo chambers.
Australia banned under-16s from accessing social media last December. Others are now following suit.
Spain is moving to ban U16s and its legislation will also include tracking how digital platforms fuel division and amplify hate speech. Slovenia is drafting a ban for U15s. The UK is also mulling over the idea.
The idea is gaining traction here too.
A recent Irish Times/Ipsos poll found that three-quarters of voters favour banning U16s from accessing social media.
Unlike other platforms, Discord has no algorithmically driven feed
Tánaiste Simon Harris, no stranger to social media himself, recently said that age limits for social media should be agreed at European level, while Communications Minister Patrick O’Donovan said that he would bring a proposal to Cabinet next month on age verification for young people on social media.
And while governments struggle with a legislative response, something strange happened last week: one social platform decided to “age-gate” itself.
The platform in question is Discord. It started life as a platform for gamers who needed reliable voice chat during multiplayer sessions.
But now it has evolved into a host of communities such as coders, music fans, political activists, crypto traders, school clubs and more, all existing on private servers. Unlike other platforms, there’s no algorithmically driven feed.
So what did they announce?
Are outright bans the solution? Photo: Getty
On Monday, they said they’d roll out age verification and all users’ accounts would be automatically set to a “teen-appropriate” experience unless they demonstrate that they’re adults.
Users who aren’t verified as adults will not be able to access age-restricted servers and channels and will see content filters for any content Discord detects as graphic or sensitive.
“We design our products with teen safety principles at the core and will continue working with safety experts, policymakers, and Discord users to support meaningful, long-term well-being for teens on the platform”, said Savannah Badalich, head of product policy at Discord.
Discord is hardly a social giant. But it’s no slouch either, boasting almost 700 million registered users and 259 million monthly active users. And its proactive approach certainly makes a change from the usual response to teen safety from tech platforms.
Meta has consistently said that age verification at the app store or operating system level might be a more practical approach. This effectively means it wants age-gating to be Apple or Google’s problem.
?Meta has also spent a fortune on traditional media promoting the parental controls on Instagram. Google has advocated for robust, platform-designed age assurances, parental controls, and insisted that YouTube is not a social platform.
The awkward truth is that this sort of legislation leads to migration
Representatives of TikTok, meanwhile, have argued that a social media ban would see children move to more unsafe areas of the internet.
This seems like a shifty line of reasoning, but there is a legitimate point here. The awkward truth is that this sort of legislation leads to migration.
In Australia, teenagers haven’t logged off the internet. Rather, they’ve reportedly moved to gaming platforms and messaging services that sit outside the legislation – which includes Discord.
This is the pattern of every well-intended ban: squeeze one area and another expands.
So are outright bans the solution?
Or should legislators be mandating that all digital products – large and small – are designed with some consideration of users’ mental health, including minors and vulnerable adults?
This is also something that legislators are considering. But this is more nuanced and less compelling to voting parents than a flat ban for kids.
EU regulators recently found that TikTok hasn’t done enough to assess how its addictive design features have lead to compulsive use which could harm users.
It’s far less memorable than pearl-clutching advocacy
The European Commission, which enforces the Digital Services Act, wants TikTok to change the basic design of its service, or face hefty fines.
Better product design, which would include guidelines about echo content moderation and transparency around how large platforms’ algorithms work are other things the Digital Services Act covers.
It’s a comprehensive framework that aims to create a safer, more accountable digital space. But it’s far less memorable than pearl-clutching advocacy that asks “won’t someone please think of the children?”
I fully expect more countries to opt for outright bans. And maybe that’s not a bad thing in the short term.
A final thought: I can’t help wondering if the real issue here is that the terminally online generation – the children we’re so eager to keep from social media – have a better understanding of the trade-offs involved in using social media.
Maybe the people who really need protection from algorithms and echo chambers – the people who struggle with doom scrolling and disinformation – are the adults?
source
