Thursday, 19 September 2024 13:15

Instagram's new teen accounts: Meta's attempt to protect youth online

instagram instagram Pixabay

With increasing scrutiny over the impact of social media on young users, Instagram has introduced new measures to enhance safety for teens on its platform. These changes are aimed at addressing concerns about exposure to harmful content, unwanted interactions, and excessive screen time. As part of these efforts, Meta, Instagram's parent company, is rolling out new account types specifically designed for users under 18, marking a significant step in social media regulation for minors.

Overview of teen account introduction

Starting from Tuesday, Instagram is launching dedicated teen accounts in countries such as the U.S., U.K., Canada, and Australia. This new initiative will automatically place users under 18 into these restricted accounts, with existing accounts transitioning over the next 60 days. For teens in the European Union, the changes are expected to be implemented later this year. These accounts come with more privacy and safety restrictions, designed to shield young users from inappropriate content and unwanted interactions.

Meta's president of global affairs, Nick Clegg, emphasized the need for better supervision, acknowledging the growing concerns from parents, educators, and regulators about social media's impact on youth mental health and well-being. As of 2023, more than 95% of U.S. teens aged 13 to 17 use social media, according to the Pew Research Center, with one-third of them reporting near-constant usage. These figures illustrate the magnitude of the issue that Meta seeks to address with its latest updates.

Key features of teen accounts

Teen accounts on Instagram are private by default, meaning only approved followers can view their posts. Additionally, private messages are restricted, allowing teens to receive messages only from users they follow or have connected with. Meta has also implemented limitations on content deemed sensitive, such as videos of physical confrontations or posts promoting cosmetic procedures, to minimize potential harm.

To further encourage healthy usage habits, Meta has introduced time management tools within teen accounts. Notifications will prompt teens when they spend more than 60 minutes on the app, encouraging them to take breaks. A "sleep mode" feature will mute notifications and send auto-replies to direct messages during night hours, specifically from 10 p.m. to 7 a.m. This feature aims to support healthier sleep patterns and minimize late-night screen time, which has been linked to negative mental health outcomes.

While these measures are applied to all users under 18, there are some flexibilities. Teens aged 16 and 17 can disable certain features, such as time limit notifications. However, those under 16 will need parental consent to make such changes. Meta’s Naomi Gleit, head of product, mentioned that these features were implemented in direct response to three main concerns voiced by parents: exposure to harmful content, unwanted contact from strangers, and excessive time spent on the platform.

Parental control and the evolving role of parents

In recent years, tech companies, including Meta, have shifted more responsibility onto parents to monitor their children's social media usage. U.S. Surgeon General Vivek Murthy criticized this trend in 2023, arguing that parents are often left to manage rapidly evolving technologies that have a profound influence on how their children develop socially and emotionally. He also noted that this is a challenge that prior generations did not face.

In response to such feedback, Meta has introduced more robust tools for parents to oversee their teens' Instagram accounts. Parents or guardians can now enable "parental supervision" features, which allow them to limit their teen's time on the app, down to specific intervals, such as 15 minutes. Meta also requires parents' permission for users under 16 to adjust certain account settings, reinforcing the company’s commitment to giving parents more control over their children's online experiences.

Instagram CEO Adam Mosseri, during an interview with ABC's Good Morning America, reiterated that the company is working hard to balance parental control with user autonomy. According to Mosseri, while the platform's changes aim to address the concerns of parents, they are designed to do so in a way that does not overwhelm parents with responsibility. "We’ve really decided that parents should be our north star," he explained, adding that Instagram's tools aim to shape a safer environment for teens while still allowing parents to get involved if they choose.

Legal challenges Meta is facing

While Meta's new features represent a step forward in improving teen safety on Instagram, they come at a time when the company is facing legal challenges across North America. Several U.S. states have filed lawsuits against Meta, accusing the tech giant of deliberately creating addictive features on Instagram and Facebook, which they allege have exacerbated the youth mental health crisis. These lawsuits assert that Meta has knowingly profited from the excessive use of its platforms by young users, at the expense of their mental and emotional well-being.

Ontario school boards in Canada have also taken legal action against Meta, Snap Inc. (which owns Snapchat), and ByteDance Ltd. (the parent company of TikTok). These lawsuits seek compensation for the alleged damage social media companies have caused to the learning environment. According to the claims, the platforms' design encourages compulsive use by students, thereby disrupting educational processes and impacting overall school climates.

Despite these legal challenges, the effectiveness of these lawsuits remains uncertain. Meta has defended its policies, pointing to the new measures it has implemented in recent years, including the latest updates to teen accounts. However, critics argue that these changes may not go far enough to curb the more deep-rooted issues associated with social media use among teens.

Instagram's introduction of teen accounts marks a notable attempt to address growing concerns about the impact of social media on youth. With private accounts by default, restrictions on sensitive content, and tools for managing screen time, Meta aims to create a safer environment for young users. By empowering both teens and parents with more control over online interactions and exposure, Meta is taking steps to mitigate the negative consequences of excessive social media use. However, as the company continues to face legal challenges and public scrutiny, the long-term effectiveness of these measures will ultimately determine their success in protecting the next generation from the potential harms of social media.

source: CBC

 

 

Cars & Tech

  • 1