Meta’s platform Instagram is testing a unique feature that lets users reset their algorithm to promote a safer and more positive experience, especially for teens.
This update comes after studies revealed the harmful effects of excessive social media use on teenagers’ mental health.
According to research, spending over three hours daily on social media can double the risk of depression and anxiety in teens.
To address this concern, Instagram’s parent company, Meta wants to ensure users have valuable and age-appropriate experiences.
Let’s Check How the New Feature Works:
The algorithm reset allows users to clear recommended content across Explore, Reels, and Feed tabs and manually restart their feed with fresh suggestions. This will improve algorithm recommendations using “interested” and “not interested” options. This update aims to provide users with more personalised content and better control over their feeds.
Also, read| Meta To Use Public Posts On Facebook, Instagram In UK To Train AI
Instagram plans to roll out this feature globally, although the exact launch date is unknown. This move follows the US Surgeon General’s call for warning labels on social media which is similar to those on cigarette packaging.
Earlier this month, Australian Prime Minister Anthony Albanese announced that the government will legislate a ban on social media for children under 16, an initiative his government says is ‘world-leading’.
The social media platforms that would be impacted include Instagram and Facebook, as well as TikTok and X. In September, Instagram rolled out new privacy features and parental control for those under 18.
This new ‘Teen Accounts” upgrade aims to ensure the safety of young users who use Instagram. Mark Zuckerberg announced on his Threads platform that the platform ponders getting built-in protections including what they see and who they are messaging.
Also, read| Stay Safe In WhatsApp Groups With These Security Features And Tools
Instagram has set up the content filter in a meticulous and restrictive way that offers a high level of safety for teen accounts. It will make sure that exposure to inappropriate content or material is systemically banned or hidden from teenagers.