Instagram is testing a new feature that allows users to completely clear all recommended content from their feeds. This "reset" tool, set to roll out globally soon, will enable users to refresh their experience by wiping out algorithm-driven suggestions. However, Instagram notes that recommendations will gradually personalize again over time as users interact with content. This development comes alongside a series of new features aimed at promoting user well-being on the platform. In recent months, Instagram has introduced various safety updates, including enhancements in October and specialized teen accounts in September. The move has been welcomed by the UK media regulator Ofcom, which commended Instagram for taking steps to improve online safety ahead of the implementation of the UK's new Online Safety Act (OSA). "It's encouraging to see Instagram implement these changes before regulations begin to take effect, and we'll be pushing for businesses to do more to empower and protect their users," Ofcom said. The OSA, once fully enforced, will require major platforms to provide users with greater control over their online experiences. Ofcom has warned tech companies that significant changes will be necessary to comply with the new law. Meta, Instagram’s parent company, explained in a blog post that the reset feature would be available to all users, including teenagers. Accessible through the "content preferences" menu, the tool allows users to reset their suggested content and unfollow accounts whose posts frequently appear in their feeds. Meta emphasized its commitment to ensuring that Instagram provides safe, positive, and age-appropriate experiences for everyone. Currently, Instagram users can influence recommendations by indicating whether they like or dislike posts. However, the new reset feature offers a more comprehensive option for starting fresh. This initiative mirrors a similar feature already available on TikTok, where users can refresh their "For You" feed. The introduction of this feature aligns with the upcoming enforcement of the OSA in December. Companies like Meta will have three months to assess and mitigate risks of illegal content appearing on their platforms. Additionally, by April 2025, the regulator will finalize Children’s Safety codes, including requirements for giving young users more control over their social media feeds.