首页 > 资讯 > News > 正文
2025-08-31 09:05:27

Instagram Overhauls Safety Measures for Child-Centric and Teen Accounts

Instagram is introducing a set of strengthened safety measures for accounts managed by adults that predominantly feature children. According to a recent Meta announcement and corroborated by coverage from third-party sites, these accounts will now be automatically placed under the platform’s strictest messaging settings and have offensive comments filtered via the “Hidden Words” tool.

This update responds to mounting concern over individuals attempting to leave sexualized comments or request inappropriate images via direct messages – behavior Instagram reports has led to the removal of approximately 135,000 accounts and linked networks. By preventing such accounts from being recommended to users flagged as potentially suspicious – including those who have been previously blocked by ****s – Instagram aims to reduce unwanted attention and ensure that underage users are properly protected in spaces that should be safe for them.

Parallel improvements are being made to Teen Accounts. Messaging safety now includes in-chat tips, a combined block-and-report option in DMs, and a visible timestamp marking when a chatting account was created, helping ****agers assess conversation partners by giving them easy access to safety tools. These changes follow data showing ****s utilized safety prompts to block or report accounts over two million times in June alone.

These enhancements build on previous efforts like Instagram’s Teen Account rollout, which already defaulted to stricter privacy settings and limited messaging capabilities. However, recent scandals – including lawsuits and internal reviews – highlighted the risks children and ****s face on social platforms, prompting Meta to take more aggressive action.