Instagram parent company Meta has introduced new safety features aimed at protecting teens who use its platforms, including information about accounts that message them and an option to block and report accounts with one tap.

The company also announced that it has removed thousands of accounts that were leaving sexualized comments or requesting sexual images from adult-run accounts of kids under 13.

Meta said teen users blocked more than a million accounts and reported another million after seeing a “safety notice” that reminds people to “be cautious in private messages and to block and report anything that makes them uncomfortable.”

Earlier this year, Meta began to test the use of artificial intelligence to determine if kids are lying about their ages on Instagram, which is technically only allowed for those over 13.

If it is determined that a user is misrepresenting their age, the account will automatically become a teen account, which has more restrictions than an adult account. In 2024, the company made teen accounts private by default.