Instagram has announced a new feature where parents will be notified if their teenage children repeatedly search for terms related to suicide or self-harm within a short period. This initiative comes amid increasing pressure on governments to implement regulations similar to Australia’s ban on social media use for individuals under 16.
The social media platform, owned by Meta Platforms Inc., revealed that it will send alerts to parents who have signed up for its optional supervision setting when their children attempt to access content related to suicide or self-harm. This notification system will be rolled out starting next week in Canada, the United States, Britain, and Australia.
Instagram emphasized that these alerts complement their existing efforts to safeguard teenagers from potentially harmful content on the platform. The company stated, “We have strict policies in place against content that promotes or glorifies suicide or self-harm.” Currently, Instagram’s policy entails blocking such search queries and directing individuals to support resources.
Governments worldwide are increasingly focusing on protecting children from online threats, particularly following concerns surrounding the AI chatbot Grok, which has been linked to the creation of non-consensual sexualized images. In response, countries like Britain and Australia have contemplated implementing restrictions to enhance online child safety. Similarly, Spain, Greece, and Slovenia have also expressed interest in limiting internet access for minors.
In the UK, efforts to restrict children’s access to pornography websites have raised privacy concerns for adults and sparked debates with the US regarding free speech boundaries and regulatory jurisdiction. Instagram’s introduction of “teen accounts” for individuals under 16 requires parental approval for settings adjustments. Moreover, parents can opt for additional monitoring features in agreement with their teenagers, ensuring that young users are shielded from viewing sensitive content, including sexually explicit or violent material.

