U.S.-UK Joint Statement on Child Online Safety

Oct 10, 2024

The following joint statement was released by the Government of the United States of America and the Government of the United Kingdom on the sidelines of the G7 Ministerial on Industry, Technology, and Digital, as part of the U.S.-UK Comprehensive Dialogue on Technology & Data.

Begin text:

The United Kingdom and United States share fundamental values and a commitment to democracy and human rights, including privacy and freedom of expression. Both the United Kingdom and United States, alongside our international partners, are taking steps to support children’s online safety.

To make the Internet safer for children, we should aim to ensure all users have the skills and resources they need to make safe and informed choices online, and advance stronger protections for children. The United States and United Kingdom intend to work with our national institutions and organizations to support these goals and shared values. To help further these aims, both countries plan to establish a joint children’s online safety working group to advance the aims and principles of this statement.

There are parallels between the child online safety landscape in the United States and United Kingdom. Smartphone ownership is nearly universal amongst teenagers in both countries.1,2,3  Children in the United States and the United Kingdom actively engage with social media platforms daily; upwards of six in ten 13 to 17-year-olds in the United States and United Kingdom report using TikTok, Snapchat, and Instagram,4 whilst nearly nine in ten report using YouTube.5,6 In both countries, children report accessing social media at an early age. Nearly 40 percent of 8 to 12-year-olds in the United States7 and 63 percent of 8 to 11-year-olds in the United Kingdom report using social media.8

We recognize the significant educational and social benefits technology can provide children and seek to ensure that they can flourish, online and offline. To ensure these benefits can be maximized, online platforms, including social media companies, have a moral responsibility to respect human rights and put in place additional protections for children’s safety and privacy. Age-appropriate safeguards, including protections from content and interactions that harm children’s health and safety, are vital to achieve this goal. This includes measures to address and prevent sexual exploitation and abuse, harassment, cyberbullying, content that is abusive (including technology-facilitated gender-based violence), and content that encourages or promotes suicide, self-harm, and eating disorders.

The UK Government is committed to the online safety of children. The Online Safety Act places clear duties on online platforms to protect children’s safety and put in place measures to mitigate risks. For example, platforms must use ‘highly effective’ privacy preserving age assurance technologies to prevent children from encountering the most harmful content, including pornography (including violent pornography) and content which encourages or promotes suicide. Platforms also need to proactively tackle the most harmful illegal content and activity, including child sexual exploitation and abuse and content which disproportionately affects women and girls, such as harassment, intimate image abuse, and controlling or coercive behavior. Companies can elect to voluntarily extend these protections to children living across the world to increase the health, safety, and privacy of children.

The U.S. government has taken bold action to advance children’s online health, safety, and privacy. In 2023, the United States Surgeon General issued a new advisory about the effects social media use has on youth mental health. Building on this advisory, the U.S. government launched the Kids Online Health and Safety Task Force to advance the health, safety, and privacy of children online, including preventing and mitigating the adverse health effects that children can experience through the use of online platforms. The Task Force released a report which includes guidance and recommendations for industry, parents and caregivers, researchers, and policymakers on how to promote and enhance youth online health, safety, and privacy.9 Further, the U.S. government has made addressing image-based sexual abuse a core focus of its AI policy, with several actions to promote the safeguarding of AI systems from generating child sexual abuse material included in the Executive Order on the Safe, Secure, and Trustworthy Development and Use of AI, and issuing a call to action inviting industry to make voluntary commitments to reduce the generation, dissemination, and monetization of image-based sexual abuse. 

We should continue to advocate for increased transparency from online platforms, including clear and accessible terms of service and reporting on online safety practices, to assist governments, regulators, independent researchers, and the public to develop a better understanding of the technologies that are shaping children’s lives. Further independent, public interest research is needed to evaluate the impact of excessive social media and smartphone use on children’s development and enable researchers and policymakers to work towards a robust framework to assess the risks to children at different stages of their childhood and adolescence. To support such research, we should consider work to increase privacy-preserving access to online platform data for independent researchers. These risks and challenges associated with the digital environment are constantly evolving alongside new and emerging technologies, including generative AI. We should seek to ensure that research on the impacts of these new technologies keeps pace with their development.

Both countries acknowledge that risk-based and safety-, privacy-, and inclusivity-by-design approaches throughout design, development, and deployment are fundamental to children’s safety and wellbeing online, alongside increased transparency and accountability from online platforms. We consider that measures, as appropriate, such as preventing the promotion of harmful content, better reporting on content moderation, strong default privacy settings, and limits on targeted advertising, play an important role in protecting children from excessive data collection and harmful content and interactions, while delivering age-appropriate experiences. We also believe it is imperative that such measures are implemented in a manner that respects human rights, including privacy and freedom of expression. The United Kingdom and United States continue to work together to protect children online through multilateral forums such as the OECD.

We encourage online platforms to go further and faster in their efforts to protect children by taking immediate action and continually using the resources available to them to develop innovative solutions, while ensuring there are appropriate safeguards for user privacy and freedom of expression.

Children’s online safety is an issue of global importance. We also plan to work with our international partners to develop and promote common solutions, shared principles, and global standards that prioritize children’s wellbeing and champion a free, open, and secure Internet.End text.

Citations

1 – Pew Research, Teens, Social Media and Technology. 2023 Statistic: 95% of US teenagers own their own smartphone (based on a survey of 1,453 teenagers).

2 – The latest NTIA Internet Use Survey of U.S. household estimates that 83 percent of 15-24-year-olds use smart phones https://www.ntia.gov/data/explorer#sel=mobilePhoneUser&demo=age&pc=prop&disp=chart

3 – Ofcom, Children and parents: media use and attitudes report 2024 – interactive data. 2024 Statistic: 96% of UK teenagers own their own smartphone

4 – Ofcom, Children and parents: media use and attitudes report 2024 – interactive data. 2024 Statistic: In the UK, TikTok (66 percent), Snapchat (63 percent), and Instagram (58 percent) are popular among 12-17-year-olds

5 – Pew Research, Teens, Social Media and Technology. 2023

6 – Parents’ and children’s online behaviors and attitudes survey 2023. 30th October to 27th November 2023. Statistic: 88 percent of UK children use YouTube (this includes YouTube Kids)

7 – The U.S. Surgeon General’s Advisory, Social Media and Youth Mental Health. 2023

8 – Ofcom: Parents’ and Children’s Online Behaviours and Attitudes Survey 2023. 30th October To 27th November 2023. Table

9 – Online Health and Safety for Children and Youth: Best Practices for Families and Guidance for Industry (July 22, 2024), https://www.ntia.gov/category/kids-online-health-and-safety/online-health-and-safety-for-children-and-youth

Read the full report from the U.S. Department of Commerce: Read More