Rwanda is drafting a law that would stop children under 16 from using social media platforms such as Facebook, Instagram, and YouTube. Rwanda has spent years building digital access into schools and public life, so this draft law now forces the country to solve two problems at once. It needs to protect children from harmful content and preserve the internet’s value for learning.
Governments across Africa and other regions now push social platforms to do more than post age limits in their terms of service. They want real checks, real accountability, and real consequences when platforms fail to keep children safe. Rwanda’s draft law lands in the middle of that shift.
Rwanda wants firmer rules
Minister of ICT and Innovation Paula Ingabire said the government is preparing a law that would block under-16s from opening accounts or viewing content on major social platforms. Officials are also studying age verification that would involve internet providers, platforms, parents, and Rwanda’s national ID system. This is still a draft, not a finished law, so the details now matter as much as the headline.
Rwanda did not pull this plan out of thin air. Ecofin reported that 46 percent of schoolchildren already access digital services through mobile phones, often without parental supervision. The same report said 30 percent to 35 percent of students reported problems linked to social media use, including attention issues and anxiety tied to online content. Those figures explain why Kigali wants stronger guardrails.
Rwanda already teaches children online
Rwanda’s own child online protection policy makes one point very clear. Children need safe internet access, not zero internet access. The policy says digital tools support education, creativity, self-expression, and economic empowerment. It also says Rwanda must help children use the digital world safely and confidently. That makes the coming law more complex than a simple block list.
That same balance shows up in Rwanda’s digital literacy work. In the first year of the country’s digital skills training program, 89,223 participants joined the training platform, including 76,388 students and 11,465 teachers. Those numbers show how deeply digital tools now sit inside Rwanda’s education system. A broad social media rule that ignores classroom use will clash with that reality.
RELATED STORIES:
Africa Launches First Child Protection Taskforce as Digital Threats Rise
Nigeria Moves to Set Social Media Age Limits for Children
Australia shows the enforcement gap
Australia gives Rwanda a live case study. In November 2024, Australia passed a law that requires social platforms to keep under-16s off their services or face fines of up to A$49.5 million. The ban took effect in December 2025 after a year-long rollout. Reuters also reported that YouTube won an exemption because schools use it widely. That detail matters because it shows lawmakers had to separate social use from educational use.
Enforcement has already exposed the weak spots. Reuters reported in February 2025 that many major platforms still relied on self-declared birth dates at sign-up. The same report cited eSafety data showing that 80 percent of Australian children aged 8 to 12 used social media in 2024, even though most of those services set minimum age rules. That gap shows how little a checkbox age gate can do on its own.
Australia’s own regulator later said a substantial proportion of under-16s still kept accounts on age-restricted platforms after the rules started. eSafety said platform-led deactivation drove most of the decline, but many children still had accounts because platforms had not yet asked them to confirm their age. Rwanda can read that as a warning. A law only works when platforms carry the main enforcement burden and actually act on it.
Gabon and Nigeria add pressure in Africa
Gabon has already gazetted a rule that requires age verification for access to social media and digital content. The rule covers biographical details, address information, and in some cases digital identity records tied to a personal identification number. Nigeria has also opened a public consultation on age limits, age checks, platform accountability, and stronger oversight for child safety online. African governments now treat this issue as a core digital policy question, not a side debate.
Rwanda needs a practical law
The strongest version of Rwanda’s law will draw a clean line between learning tools and social feeds built for endless engagement. A child watching a school lesson does not create the same risk profile as a child scrolling an algorithm-driven feed for hours. Australia dealt with that problem by exempting YouTube for school use, and Rwanda’s child online protection policy already says children need digital access for education and self-expression. Rwanda should keep that distinction front and center.
Rwanda also needs a narrow and trusted age check system. Gabon’s model leans hard on identity verification and audits. Australia added rules that stop platforms from forcing every user to upload a government ID. Rwanda can take the useful parts from both paths. It can require platforms to verify age, audit their systems, and protect privacy without turning child safety into blanket data collection.
Rwanda is asking the right question. Children do not need open access to every social platform. Still, a ban alone does not solve the problem. The law needs clear exemptions for education, strong duties for platforms, simple tools for parents, and rules that children and schools can actually follow. If Rwanda gets that balance right, it will set a useful standard for child safety policy in Africa.











