Latest news with #EmmaSadleir


Daily Maverick
7 hours ago
- Daily Maverick
Meta held accountable by SA court in fight against online child sexual abuse
Following a landmark Gauteng Division of the High Court judgment, Meta must permanently remove dozens of accounts spreading child sexual abuse content and provide information on offenders to South African authorities. In a landmark ruling, South Africa's Gauteng Division of the High Court has ordered Meta, the parent company of WhatsApp, Instagram, and Facebook, to permanently close dozens of anonymous accounts distributing explicit child sexual abuse material involving local schoolchildren. Meta must also provide identifying information of the offenders to aid prosecution, marking a pivotal step in South Africa's fight against online child abuse. Emma Sadleir from The Digital Law Co, who has been at the forefront of the legal battle, said the issue escalated after a call from a Centurion school about harmful content spreading rapidly across WhatsApp channels and Instagram accounts, with some gaining 20,000 followers within hours. These accounts cross-promoted content, moving users from Instagram to WhatsApp, where more extreme material circulated. When Meta failed to act despite repeated reports, The Digital Law Co sought legal intervention. On 14 July 2025, the Gauteng Division of the High Court in Johannesburg ordered Meta and three officials to remove six WhatsApp channels and 30 Instagram profiles, to block the offenders from creating new accounts, and to disclose their identities. Though some accounts were deleted, many remained active, and new profiles emerged rapidly, prompting a contempt of court application demanding immediate removal, permanent disabling of offender access, and disclosure of identities. Highly explicit, identifiable and harmful content targeting minors The explicit nature of the content circulating on the Meta-owned platforms was deeply disturbing, said Sadleir. She said there was highly explicit and illegal content involving South African school children. This material included graphic child sexual abuse images and videos, such as footage of minors engaging in sexual acts at school, often filmed by other pupils. These posts frequently contained identifying information about the victims, including their names, grades and schools, alongside offensive and defamatory descriptions. The content was not only pornographic but also widely disseminated to large groups of mostly underage followers. 'The most extreme content was obviously the nudes and the sex tapes. Some of them were consensually created, but many were not. They named the kids, linked to their Instagram or TikTok profiles, mentioned their schools, grades — even classes. So anyone viewing could easily identify who they were, and that level of specificity has devastating consequences for the children involved,' said Sadleir. From a legal perspective Sadleir said this was clear-cut child pornography. 'We know these kids are under 18, often as young as Grade 7 or 8. Any private sexual photos or videos involving minors fall squarely under child pornography laws,' she said. The content was not only sexual, but horrifyingly exploitative and degrading. 'There were messages about kids exchanging sex for taxi rides, girls with STIs, reports of sexual abuse by family members, even cases of repeated abortions. It was extreme and deeply upsetting, just completely unacceptable,' said Sadleir. Landmark court-ordered measures Sadleir said it became clear through the course of litigation that some of their demands extended beyond Meta's current technical infrastructure. The company has now agreed, through a court-sanctioned joint consent order, to take the following steps: Permanently remove, as far as is technically feasible, all Instagram accounts and WhatsApp channels reported by us on behalf of victims, thereby cutting off public access to this deeply harmful material. Disclose subscriber information of more than 60 offending accounts across both platforms, enabling victims and their families to pursue justice through appropriate legal avenues. Establish a direct two-year hotline between The Digital Law Co and Meta, to fast-track urgent child protection matters and ensure that future reports do not fall through the cracks. 'This is a powerful affirmation of what can be achieved when the law is used not only as a shield, but as a sword in defence of the most vulnerable among us. We believe this is the first time in South African legal history that a global tech giant has agreed, in writing and in court, to these kinds of terms — and we hope it signals a turning point in how platforms respond to harm within our jurisdiction. Importantly, it also acknowledges that tech companies operating in South Africa must do so in line with South African laws, South African court orders, and South African standards of dignity and child protection,' said Sadleir. She added that the identifying information was with forensic teams, and once they found out who was behind the accounts, criminal charges would be laid, even if the culprits are found to be underage. 'Children have full criminal capacity in South Africa from 14, arguable from 12, so we'll make criminal charges, definitely criminal injuria, distribution of private sexual images, non-consensual distribution of intimate images, child pornography offences — so the solicitation, distribution and possession of child pornography,' she said. Beyond criminal prosecution, there was a civil dimension where victims could seek damages. Additionally, schools would be notified to consider disciplinary action, including expulsions, said Sadleir. What does the law say? In South Africa, social media content involving children is regulated by laws aimed at protecting minors from exploitation and harm. Key legislation includes the Films and Publications Act and the Criminal Law (Sexual Offences and Related Matters) Amendment Act, which criminalise the creation, possession and distribution of sexually explicit material involving minors. Platforms operating in South Africa must comply with court orders and laws to remove harmful content and provide subscriber information to support prosecutions. Victims can also pursue civil claims for damages, and schools can take safeguarding actions such as expulsions. Despite these measures, enforcement is challenged by technological limits, weak moderation, and complex reporting procedures. Ephraim Tlhako, acting CEO of the Films and Publications Board, said the prevalence of harmful content was significant. 'Between April 2024 and March 2025, we handled 20 cases involving child sexual abuse material. While 20 cases might seem small over 12 months, the volume is staggering; we analysed approximately 200,000 images and videos. Since April 2025, we've processed five cases containing more than one million images,' he said. Despite ongoing efforts, the Films and Publications Board struggled with limited financial and human resources, which restricted its ability to deploy advanced technologies like AI for faster detection and removal of harmful content. Currently, it relied heavily on complaints and law enforcement referrals, said Tlhako. Collaboration with social media platforms and internet service providers remained crucial, and the Films and Publications Board routinely issued takedown notices under section 77 of the Electronic Communications and Transactions Act to promptly remove illegal content, especially child sexual abuse material. Last year, about 17 such notices were issued and mostly complied with. However, compliance could be challenging when platform community guidelines conflicted with the board's classification standards, said Tlhako. Parental vigilance crucial to protecting children online On protecting children, Sadleir emphasised parental vigilance. 'Parents need to be actively involved and know what their children are doing online. Phone use at night is a huge risk; a lot of this content was being shared after midnight, when kids should be sleeping or studying,' she said. Sadleir flagged WhatsApp channels as especially dangerous. 'Unlike traditional WhatsApp chats, channels allow for the mass broadcast of content without followers needing to engage or be visible. They attract tens of thousands of views quickly, with no moderation and no accountability. Worse, if a channel owner deletes the channel, the content remains on all recipients' devices indefinitely,' she said. Reporting these channels was complicated, as there was no proper escalation or categorisation of reports on these channels. 'You just click 'report', and the platform gives no transparency on how seriously they treat things like child sexual abuse material versus less urgent reports,' she said. For parents, Tlhako advised practical protective measures. 'There are technologies and applications you can install on your child's devices to monitor activity and set age restrictions. For example, a nine-year-old can only download apps or games meant for that age, and if they want something beyond that, it requires parental approval.' He also noted how device manufacturers were incorporating 'safety by design' features, but warned that if parents didn't properly set up the devices, children could have unrestricted access. The Films and Publications Board facilitates reporting not only for law enforcement, but also the public through its WhatsApp Hotline (083 428 4767), the Films and Publications Board hotline (0800 148 148), and an online complaints portal. Other suggestions on how parents can protect their children from harmful content on social media include: Stay actively involved in your child's online activities. Monitor device usage, especially during late hours. Set parental controls and age restrictions on devices. Use monitoring apps to track screen time and chats. Educate children about online safety and risks. Encourage reporting of suspicious or harmful content. Be cautious with WhatsApp channels and broadcast features. Regularly review your child's social media contacts. Teach privacy settings, blocking, and reporting tools. Maintain communication with schools and the community for vigilance. Meta's response and safety enhancements The Department of Communications and Digital Technologies welcomed the recent ruling. Deputy Minister Mondli Gungubele remarked that it confirmed his earlier statements during the department's recent budget vote on the dangers of child sexual abuse material. He further urged parents and guardians to regularly monitor their children's online activities and educate them about harmful and prohibited content. Meta announced on 23 July 2025 new safety enhancements for Instagram aimed at protecting teens and children, including enhanced controls for teen accounts and stricter settings for adult-managed accounts featuring minors. Teens will now see the account creation date in DMs, receive safety tips, and have a combined block-and-report button for suspicious users, building on existing safety notices that led to more than one million blocks and reports in June. A new location notice alerts teens if their chat partner is in another country, to combat sextortion. The nudity protection feature, which blurs suspected nude images by default, remains enabled by 99% of teens, reducing unwanted exposure and sharing. For adults managing accounts for children under 13, Meta applies the strictest message settings, enables Hidden Words to block offensive comments, and limits these accounts' visibility to suspicious adults by removing them from searches, and hiding comments. Earlier this year, Meta removed nearly 135,000 Instagram accounts linked to sexualised comments or image solicitation on child-managed accounts, along with 500,000 related Facebook and Instagram accounts. DM


The South African
14 hours ago
- The South African
Warning: Why parents should check kids WhatsApp channels
Social media law expert Emma Sadleir has issued an urgent warning to parents across South Africa to monitor their children's WhatsApp channels, following a sharp increase in harmful and illegal content being distributed via the platform, particularly in Gauteng. Sadleir, founder of The Digital Law Company, said that an 'awful thread of communication' is spreading through WhatsApp Channels – a relatively new feature that allows users to broadcast messages to unlimited followers – with children being directly targeted. 'The type of content we're seeing includes child sexual abuse material, defamation, crimen injuria, and the unauthorised sharing of personal details, including social media handles,' Sadleir said. While the disturbing trend has been noted across the country, she flagged Johannesburg as a particular hotspot for this illicit digital activity. Unlike traditional WhatsApp groups, WhatsApp channels can host unlimited followers, making it harder to control who receives content – and who's behind it. Sadleir revealed that children are being added or directed to channels containing explicit and criminal material, often without the knowledge of their parents or guardians. The Digital Law Company is now actively working with legal professionals in preparation for court action aimed at identifying and holding the culprits accountable. Sadleir urged parents to take immediate action: Check your child's WhatsApp Channels by going to the 'Updates' tab (bottom left corner on most devices). Report any illegal or harmful content through WhatsApp's reporting function. Talk to children about the risks of unknown or unsolicited channel invites. 'The only way we can bring these criminals to justice and put pressure on Meta [WhatsApp's parent company] to act, is if people report this content,' she said. This alarming development underscores a broader issue facing South African families: the growing dangers of unsupervised screen time and digital communication platforms. Recent studies have already linked excessive screen time to emotional and behavioural issues in children – but this latest warning adds an urgent criminal safety dimension to the concern. Sadleir's message to parents is clear: 'Be proactive. Be present. Be vigilant.' Let us know by leaving a comment below, or send a WhatsApp to 060 011 021 1 Subscribe to The South African website's newsletters and follow us on WhatsApp, Facebook, X and Bluesky for the latest news.

IOL News
2 days ago
- Business
- IOL News
Landmark agreement with Meta to combat child exploitation in South Africa
Emma Sadleir and the legal team outside the Johannesburg High Court after protecting the rights of children who are being exploited on social media. Image: Facebook In a groundbreaking legal victory, the Digital Law Co (DLC) has secured an order in which Meta agreed to cooperate in the fight against child porn on its sites. Over the past two weeks, Emma Sadleir and her team fought a fierce legal battle against Meta, the parent company of Instagram and WhatsApp, in a bid to have disturbing posts of children removed from public sites. In the latest turn of events, DLC has secured a consent order, issued in the Gauteng High Court, Johannesburg, in which Meta has agreed to work closely with DLC. This case arose in response to the widespread circulation of sexually exploitative material involving South African schoolchildren on Meta-owned platforms. Meta had agreed to permanently remove, as far as is technically feasible, all Instagram accounts and WhatsApp Channels reported by DLC to them on behalf of victims, thereby cutting off public access to this deeply harmful material. The digital giant also agreed to disclose subscriber information for over 60 offending accounts across both platforms, enabling victims and their families to pursue justice through appropriate legal avenues. It will further establish a direct two-year hotline between The Digital Law Co and Meta to fast-track urgent child protection matters and ensure that future reports do not fall through the cracks. Sadleir responded that this is a powerful affirmation of what can be achieved when the law is used not only as a shield, but as a sword in defence of the most vulnerable. 'We believe this is the first time in South African legal history that a global tech giant has agreed, in writing and court, to these kinds of terms. We hope it signals a turning point in how platforms respond to harm within our jurisdiction,' Sadleir said. 'The work is not done. Technology evolves. Harms migrate. But we have taken a stand - and we believe South Africa is safer for it,' Sadleir said. Rorke Wilson of DLC, meanwhile, said part of the earlier court order has been complied with, as Meta has sent some details of the offending accounts, and more are expected to be sent on Wednesday. The hotline has also been very responsive, as some accounts have been taken down quickly. Rorke said from what they have seen, the person or persons who are behind these offending posts seem to have the wind taken out of their sails as these accounts are now cut before they're able to grow too big. Cape Times

IOL News
2 days ago
- Business
- IOL News
Digital Law Co secures landmark agreement with Meta to combat child exploitation in South Africa
Emma Sadleir and the legal team outside the Johannesburg High Court after protecting the rights of children who are being exploited on social media. Image: Facebook The Digital Law Co (DLC) has achieved a landmark moment in the fight to protect South African children from digital harm by securing an order in which the technical giant agreed to cooperate in the fight against child porn on its sites. Over the past two weeks, Emma Sadleir and her team fought a fierce legal battle against Meta, the parent company of Instagram and WhatsApp, in a bid to have disturbing posts of children removed from public sites. In the latest turn of events, DLC has secured a consent order, issued in the Gauteng High Court, Johannesburg, in which Meta has agreed to work closely with DLC. This case arose in response to the widespread circulation of sexually exploitative material involving South African schoolchildren on Meta-owned platforms. Video Player is loading. Play Video Play Unmute Current Time 0:00 / Duration -:- Loaded : 0% Stream Type LIVE Seek to live, currently behind live LIVE Remaining Time - 0:00 This is a modal window. Beginning of dialog window. Escape will cancel and close the window. Text Color White Black Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Background Color Black White Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Transparent Window Color Black White Red Green Blue Yellow Magenta Cyan Transparency Transparent Semi-Transparent Opaque Font Size 50% 75% 100% 125% 150% 175% 200% 300% 400% Text Edge Style None Raised Depressed Uniform Dropshadow Font Family Proportional Sans-Serif Monospace Sans-Serif Proportional Serif Monospace Serif Casual Script Small Caps Reset restore all settings to the default values Done Close Modal Dialog End of dialog window. Advertisement Video Player is loading. Play Video Play Unmute Current Time 0:00 / Duration -:- Loaded : 0% Stream Type LIVE Seek to live, currently behind live LIVE Remaining Time - 0:00 This is a modal window. Beginning of dialog window. Escape will cancel and close the window. Text Color White Black Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Background Color Black White Red Green Blue Yellow Magenta Cyan Transparency Opaque Semi-Transparent Transparent Window Color Black White Red Green Blue Yellow Magenta Cyan Transparency Transparent Semi-Transparent Opaque Font Size 50% 75% 100% 125% 150% 175% 200% 300% 400% Text Edge Style None Raised Depressed Uniform Dropshadow Font Family Proportional Sans-Serif Monospace Sans-Serif Proportional Serif Monospace Serif Casual Script Small Caps Reset restore all settings to the default values Done Close Modal Dialog End of dialog window. Next Stay Close ✕ Meta had agreed to permanently remove, as far as is technically feasible, all Instagram accounts and WhatsApp Channels reported by DLC to them on behalf of victims, thereby cutting off public access to this deeply harmful material. The digital giant also agreed to disclose subscriber information for over 60 offending accounts across both platforms, enabling victims and their families to pursue justice through appropriate legal avenues. It will further establish a direct two-year hotline between The Digital Law Co and Meta to fast-track urgent child protection matters and ensure that future reports do not fall through the cracks. Sadleir responded that this is a powerful affirmation of what can be achieved when the law is used not only as a shield, but as a sword in defence of the most vulnerable among us. 'We believe this is the first time in South African legal history that a global tech giant has agreed, in writing and court, to these kinds of terms. We hope it signals a turning point in how platforms respond to harm within our jurisdiction,' Sadleir said. 'The work is not done. Technology evolves. Harms migrate. But we have taken a stand — and we believe South Africa is safer for it,' Sadleir said. Rorke Wilson of DLC, meanwhile, said part of the earlier court order has been complied with, as Meta has sent some details of the offending accounts, and more are expected to be sent on Wednesday. The hotline has also been very responsive, as some accounts have been taken down quickly. Rorke said from what they have seen, the person or persons who are behind these offending posts seem to have the wind taken out of their sails as these accounts are now cut before they're able to grow too big.

Business Insider
5 days ago
- Business
- Business Insider
Meta faces legal battle in South Africa over illicit content involving minors
The decision comes as South Africa grapples with rising cybercrime, including WhatsApp scams and the spread of illegal content. This development follows a high-profile legal case initiated by social media law expert Emma Sadleir, who took action against the tech giant after discovering over 30 Instagram accounts and at least six WhatsApp channels distributing illicit content and personal information of South African schoolchildren. Court documents revealed that new accounts were being created "every few minutes" to distribute the material, indicating an organized and persistent campaign that has sparked nationwide concern and urgent legal action. Sadleir, representing The Digital Law Company, emphasized the importance of protecting vulnerable children. She said, " This is about protecting vulnerable children. Full compliance with the court order is critical to identifying the perpetrators.' In support of the lawsuit, the Pretoria High Court ordered Meta to discontinue the identified accounts and provide subscriber information, including names, email addresses, phone numbers, and IP addresses used at account creation and last login. Despite the urgency, Meta's initial refusal to comply prompted the legal team to file a contempt of court application. The tech giant however argued that the filing had misidentified legal entities, thereby delaying the process. Critics claimed that Meta was avoiding accountability despite having the technical means to respond. Amid growing public and legal pressure, Meta agreed to a settlement on July 18, specifically due to the threat of imprisonment for Meta's Southern Africa representative, Thabiso Makenete. The company has since deactivated over 60 accounts and promised to provide the requested data within three business days under strict confidentiality. Emma Sadleir, founder of The Digital Law Company, described the agreement as unprecedented, she said, " This may be the first time in South Africa that a global tech company has formally agreed in writing to provide such data in compliance with a local court order." Regulatory compliance mars Meta, SA relations Notably, this is not the first time both entities have conflicted. South Africa's Information Regulator has previously had disputes with Meta over compliance with the Protection of Personal Information Act (POPIA). In 2024, WhatsApp was criticized for vague privacy terms and unauthorized data-sharing with Meta and third parties. This case adds another layer on recent security concerns, testing Meta's willingness to balance privacy obligations with public safety demands. While child protection advocates have welcomed the ruling as a major step toward digital accountability, digital rights organizations have warned about the broader implications. "We must ensure that data disclosures do not set a precedent for unchecked surveillance," a spokesperson for the South African Digital Rights Forum noted.