
Instagram wrongly says some users breached child sex abuse rules
BBC News has been contacted by more than 100 people who claim to have been wrongly banned by Meta. Some talk of a loss of earnings after being locked out of their business pages, while others highlight the pain of no longer having access to years of pictures and memories. Many point to the impact it has had on their mental health.Over 27,000 people have signed a petition that accuses Meta's moderation system, powered by artificial intelligence (AI), of falsely banning accounts and then having an appeal process that is unfit for purpose.Thousands of people are also in Reddit forums dedicated to the subject, and many users have posted on social media about being banned. Meta has previously acknowledged a problem with Facebook Groups but denied its platforms were more widely affected.
'Outrageous and vile'
The BBC has changed the names of the people in this piece to protect their identities.David, from Aberdeen in Scotland, was suspended from Instagram on 4 June. He was told he had not followed Meta's community standards on child sexual exploitation, abuse and nudity.He appealed that day, and was then permanently disabled on Instagram and his associated Facebook and Facebook Messenger accounts.David found a Reddit thread, where many others were posting that they had also been wrongly banned over child sexual exploitation."We have lost years of memories, in my case over 10 years of messages, photos and posts - due to a completely outrageous and vile accusation," he told BBC News.He said Meta was "an embarrassment", with AI-generated replies and templated responses to his questions. He still has no idea why his account was banned."I've lost endless hours of sleep, extreme stress, felt isolated. It's been horrible, not to mention having an accusation like that over my head."Although you can speak to people on Reddit, it is hard to go and speak to a family member or a colleague. They probably don't know the context that there is a ban wave going on."The BBC raised David's case to Meta on 3 July, as one of a number of people who claimed to have been wrongly banned over child sexual exploitation. Within hours, his account was reinstated.In a message sent to David, and seen by the BBC, the tech giant said: "We're sorry that we've got this wrong, and that you weren't able to use Instagram for a while. Sometimes, we need to take action to help keep our community safe.""It is a massive weight off my shoulders," said David.
Faisal was banned from Instagram on 6 June over alleged child sexual exploitation and, like David, found his Facebook account suspended too. The student from London is embarking on a career in the creative arts, and was starting to earn money via commissions on his Instagram page when it was suspended. He appealed after feeling he had done nothing wrong, and then his account was then banned a few minutes later.He told BBC News: "I don't know what to do and I'm really upset."[Meta] falsely accuse me of a crime that I have never done, which also damages my mental state and health and it has put me into pure isolation throughout the past month." His case was also raised with Meta by the BBC on 3 July. About five hours later, his accounts were reinstated. He received the exact same email as David, with the apology from Meta.He told BBC News he was "quite relieved" after hearing the news. "I am trying to limit my time on Instagram now."Faisal said he remained upset over the incident, and is now worried the account ban might come up if any background checks are made on him.A third user Salim told BBC News that he also had accounts falsely banned for child sexual exploitation violations.He highlighted his case to journalists, stating that appeals are "largely ignored", business accounts were being affected, and AI was "labelling ordinary people as criminal abusers".Almost a week after he was banned, his Instagram and Facebook accounts were reinstated.
What's gone wrong?
When asked by BBC News, Meta declined to comment on the cases of David, Faisal, and Salim, and did not answer questions about whether it had a problem with wrongly accusing users of child abuse offences.It seems in one part of the world, however, it has acknowledged there is a wider issue.The BBC has learned that the chair of the Science, ICT, Broadcasting, and Communications Committee at the National Assembly in South Korea, said last month that Meta had acknowledged the possibility of wrongful suspensions for people in her country.Dr Carolina Are, a blogger and researcher at Northumbria University into social media moderation, said it was hard to know what the root of the problem was because Meta was not being open about it.However, she suggested it could be due to recent changes to the wording of some its community guidelines and an ongoing lack of a workable appeal process."Meta often don't explain what it is that triggered the deletion. We are not privy to what went wrong with the algorithm," she told BBC News.In a previous statement, Meta said: "We take action on accounts that violate our policies, and people can appeal if they think we've made a mistake." Meta, in common with all big technology firms, have come under increased pressure in recent years from regulators and authorities to make their platforms safe spaces.Meta told the BBC it used a combination of people and technology to find and remove accounts that broke its rules, and was not aware of a spike in erroneous account suspension.Meta says its child sexual exploitation policy relates to children and "non-real depictions with a human likeness", such as art, content generated by AI or fictional characters.Meta also told the BBC a few weeks ago it uses technology to identify potentially suspicious behaviours, such as adult accounts being reported by teen accounts, or adults repeatedly searching for "harmful" terms.Meta states that when it becomes aware of "apparent child exploitation", it reports it to the National Center for Missing and Exploited Children (NCMEC) in the US. NCMEC told BBC News it makes all of those reports available to law enforcement around the world.
Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Times
an hour ago
- Times
Boy planned terrorist attack on mosque inspired by Anders Breivik
A teenager who idolised the Norwegian mass murderer Anders Breivik has admitted planning to carry out a massacre at a Scottish mosque. The 17-year-old boy, who cannot be named for legal reasons, had posted on TikTok about white people being at 'war' with other races and how he had 'developed sympathies' with the Nazi party. He also listed a number of 'inspirations' such as Adolf Hitler, Benito Mussolini and Breivik, the neo-Nazi who murdered 77 people in Norway in 2011. An investigation by police intelligence led to officers finding the boy armed with weapons, including an air gun, outside the Inverclyde Muslim Centre. He claimed that the gun would keep worshippers inside once he had set the building on fire. The boy planned his attack after he convinced the centre's imam that he wanted to covert to Islam. He was at times left alone in the building, which allowed him to make sketches and videos of the layout.


BBC News
2 hours ago
- BBC News
Federal judge says voiceover artists AI lawsuit can move forward
A federal judge in New York has allowed a lawsuit to move forward from two voice over artists alleging their voices were stolen by an AI voice judge dismissed artists Paul Skye Lehrman and Linnea Sage claims that their voices were subject to federal claims from the artists of breach of contract and deceptive business practices, as well as separate copyright claims alleging that the voices were improperly used as part of the AI's training data, will, however, move Lovo Inc. had asked for the case to be dismissed entirely. The company has not yet responded to the BBC's request for comment. The judge's decision comes after a flood of cases from artists against artificial intelligence companies alleging misuse of their work to train AI artists' attorney, Steve Cohen, has called the decision a "spectacular" victory for his clients, saying he was confident a future jury will "hold big tech accountable". Lawyers for Lovo had called the artists' allegations a "kitchen sink approach" saying the artists' claims failed to make an actionable claim against the artists, a couple living in New York City, filed a proposed class action lawsuit in 2024 after learning alleged clones of their voices were for sale via Lovo's text-to-speech platform couple claim they were separately approached by anonymous Lovo employees for voiceover work through the online freelance marketplace was paid $1200 (around £890). Sage received $800 (almost £600).In messages shared with the BBC, the anonymous client can be seen saying Lehrman and Sage's voices would be used for "academic research purposes only" and "test scripts for radio ads" anonymous messenger said the voiceovers would "not be disclosed externally and will only be consumed internally". Months later, while driving near their home in New York City, the couple listened to a podcast about the ongoing strikes in Hollywood and how artificial intelligence (AI) could affect the episode had a unique hook – an interview with an AI-powered chatbot, equipped with text-to-speech software. It was asked how it thought the use of AI would affect jobs in when it spoke, it sounded just like Mr Lehrman."We needed to pull the car over," Mr Lehrman told the BBC in an interview last year. "The irony that AI is coming for the entertainment industry, and here is my voice talking about the potential destruction of the industry, was really quite shocking."Upon returning home, the couple found voices with the names Kyle Snow and Sally Coleman available for use by paid Lovo later found Sage's alleged clone voicing a fundraising video for the platform –while Lehrman's had been used in an advertisement on the company's YouTube company eventually removed the voices, saying both voices were not popular on the case is now set to move ahead in the US District Court in Manhattan.


Sky News
3 hours ago
- Sky News
Care whistleblower 'who saw elderly resident being punched' could face removal from Britain
A care worker who reported the alleged abuse of an elderly care home resident, which triggered a criminal investigation, is facing destitution and potential removal from Britain after speaking up. "Meera", whose name we have changed to protect her identity, said she witnessed an elderly male resident being punched several times in the back by a carer at the home where she worked. Sky News is unable to name the care home for legal reasons because of the ongoing police investigation. "I was [a] whistleblower there," said Meera, who came to the UK from India last year to work at the home. "Instead of addressing things, they fired me... I told them everything and they made me feel like I am criminal. I am not criminal, I am saving lives," she added. Like thousands of foreign care workers, Meera's employer sponsored her visa. Unless she can find another sponsor, she now faces the prospect of removal from the country. "I am in trouble right now and no one is trying to help me," she said. Meera said she reported the alleged abuse to her bosses, but was called to a meeting with a manager and told to "change your statement, otherwise we will dismiss you". She refused. The following month, she was sacked. The care home claimed she failed to perform to the required standard in the job. She went to the police to report the alleged abuse and since then, a number of people from the care home have been arrested. They remain under investigation. 'Migrants recruited because many are too afraid to speak out' The home has capacity for over 60 residents. It is unclear if the care home residents or their relatives know about the police investigation or claim of physical abuse. Since the arrests, the regulator, the Care Quality Commission (CQC), carried out an investigation at the home triggered by the concerns - but the home retained its 'good' rating. Meera has had no reassurance from the authorities that she will be allowed to remain in Britain. In order to stay, she'll need to find another care home to sponsor her which she believes will be impossible without references from her previous employer. She warned families: "I just want to know people in care homes like these... your person, your father, your parents, is not safe." She claimed some care homes have preferred to recruit migrants because many are too afraid to speak out. "You hire local staff, they know the legal rights," she said. "They can complain, they can work anywhere... they can raise [their] voice," she said. Sky News has reported widespread exploitation of care visas and migrant care workers. Currently migrants make up around a third of the adult social care workforce, with the majority here on visas that are sponsored by their employers. As part of measures announced in April in the government's immigration white paper, the care visa route will be closed, meaning care homes will no longer be able to recruit abroad. 'Whole system is based on power imbalance' But the chief executive of the Work Rights Centre, a charity that helps migrants with employment issues, is warning that little will change for the tens of thousands of foreign care workers already here. "The whole system is based on power imbalance and the government announcement doesn't change that," Dr Dora-Olivia Vicol told Sky News. She linked the conditions for workers to poor care for residents. "I think the power that employers have over migrant workers' visas really makes a terrible contribution to the quality of care," she said. Imran agrees. He came to the UK from Bangladesh, sponsored by a care company unrelated to the one Meera worked for. He says he frequently had to work 14-hour shifts with no break because there weren't enough staff. He too believes vulnerable people are being put at risk by the working conditions of their carers. Migrant workers 'threatened' over visas "For four clients, there is [a] minimum requirement for two or three staff. I was doing [it] alone," he said, in broken English. "When I try to speak, they just directly threaten me about my visa," he said. "I knew two or three of my colleagues, they are facing the same issue like me. But they're still afraid to speak up because of the visa." A government spokesperson called what happened to Imran and Meera "shocking". "No one should go to work in fear of their employer, and all employees have a right to speak up if they witness poor practice and care." James Bullion, from the CQC, told Sky News it acts on intelligence passed to it to ensure people stay safe in care settings.