logo
#

Latest news with #NicoleBell

Remains found in B.C. identified as woman who went missing in 2017
Remains found in B.C. identified as woman who went missing in 2017

CBC

time18-06-2025

  • CBC

Remains found in B.C. identified as woman who went missing in 2017

Social Sharing Mounties in British Columbia's Interior say human remains found last month have been confirmed as being a woman who went missing near Sicamous nearly eight years ago. They say the remains were discovered on a rural property in Salmon Arm on May 5, and the B.C. Coroners Service has since linked them to Nicole Bell. She was 31 when she disappeared in September 2017, and major crime investigators who took conduct of the case quickly confirmed the woman's disappearance "was the result of foul play." She was one of five women who went missing in the North Okanagan over a 20-month period in 2016 and 2017, but police say it has been established that the disappearances were not all associated to the same person. But the RCMP news release says investigators believe the person responsible for Bell's death is the primary suspect in the killing of at least one other victim, but that man is dead. Eighteen-year-old Traci Genereaux was reported missing on June 9, 2017, and her remains were found when police executed a search warrant on Curtis Sagmoen's family farm in Salmon Arm in October that year. Police do not name Sagmoen in the news release, but say they do not believe there is an ongoing threat to the public as the primary suspect is dead. Sagmoen was reported dead in a Vernon hotel in April this year. The release says the Southeast District Major Crimes Unit continues to investigate the deaths of both women as "additional parties to the offence have not yet been ruled out."

Remains found in Salmon Arm identified as woman who went missing in 2017
Remains found in Salmon Arm identified as woman who went missing in 2017

The Province

time18-06-2025

  • The Province

Remains found in Salmon Arm identified as woman who went missing in 2017

The remains were discovered on a rural property in Salmon Arm on May 5, and the B.C. Coroners Service has since linked them to Nicole Bell Published Jun 18, 2025 • Last updated 1 hour ago • 1 minute read missing woman Nicole Bell, last seen on Sept. 2, 2017, in Sicamous. Photo dates from 2013. Handout [PNG Merlin Archive] PNG Mounties say human remains found last month have been confirmed as being a woman who went missing near Sicamous nearly eight years ago. This advertisement has not loaded yet, but your article continues below. THIS CONTENT IS RESERVED FOR SUBSCRIBERS ONLY Subscribe now to read the latest news in your city and across Canada. Exclusive articles by top sports columnists Patrick Johnston, Ben Kuzma, J.J. Abrams and others. Plus, Canucks Report, Sports and Headline News newsletters and events. Unlimited online access to The Province and 15 news sites with one account. The Province ePaper, an electronic replica of the print edition to view on any device, share and comment on. Daily puzzles and comics, including the New York Times Crossword. Support local journalism. SUBSCRIBE TO UNLOCK MORE ARTICLES Subscribe now to read the latest news in your city and across Canada. Exclusive articles by top sports columnists Patrick Johnston, Ben Kuzma, J.J. Abrams and others. Plus, Canucks Report, Sports and Headline News newsletters and events. Unlimited online access to The Province and 15 news sites with one account. The Province ePaper, an electronic replica of the print edition to view on any device, share and comment on. Daily puzzles and comics, including the New York Times Crossword. Support local journalism. REGISTER / SIGN IN TO UNLOCK MORE ARTICLES Create an account or sign in to continue with your reading experience. Access articles from across Canada with one account. Share your thoughts and join the conversation in the comments. Enjoy additional articles per month. Get email updates from your favourite authors. THIS ARTICLE IS FREE TO READ REGISTER TO UNLOCK. Create an account or sign in to continue with your reading experience. Access articles from across Canada with one account Share your thoughts and join the conversation in the comments Enjoy additional articles per month Get email updates from your favourite authors They say the remains were discovered on a rural property in Salmon Arm on May 5, and the B.C. Coroners Service has linked them to Nicole Bell. She was 31 when she disappeared in September of 2017, and major crime officers who investigated quickly confirmed the woman's disappearance 'was the result of foul play.' She was one of five women who went missing in the North Okanagan over a 20-month period in 2016 and 2017, but police say it has been established that the disappearances were not all associated to the same person. Along the 75-kilometre stretch between Vernon and Sicamous, at least five women – Ashley Simpson (left), Deanna Wertz, Caitlin Potts, Nicole Bell, and Traci Genereaux – have vanished. Photo by Postmedia files / PNG But the RCMP news release says investigators believe the person responsible for Bell's death is the primary suspect in the killing of at least one other victim, but that man is dead. Eighteen-year-old Traci Genereaux was reported missing on June 9, 2017, and her remains were found when police searched Curtis Sagmoen's family farm in Salmon Arm in October that year. This advertisement has not loaded yet, but your article continues below. Police do not name Sagmoen in the news release, but say they do not believe there is a continuing threat to the public as the primary suspect is dead. Sagmoen was reported dead in a Vernon hotel in April this year. Curtis Wayne Sagmoen Photo by RCMP Handout The release says the RCMP's Southeast District major crimes unit continues to investigate the deaths of both women as 'additional parties to the offence have not yet been ruled out.' It is asking anyone with information about their disappearances or murders to contact the unit. Read More Vancouver Canucks Vancouver Canucks BC Lions Basketball National

As disinformation and hate thrive online, YouTube quietly changed how it moderates content
As disinformation and hate thrive online, YouTube quietly changed how it moderates content

CBC

time14-06-2025

  • Business
  • CBC

As disinformation and hate thrive online, YouTube quietly changed how it moderates content

Social Sharing YouTube, the world's largest video platform, appears to have changed its moderation policies to allow more content that violates its own rules to remain online. The change happened quietly in December, according to The New York Times, which reviewed training documents for moderators indicating that a video could stay online if the offending material did not account for more than 50 per cent of the video's duration — that's double what it was prior to the new guidelines. YouTube, which sees 20 million videos uploaded a day, says it updates its guidance regularly and that it has a "long-standing practice of applying exceptions" when it suits the public interest or when something is presented in an educational, documentary, scientific or artistic context. "These exceptions apply to a small fraction of the videos on YouTube, but are vital for ensuring important content remains available," YouTube spokesperson Nicole Bell said in a statement to CBC News this week. But in a time when social media platforms are awash with misinformation and conspiracy theories, there are concerns that YouTube is only opening the door for more people to spread problematic or harmful content — and to make a profit doing so. YouTube isn't alone. Meta, which owns Facebook and Instagram, dialled back its content moderation earlier this year, and Elon Musk sacked Twitter's moderators when he purchased the platform in 2022 and rebranded it as X. "We're seeing a race to the bottom now," Imran Ahmed, CEO for the U.S.-based Center for Countering Digital Hate, told CBC News. "What we're going to see is a growth in the economy around hate and disinformation." Meta's move away from fact checking could have dangerous consequences, experts warn 5 months ago Duration 2:05 Facebook and Instagram's parent-company Meta is getting rid of fact checkers on the platforms and will instead rely on users to comment on accuracy, but experts warn the move will likely increase the spread of misinformation. Public interest vs public harm YouTube's goal is "to protect free expression," Brooks said in her statement, explaining that easing its community guidelines "reflect the new types of content" on the platform. For example, she said, a long-form podcast containing one short clip of violence may no longer need to be removed. The T imes reported Monday that examples presented to YouTube staff included a video in which someone used a derogatory term for transgender people during a discussion about hearings for U.S. President Donald Trump's cabinet appointees, and another that shared false information about COVID-19 vaccines but that did not outright tell people not to get vaccinated. A platform like YouTube does have to make some "genuinely very difficult decisions" when moderating content, says Matt Hatfield, executive director of the Canadian digital rights group OpenMedia. He believes platforms do take the issue seriously, but he says there's a balance between removing harmful or illegal content, such as child abuse material or clear incitements to violence, and allowing content to stay online, even if it's offensive to many or contains some false information. The problem, he says, is that social media platforms also "create environments that encourage some bad behaviour" among creators, who like to walk the line of what's acceptable. "The core model of these platforms is to keep you clicking, keep you watching, get you to try a video from someone you've never experienced before and then stick with that person." And that's what concerns Ahmed. He says these companies put profits over online safety and that they don't face consequences because there are no regulations forcing them to limit what can be posted on their platforms. He believes YouTube's relaxed policies will only encourage more people to exploit them. How well YouTube is moderating In a recent transparency report, YouTube said it had removed nearly 2.9 million channels containing more than 47 million videos for community guideline violations in the first quarter — that came after the reported policy change. The overwhelming majority of those, 81.8 per cent, were considered spam, but other reasons included violence, hateful or abusive material and child safety. Hatfield says there is a public interest in having harmful content like that removed, but that doesn't mean all controversial or offensive content must go. However, he says YouTube does make mistakes in content moderation, explaining that it judges individual videos in a sort of "vacuum" without considering how each piece of content fits into a broader context. "Some content can't really be fairly interpreted in that way." Regulations not a perfect solution Ahmed says companies should be held accountable for the content on their platforms through government regulation. He pointed to Canada's controversial but now-scuttled Online Harms Act, also known as Bill C-63, as an example. It proposed heavier sentences, new regulatory bodies and changes to a number of laws to tackle online abuse. The bill died when former prime minister Justin Trudeau announced his resignation and prorogued Parliament back in January. Ahmed says he hopes the new government under Prime Minister Mark Carney will enact similar legislation. Hatfield says he liked parts of that act, but his group ultimately opposed it after it tacked on some other changes to the Criminal Code and Human Rights Act that he says were unrelated to the platforms. He says groups like OpenMedia would have liked to see a strategy addressing business models that encourage users to post and profit off of "lawful but awful" content. "We're not going to have a hate-free internet," he said. "We can have an internet that makes it less profitable to spread certain types of hate and misinformation."

YouTube Makes Adjustments to Its Moderation Guidelines
YouTube Makes Adjustments to Its Moderation Guidelines

Yahoo

time09-06-2025

  • Politics
  • Yahoo

YouTube Makes Adjustments to Its Moderation Guidelines

YouTube quietly made changes to its moderation policies last December, ahead of President Donald Trump's second term. According to The New York Times, which reviewed internal documents, YouTube is allowing content containing political, social, and cultural issues that would have been subject to removal under previous guidelines to remain on the platform. YouTube is allowing this type of content to remain on its platform as long as it is considered to be in the public's interest. The threshold for these videos has been extended from one-quarter of a video to one-half of a video. In a statement to the Times, Nicole Bell, a spokesperson for the Google-owned platform, said, "Recognizing that the definition of 'public interest' is always evolving, we update our guidance for these exceptions to reflect the new types of discussion we see on the platform today." She added, "Our goal remains the same: to protect free expression on YouTube while mitigating egregious harm." For years, conservative circles decried the moderation techniques employed by the various social media platforms, bemoaning that the takedown of their content was agenda-driven and a form of censorship. With the transition to the Trump administration, the rigid stances employed by various platforms have been jettisoned in favor of a more loose approach. YouTube joins Meta's Instagram and Facebook and X, formerly Twitter, in relaxing their moderation guidelines. Those platforms shifted from employing fact-checkers to having community members vet the veracity of content posted on their sites.

YouTube has loosened its content moderation policies
YouTube has loosened its content moderation policies

The Verge

time09-06-2025

  • Politics
  • The Verge

YouTube has loosened its content moderation policies

YouTube has relaxed its moderation policies and is now instructing reviewers not to remove content that might violate its rules if they're in the 'public interest,' according to a report from The New York Times. The platform reportedly adjusted its policies internally in December, offering examples that included medical misinformation and hate speech. In training material viewed by the Times, YouTube says reviewers should now leave up videos in the public interest — which includes discussions of elections, ideologies, movements, race, gender, sexuality, abortion, immigration, censorship — if no more than half of their content breaks its rules, up from one quarter. The platform said in the material that the move expands on a change made before the 2024 US election, which allows content from political candidates to stay up even if they violate its community guidelines. Additionally, the platform told moderators that they should remove content if 'freedom of expression value may outweigh harm risk,' and take borderline videos to a manager instead of removing them, the Times reports. 'Recognizing that the definition of 'public interest' is always evolving, we update our guidance for these exceptions to reflect the new types of discussion we see on the platform today,' YouTube spokesperson Nicole Bell said in a statement to the Times. 'Our goal remains the same: to protect free expression on YouTube while mitigating egregious harm.' YouTube didn't immediately respond to The Verge 's request for comment. YouTube tightened its policies against misinformation during Donald Trump's first term as US president and the covid pandemic, as it began removing videos containing false information about covid vaccines and US elections. The platform stepped back from removing election fraud lies in 2023, but this recent change goes a step further and reflects a broader trend of online platforms taking a more lax approach to moderation followingTrump's reelection. Earlier this year, Meta similarly changed its policies surrounding hate speech and ended third-party fact-checking in favor of X-style community notes. The changes follow years of attacks on tech companies from Trump, and Google in particular is in a vulnerable legal situation, facing two Department of Justice antitrust lawsuits that could see its Chrome browser and other services broken off. Trump has previously taken credit for Meta's moderation changes. As noted by the Times, YouTube showed reviewers real examples of how it has implemented the new policy. One video contained coverage of Health and Human Services Secretary Robert F. Kennedy Jr.'s covid vaccine policy changes — under the title 'RFK Jr. Delivers SLEDGEHAMMER Blows to Gene-Altering JABS' — and was allowed to violate policies surrounding medical misinformation because public interest 'outweighs the harm risk,' according to the Times. (The video has since been taken off the platform, but the Times says the reasoning behind this is 'unclear.') Another example was a 43-minute video about Trump's cabinet appointees that violated YouTube's harassment rules with a slur targeting a transgender person, but was left up because it had only a single violation, the Times reports. YouTube also reportedly told reviewers to leave up a video from South Korea that mentioned putting former president Yoon Suk Yeol in a guillotine, saying that the 'wish for execution by guillotine is not feasible.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store