logo
Brenda Song blocked from movie role by Disney

Brenda Song blocked from movie role by Disney

Perth Nowa day ago

Brenda Song has claimed Disney blocked her from starring in Gran Torino because of a sexual assault scene.
The 37-year-old actress shot to fame as hotel heiress London Tipton in the Disney Channel's The Suite Life of Zack and Cody and went on to star in a number of the network's own movies before moving away from child stardom to more adult roles.
However, despite her success, Brenda revealed her transition wasn't made easy and she faced difficulties such as missing out on a role she wanted in Clint Eastwood's 2008 drama Gran Torino.
Accepting Variety's Virtuoso Award at the Bentonville Film Festival in Arkansas, she recalled: "The character had an intended sexual assault scene, so Disney nixed it. And I was very upset but I was like, 'Okay, I guess it didn't work out.''
Brenda also had to fight for her role as Christy in The Social Network, which also featured a sexually-explicit scene, going to then-COO of Disney Branded Television Gary Marsh to plead her case.
She recalled: 'I was just like, 'I am an actor. When you hired me, I was not a hotel heiress. If I have ever done anything in my personal life to ever draw bad attention to your company, I understand. But this is the last season of the show, and this is the opportunity of a lifetime.'
"And I was so fortunate, they were so supportive. They allowed me to do this film that truly changed my life.'
Brenda - who has sons Dakota, four, and Carson, two, with fiance Macaulay Culkin - was grateful to break into acting at a young age because it was "really hard" for Asian-American women to find success.
She said: "That was the tricky thing growing up, being an Asian-American actress in Hollywood.
'Like if you weren't Jackie Chan or Jet Li — I'm not an Asian man — it was really hard. But I was fortunate to have actors like Ming-Na Wen, Michelle Yeoh and Lucy Liu, who really inspired me.'
Of working with Ming-Na when she was eight years old, she added: 'I'm so grateful because she was so encouraging, so kind and just so supportive.'

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Isabellea ‘couldn't be without' her best friend. He wasn't real
Isabellea ‘couldn't be without' her best friend. He wasn't real

Sydney Morning Herald

time6 hours ago

  • Sydney Morning Herald

Isabellea ‘couldn't be without' her best friend. He wasn't real

Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

Isabellea ‘couldn't be without' her best friend. He wasn't real
Isabellea ‘couldn't be without' her best friend. He wasn't real

The Age

time6 hours ago

  • The Age

Isabellea ‘couldn't be without' her best friend. He wasn't real

Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

Spoiler alert! Here's the huge Australian A-lister that appears in the Squid Game finale
Spoiler alert! Here's the huge Australian A-lister that appears in the Squid Game finale

Courier-Mail

time9 hours ago

  • Courier-Mail

Spoiler alert! Here's the huge Australian A-lister that appears in the Squid Game finale

Don't miss out on the headlines from TV. Followed categories will be added to My News. ****Spoilers for the new season of Squid Game follow**** The season finale of Squid Game season 3 features the cameo of a huge Australian Hollywood star. The one and only Cate Blanchett makes a surprise appearance in the final scene of the hit Korean thriller as a recruiter for the deadly competition. Picking up in the wake of a failed revolution, the final season of the Asian version of the hugely popular show follows the struggle between Gi-hun (Lee Jung-jae), who's determined to take down the games once and for all, and Front Man (Lee Byung-hun), who desperately wants to break Gi-hun's faith in humanity. Korean veteran actor Lee Jung-jae returns as Gi-hun in the final season. In the final scene of the show, Front Man finds himself in a car in Downtown Los Angeles, and while stopped at a light, he hears some familiar sounds: the thwap of two ddakji tiles hitting the ground followed by the sharp crack of a slap across the face. Front Man rolls down his window and sees a suited Blanchett playing ddakji with a seemingly desperate man in an alleyway. Blanchett looks up and exchanges a knowing glance with Front Man, who pulls away as Blanchett's attention returns to her new recruit. 'We thought having a woman as a recruiter would be more dramatic and intriguing,' said Squid Game Director Hwang Dong-hyuk. 'And as for why Cate Blanchett, she's just the best, with unmatched charisma. Who doesn't love her? So we were very happy to have her appear. We needed someone who could dominate the screen with just one or two words, which is exactly what she did,' he continued. Cate Blanchett makes an appearance in the finale. Picture: Netflix. The final scene of the series sets up the spin-off. Picture: Netflix. 'If Gong Yoo is the Korean Recruiter, I thought she would be the perfect fit as the American Recruiter, bringing a short but gripping and impactful ending to the story.' He went on to reveal that Blanchett had very limited time to film the cameo, so much so that she shot the entire thing in one take. 'During the shoot, she reminded me of what true talent looks like. Even with just a few looks and lines, her performance was mesmerising,' he shared. 'She was amazing at playing ddakji. I believe she successfully flipped the ddakji with her first try, and we were able to get that one long take right away.' But what exactly does Blanchett's cameo mean for the future of the show? Quite a bit, it turns out, considering where Netflix plans on taking the franchise next. Season 3 is the last for the Asian version of Squid Game, but it's far from over. An English-language spin-off is being developed by director David Finch, who has previously worked with Blanchett on The Curious Case of Benjamin Button. Blanchett's appearance raises numerous questions, however, like have the Games always had international counterparts or have they been forced to move them after they were nearly discovered by authorities at the end of the final season? Squid Game season 3 is available to stream now on Netflix. Originally published as Spoiler alert! Here's the huge Australian A-lister that appears in the Squid Game finale

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store