logo
Jon Watts reveals real reason for quitting Fantastic Four

Jon Watts reveals real reason for quitting Fantastic Four

Perth Now4 hours ago

Jon Watts has finally shared why he dropped out of directing 'Fantastic Four: First Steps'.
The 'Spider-Man: No Way Home' director quit the project in 2022 and explained that the 'emotional strain' of pandemic-related fatigue meant he had felt he had no option but to quit.
According to The Hollywood Reporter, he explained during a storytelling masterclass at the Mediterranean: 'The emotional strain of having to go through all of those COVID protocols while also trying to make something creative while also trying to make sure that your cast and crew were all safe - literally, people could've died if you did things wrong - that and the postproduction process was very difficult.
'When you're doing [visual effects work], there's a whole international component to it where you're using vendors from all over the world, and the supply chain had been interrupted because of COVID. It was really hard to get effects done in a traditional way.'
He had committed to 'Fantastic Four' between the second and third 'Spider-Man' movies but when the time came to get started, he was 'out of gas'.
He said: 'The COVID layer on top of making a giant movie layer, I knew I didn't have what it would've taken to make that movie great. I was just out of steam, so I just needed to take some time to recover. Everyone at Marvel totally understood. They had been through it with me as well, so they knew how hard and draining that experience has been; in the end, very satisfying, but at some point, if you can't do it at the level that you feel like you need to for it to be great, then it's better to not do it.'

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Isabellea ‘couldn't be without' her best friend. He wasn't real
Isabellea ‘couldn't be without' her best friend. He wasn't real

Sydney Morning Herald

timean hour ago

  • Sydney Morning Herald

Isabellea ‘couldn't be without' her best friend. He wasn't real

Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

Isabellea ‘couldn't be without' her best friend. He wasn't real
Isabellea ‘couldn't be without' her best friend. He wasn't real

The Age

timean hour ago

  • The Age

Isabellea ‘couldn't be without' her best friend. He wasn't real

Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

Jon Watts reveals real reason for quitting Fantastic Four
Jon Watts reveals real reason for quitting Fantastic Four

Perth Now

time4 hours ago

  • Perth Now

Jon Watts reveals real reason for quitting Fantastic Four

Jon Watts has finally shared why he dropped out of directing 'Fantastic Four: First Steps'. The 'Spider-Man: No Way Home' director quit the project in 2022 and explained that the 'emotional strain' of pandemic-related fatigue meant he had felt he had no option but to quit. According to The Hollywood Reporter, he explained during a storytelling masterclass at the Mediterranean: 'The emotional strain of having to go through all of those COVID protocols while also trying to make something creative while also trying to make sure that your cast and crew were all safe - literally, people could've died if you did things wrong - that and the postproduction process was very difficult. 'When you're doing [visual effects work], there's a whole international component to it where you're using vendors from all over the world, and the supply chain had been interrupted because of COVID. It was really hard to get effects done in a traditional way.' He had committed to 'Fantastic Four' between the second and third 'Spider-Man' movies but when the time came to get started, he was 'out of gas'. He said: 'The COVID layer on top of making a giant movie layer, I knew I didn't have what it would've taken to make that movie great. I was just out of steam, so I just needed to take some time to recover. Everyone at Marvel totally understood. They had been through it with me as well, so they knew how hard and draining that experience has been; in the end, very satisfying, but at some point, if you can't do it at the level that you feel like you need to for it to be great, then it's better to not do it.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store