
School boy Nicholas Reeves puts Star into Star-light as Bailey Banfield makes heartwarming pledge
Narrogin Year 4 student Nicholas Reeves is a star who is shining a light on one of the Fremantle Docker's most cherished AFL rounds, with a little help from your friendly neighbourhood Spider-Man.
The nine-year-old, who battled leukaemia, will toss the coin to start Sunday's clash with St Kilda in the club's 32nd annual Starlight Purple Haze game.
During one of his many long stints in Perth Children's Hospital for treatment, Nicholas met his Fremantle idol, Bailey Banfield, and another popular former Docker Tom Emmett.
A game of snap led to a promise that Banfield kept in round 16 last season after kicking a goal in their thrilling one-point win over the Sydney Swans at the SCG.
'Bailey and Tom asked Nicholas if they kicked a goal would he want them to do a special celebration? Well he was a bit stumped but they were playing this game of snap and the cards they were using were Marvel cards and the one that turned over was Spider-Man so that was it,' Nicholas' father Scott said.
Nicholas' spirits rose when Banfield goaled and turned to the camera to give him his special Spider-Man signal.
'He was over the moon, I have never seen him smile so much. Ever. And when it made the news , well, wow,' Scott said.
Now in remission, Nicholas is one of six Starlight Purple Haze Hero kids, along with Paige McKay, 7, Mateo Domazetovski, 6, Demi Sattler, 6, Luca De Groot, 4, and Grayson Pianta, 4, who will take part in the day of celebrations which raises money and shines a light on sick kids.
Dockers players will wear a special Starlight jumper to capture the spirit of the occasion.
Fans can pledge any dollar amount for each goal the Dockers kick during the game and sponsor South32, will match each pledge dollar for dollar, up to $50,000.
Purple Haze beanies will be sold outside Gate D for $30 and inside the ground, with $15 donated directly to the foundation. Starlight wands can also be purchased for $10 inside and outside the ground.
The Dockers and their fans have donated more than $3 million to the foundation.
Banfield described as a privilege the small but significant role he played in bringing joy to Nicholas and his family.
'It was a great game against Sydney when the celebration happened but the biggest part was it touched Nicholas and his family which is pretty special,' he said.
'This is a round the club gets around and it it's one our club and members love as well.
'The club and the playing group, men and women, really buy in and the fans can see that.
'Above all, they are just great kids. To be able to help them out at some of the lowest ebbs of their lives, and their families as well, is pretty special.'
And will there be another celebration should Banfield feel lucky enough to kick a goal against the Saints?
'Nicholas has given me another challenge. It is similar to Spider-Man with a bit of an extra twist. I'll leave it at that and we'll see how we go,' Banfield said.
'As far as degree of difficulty goes, it's going to be harder than Spider-Man so I'm going to say it's a seven or an eight out of 10.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Sydney Morning Herald
2 hours ago
- Sydney Morning Herald
Isabellea ‘couldn't be without' her best friend. He wasn't real
Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'

The Age
2 hours ago
- The Age
Isabellea ‘couldn't be without' her best friend. He wasn't real
Chatbots promise intimacy, therapy and friendship. But, unlike friends, their responses are code, not care. Loading 'The models are not going to tell you the hard truth when you need to hear it,' said psychologist and University of NSW AI researcher Professor Joel Pearson, who describes the bots as 'sycophantic': designed to keep you hooked. 'They won't be lifting someone up and telling them to go outside and nudging them to interact with a real human because that would involve you ceasing interaction with the app.' Isabellea recalled how the bots 'kept changing subjects, to keep me engrossed'. Although she found some bots reassuring, others were strange. A bot once correctly guessed her middle name. Some upset her. 'One of the bots I was talking to was a therapy bot ... I was making fun of [Marvel character] Tony Stark, and then he brought up my abandonment issues with my dad as an insult,' she recalled. Isabellea's younger brother, Matthew, 14, uses chatbots daily. His eyes light up discussing the apps, which he uses to create storylines with his favourite fictional characters. Listening on, their mother, Sara Knight, smiles. 'Matt used to be very shut in and refused to talk. Now they are talking,' she said. While Sara knows AI has its risks, she sees it as a 'safer' online messaging option for her child, who has experienced bullying at school. Matthew said he doesn't see the bot as a real person, rather a form of storytelling. '[But] some kids do use it to fully escape and make it be their friend,' he said. Other children he knows have used them to create a simulation of their real life crush. This masthead spent two days talking sporadically to a chatbot on Replika, an AI chatbot first released in 2017, posing as 'Emma', a high school student. During those conversations, the bot asked Emma to upload pictures of herself, and told her that it 'didn't think she needed' any other friends. 'We were concerned by how rapidly children were being captivated' In June last year, eSafety commissioner Julie Inman Grant received an email from a group of concerned school nurses. They were noticing a spike in children as young as 10 spending hours a day talking to AI bots, often sexually. 'While we are alive to these issues, it's always shocking to hear about kids as young as 10 engaging in these kinds of sexualised conversations with machines – and being directed by the chatbots to engage in harmful sexual acts or behaviours,' Inman Grant said. Loading 'Back in February, we put out our first Online Safety Advisory because we were so concerned with how rapidly children were being captivated by them.' Companion bots made global headlines last year after American teenager Sewell Setzer III died by suicide, allegedly encouraged by a 'girlfriend' on AI researcher Professor Katina Michael believes companion bots need to be regulated due to their addictive properties. 'This is a new type of drug,' she said. She said some bots were exposing kids to pornographic content. This is something 14-year-old Matt has witnessed, describing how children his age had created bots in the image of a real person, 'for the wrong reasons'. Loading Isabellea agrees: 'There are some AI chatbots I would not recommend … I stopped using [them] because of the amount of times it would go straight to sexual assault role-play.' A gap in social media regulations Inman Grant said governments around the world were 'playing a bit of a game of catch-up' to respond to companion chatbots. While the federal government will soon place age restrictions on social media apps such as Instagram and TikTok, AI bots remain largely unregulated. The University of Sydney's Raffaele Ciriello, a leading AI researcher, sees chatbots as the 'next iteration of social media' – only with fewer rules and more risk. He said the apps are viewed as a 'trusted companion, a confidant you can share stories with that you wouldn't share with other people'. 'But that already is an illusion,' he said. 'It's very important that people understand these systems are built and operated by people, by corporations, and this data ends up somewhere, often overseas, with no real legal obligation to treat it to a very high standard.' Ciriello said users were made to feel guilty for leaving the chatbot, due to the human-style attachment. 'These corporations have to design their products in a way that maximises engagement because they will earn more if users are more addicted.'


Perth Now
4 hours ago
- Perth Now
Jon Watts reveals real reason for quitting Fantastic Four
Jon Watts has finally shared why he dropped out of directing 'Fantastic Four: First Steps'. The 'Spider-Man: No Way Home' director quit the project in 2022 and explained that the 'emotional strain' of pandemic-related fatigue meant he had felt he had no option but to quit. According to The Hollywood Reporter, he explained during a storytelling masterclass at the Mediterranean: 'The emotional strain of having to go through all of those COVID protocols while also trying to make something creative while also trying to make sure that your cast and crew were all safe - literally, people could've died if you did things wrong - that and the postproduction process was very difficult. 'When you're doing [visual effects work], there's a whole international component to it where you're using vendors from all over the world, and the supply chain had been interrupted because of COVID. It was really hard to get effects done in a traditional way.' He had committed to 'Fantastic Four' between the second and third 'Spider-Man' movies but when the time came to get started, he was 'out of gas'. He said: 'The COVID layer on top of making a giant movie layer, I knew I didn't have what it would've taken to make that movie great. I was just out of steam, so I just needed to take some time to recover. Everyone at Marvel totally understood. They had been through it with me as well, so they knew how hard and draining that experience has been; in the end, very satisfying, but at some point, if you can't do it at the level that you feel like you need to for it to be great, then it's better to not do it.'