Latest news with #JohnAbel


Web Release
7 days ago
- Automotive
- Web Release
FORMULA E AND GOOGLE CLOUD ANNOUNCE INCLUSIVITY PODCAST FOR VISUALLY IMPAIRED MOTORSPORT FANS
Blind and visually impaired Middle Eastern motorsport fans will soon become more immersed in Formula E through an innovative, AI-powered, audio race report being created and rolled out by the electric racing organisation and its Official Cloud Partner, Google Cloud. The news comes after it was announced that the 2025 Jeddah E-Prix double-header became the most-watched Formula E weekend in history, with an unprecedented 65 million global cumulative viewers. Unveiled at the Google Cloud Summit in London by Formula E CEO Jeff Dodds, the project uses Google Cloud's generative AI technology to create rich, descriptive audio summaries of every E-Prix race. The content will be made available globally on Spotify and other popular audio platforms in more than 15 languages, with Arabic being prominently featured alongside English, Spanish, French, German, and Mandarin – ensuring Middle Eastern fans can experience the thrill of Formula E racing in their native language. The reports will provide fans with a dynamic recap that captures the excitement and key moments of the race, available on-demand within minutes after the chequered flag. The initiative was born from a Google Cloud Hackathon held at the 2024 London E-Prix. It is being developed in close partnership with the Royal National Institute of Blind People (RNIB) to ensure the final product meets the needs of visually impaired users. Formula E and Google Cloud will work with the RNIB to conduct focus groups and user testing during the upcoming race weekends in Berlin and London, with a full rollout planned for Season 12. Jeff Dodds, CEO, Formula E, said: 'At Formula E, we believe the thrill of electric racing should be accessible to everyone. This innovative collaboration with Google Cloud is a fantastic example of how technology can be used for good, creating a brand-new way for blind and visually impaired fans to experience the drama and emotion of our sport. By working closely with the RNIB, we are ensuring this innovation is truly inclusive and fit for purpose, so that no fan is left behind.' John Abel, Managing Director, Specialised Software, Google Cloud, said: 'For too long, the visual nature of racing has been a barrier for fans who are blind or visually impaired. Google Cloud's AI technology will act as a digital storyteller, creating a vivid audio narrative that brings the speed, strategy, and excitement of Formula E to life. We are proud to work alongside a partner like Formula E that shares our passion for using innovation to break down barriers and connect people through shared experiences.' Sonali Rai, RNIB's Media Culture and Immersive Technology Lead said: 'Audio description transforms how blind and partially sighted motorsport fans can fully engage in enjoying the full racing spectacle – taking in the visceral sounds of cars on the track while feeling the passion of the crowd. 'RNIB has been working with Formula E and Google Cloud on this AI-powered podcast which promises to give a full picture of the race in an accessible and engaging way for blind and partially sighted racing fans. Formula E's commitment to working directly with the blind and partially sighted community to develop this technology is exactly the right approach and sets a fantastic standard in inclusivity for other sports to follow and stay on track with new advances in innovation.' How The Technology Works: The audio report is created through a multi-stage process powered by Google Cloud's AI platform Vertex AI: Transcription: Google's Chirp model accurately transcribes live race commentary. Analysis and Generation: Google's Gemini models then analyse the transcribed commentary alongside live timing data and other official race information. The audio report identifies key events – such as overtakes, incidents, and strategic pit stops – and generates a fact-based, engaging race summary. Audio Production: Finally, the text is converted into natural, expressive speech using advanced text-to-speech technology, creating a polished audio report ready for distribution.


Time of India
09-07-2025
- Automotive
- Time of India
Formula E and Google Cloud bring races to visually impaired fans, here's how it works
Image credit: Formula E Can the visually impaired also enjoy the fun and thrill of a Formula E race? Yes, it's now possible — thanks to a new initiative by Formula E and Google Cloud: an AI-powered audio race report. This initiative promotes inclusivity and makes motorsport more accessible for blind and visually impaired fans. Google Cloud will work with the Royal National Institute of Blind People (RNIB) to conduct focus groups and user testing during upcoming race weekends in Berlin and London, with a full rollout planned for Formula E season 12. All about the Formula E and Google Cloud project for visually impaired fans The project was unveiled by Formula E CEO Jeff Dodds at the Google Cloud Summit in London. It uses Google Cloud's generative AI technology to create rich, descriptive audio summaries of every E-Prix race. These reports will offer fans a dynamic recap that captures the excitement and key moments of each race, available on demand shortly after the chequered flag. The initiative originated from a Google Cloud Hackathon held during the 2024 London E-Prix. It is being developed in close partnership with RNIB to ensure the final product meets the needs of visually impaired users. Sharing his enthusiasm about the collaboration, Formula E CEO Jeff Dodds said, 'At Formula E, we believe the thrill of electric racing should be accessible to everyone. This innovative collaboration with Google Cloud is a fantastic example of how technology can be used for good—creating a brand-new way for blind and visually impaired fans to experience the drama and emotion of our sport. By working closely with the RNIB, we are ensuring this innovation is truly inclusive and fit for purpose, so that no fan is left behind. ' John Abel, Managing Director of Specialised Software at Google Cloud, added, 'For too long, the visual nature of racing has been a barrier for fans who are blind or visually impaired. Google Cloud's AI technology will act as a digital storyteller, creating a vivid audio narrative that brings the speed, strategy, and excitement of Formula E to life. We are proud to work alongside a partner like Formula E that shares our passion for using innovation to break down barriers and connect people through shared experiences. ' Image credit: Formula E Sonali Rai, RNIB's Media, Culture and Immersive Technology Lead, also said, 'Audio description transforms how blind and partially sighted motorsport fans can fully engage in enjoying the full racing spectacle—taking in the visceral sounds of cars on the track while feeling the passion of the crowd.' She continued, 'RNIB has been working with Formula E and Google Cloud on this AI-powered podcast, which promises to give a full picture of the race in an accessible and engaging way for blind and partially sighted fans. Formula E's commitment to working directly with the community to develop this technology is exactly the right approach. It sets a fantastic standard in inclusivity for other sports to follow and stay on track with new advances in innovation. ' How the technology works The audio report is generated through a multi-stage process powered by Google Cloud's AI platform, Vertex AI. Google's Chirp model accurately transcribes live race commentary. Google's Gemini models analyze the transcribed commentary along with live timing data and other official race information. The AI identifies key events such as overtakes, incidents, and strategic pit stops and generates an engaging, fact-based race summary. Also Read: Berlin E-Prix: From Oliver Rowland's World Championship win to Felipe Drugovich's debut, there's a lot to look forward to The text is then converted into natural, expressive speech using advanced text-to-speech technology, resulting in a polished audio report ready for distribution. The entire process is completed within minutes of the race's conclusion. The reports will be available globally on Spotify and other popular audio platforms in more than 15 languages, including English, Spanish, French, German, Mandarin, and Arabic. Catch Manika Batra's inspiring story on Game On, Episode 3. Watch Here!