
Google rolls out AI mode search in India, powered by Gemini 2.5
Google has officially launched its AI Mode search experience in India, bringing advanced, natural search capabilities to users through voice, visuals, and text. Initially piloted in the U.S., this experimental feature is now accessible in English via Search Labs—a platform that allows users to try early Google Search features and provide feedback.
Once activated, AI Mode introduces a dedicated tab in the Google app and search interface. It leverages the powerful Gemini 2.5 model to enhance reasoning and allow for more complex, multi-step queries. Users can input questions through typing, voice commands, or even images, enabling deeper and more intuitive interactions with Search.
This rollout is especially significant in India, where voice and visual search see widespread use. Hema Budaraju, Vice President of Product Management for Search at Google, noted that India leads globally in monthly Google Lens usage. AI Mode is part of Google's mission to make information universally accessible—regardless of how users choose to ask.
The feature also enables follow-up questions and provides links to diverse sources, offering multiple perspectives on any topic. This aligns with the growing trend of users seeking not just answers, but richer understanding.
Google highlighted that more than 1.5 billion people globally now interact with AI Overviews each month. These AI-generated summaries—featured at the top of search results—have led to a 10% increase in engagement for certain query types in both India and the U.S.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


India.com
an hour ago
- India.com
Google CEO Sundar Pichai studied from this IIT, did engineering in this branch, his net worth is...
Google CEO Sundar Pichai studied from this IIT, did engineering in this branch, his net worth is… The Joint Seat Authority (JoSAA) Counselling 2025 has started for students following the declaration of the JEE Advanced 2025 result. Through this counselling, students will get an opportunity to gain admission to prestigious institutions such as IITs and NITs as per their rank. However, it is a fact that almost all students want to get admission into top IITs, and as always, their favourite branch is Computer Science. Google CEO Sundar Pichai is a role model and a source of inspiration for most aspirants. They want to become like him. Pichai also started his journey at an IIT. On this note, let's take a look at the educational qualifications of the Google CEO and the branch he selected that skyrocketed his career. Who is Sundar Pichai? Sundar Pichai was born on June 10 1972, in Tamil Nadu's Madurai. He was born into a typical Tamil family. His father, Raghunath Pichai, was an electrical engineer in a British company. Raghunath was also the owner of a manufacturing plant. Pichai's mother worked as a stenographer. Sundar was a bright student from his childhood. He studied at Jawahar Navodaya Vidyalaya in Chennai until Class 10. He completed his Class 12 board exams at Vana Vani School in Chennai. Did Sundar Pichai Get The Desired Branch? To get a job in tech companies like Google, most of the students choose i.e. engineering from the Computer Science branch. But, interestingly, Pichai doesn't have a computer science degree. According to several media reports, after passing 12th Sundar was to do in the computer science branch but didn't get it due to his ranking in JEE IIT. His JEE IIT ranking was between 1100 to 1200 due to which he got the Metallurgical branch. But he studied hard and received the BC Roy Silver Medal in 1993. From which IIT did Sundar Pichai study? Sundar Pichai, a Chennai native, pursued engineering at IIT Kharagpur after achieving a high JEE rank, despite initially aiming for IIT Madras's computer science program. Following his undergraduate studies, he earned an MS in Material Science from Stanford University and an MBA from the Wharton School at the University of Pennsylvania.


Hans India
2 hours ago
- Hans India
OpenAI Taps Google's AI Chips in Strategic Shift Away from Nvidia Dependency
In a significant move within the AI landscape, OpenAI, the Microsoft-backed creator of ChatGPT, has reportedly begun utilizing Google's artificial intelligence chips. According to a recent report by Reuters, this development points to OpenAI's efforts to diversify its chip suppliers and reduce its dependency on Nvidia, which currently dominates the AI hardware market. OpenAI has historically been one of the largest buyers of Nvidia's graphics processing units (GPUs), using them extensively for both training its AI models and performing inference tasks — where the model applies learned data to generate outputs. However, as demand for computing power surges, OpenAI is now exploring alternatives. The Reuters report, citing a source familiar with the matter, claims that OpenAI has started using Google's Tensor Processing Units (TPUs), marking a notable shift not only in its hardware strategy but also in its reliance on cloud services. Earlier this month, Reuters had already suggested that OpenAI was planning to leverage Google Cloud to help meet its growing computational needs. What makes this collaboration remarkable is the competitive context. Google and OpenAI are direct rivals in the AI field, both vying for leadership in generative AI and large language model development. Yet, this partnership demonstrates how shared interests in infrastructure efficiency and cost management can bridge even the most competitive divides. According to The Information, this is OpenAI's first major deployment of non-Nvidia chips, indicating a deliberate effort to explore alternative computing platforms. By leasing Google's TPUs through Google Cloud, OpenAI is reportedly looking to reduce inference costs — a crucial factor as AI services like ChatGPT continue to scale. The move is also part of a broader trend at Google. Historically, the tech giant has reserved its proprietary TPUs mainly for internal projects. However, it appears Google is now actively expanding external access to these chips in a bid to grow its cloud business. This strategy has reportedly attracted several high-profile clients, including Apple and AI startups like Anthropic and Safe Superintelligence — both founded by former OpenAI employees and seen as emerging competitors. A Google Cloud employee told The Information that OpenAI is not being offered Google's latest-generation TPUs, suggesting the company is balancing business expansion with competitive caution. Still, the fact that OpenAI is now a customer illustrates Google's ambition to grow its end-to-end AI ecosystem — from hardware and software to cloud services — even if that means partnering with direct rivals. Neither Google nor OpenAI has issued official statements confirming the deal. Yet, the development signals an evolving AI infrastructure market where flexibility, cost-efficiency, and compute availability are becoming more strategic than ever. As the race to power the future of AI intensifies, such cross-competitive collaborations could become more commonplace — redefining how major players navigate both cooperation and competition in the era of intelligent computing.


Time of India
2 hours ago
- Time of India
OpenAI turns to Google's AI chips to power its products: The information
OpenAI has recently begun renting Google's artificial intelligence chips to power ChatGPT and other products, The Information reported on Friday, citing a person involved in the arrangement. The move, which marks the first time OpenAI has used non-Nvidia chips in a meaningful way, shows the Sam Altman-led company's shift away from relying on backer Microsoft's data centres, potentially boosting Google's tensor processing units (TPUs) as a cheaper alternative to Nvidia's graphics processing units (GPUs), the report said. As one of the largest purchasers of Nvidia's GPUs, OpenAI uses AI chips to train models and also for inference computing, a process in which an AI model uses its trained knowledge to make predictions or decisions based on new information. OpenAI hopes the TPUs, which it rents through Google Cloud, will help lower the cost of inference, according to the report. However, Google, an OpenAI competitor in the AI race, is not renting its most powerful TPUs to its rival, The Information said, citing a Google Cloud employee. Both OpenAI and Google did not immediately respond to Reuters requests for comment. OpenAI planned to add Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector. For Google, the deal comes as it is expanding external availability of its in-house TPUs, which were historically reserved for internal use. That helped Google win customers including Big Tech player Apple as well as startups like Anthropic and Safe Superintelligence, two OpenAI competitors launched by former OpenAI leaders.