
How to run AI models on your smartphone without an Internet connection
Google AI Edge Gallery enables users to download and run AI models locally on their smartphone, without the need for an Internet connection. — Google
Google has quietly launched a mobile application that lets you use artificial intelligence models locally on your smartphone, without needing to be connected to the internet, whether via a cellular or WiFi network.
The app, called Google AI Edge Gallery, enables users to download and run artificial intelligence (AI) models locally on their smartphone, without the need for an Internet connection. For the time being, this application is only compatible with Android. A version for iOS is planned at a later date.
Available in alpha (experimental) version only via GitHub, it offers access to a variety of open source models from the Hugging Face platform, including Google's Gemma 3n model. The application has absolutely nothing to do with Google's Gemini AI and all its possible variations. It's designed to be highly intuitive and accessible to all, and free of charge via a smartphone running at least Android 10. Beware of storage, however, as each model needs between 500 MB and 4 GB, depending on performance.
Like the most popular generative AI tools, the application uses its various models to answer queries posed by the user, generate images from text descriptions, and rewrite, summarise or translate text. It also aids programming by generating or modifying source code.
As the application is not available on the Google Play Store, you need to manually download the APK file from Google AI Edge Gallery directly from GitHub. Once installed, you can then run the AI models of your choice locally.
If this application is being launched quietly and outside of Google Play, it's because it's currently an experimental version (and therefore by definition unstable) that needs a lot of feedback from users and developers in order to be improved, and one day offered in a more accessible way to everyday users. – AFP Relaxnews
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Star
19 hours ago
- The Star
Opinion: Wanna help save the planet? Stop asking AI dumb questions
It takes huge amounts of energy to power artificial intelligence – so much energy that it's looking less and less likely that the US will meet its goals for reducing greenhouse gas emissions. (If we still have any such goals under President Donald Trump.) What's less known is that AI also consumes copious amounts of water needed to cool all that IT equipment. To generate a 100-word email, a chatbot using GPT-4 requires 519 millilitres of water – roughly equivalent to an 18-ounce bottle of water. That doesn't sound like much, but when you multiply that by the number of users, it's significant. Also, it requires far more than 100 words for AI to respond to our most pressing questions, such as: – What are three excuses for skipping dinner at my (fill in the blank's) house tonight? – Can you rewrite this email to make me sound smarter? – How do you make a mojito? – Does this outfit look good on me? If you are wondering about that last query, yes, there are folks who rely on ChatGPT for wardrobe advice. Some check in with Chat on a daily basis by uploading a photo of themselves before they leave the house, just to make sure they look presentable. These superusers often spring for a US$20-per-month (RM84) subscription to ChatGPT Plus, which provides priority access, among other perks. Chat can also help you write a dating profile, plan a trip to Mexico City, manage your finances, give you relationship advice, tell you what shampoo to use and what color to paint your living room. Another plus: ChatGPT never talks down to you. Even the most outlandish queries get a polite, ego-boosting response like this: 'That's a thoughtful and important question. Here's a grounded response.' Google vs ChatGPT But again, it's hard to get around the fact that AI is hard on the planet. Example: The New York Times reports that Amazon is building an enormous AI data centre in Indiana that will use 2.2 gigawatts of electricity, which is enough to power a million homes. And according to a report from Goldman Sachs, 'a ChatGPT query needs nearly 10 times as much electricity to process as a Google search.' So we could save energy by opting for Google search, except Google is getting in to the AI business, too. Have you noticed those 'AI overviews' at the top of search results? Those come at an environmental cost. 'Embedding generative AI in such a widely used application is likely to deepen the tech sector's hunger for fossil fuels and water,' writes Scientific American staffer Allison Parshall. The good news is there is a way to block those pesky AI overviews; YouTube has tutorials like this one that will walk you through it. In further good news, there are smart people looking for ways to make AI more environmentally friendly, but that could take a while. in the meantime, should we conserve water and energy by letting AI focus on important tasks like diagnosing breast cancer, predicting floods and tracking icebergs? Maybe stop running to ChatGPT every time we have a personal problem? Should I feel guilty, for example, if I ask Chat how to stop my cats from scratching the couch? Not according to Chat. 'No, guilt isn't productive unless it's leading you to positive action,' Chat told me. 'Instead, awareness is more productive.' But if you do worry about the planet, Chat recommends using AI 'with purpose' rather than as entertainment. No need to swear it off entirely. 'The focus should be on conscious consumption rather than abstinence,' Chat says. Lower 'brain engagement' That sounds reasonable, except a recent MIT study offers evidence that the longer we use AI, the less conscious we become. Using an EEG to measure brain activity of 54 subjects, researchers found that those who used ChatGPT to write SAT essays had lower 'brain engagement' than two other groups – one was allowed to use Google search and the other relied solely on brain power to complete the essays. 'Over the course of several months, ChatGPT users got lazier with each subsequent essay, often resorting to copy-and-paste by the end of the study,' Time magazine reported. Granted, this is only one small study. But to be on the safe side, I'm going to lay off Chat for a while. Maybe I'll hit Google with that cat question. There is, however, one thing Google can't tell me: Does that dress I ordered online look OK on me or should I send it back? Tell me what you think, Chat. And please, be brutally honest. – The Sacramento Bee/Tribune News Service


The Sun
19 hours ago
- The Sun
OpenAI turns to Google's AI chips to power its products, source says
OPENAI has recently begun renting Google's artificial intelligence chips to power ChatGPT and its other products, a source close to the matter told Reuters on Friday. The ChatGPT maker is one of the largest purchasers of Nvidia's graphics processing units (GPUs), using the AI chips to train models and also for inference computing, a process in which an AI model uses its trained knowledge to make predictions or decisions based on new information. OpenAI planned to add Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector. For Google, the deal comes as it is expanding external availability of its in-house tensor processing units (TPUs), which were historically reserved for internal use. That helped Google win customers including Big Tech player Apple as well as startups like Anthropic and Safe Superintelligence, two ChatGPT-maker competitors launched by former OpenAI leaders. The move to rent Google's TPUs signals the first time OpenAI has used non-Nvidia chips meaningfully and shows the Sam Altman-led company's shift away from relying on backer Microsoft's data centers. It could potentially boost TPUs as a cheaper alternative to Nvidia's GPUs, according to the Information, which reported the development earlier. OpenAI hopes the TPUs, which it rents through Google Cloud, will help lower the cost of inference, according to the report. However, Google, an OpenAI competitor in the AI race, is not renting its most powerful TPUs to its rival, The Information said, citing a Google Cloud employee. Google declined to comment while OpenAI did not immediately respond to Reuters when contacted. Google's addition of OpenAI to its customer list shows how the tech giant has capitalized on its in-house AI technology from hardware to software to accelerate the growth of its cloud business.


The Sun
19 hours ago
- The Sun
OpenAI rents Google AI chips amid rising compute demands
OPENAI has recently begun renting Google's artificial intelligence chips to power ChatGPT and its other products, a source close to the matter told Reuters on Friday. The ChatGPT maker is one of the largest purchasers of Nvidia's graphics processing units (GPUs), using the AI chips to train models and also for inference computing, a process in which an AI model uses its trained knowledge to make predictions or decisions based on new information. OpenAI planned to add Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector. For Google, the deal comes as it is expanding external availability of its in-house tensor processing units (TPUs), which were historically reserved for internal use. That helped Google win customers including Big Tech player Apple as well as startups like Anthropic and Safe Superintelligence, two ChatGPT-maker competitors launched by former OpenAI leaders. The move to rent Google's TPUs signals the first time OpenAI has used non-Nvidia chips meaningfully and shows the Sam Altman-led company's shift away from relying on backer Microsoft's data centers. It could potentially boost TPUs as a cheaper alternative to Nvidia's GPUs, according to the Information, which reported the development earlier. OpenAI hopes the TPUs, which it rents through Google Cloud, will help lower the cost of inference, according to the report. However, Google, an OpenAI competitor in the AI race, is not renting its most powerful TPUs to its rival, The Information said, citing a Google Cloud employee. Google declined to comment while OpenAI did not immediately respond to Reuters when contacted. Google's addition of OpenAI to its customer list shows how the tech giant has capitalized on its in-house AI technology from hardware to software to accelerate the growth of its cloud business.