logo
Your Phone Won't Be Enough to Power the First Real Pair of Android XR Glasses

Your Phone Won't Be Enough to Power the First Real Pair of Android XR Glasses

Gizmodo10-06-2025
The first pair of commercial augmented reality glasses to sport Google's newfangled Android XR aren't merely a wearable computer you can slip on and off with ease. Xreal's Project Aura glasses still aren't available, but the bare details about the device's specs offer a better idea of what the first pair of true AR glasses may look like—when and if you can actually buy them. While we all hoped we could simply tether our AR glasses to an Android device, you can expect that today's phones won't be enough to handle all the 3D and AI goodness packed inside.
At Augmented World Expo (AWE) on Tuesday, Xreal revealed a few scant details about its device that was already lacking much in the way of specifics. The company isn't showing off any more images of the device, nor is it offering regular folk the chance to try it. Instead, the AR glasses maker finally revealed that Aura won't be processing data on the glasses itself. It will instead use a separate, tethered, puck-like device that sits in your pocket. Like Xreal's other products, such as its recent One Pro, the glasses have the in-built wire, though you'll be able to disconnect from the processing device if you want to store the glasses and puck separately.
Additionally, Xreal finally offered some insight into what's powering its new device. Aura is working with two chipsets, one for the glasses and another for the compute device. The glasses frame will house a modified version of the X1 chip found in the Xreal One glasses, called X1S. The company has claimed it's a more powerful processor than the one found in the company's streaming glasses, but even that's not enough for its 'optical-see-through' device. The puck will house a Qualcomm Snapdragon chip. The company didn't specify if it's the same Qualcomm Snapdragon XR2 Plus Gen 2 that's powering the only other fully revealed Android XR device, Samsung's Project Moohan.
Simply put, your phone isn't built for handling the same tasks, according to Xreal. Meta's Project Orion was the first AR prototype device announced so far that requires a similar 'compute puck.' The need for an extra processing puck leaves me questioning whether these devices will have the battery capacity necessary for heavy use for more than a few hours. Xreal didn't offer details about battery life, though at least it confirmed the glasses will have front-facing sensors with added hand tracking, akin to the Meta Quest 3 and Apple Vision Pro.
Xreal offered more about the displays it's using in Aura. The glasses sport a new flat prism lens, rather than the larger 'birdbath' lenses of older devices. This 'flat prism' is a triangular-shaped pane of glass that's around 44% smaller than before, yet it should offer a larger field of view (FOV) of up to 70 degrees. That's a relatively wide FOV for a pair of AR glasses, far more than the 57-degree FOV of the Xreal One Pro, and it should offer something much closer to the experience of wearing a full VR headset, at least on the horizontal axis.
These glasses are built with many of the same apps as Project Moohan in mind. We've only seen a few short demos with Samsung's XR device and how it works in concert with Google's Gemini AI. Moohan should be able to operate most Android apps, but it can also add 3D functionality to apps like Google Maps. Gizmodo had the opportunity to try a different pair of Android XR glasses at last month's Google I/O, though that truncated demo paled in comparison to the navigation and memory capabilities Google has previously shown to the public.
Xreal hasn't offered any indication of a release date or price, but we can only assume it will be a while, and it will be very expensive. That processor will be pricey; reports suggest Moohan also will be a costly device. Aura's wired connection means you'll never truly be inconspicuous wearing the glasses unless you somehow try to finagle the wire to run down the inside of your shirt. A pair of $300 Ray-Ban Meta or similar smart glasses will still be far slimmer without the need for wired connections. Those devices aren't exactly cheap, but we'll have to see if consumers are truly ready for a device that costs closer to $1,000 just for the sake of having a tiny heads-up display on their spectacles.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Galaxy Z Flip 7 FE tipped for Unpacked release thanks to a new case leak
Galaxy Z Flip 7 FE tipped for Unpacked release thanks to a new case leak

Tom's Guide

time42 minutes ago

  • Tom's Guide

Galaxy Z Flip 7 FE tipped for Unpacked release thanks to a new case leak

We know the next Galaxy Unpacked is happening next week, with Samsung set to unveil at least two new foldables — the Samsung Galaxy Z Fold 7 and Galaxy Z Flip 7. But, if details spotted by a major casemaker are to be believed, we may be seeing the Samsung Galaxy Z Flip 7 FE at the same time. As spotted by Android Central, case maker Spigen has published listings for the Galaxy Z Flip 7 and the Z Flip 7 FE a little too early. The listings were removed pretty quickly, and didn't show any images of the phones in question, but that doesn't mean much on the modern internet. The most important thing here is that Spigen seemingly confirmed Samsung will be calling the cheaper foldable the Z Flip 7 FE, rather than just the Z Flip FE. It doesn't really make a difference either way, but considering the Galaxy Z Fold SE was released last year without any numbering meant that the names given to the rumored phone were always a little contentious. It also suggests that the Z Flip 7 FE will be arriving at Galaxy Unpacked alongside the flagship foldables. Not "several months later," as some early rumors first claimed. Rumors about the Z Flip FE have been few and far between, so there isn't a whole lot we really know about the phone just yet. What is clear, though, is that this should be a cheaper version of the Galaxy Z Flip 7 — complete with some compromises and hardware nerfs to justify the lower price. How much? We don't know, but the Korean model is rumored to cost 1 million won — which is about a third less than the Galaxy Z Flip 6. Leaked CAD renders also suggest we'll see a design similar to the Z Flip 6, complete with a 6.7-inch display and a 3.4-inch cover screen paired with two camera lenses. The screen itself is said to be FHD+ resolution with a 120Hz refresh rate. Internally we've heard that the phone could run off an Exynos 2400e chipset, alongside 8GB of RAM and base storage of 128GB. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. But it sounds like we don't have long to wait before we find out for sure. In the meantime, be sure to check out our Samsung Galaxy Z fold 7, Samsung Galaxy Z Flip 7 and Samsung Galaxy Z Flip FE hubs for all the latest news and rumors about the three new foldables.

Winning AI Search: How To Optimize For Discovery In A Generative World
Winning AI Search: How To Optimize For Discovery In A Generative World

Forbes

timean hour ago

  • Forbes

Winning AI Search: How To Optimize For Discovery In A Generative World

Dani Nadel, President and COO, Feedvisor. Gone are the days of traditional SEO, when ranking on page one guaranteed success. Search is undergoing its most radical transformation since Google became a verb. That once-simple search box is now a generative system: anticipating needs, synthesizing answers and reshaping how consumers discover, evaluate and buy. Today's AI search agents, think ChatGPT, Perplexity and Google's Search Generative Experience (SGE), go beyond serving links. They interpret intent, scan sources and deliver personalized, conversational answers that often replace the need to browse or click. How Search Is Fragmenting Across Platforms Google still drives nearly 90% of global search traffic, but signs of fragmentation are emerging: • Younger Audiences Shift: Gen Z and millennials are turning to TikTok, Instagram and YouTube—visual, conversational spaces where search feels more organic and less link-driven. Google usage among Gen Z declined 25% compared to Gen X, signaling a generational pivot. • AI Accelerates The Zero-Click Era: AI overviews now appear in over 13% of Google queries, pushing users toward instant answers. In 2024, up to 60% of Google searches ended without a single click; over 75% on mobile. This is no accident. Tools like Gemini 2.5, powering 'AI Mode,' are keeping users on-platform with synthesized, multimodal responses. • AI-Native Platforms Gain Ground: By early 2025, 40% of U.S. internet users had tried tools like Perplexity or Google's SGE. ChatGPT commands an 80% share and over 500 million users. Gen Z leads, with 82% saying they use AI search tools at least occasionally. • Search Becomes Mid-Funnel: Only 13% of shoppers begin with Google when they know what they want. Most use it to compare, validate or explore options, not to discover. The Rise Of Agentic Commerce By 2026, Gartner predicts that search engine volume will drop by 25%, losing market share to AI chatbots and agents, tools that answer questions and take action. Early efforts like Google's Project Mariner preview this world: Bots that browse, compare and purchase on behalf of users. This shift fundamentally changes visibility dynamics. Instead of crawling and ranking pages, AI curates and selects what it deems the best answer. The implications are clear: • Fewer Clicks: Instant answers reduce site traffic. Visibility must happen upstream. • Higher Trust Threshold: Structured, credible and current content is prioritized. • Intent Over Keywords: AI rewards clear, purposeful language aligned with user goals, not keyword stuffing. • Performance Signals Count: Metrics like dwell time, conversions and reviews influence how AI weighs content. Ranking isn't the goal anymore; being chosen is. My company's internal testing shows that selection hinges on clarity, structure and completeness, traits that help AI systems quickly understand and trust your content. These systems favor content designed for rapid comprehension: Well-organized pages with clear headings, concise bullets and repeatable phrasing that mirrors real user queries like 'What does this do?' That trust deepens when your brand appears consistently across marketplaces and third-party sources. The more visible you are, the more credible you become. To earn visibility, you need to earn AI trust. These systems are the new gatekeepers to discovery. While you can't game them, you can design for them. How AI is Rewriting Marketplace Discovery This disruption isn't limited to search engines. AI is reshaping discovery and shopping across Amazon, Walmart and TikTok, often before shoppers reach a search bar. AI agents now surface marketplace listings directly in response to queries. Whether a consumer starts on ChatGPT, Google SGE or TikTok, they see curated product options, bypassing traditional in-platform search. And with tools like Google's AI Mode, agents can monitor availability, track prices, send alerts and guide users toward purchase decisions. Marketplaces are not just destinations; they are sources fueling AI-driven shopping journeys. Amazon vs. Walmart: Two Paths To AI Discovery Amazon, one of the many marketplaces my company supports, and Walmart both recognize AI product discovery as the future, but their approaches differ: • Amazon: Seems to be mostly operating a closed AI ecosystem, controlling discovery within its platform. From Rufus to product-ranking algorithms, Amazon's AI prioritizes first-party signals like pricing, availability and ratings, and auto-summarizes key product detail page details such as specs and usage scenarios. Success will require strong optimization across product pages, competitive pricing and campaign performance. • Walmart: Appears to be betting on the open web. Walmart is making its catalog AI-readable and bot-accessible, preparing for agents like OpenAI's Operator to shop across platforms. Its strategy centers on cross-platform visibility through structured data, review syndication and influencer content that ensures listings are indexed and recommended by AI. Both reward clarity, credibility and consistency, but draw from different signals: Amazon prioritizes internal data, while Walmart scans across the web. How To Optimize for AI-Driven Marketplaces AI ranks what it understands, what performs and what it trusts. Whether you're trying to rank for ChatGPT, Google SGE, Amazon or Walmart, the principles of AI-native content apply: • Prioritize relevance and clarity. Use benefit-led, conversational language. Prioritize titles, bullets and descriptions that highlight key features and differentiators to influence both AI and buyer decisions. • Answer AI-driven prompts. Frame content around real questions like "What problem does this solve?" or "How does it compare?" Mirror how users phrase queries. • Design for AI comprehension. Use structured formatting, semantic tags, concise sentences and complete fields like specs, features and use cases to support parsing and analysis. • Elevate visual content: Apply metadata, alt text, lifestyle imagery, charts and short, benefit-driven video above the fold to boost engagement signals. Well-structured visuals aid shopper decision-making and help AI extract information. • Strengthen engagement signals. Post-purchase prompts and sampling to drive reviews. Keep Q&A active. • Distribute influence off-platform. Support mobile, voice and off-site journeys that drive traffic to PDPs. What Winning in AI Discovery Requires AI is now the first impression. Whether through a chatbot, generative engine or voice assistant, your brand is being evaluated by algorithms before shoppers even begin to browse. Winning requires evolving from traditional SEO to an AI-native discovery strategy, one designed for people and the AI agents guiding their choices. Amazon and Walmart are charting different paths, but both demand smarter content and sharper signals. What you surface, and how clearly you communicate it, will determine what gets seen, what gets clicked and what ultimately gets bought. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

The AI tsunami is sweeping Florida colleges: Will we ride it or drown?
The AI tsunami is sweeping Florida colleges: Will we ride it or drown?

Miami Herald

timean hour ago

  • Miami Herald

The AI tsunami is sweeping Florida colleges: Will we ride it or drown?

Throughout human history, groundbreaking technologies have reshaped civilization and marked pivotal moments in human progress. The wheel revolutionized transportation, the radio redefined communication, and antibiotics reimagined medicine. Artificial intelligence (AI) is the next great leap—poised to surpass them all. Not merely an innovation, AI is a transformative force, a virtual tsunami reshaping the foundations of society. Unlike past technological revolutions, which unfolded over decades or generations, AI's development is accelerating at breathtaking speed. Many of the contemporary uses of AI may well become obsolete before this article is published. Experts predict that capabilities once considered science fiction will soon become reality. The implications will ripple through industries, education, healthcare, and human relationships. In classrooms across Florida and worldwide is where we perhaps see the greatest impact. AI is, even now, fundamentally altering the learning process. Students can leverage AI as a powerful research assistant—locating sources, identifying critical information and refining writing. Gone are the days of browsing through endless, time-consuming Google searches. AI provides instant explanations, personalized learning experiences and curated reading suggestions tailored to individual needs. Beyond research, AI enhances writing by suggesting synonyms, improving sentence flow and identifying grammatical errors. It ensures clarity and coherence, allowing students to focus on argumentation rather than formatting. When it comes to citations, AI compiles bibliographies in the required style, eliminating the tedium of manual reference-checking. AI empowers students to achieve more with their time. Educators in Florida and across the nation have already recognized AI's immense potential. Some institutions now incorporate AI into coursework—not merely as a tool, but as a subject of exploration. Assignments may ask students to evaluate AI's output critically. As one professor noted, 'AI will not replace you, but someone who can take full advantage of AI probably will.' Beyond education, AI-driven robotics may soon transform daily life. Imagine a world where humanoid robots are as common as cars — handling chores, ordering groceries, and managing calendars. Futurists even predict AI companions that engage in meaningful conversations, tutor children, or offer emotional support. These possibilities are exciting but raise profound ethical questions. What will the world be like when virtually everyone has access to an assistant with an IQ of over 200 and a vast memory bank? Can machines emulate human relationships? Should they? Can AI offer emotional comfort without the complexities of real human interaction? These questions, once speculative, now demand serious thought. Like all transformative technologies, AI carries risks. What happens when machines reach intelligence levels beyond human comprehension? What safeguards exist to prevent unintended consequences? AI already generates complex programs in minutes—work that might take human developers weeks. If left unchecked, AI could evolve its own decision-making processes, hidden from human oversight. Still, AI remains a tool. Its impact depends on how we choose to use it. Fear should not overshadow its promise. Major corporations have embraced AI's productivity-boosting capabilities. At Google and Microsoft, more than 25% of their software code is AI-generated. Governments and industries worldwide are adopting AI, preparing for a future where intelligent automation is not just a benefit — but a necessity. Instead of resisting AI, society must engage with it thoughtfully. By harnessing its power responsibly, AI can elevate human potential. The question isn't simply whether AI will replace us — it's how we can survive this mega-tsunami. One thing is certain: the mega-tsunami has arrived. We can either ride its wave or be overwhelmed by its tide. Modesto A. Maidique is president emeritus of FIU. His current focus is on the art of leadership. Ronald G. Clark is professor emeritus at the UM's School of Medicine. His current focus is on the neuroscience that underpins artificial intelligence applications. Edwin Luu is a junior majoring in computer engineering at FIU.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store