Latest news with #AppleGlasses


Geeky Gadgets
01-07-2025
- Geeky Gadgets
Apple Glasses Release Date and Price
Apple is preparing to make a significant impact in the smart glasses market with the much-anticipated Apple Glasses. Expected to launch in late 2026, these innovative glasses aim to combine advanced technology with everyday practicality. With features like heads-up displays, environmental awareness, and seamless iPhone integration, Apple Glasses are designed to transform how you interact with both the digital and physical worlds. Positioned as a more affordable alternative to the Vision Pro, they are expected to appeal to a wide range of users while maintaining Apple's hallmark of premium functionality. The video below from Matt Talks Tech gives us more details on the rumored Apple Glasses. Watch this video on YouTube. Advanced Features of Apple Glasses Apple Glasses are rumored to include a variety of innovative features designed to deliver a seamless augmented reality (AR) experience. These features are expected to enhance usability and provide practical applications for everyday life: Heads-Up Display: The glasses will project information directly onto the lenses, allowing you to view notifications, navigation directions, or contextual data without needing to glance at your phone. This feature could make multitasking more efficient and less distracting. The glasses will project information directly onto the lenses, allowing you to view notifications, navigation directions, or contextual data without needing to glance at your phone. This feature could make multitasking more efficient and less distracting. Environmental Awareness: Equipped with built-in cameras, the glasses will analyze your surroundings in real-time. This capability could enable object recognition, helping you identify items, landmarks, or even people in your environment with ease. Equipped with built-in cameras, the glasses will analyze your surroundings in real-time. This capability could enable object recognition, helping you identify items, landmarks, or even people in your environment with ease. AI Integration: Apple's advanced AI technology is expected to power various tasks, such as answering questions, identifying objects, and providing turn-by-turn navigation. For example, you could ask the glasses for details about a nearby restaurant or directions to your destination. Apple's advanced AI technology is expected to power various tasks, such as answering questions, identifying objects, and providing turn-by-turn navigation. For example, you could ask the glasses for details about a nearby restaurant or directions to your destination. Prescription Lens Compatibility: To ensure accessibility for users with vision needs, Apple may offer prescription lens options. This feature could make the glasses a practical choice for a broader audience. These features highlight Apple's focus on combining functionality with convenience, aiming to create a product that seamlessly integrates into your daily life. Seamless Integration with iPhones Apple Glasses are expected to pair effortlessly with iPhones via Bluetooth, using the smartphone's processing power and connectivity. This integration could unlock a range of functionalities, such as accessing apps, responding to messages, or controlling smart home devices. While the glasses may include a dedicated chip for specific tasks, the reliance on iPhone processing is likely to ensure smooth performance and minimal latency. By using the iPhone's ecosystem, Apple Glasses could deliver a cohesive and intuitive user experience. For instance, you might receive notifications directly on the lenses or control your smart home devices with simple voice commands. This seamless integration could make the glasses an indispensable tool for users already invested in Apple's ecosystem. Navigation and Object Recognition: Practical Everyday Applications One of the standout features of Apple Glasses is expected to be their enhanced navigation capabilities. By projecting turn-by-turn directions onto the lenses, the glasses could guide you without requiring you to look at a separate device. This hands-free navigation could be particularly useful for activities like walking, cycling, or driving. Additionally, the object recognition feature, powered by AI, could identify items in your surroundings and provide instant, relevant information. Whether it's recognizing a product in a store, identifying a historical landmark, or even offering details about a person, this feature could enhance your understanding of the world around you. These practical applications highlight the potential of Apple Glasses to simplify and enrich everyday experiences. Competitive Landscape and Pricing The smart glasses market is becoming increasingly competitive, with major players like Google and Meta actively developing their own products. Google's Android XR and Meta's smart glasses are among the key competitors. However, Apple aims to differentiate itself by using its ecosystem integration and advanced AI capabilities, offering a more cohesive and user-friendly experience. Pricing for Apple Glasses is speculated to start at $600 or higher, depending on the final feature set and market conditions. This pricing strategy positions them as a more affordable alternative to the Vision Pro while still maintaining Apple's reputation for premium design and functionality. By offering a balance between cost and features, Apple could attract a diverse audience, from tech enthusiasts to casual users. The Future of Wearable Technology If the rumored features and release timeline hold true, Apple Glasses could represent a significant step forward in wearable technology. By combining heads-up displays, environmental awareness, and AI-driven features, these glasses aim to enhance how you interact with the world around you. Seamless iPhone integration and competitive pricing further strengthen their appeal, positioning Apple to establish a strong foothold in the emerging smart glasses market. As the late 2026 release date approaches, the anticipation surrounding Apple Glasses reflects the growing interest in wearable AR devices. Whether you're a tech enthusiast or someone curious about the future of smart wearables, Apple Glasses could mark a pivotal moment in the evolution of wearable technology. With their potential to blend practicality with innovation, these glasses are set to redefine how we engage with both digital and physical environments. Master Apple Glasses release date 2026 with the help of our in-depth articles and helpful guides. Source & Image Credit: Matt Talks Tech Filed Under: Apple, Technology News, Top News Latest Geeky Gadgets Deals Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.


Tom's Guide
11-06-2025
- Tom's Guide
I just tried visionOS 26 — and the most exciting thing is actually not the Vision Pro
Apple famously does not talk about future products (delayed Siri notwithstanding), but I couldn't help but look ahead as I tried out all of the new features in visionOS 26 for the first time, which is in developer beta now and arriving this fall. Yes, the new visionOS will launch on the Vision Pro, and as far as we know there's no new spatial computing hardware coming from Apple this year — although there's rumors of a lighter Vision Air headset on the way. But as I stared at a panoramic photo widget of Japan and Mt Fuji on the wall in front of me — just like a window — I was thinking a lot more about the implications of visionOS 26 for the rumored Apple Glasses. During one visionOS 26 demo, I was able to play around with the new spatial widgets, which I think have huge implications for a pair of Apple smart glasses. I opened the new Widgets app in visionOS and then placed a clock on the wall I was staring at, and I could adjust the color and width of the frame. But with another tap I could then make it appear like that widget was literally sunken into the wall, adding an almost freaky sense of depth. I also walked from one room to another to demonstrate the fact that you can pin widgets and make their locations persistent. So, for example, your music widget could always be in the same spot. And as I walked up to that widget I could see more info on the Lady Gaga album and start playing my music. All of this is cool if you happen to have $3,500 to burn on a Vision Pro and don't mind wearing a 1.3-pound headset all day. But I think use cases like this get much more interesting when you can shrink the technology down to work on a pair of smart glasses. The Vision Pro could turn 2D photos into 3D before, but it was doing so by displaying information differently to your left eye and right eye. The new Spatial scenes feature works differently and quickly turns your flat pics into something much more immersive, thanks to generative AI. In one image I could literally peek behind a rocky outcropping and see more of a body of water that wasn't even there in the original photo. Apple is using a new AI algorithm that leverages computational depth to crate multiple perspectives from your 2D photos. The result is that it feels like you can get various perspectives of the images just by leaning into the shot and tilting your head. Again, I can see slipping on a pair of glasses to get this effect, but I don't know if the payoff is worthwhile if we're talking about a bulky headset. Easily the most jaw-dropping moment of my visionOS 26 demo was being able to see someone paraskiing, thanks to 8K footage captured by an Insta360 3D video camera. The point of this demo was to show that visionOS 26 supports native playback of 180-degree, 360-degree and wide FOV content from 3D cameras. Apple's new Apple Projected Media Profile takes these shots and remap them into a sphere around you. As the paraskiier essentially floated down a mountain and screamed his head off, I was both excited to live vicariously through him and relieved I was not him. Honestly, I think this format is fine for the Vision Pro and might be tough to pull off in Apple Glasses, as they would have to give you a very wide field of view. But the demo was still impressive. There was only moment I laughed out loud during WWDC 25 — other than seeing Craig Federighi's CGI-enhanced windblown hair after emerging from an F1 race car. And that was Apple showing two people wearing a visionOS headset together watching a movie on a couch. First, who is going to do that when you're in the same room? And, more important, who the heck can afford $7,000 worth of hardware for that sort of experience? But there was a more compelling shared experiences demo for visionOS 26. An Apple rep loaded up a 3D version of Neil Armstrong's space suit and I could then zoom in on in and walk around it. This could be a great learning tool, for example, for parents trying to explain concepts to kids. But I had to remember to take the Vision Pro's battery with me before I got up and walked around the space suit to inspect it, which puts a damper on the experience. This would be much more compelling with smart glasses. Last but not least, I wanted to mention that I tried the new Persona in Vision Pro with visionOS 26. The virtual me definitely looks more realistic now, especially when you turn your head. Before the side view was a real challenge. You'd turn your head, and it almost looked like you turned into a ghost with the missing detail. My hair and skin both looked more realistic, and Apple paid closer attention to little details like eyelashes. Frankly, I still don't really love how my Persona looks. I wish I could smooth out my skin a bit and maybe whiten my teeth slightly. But you can enhance your Persona by changing the portrait effect, as well as accessorize with glasses. I could see myself perhaps dialing into a video call in the future if Apple could pull this off with smart glasses. At the risk of beating a dead horse, I like a lot of the features in visionOS 26, but until I see a lighter, more affordable spatial computer from Apple, I think Vision Pro will continue to be a tough sell. I believe Apple's ultimate goal is to create a pair of smart glasses that can deliver all of the above experiences and then some. Earlier this year, Bloomberg's Mark Gurman reported that Apple's Tim Cook was keenly focused on 'lightweight spectacles that a customer could wear all day' — offering AR elements that 'will overlay data and images onto real-world views.' In fact, Apple is reportedly 'hell-bent on creating an industry-leading product before Meta can.' For me, visionOS 26 provides a very good starting blueprint for what Apple glasses could offer.


Tom's Guide
22-05-2025
- Business
- Tom's Guide
Apple's ‘AI push' could mean smart glasses arrive as soon as 2026
According to a new report from Bloomberg's Mark Gurman, Apple is seeking to release a set of smart glasses by end of 2026 as part of a "push into AI-enhanced gadgets." The Apple Glasses, meant to take on the Meta Ray-Ban glasses and any upcoming products built on the Android XR platform that Google showed off this week, have entered a ramped up development to meet the target date. Prototypes should be produced by the end of this year, the Bloomberg report claims. With OpenAI buying former Apple Chief Design Officer Jony Ive's company (which he started with OpenAI's Sam Altman) to build the 'iPhone of AI' it seems the Cupertino giant is feeling the pressure. In April, it was reported that Apple CEO Tim Cook is "obsessed" with launching a pair of Apple Glasses. Like other smart glasses, the Apple version is supposed to feature cameras, microphones and speakers. Coupled with Apple Intelligence and Siri, they could potentially analyze the external world and take on tasks like music playback, live translations, and phone calls. Gurman claims that Apple wants its glasses to use augmented reality (AR) to use displays and other tech to show digital content on the lens, but that feature might not come any time soon. Allegedly, Apple's Vision Products Group, makers of the Vision Pro headset, will develop this product. And while they are working on a new version of Apple's spatial computing headset, apparently, Glasses are getting the bulk of the focus. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. The group is supposed to be helping design a chip meant for smart glasses, which might launch next year. Much of Apple's future plans depend on the company bolstering Apple Intelligence, something the company has struggled with since its take on AI was announced in June of 2024. A number of reports have come out in the last few months that claim that Apple couldn't get its priorities in order especially when it comes to Siri. Recently, Apple has started to open up its walled garden by allowing third-party LLMs to help bolster Apple Intelligence alongside ChatGPT which is already integrated with Siri. For the rumored glasses to succeed in the way Apple wants, the company will have to offer a more robust version of its AI tools, including a smarter version of its personal assistant. That could happen with the iOS 19 update likely to arrive later this year. Not only will iOS 19 offer a redesign for Apple's iPhone software, it's supposed to give Apple Intelligence a boost. There have been rumors that Apple was working on an Apple Watch or Apple Watch Ultra that would feature a camera; however, Gurman claims those plans have been squashed. A rumored AirPods update that would feature built-in cameras is still in the works. Reportedly, those earbuds would launch next year as well. Next year could be big for Apple with new products and overhauled classics. Apple's first foldable phone should also launch late next year.


Tom's Guide
08-05-2025
- Business
- Tom's Guide
Forget the Vision Pro: Apple is reportedly going all in on smart glasses with custom chips
Amid reports that Apple CEO Tim Cook is "obsessed" with developing a pair of Apple Glasses, the rumor mill has been rife with claims that Apple's version of smart glasses would feature Apple Intelligence. More grist for that rumor emerged today (May 8) when Bloomberg's Mark Gurman reported that the Cupertino company is develop new chips to power future smart glasses, Macs and AI servers. What's more, the chip meant for Apple Glasses is apparently well on the way. Since Apple ditched Intel for its own homegrown processors, more and more internal components have been developed in house. Most recently, Apple replaced Broadcom modems with its own C1 modems in the iPhone 16e. The alleged glasses SoC is based on the S chip used in the Apple Watch, which requires less energy than the A and M processors used in iPhones and Macs, respectively. Gurman claims this new chip has been customized to be more power efficient while also controlling multiple cameras that the glasses would feature. While AR has made strides especially with options like the Xreal One and Spacetop glasses, Apple is apparently pursuing the Meta strategy of non-AR glasses. The Apple version would use cameras to scan the world and AI to assist you. That said, Gurman asserts that Apple hasn't settled on an approach yet. For those interested in a set of Apple Glasses, you'll be waiting awhile. The chips might not go into production until late 2026 or 2027, meaning the earliest we could see them in a pair of glasses is maybe fall 2027. According to Gurman, the other chips in development will power future Macs as well as AI servers that will be used to run Apple Intelligence. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. These new chips include the M6 and M7 as wells an advanced Mac chip code-named Sutra. The AI chips would be exclusively made for AI servers to improve Apple Intelligence. Currently, the company reportedly uses M2 Ultra chips to power its AI servers.