Latest news with #SnapOS


Tom's Guide
17-06-2025
- Business
- Tom's Guide
Exclusive: I asked Snap's hardware chief about the company's next-gen Specs — here's what I found out
So as we found out last week, Snap is finally launching Specs to the public in 2026 — after an exhaustive developer program that spans four years since its first Spectacles AR glasses. It's been a heluva journey, and with Meta's Project Orion on the horizon and Apple being 'hellbent' on delivering smart glasses, this is becoming a very competitive space. So far, Snap CEO Evan Spiegel has said they will be smaller, lighter, and fully standalone with 'no puck' required. But there's a lot we don't know yet. What has been the story that's led to this point where Snap is ready to go for a full public release? What tech can we expect inside these future contenders for best smart glasses? What's the price? And is society ready for true AR glasses like this? I had a chance to sit down with Snap's VP of Hardware, Scott Myers, and put these questions to him. I have to be a little careful with what I say, because we have like fully announced everything. Yeah. But what he said was that it's substantially smaller. So we have been in this area for 11 years, and we have been building these things for a very long time. It's public information that we have made some acquisitions that like our entire opt optical engine is our own custom thing. We build it ourselves. We design it ourselves, which gives us a pretty unique position where we know exactly how these things are going. We have road maps for development and I really like where we're going. And because we're not just a bunch of companies strung together, we're all like one group working all toward the same goal. I can have the team designing, say, the waveguide, talking to the same team that's working on the rendering engine and SnapOS. And that like synthesis is how we end up still confident about where we're at. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. We've been getting feedback in a lot of different forms about the hardware. We've gotten some phenomenal feedback from the community, but also feedback like 'we wish the field of view was bigger,' you know, or that 'the device was lighter.' There's a joke with the team that like, this is what I want [points towards his reading glasses]. It's a question of what I want. It's how we get there. It's the trade-offs we make to go make these like the dream of true augmented reality, something people can wear and walk around. The social acceptability element is so critically important. Context: Since 2020, Snap has been on a bit of a shopping spree — acquiring AI Factory in 2020 (computer vision AI company), WaveOptics in 2021 (the company designing its waveguide displays), Fit Analytics in 2022 (shopping and ecommerce), a 3D-scanning startup called Th3rd in 2023, and GrAI Matter Labs in 2024 (an edge AI chip company), alongside many more. Well, I think this is one of the reasons we're standalone. I don't want to see people wearing a wire coming out of the back of their head. It makes people look like U.S. Government officials, and that's not how I want to see the world. The form factor obviously matters, but it's also the fit and finish of these things that also matter when you make that jump. Like they need to be robust, but like all of those are pulling the product in different directions. So like, I think one of our strengths is, like, the balance of all of these things. You can make a giant field of view. Some companies have, but you also need really high pixel count or pixels per degree, because it's important for text legibility. You need the ability to make it work indoors and outdoors. Why? Because I don't want to spend all my time inside. As I'm moving through my day, like some of that's inside, some of it's outside. It needs to work in both. So you can't just have a static tint like sunglasses, nor can you just make them clear because they don't work in both environments. So because we've been building these things for so long, we learned these things. We've learned how to solve those problems — what works and what doesn't — but it's all in that trade-off and exactly how you balance all those things. Like, obviously, I'd want the battery to last for days, but then you end up with this giant battery pack that's directionally incorrect, too. This has been a multi-year multi-generation arc. We've launched a pair of 26-degree field of view, augmented reality glasses in 2021 to developers. With that, we learned a ton, and it drove the way our development tool Lens Studio is constructed. So we've been just iterating and iterating and iterating. And what we learned is that like the breath of feedback, the depth of feedback, it's not like you release the product once — you get a long written document, and that's it. It's an active conversation with the community. We even iterate in public in collaboration with our Spectacles subreddit. We want to learn. And what we find is like, as the community grows, as people get better and better at building lenses, they start answering each other's questions. It's a back and forth, like, I personally know developers. That's what a successful community looks like, and we're building this together. And that's very, very intentional. It's in the way our pricing is structured. It's in the way our community is growing. We don't just sell it to anybody, because we want the people who are really going to move the platform forward. It's all very, very, very intentional, and we're very happy with the results as well. As we've had the product out a little bit longer, the lenses have been getting more and more engaging and we're learning together how different UI elements are. I think we are really here to build this with the community because it's an entirely new paradigm. There's a lot of things that will take time for people to understand and figure out. It's not just going to be like, 'oh, here you go, developers — come build for this!' That's not going to work, in my opinion. It is very, very much a community-led discussion. And I couldn't be happier with that. I think what Evan shared was more than Ray-Ban Metas, and less than a Apple Vision Pro. I recognize that's a huge scale!. Obviously we want to make it as low cost as possible. Yeah. But it's also pretty, as you pointed out, pretty advanced technology. And so there's a balance there. One of the things that may not be super intuitive is there's a lot of technology that there is not a ton of world capacity for. Like, we have to go off and work with our suppliers to create these new technologies. Then we have to build the capacity to actually produce them. It's a fun challenge, but there's certainly a ton of work to do. Like, this isn't a Snap-specific problem. This is industry-wide. This is an area where Snap is in a very good spot. Trust matters, privacy matters. And the way we're constructing all of this is a privacy centric way. Like, I want to personalize it. But, this is the most personal possible device. It is literally seeing exactly what I'm seeing. And so, of course, we're going to bring in all the personalization that AI kind of already has like memory. That's an element here, but like I'm actually more worried about how we do it in a privacy-centric way. Back to your previous question, I'm very happy with our direction there. And we've shared a little bit about it, but like having built these for a while, having lived with them, like, it's very much one thing to say, like, hey, but what is this use case? Which I personally don't think is that valuable. It's more about that responsiveness — when I want it, I can go as deep as I want on any topic with it. But do so in a way that maintains my privacy for the times when I don't really need it. But I think that's maybe an undervalued, underexpected problem. Like, you don't want to just share camera images of your entire day! I like that you said battery life, and not just battery capacity. Like, it's all about the way you use it smartly. I used to work on smartphones for a very long time. And yeah, the battery capacity has grown pretty consistently, to be honest — X percent per year. But really, software has gotten much better in how it's being used. This is one of the reasons we built Snap OS, so that we have complete control of exactly how every little bit of energy is consumed across the entire device. It also goes to the way we design the displays, how we make them just super duper efficient, how we do the processing and how we distribute the heat. All of these things have to be balanced, and that's why it's so important to build these, again, where engineers can talk to engineers, and really look at everything as precisely as I can. The other thing I would say is I think if you were to have like in the limit, you have full display, including everything in your world all the time. That would probably be visually overwhelming. I don't personally want a world where I'm walking around and everything's an ad all the time. That would be terrible. So like, I think it'll be about like, what is shown and when, how it's used, and then just generally technology progressing. You know, if you look at some of the initial talk times of very early phones, we're not that long in our developer models. But I think we have some good strategies to increase the battery life now, and it'll just get better and better over time.


GSM Arena
11-06-2025
- Business
- GSM Arena
Snap will launch its AR glasses called Specs next year, and these will be commercially available
Snap Inc. unveiled its fifth-generation AR glasses, called Spectacles 5, last September, but they were only available to developers who signed up for the $99/month developer program. The good news is that this won't be the case with Snap's next AR glasses, which are set to launch in 2026. This revelation came from Snap during the ongoing Augmented World Expo (AWE) 2025 in Long Beach, California, in the US, where the company also announced that its next AR glasses, which will be available to the public, will be called "Specs." 'We believe the time is right for a revolution in computing that naturally integrates our digital experiences with the physical world, and we can't wait to publicly launch our new Specs next year,' said Evan Spiegel, co-founder and CEO of Snap Inc. Spectacles 5 Spiegel also claimed that Specs, which will be Snap's first commercially available AR glasses, "are the most advanced personal computer in the world." While Snap didn't delve into the details of the Specs, it said these AR glasses feature "see-through Lenses that enhance the physical world with digital experience." Furthermore, the company revealed that it has spent more than $3 billion on AR glasses over the past 11 years. In addition to revealing the launch timeframe of Specs, Snap announced updates to Snap OS, including deep integrations with OpenAI and Gemini on Google Cloud. Moreover, the company launched new tools for developers building location-based experiences. It includes the Fleet Management app, Guided Mode, and Guided Navigation. That's not it. Snap also announced that WebXR support in the browser is coming soon, and it is partnering with Niantic Spatial to bring its Visual Positioning System to Lens Studio and Specs to build an AI-powered map of the world.


India Today
11-06-2025
- Business
- India Today
Snap to launch smaller and lighter Specs in 2026, will compete with Meta Ray-Ban
Snap, the parent company of Snapchat, has revealed its plans to launch the sixth generation of its augmented reality (AR) glasses in 2026, as competition intensifies in the rapidly evolving smart glasses sector. In a departure from its previous branding, Snap announced that its forthcoming AR glasses will be called Specs, moving away from the familiar Spectacles name used for earlier iterations. According to the company, the Specs will utilise advanced AR technology, enabling wearers to interact with digital content seamlessly overlaid onto their real-world surroundings. advertisementWhile Snap has yet to confirm either a price point or an exact release date for the new device, the company has stated that Specs will be both smaller and lighter than earlier models. The previous generation of Spectacles, launched in September 2024, was made available exclusively to developers through a leasing model that required a monthly payment of $99 (around Rs 8,500) over the course of a year.'We couldn't be more excited about the extraordinary progress in artificial intelligence and augmented reality that is enabling new, human-centered computing experiences,' Snap CEO Evan Spiegel said in a statement announcing the Specs. The upcoming Specs are aimed squarely at consumers and will run on Snap's own Snap OS operating system. In a significant development, Snap has opened up the platform to allow developers to integrate Google's Gemini AI models into applications designed for the smart glasses. This addition offers coders greater flexibility when creating software for the device, expanding beyond the previously limited access to OpenAI's GPT language models. The integration of multiple AI platforms is expected to fuel innovation within Snap's AR ecosystem, offering a wider range of capabilities for developers and users alike. advertisement Snap first entered the wearables market in 2016 with the launch of its original Spectacles, priced at $130 (around Rs 11,120). Those early models were relatively basic, allowing users to record short videos for upload to Snapchat. In 2021, Snap advanced its product line by introducing AR displays to its Spectacles, offering users the ability to see virtual objects projected onto their real-world view. These AR-enabled glasses represented a major step forward for the company as it sought to establish itself in the emerging market for head-mounted Snap's initial foray, competition within the AR and VR space has grown considerably. Earlier this year, Apple entered the market with its high-end Vision Pro headset, which went on sale in February 2024 for $3,500 (around Rs 3,00,000). Meanwhile, Meta has broadened its range of hardware offerings, which now include the popular Quest VR headsets, the Ray-Ban Meta smartglasses, and the experimental Project Orion AR glasses, which were unveiled last a growing number of major tech players investing heavily in AR and mixed reality technologies, Snap's upcoming Specs will enter an increasingly crowded and highly competitive market. However, the company hopes that its unique combination of lightweight design, expanded AI integration, and its deep roots in augmented reality development will allow it to carve out a meaningful share of the smart glasses market as consumers continue to embrace wearable technology.
&w=3840&q=100)

Business Standard
11-06-2025
- Business Standard
Snap introduces next-gen 'Specs' AR Glasses, Snap OS platform: What's new
Snap, the parent company of Snapchat, has announced its next-generation consumer-focused augmented reality (AR) glasses at the Augmented World Expo (AWE) 2025. Named Specs, the new wearable device is scheduled for launch in 2026 and marks Snap's most ambitious attempt yet at integrating AR into everyday life. Alongside the hardware reveal, Snap also introduced major upgrades to its AR operating system, Snap OS. The new Specs aim to understand users' surroundings, enable shared AR experiences such as multiplayer games, and support tasks like Browse, streaming, and productivity—all through a sleek, self-contained design. Snap Specs: Details Unlike previous Snap Spectacles—which were available only to developers—the upcoming Specs will be publicly released and are designed as an 'ultra-powerful wearable computer' with see-through lenses that overlay digital content onto the real world. Snap described the new Specs as a device built to 'seamlessly integrate digital content into everyday life,' positioning the glasses as part of a broader shift in computing where physical and digital environments converge. The company said it believes 'the time is right for a revolution in computing.' Snap OS: What's new Snap is also rolling out key updates to Snap OS, the platform powering its AR glasses. These upgrades are designed to support multimodal AI, spatial awareness, and real-time content generation. Highlights include: Deep integration with OpenAI and Gemini (Google Cloud): Developers can now create multimodal, AI-powered Lenses and publish them for the Specs user base. Depth Module API: Allows Snap OS to anchor AR visuals in 3D space using translated 2D data from language models—enhancing spatial intelligence. Automated Speech Recognition API: Supports real-time transcription in over 40 languages, including non-native accents, with high accuracy. Snap3D API: Enables on-the-fly generation of 3D objects directly within Lenses. Tools for developers Snap is also introducing fleet management tools and features focused on location-based and guided experiences, designed for venues such as museums, parks, and public exhibitions: Fleet Management App: Allows institutions or developers to monitor and manage multiple Specs units remotely. Guided Mode: Lets developers pre-configure Specs to launch directly into a multiplayer or single-player Lens for instant interaction. Guided Navigation: Designed for AR-based tours, this feature provides turn-by-turn guidance through points of interest like landmarks or exhibits.


Hindustan Times
11-06-2025
- Business
- Hindustan Times
Snap to launch easy-to-wear "Specs" AR glasses for everyday users by 2026
After years of development behind the scenes, Snap is finally ready to bring its AR glasses to everyday users. Speaking at the Augmented World Expo in Long Beach, CEO Evan Spiegel announced that the company's new Specs glasses will ship to consumers in 2026. Unlike earlier Spectacles designed mainly for developers, these new Specs are built for real-world use. You can wear them on your daily commute, at home, or while doing everyday tasks like cooking or fixing things around the house. Snap has invested more than $3 billion (approximately ₹25,000 crore) in AR over the past 11 years. That long term focus has led to a design that's lighter, easier to wear, and more transparent than anything Snap has produced before. These Specs are meant to be a complete AR device. They don't need wires, phones, or external processors. They operate entirely on their own, which puts them a step ahead of many existing headsets. Snap OS is at the heart of this push. The latest version comes with AI capabilities that let users see real time translations, follow recipe steps with visual cues, and use contextual overlays. Think of being able to change a tire with on screen instructions or line up a pool shot with visual guidance. Snap is also bringing developers into the fold. Its updated spatial tools and voice input APIs will let creators build AR apps that respond to both location and natural speech. This kind of interactivity aims to make AR more useful and intuitive. What gives Snap a strong position is the ecosystem it has already built. With over 400,000 AR developers, four million Lenses, and roughly $8 billion (around ₹66,000 crore) worth of AR interactions daily, Snap has the community and content to support its hardware. Its partnership with Niantic is expected to tie AR features to real places like museums, public parks, or city walks. The glasses are equipped with waveguide lenses, stereo speakers, and a six-microphone array to deliver both spatial visuals and audio. With a 46 degree diagonal field of view, the Specs aim to offer an immersive visual experience that still feels comfortable and natural. Snap has not yet revealed pricing or the final hardware specifications. However, CEO Evan Spiegel has said the new Specs will be smaller, lighter and more advanced than earlier versions. That sounds promising, but the real test will come from how they hold up in everyday use. Battery life, comfort and ease of use will ultimately determine their success. Will these glasses genuinely make day-to-day tasks easier? Can they offer value without forcing users to adjust their routines? These are the questions Snap still needs to answer.