Latest news with #SnapchatLenses


Axios
10-06-2025
- Business
- Axios
Snap to launch AI-powered AR glasses in 2026
Snap on Tuesday announced that it plans to launch a new version of its augmented reality glasses, called Specs, in 2026. The new wearable computer lenses bring the power of AI assistant tools to the user's 3D AR experience. Why it matters: Snapchat still makes most of its money from advertising on its mobile app, but CEO Evan Spiegel believes the future of connection, and its business, will live beyond the smartphone and in the real world. "The tiny smartphone limited our imagination. It forced us to look down at a screen instead of up at the world. It required our fingers to hold, swipe and tap when we really wanted to live hands-free," he said Tuesday at the Augmented World Expo 2025 in Long Beach, California. "It kept content confined to a small 2D rectangle, when we really wanted to experience life in all of its three-dimensional splendor." Catch up quick: The new AR glasses are leaps and bounds more sophisticated than Snapchat's first iteration of consumer wearables, called Spectacles. Spectacles, which were first launched in 2016, allowed users to take pictures and video with their glasses and add AR overlays onto that content. But those features weren't powered by AI tools, which can now deliver more complex and engaging experiences. Snap released its fifth generation of Spectacles for developers in 2024. It's since leveraged the creativity and brain power of thousands of developers to help create experiences that will eventually power the version of Specs launching next year. How it works: Specs will leverage AI to help enhance the consumer experience far beyond creating and capturing content. The new glasses allow users to share games and experiences with friends, stream content and set up virtual work stations. They will also leverage AI recommendations and tools to help users with tasks such as figuring out how to change a tire or position a pool cue. Zoom out: Snap has long prioritized its relationship with the developer community as a way of expanding its creative tools, especially around AR. Using Snap's existing AR tools, over 400,000 developers have built more than 4 million Snapchat Lenses, or AR overlays, that can be positioned on top of photos or videos taken by users. Snapchat users currently use AR Lenses in the Snapchat camera 8 billion times per day, according to the company. With the launch of Specs, Snap said it will also roll out new tools specifically for developers building location-based experiences. The big picture: Snap's smartphone experience has paved the way for its 3D spatial computing efforts. Its Snap Map, which now has more than 400 million monthly active users, will eventually become a critical part of its wearables strategy. In its announcement Tuesday, Snap said it's partnering with Niantic Spatial to bring their visual positioning system to its developer Lens Studio and Specs to build a shared, AI-powered map of the world. The bottom line: Spiegel believes the hardware space has been slow to elevate and enhance the advancements of sophisticated AI software, and he wants Snap to be the company that bridges that gap.
Yahoo
05-06-2025
- Yahoo
Snapchat Launches Standalone Lens Studio Mobile App
This story was originally published on Social Media Today. To receive daily news and insights, subscribe to our free daily Social Media Today newsletter. As it continues to work on its advancing AR projects, including AR-enabled Spectacles, Snapchat has launched a new standalone Lens Studio mobile app, which aims to make it easier for people to build their own AR experiences. As you can see in these images, Snap's Lens Studio app aims to provide a simplified effects creation flow, including templates and tools that will enable more people, even those without developer experience, to build custom in-app experiences. As explained by Snap: 'We're excited to introduce the Lens Studio iOS app and web tool. These are experimental new tools that make it easier than ever to create, publish, and play with Snapchat Lenses made by you. Now, you can generate your own AI effects, add your dancing Bitmoji to the fun, and express yourself with Lenses that reflect your mood or an inside joke–whether you're on the go or near your computer.' So it's less about professional AR creation, and more about democratizing the AR creation experience. Indeed, the App Store description outlines exactly that: 'Make your own Snapchat Lenses from your phone and in minutes. No code. No installs. Create AI effects, add your Bitmoji, design natural makeup looks, and more. Create something real and immediately share with your friends.' It's another step towards enabling broader creative expression with advanced digital tools, which has also been accelerated by generative AI apps and functions that can generate code for you based on plain language text prompts. Meta's even using conversational AI prompts to generate VR worlds, and with the complexity gap closing, you can see how this will help to usher in an era of expanded digital creativity, in all new ways. Snap has seen big success with user-generated Lenses, with these outside contributions often leading to new engagement trends in the app. Rather than limiting its creative output to its own internal team, enabling anyone to create their own AR experience has expanded its opportunity, and more than 400,000 professional developers and teams now use Lens Studio for their creative visions. And now, Snap's setting its sights on non-technical folk as well, opening up even more opportunity for original AR experiences. It could be a fun add-on, which helps to drive more engagement, and generates more engagement in the main Snapchat app. You can download the Lens Studio mobile app in the App Store.

Engadget
04-06-2025
- Business
- Engadget
Snapchat now has a standalone app for making gen AI augmented reality effects
Snapchat has been experimenting with generative AI-powered augmented reality lenses in its app for the last couple years. Now, the company is allowing users to make their own with a new standalone app for making AR effects. Snap is introducing a new version of its Lens Studio software that allows anyone to create AR lenses through text prompts and other simple editing tools, and publish them directly to Snapchat. Up to now, Lens Studio has only been available as a desktop app meant for developers and AR professionals. And while the new iOS app and web version aren't nearly as powerful, it offers a wide range of face-altering and body-morphing effects thanks to generative AI. "These are experimental new tools that make it easier than ever to create, publish, and play with Snapchat Lenses made by you," the company explains in a blog post. "Now, you can generate your own AI effects, add your dancing Bitmoji to the fun, and express yourself with Lenses that reflect your mood or an inside joke–whether you're on the go or near your computer. " Snap gave me an early look at the Lens Studio iOS app, and I was pleasantly surprised by how much flexibility it offered. There are AI-powered tools for transforming your face, body and background via detailed text prompts (the app also offers suggestions of the kinds of prompts that work well, like "detailed zombie head with big eyes and nose, lots of details.") There's a bit of a learning curve to figuring out what works well for each type of effect, and some of the generative AI prompts can take up to 20 minutes to render. But the app also offers dozens of templates that you can use as a starting point and remix with your own ideas. You can also make simpler face-altering filters that don't rely as heavily AI but take advantage of popular Snapchat effects like face cutouts or Bitmoji animations. (A few examples of my creations are below, both used AI to create a background I overlaid other effects onto.) Snap already has hundreds of thousands of lens creators, some of whom have been making effects for the app for years. But I can easily see this new, simpler version of Lens Studio opening the door for many more. There could also be some upside for creators hoping to take advantage of Snapchat's monetization programs: the company confirmed that users who publish lenses from the new app will be eligible to participate in its Lens Creator Rewards program, which pays creators who make popular AR effects. A more accessible version of Lens Studio could also help Snap compete with Meta for AR talent. (Meta shut down Spark AR, its platform that allowed creators to make AR for Instagram last year.) In addition to Snapchat's in-app effects, the company is now on its second generation of standalone AR glasses . More recently, Snap has focused on big-name developers to make glasses-ready effects, but the company has previously leaned on Lens Creators to come up with interesting use cases for AR glasses. Those types of integrations will likely require much more than what's currently available in the new pared-down version of Lens Studio, but making AR creation more accessible (with the help of AI) raises some interesting possibilities for what might one day be possible for the company. Jim Lanzone, the CEO of Engadget's parent company Yahoo, joined the board of directors at Snap on September 12, 2024. No one outside of Engadget's editorial team has any say in our coverage of the company.