Oakley Meta glasses review: A familiar formula with some upgrades
But the Oakley Meta glasses are also the social media company's first collaboration with a non-Ray-Ban brand (though both share a parent company in EssilorLuxottica). And while Meta stays pretty close to the strategy it's used for the last four years, its latest frames offer some hints about its longterm ambitions in the space.
Meta has described its Oakley-branded frames as "performance glasses," which isn't entirely surprising given Oakley's longtime association with athletes. But there are only a few actual upgrades compared to the Ray-Ban lineup. The Oakley Meta glasses have a notably longer battery life, both for the glasses themselves and the charging case. They are also able to capture higher quality video than previous versions.
With a starting price of nearly $400, though, I'm not sure those upgrades are worth an extra $100 - $200.
There are some solid upgrades that will appeal to serious athletes and power users, but they don't quite justify the higher price. $399 at Meta
Meta's debut pair of Oakley-branded glasses are based on the brand's HSTN (pronounced how-stuhn) frames and there's really nothing subtle about the design. The first of these is a limited edition version with shiny gold lenses and bright white frames (which Meta inexplicably calls "warm grey").
Like previous Ray-Ban models, they don't look overtly techy, but I still wasn't a big fan of the design. The glasses felt just a little oversized for my face and something about the bright white paired with gold lenses reminded me a little too much of a bug. The color combo also accentuates just how thick the frames are, particularly around the awkwardly wide nosepiece. Karissa Bell for Engadget
I posted a selfie on my Instagram Story and polled my friends on what they thought. And while a few politely said they thought I was "pulling them off," the majority said they looked too big for my face. A few told me they looked straight-up weird, and one summed up my feelings pretty well with "something looks off about them." Style is subjective, of course. And depending on your face shape and tolerance for contrasting colors, I could see others enjoying the design. I'm looking forward to seeing the rest of the HSTN collection, which is coming later this summer, and will hopefully have some more flattering color variations.
Looks aside, the glasses function almost identically to the Ray-Ban glasses Meta introduced in 2023. There's a 12-megapixel POV camera over the left eye, and an indicator light over the right that lights up when you snap a photo or start recording a video via the capture button. There are open-ear speakers in the arms so you can listen to music and hear notifications. Much like the Ray-Ban glasses, the speakers here are pretty good at containing the sound so others can't hear when you're listening at lower volumes, but it's definitely noticeable at higher levels. You can control music playback and volume pretty easily, though, with a touchpad on the right side of the glasses.
The most important upgrade that comes with the Oakley glasses is the battery. Meta claims the glasses can last up to eight hours with "typical" (non-constant) use and up to 19 on standby. I was able to squeeze a little over five hours of continuous music playback out of the battery in one sitting, which is about an hour better than the Ray-Ban frames. The charging case can provide up to 48 hours of additional runtime, according to Meta. It's been well over a week and I haven't yet had to plug in the case.
The charging case is, however, noticeably bigger and heavier than the Ray-Ban case. It's not a dealbreaker, but the case is too big for any of my pockets and just barely fits into my small sling bag. My other gripe with the charging case is the same complaint I had about the Ray-Ban case: there's no way to see the charge level of the case itself. There's a small LED in the front that will change from green to yellow to red based on the battery level, but it's hardly a precise indicator. Karissa Bell for Engadget
The other major upgrade is the 12MP camera, which can now shoot in 3K compared to 1080p on previous models. The higher resolution video is, notably, not the default setting, but I appreciated having the option. I could see it being especially useful for creators looking to shoot POV footage, but I mostly use the glasses for still shots rather than video.
San Francisco is currently having a record-breaking cold summer so most of my testing has been in fairly overcast conditions. It might be a product of the gray weather, but I found the photos I've shot with the glasses a bit overly saturated for my taste. They looked fine on an Instagram Story, though. The camera has a bit of a wide angle with a 100-degree field of view, so there's still a bit of a learning curve in terms of figuring out how best to frame the shots.
Another issue is that it's very easy for a hat or a piece of hair to make it into your photos without realizing. My previous experience with the Ray-Ban Meta glasses meant I was careful to pull my hair back before snapping a picture, but I was bummed to realize after a long bike ride that the visor on my helmet was visible in the frame of every photo and video. It seems like Meta may have a plan to address this: I noticed a setting called "media quality" that's meant to alert you when something is partially obstructing the camera. The feature is apparently still testing, though, and it wasn't functional. A Meta spokesperson did confirm it would be added in a future update, though. "Media Quality Check is a feature we're working to bring to our AI glasses collection in the future that will alert users when photos are blurry or if something like your hair or a hat blocks what you capture," Meta said.
Meta describes this color as "warm grey."
The Meta AI app (formerly known as Meta View) can help fix other issues, though. It has a "smart crop" feature that can automatically straighten your pics to correct for any head tilt. It also has built in AI-powered edits for photos and video so you can restyle your clips directly in the app. And while the functionality isn't limited to clips shot with the glasses, the possibility of adding AI edits after the fact makes shooting otherwise mundane clips a bit more appealing. The ability to restyle video, however, is only "free for a limited time," according to the Meta AI app.
While the core features of Meta's smart glasses have largely stayed the same since it first introduced the Ray-Ban Stories in 2021, one of the more interesting changes is how Mark Zuckerberg and other execs have shifted from calling them "smart glasses" to "AI glasses." As the company has shifted away from the metaverse and made AI a central focus, it's not surprising those themes would play out in its wearables too.
And while none of the Meta AI features are unique to the Oakley frames, Meta has added a couple of abilities since my last review that are worth mentioning. The first is live translation. The feature, which you have to enable in the Meta AI app, allows the onboard assistant to translate speech as you hear it. If both sides of a conversation have a pair of Meta glasses, then you can carry on a full conversation even if you don't speak the same language. The feature currently supports Spanish, French, Italian and English. Karissa Bell for Engadget
I tried it out with my husband — a native Spanish speaker who was also wearing a pair of Meta glasses — and we were both fairly impressed. I would say something in English and Meta AI on his glasses would relay it to him in Spanish. He would then respond in Spanish and Meta AI would translate the words into English.
It's not the most natural way to speak because you have to pause and wait for a translation, but it was mostly effective. There were a few bugs, though. Because we were sitting close to each other, sometimes Meta AI would overhear the translated audio from the other person's glasses and translate it back, which made the whole thing feel like a bizarre game of telephone.
And over the course of a several-minute conversation, there were a handful of times when Meta AI wouldn't pick up on what was said at all, or would only begin translating halfway through a statement. We also encountered some issues with Meta AI's translations when it came to slang or regional variations of certain words. While it wasn't perfect, I could see it being useful while traveling since it's much smoother than using Google Translate. There was also something endlessly amusing about hearing my husband's words relayed back to me by the voice of AI Judi Dench (Meta tapped a bunch of celebrities last year to help voice its AI). Stills from a video of a walk through a parking lot (left), and the same image after using the "desert rave" effect in Meta AI app. (Screenshots (Meta AI))
The other major AI addition is something called "Live AI," which is essentially a real-time version of the glasses' multimodal powers. Once you start a Live AI session, Meta's assistant is able to "see" everything you're looking at and you can ask it questions without having to repeatedly say "hey Meta." For example, you can look at plants and ask it to identify them, or ask about landmarks or your surroundings.
The feature can feel a bit gimmicky and it doesn't always work the way you want it to. For example, Meta AI can identify landmarks but it can't help you find them. While on a bike ride, I asked if it could help me navigate somewhere based on the intersection I was at and Meta AI responded that it was unable to help with navigation. It also didn't correctly identify some (admittedly exotic) plants during a walk through San Francisco's botanical gardens. But it did helpfully let me know that I may want to keep my distance from a pack of geese on the path.
I'm still not entirely sure what problems these types of multimodal features are meant to solve, but I think it offers an interesting window into how Meta is positioning its smart glasses as an AI-first product. It also opens up some intriguing possibilities whenever we get a version of Meta glasses with an actual display, which the rumor mill suggests could come as soon as this year.
While I don't love the style of the Oakley Meta HSTN frames, Meta has shown that it's been consistently able to improve its glasses. The upgrades that come with the new Oakley frames aren't major leaps, but they deliver improvements to core features. Whether those upgrades justify the price, though, depends a lot on how you plan to use the glasses.
The special edition HSTN frames I tested are $499 and the other versions coming later this year will start at $399. Considering you can get several models of Meta's Ray-Ban glasses for just $299, I'm not sure the upgrades justify the added cost for most people. That's probably why Meta has positioned these as a "performance" model better suited to athletes and Oakley loyalists.
But the glasses do offer a clearer picture of where Meta is going with its smart glasses. We know the company is planning to add displays and, eventually, full augmented reality capabilities — both of which will benefit from better battery life and cameras. Both are also likely to cost a whole lot more than any of the frames we've seen so far. But, if you don't want to wait, the Oakley Meta glasses are the closest you can get to that right now.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
an hour ago
- Yahoo
Why AppLovin Stock Was Gaining Today
Key Points Meta's revenue grew 22% in the second quarter, reflecting strong ad demand. That and Microsoft's strong quarter could bode well for AppLovin, as the high-growth stock is set to report Q2 earnings next week. Analysts are expecting earnings per share to double to $2.32. 10 stocks we like better than AppLovin › Shares of AppLovin (NASDAQ: APP), the fast-growing adtech company, were soaring today even as there was no news out on the company. Instead, the stock seemed to benefit from a wave of bullish sentiment for artificial intelligence (AI) and digital advertising stocks after strong reports from both Meta Platforms and Microsoft last night. As a result, AppLovin stock was up 8.2% as of 12:53 p.m. ET, while those two big tech stocks gained as well. AppLovin rides Meta's coattails Strong results from Meta in particular seemed to benefit AppLovin, as Meta's report showed off healthy demand in the digital advertising market. Revenue jumped 22% to $47.5 billion, and advertising made up 98% of its revenue, showing strong demand for ads on Facebook and Instagram. Meta credited AI improvements for driving both growth in ad impressions and an increase in the price per ad, reflecting increased demand and ROI. That trend, along with Microsoft's strong quarterly numbers, set off a wave of bullishness for stocks like AppLovin, which is high-priced and high-growth and offers exposure to both adtech and AI. In fact, AppLovin may be ahead of the curve in AI-driven advertising, as its AI-powered recommendation engine, Axon, has been a key source of growth for the company recently. What's next for AppLovin? As an expensive, high-growth stock, AppLovin tends to be volatile, and today's gains reflect an improved perception of its future growth ahead of its own earnings report on Aug. 6. Investors are expecting 13% revenue growth to $1.22 billion in the quarter, though that includes the sale of its mobile apps business. Excluding that, organic growth will be much stronger. On the bottom line, analysts expect earnings per share to essentially double to $2.32, a better reflection of the underlying growth in the business. The stock has the potential to pop again if it can beat those estimates. Do the experts think AppLovin is a buy right now? The Motley Fool's expert analyst team, drawing on years of investing experience and deep analysis of thousands of stocks, leverages our proprietary Moneyball AI investing database to uncover top opportunities. They've just revealed their to buy now — did AppLovin make the list? When our Stock Advisor analyst team has a stock recommendation, it can pay to listen. After all, Stock Advisor's total average return is up 1,049% vs. just 182% for the S&P — that is beating the market by 867.25%!* Imagine if you were a Stock Advisor member when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $638,629!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $1,098,838!* The 10 stocks that made the cut could produce monster returns in the coming years. Don't miss out on the latest top 10 list, available when you join Stock Advisor. See the 10 stocks » *Stock Advisor returns as of July 29, 2025 Jeremy Bowman has positions in AppLovin and Meta Platforms. The Motley Fool has positions in and recommends AppLovin, Meta Platforms, and Microsoft. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy. Why AppLovin Stock Was Gaining Today was originally published by The Motley Fool 擷取數據時發生錯誤 登入存取你的投資組合 擷取數據時發生錯誤 擷取數據時發生錯誤 擷取數據時發生錯誤 擷取數據時發生錯誤
Yahoo
an hour ago
- Yahoo
Why Robinhood's CEO Touted Tokenization 11 Times on Its Q2 Earning Call
Robinhood CEO Vlad Tenev referenced tokenization nearly a dozen times during the company's second-quarter earnings broadcast on Wednesday, highlighting how the once-esoteric term is becoming a cornerstone of the retail brokerage's crypto strategy. In total, Tenev referenced tokenization 11 times, while also mentioning Robinhood's stock tokens on a handful of occasions. The digital assets, which offer synthetic exposure to private companies like SpaceX and OpenAI, were unveiled for European users in June, alongside company ambitions to build its own Ethereum layer-2 scaling network. During the hour-long broadcast, Tenev invoked tokenization within five minutes, a term used to describe the process of representing real-world assets using tokens on a blockchain. In some ways, the episode mirrored Facebook parent Meta's pivot away from the metaverse in 2023, when CEO Mark Zuckerberg referenced artificial intelligence 20 times on an earnings broadcast. Tenev described tokenization as 'the biggest innovation in capital markets in over a decade,' saying investors will benefit from 24/7 trading, instant settlement, and self custody. Robinhood can make 'all sorts of assets' accessible for users that weren't feasible before, he said. Robinhood's stock price has nearly tripled this year, rising 179% year-to-date to $104 on Thursday, according to Yahoo Finance. Despite a cooldown in crypto trading revenue, the firm posted stronger-than-expected sales and profits in the second quarter. Crypto exchanges Binance and now-defunct FTX dabbled in tokenized stock trading during the pandemic-era crypto boom, but the umbrella term faded into obscurity amid regulatory heat until BlackRock CEO Larry Fink touted tokenization as 'the next generation for markets' in 2022. Now, with the SEC committed to advancing tokenization frameworks under a crypto-friendly White House, Robinhood is positioning itself as a major player in the space, and one that could potentially benefit from the same regulatory tailwinds as crypto-native firms for years to come. Robinhood CEO Acknowledges OpenAI Crypto Stock 'Controversy'—But Is Doubling Down Asked by an analyst for feedback that Robinhood is getting on tokenizing private assets, Tenev said that it's addressed a mainstream criticism toward digital assets, namely the claim that they're not tied to anything with "fundamental utility' and are ephemeral like memes. Aside from customers' apparent demand for stock tokens, Tenev said the product's rollout demonstrated 'the power of crypto technology and show[ed] a roadmap to what Robinhood could be if it was rebuilt from the ground up on crypto rails.' But not everyone was satisfied with Robinhood's stock tokens, including OpenAI, which slammed the product as an unauthorized attempt to tokenize its equity. Not long after, the U.S. Securities and Exchange Commission warned firms that their tokenized offerings must comply with securities laws, and a blockchain isn't a valid way of getting around that. Tenev told Decrypt last week that he thinks the regulator's warning was not specific to Robinhood, and on Wednesday, he noted that 'some things need to happen' regulatory-wise to unlock tokenization's true potential, including a look at investor accreditation laws. At the end of the day, Robinhood said that it's still in the first phase of its tokenization strategy, which involves creating a sustainable supply of stock tokens. The second phase will involve getting the tokens to trade on Bitstamp, a crypto exchange that Robinhood acquired this year. Phase three, Tenev said, will allow users to plug stock tokens into decentralized finance, or DeFi protocols. DeFi applications allow users to self-custody their assets and transact natively on blockchain networks. 'We have the technical capabilities to do all of this,' he said. 'It's just a question of going through the regulatory process, and we think that's basically the optimal path.' Robinhood did not immediately respond to Decrypt. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


The Verge
an hour ago
- The Verge
Zuckerberg's ‘personal superintelligence' plan: fill your free time with more AI
It has been another busy week. GPT-5 appears to be just around the corner… This week, I decode the meaning behind Mark Zuckerberg's 'personal superintelligence' manifesto, and what it means for the broader AI race. Keep reading for my chat with a Figma exec on the company's IPO day, a bunch of good links, and some feedback from last week's issue. Meta has given up on trying to beat ChatGPT at its own game. If you read between the lines, that's the message behind Mark Zuckerberg's 'personal superintelligence' manifesto. For the past year, he pushed the Meta AI assistant on nearly every surface he owns in an attempt to kneecap ChatGPT's growth. It didn't work. Now, as Zuckerberg spends heavily to reboot Meta's AI strategy, he is honing the company's focus on what it has historically managed to dominate: winning your attention. In his Nat-Friedman-stylized blog post, Zuckerberg lays out how he thinks this will work in the AI era: 'If trends continue, then you'd expect people to spend less time in productivity software, and more time creating and connecting. Personal superintelligence that knows us deeply, understands our goals, and can help us achieve them will be by far the most useful.' While ChatGPT's goal is to become a 'super assistant' that increasingly does more work on your behalf, Meta's goal is to fill the free time you will theoretically get back. This strategy, while potentially dystopian, plays more to Meta's core strengths: maximizing engagement and monetizing that engagement better than anyone else. This idea — that Meta wants to fill the free time created by productivity-focused AI — is what Zuckerberg and his deputies have been pitching more directly both internally and to recruits. 'We need to differentiate here by not focusing obsessively on productivity, which is what you see Anthropic and OpenAI and Google doing,' Meta CPO Chris Cox told employees during an all-hands meeting last month. 'We're going to go focus on entertainment, on connection with friends, on how people live their lives, on all of the things that we uniquely do well.' There's a lot Meta can and will do to help creators more easily publish different kinds of content and reach more people. But going forward, I expect the company to use AI to make its apps more engaging via more personalized ads, surfacing better Reels to watch (or generating them from scratch), and encouraging interactions with AI personas. It's probably not a coincidence that 'personal superintelligence' was first coined by co-founder Noam Shazeer, who discussed joining Meta before he rejoined Google last year…. The Verge's Hayden Field and I discussed the AI talent wars this week on Decoder. We dropped some reporting during the podcast pertaining to Meta that I'll expand on here: Yes, Zuckerberg is making huge, above-market offers to hire AI talent. But the offers aren't as simple as the headlines have made them out to be. People who have seen the offers tell me they are structured more like executive pay with specific performance targets (they are paid out through performance stock units, not the restricted stock units that most Big Tech employees get) and the ability to claw back money, including the hefty signing bonus, if you leave early. Given the strings that are attached, it's easier to see why Zuckerberg hasn't managed to hire everyone he has gone after. 'Apple must do this. Apple will do this. This is sort of ours to grab. We will make the investment to do it.' - Apple CEO Tim Cook talking about AI during an employee all-hands meeting. 'Base model startup companies splitting into 1. Winners competing at the soon-to-be 13-digit level [new rounds, new investors, high valuations] 2. Laggards kept alive in hopes of finding [a] niche or buyer [new rounds, same investors, not yet punitive valuations] 3. Sovereign-supported local plays' - Hunter Walk 'I wouldn't say research iterates on product. But now that models are at the edge of the capabilities that can be measured by classical benchmarks and a lot of the long-standing challenges that we've been thinking about are starting to fall, we're at the point where it really is about what the models can do in the real world.' - OpenAI chief scientist Jakub Pachocki 'Two decades ago, design was lipstick on a pig. Design now is how you win or lose.' - Figma CEO Dylan Field A question surrounding Figma's blockbuster IPO this week is whether AI will ultimately abstract away the need for a tool like Figma, or make it more useful. Figma thinks its focus on team collaboration will help it withstand the rise of 'vibe design' tools like Lovable. After he helped ring the opening bell at the New York Stock Exchange on Thursday, I caught up with CPO Yuhki Yamashita. 'When I think about the future, I think about where the highest value activity is going to happen,'' he told me. 'And for me, it's all around aligning as a team on what you're building, and the other activity is taking an idea and really refining it.' Like his boss, Dylan Field, Yamashita sees design as a key differentiator in a world full of AI-generated software. 'If you decide it's an exciting enough idea to pursue, to keep iterating on, that's what is going to differentiate that product, especially in a world where more and more people are creating more and more products. And I think being that platform where people are gonna do that is increasingly important.' I buy that argument, but I can also see more of Figma's core use cases being replaced by AI-native startups. Luckily for Figma, Field is one of (if not the) most well-connected angel investors in AI startups right now. Given his openness to M&A, I'd expect Figma to make some acquisitions to help it get ahead in the coming quarters. Interesting career moves this week: More to click on: Responses to last week's issue about Google: If you haven't already, don't forget to subscribe to The Verge, which includes unlimited access to Command Line and all of our reporting. As always, I welcome your feedback. You can respond here or ping me securely on Signal. Thanks for subscribing. Posts from this author will be added to your daily email digest and your homepage feed. See All by Alex Heath Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Command Line Posts from this topic will be added to your daily email digest and your homepage feed. See All Meta Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech