
Don't own a smartwatch? Strava's app update just made phone tracking way better
Now, the app has committed to making tracking with your phone a better, more streamlined experience.
The Record feature is the part of the Strava app you use to start and track your workout, whether running, cycling, or hiking. This update gives it a fresh look and makes it much easier to use when recording activities on your phone.
With the update, you can see your route on a clearer, more detailed map that updates as you move. You'll also get real-time stats and splits, so you can check your pace and distance without tapping through different screens.
For subscribers, it will also be simpler to chase segments and track laps — perfect if you're motivated by beating your personal bests.
If you're on Android, you can try the new Record update today, which includes clearer maps, real-time stats, and live splits.
For iOS users, the update will arrive in the next few weeks. Later this summer, both Android and iOS users will get an added feature that lets you follow your activity on an easy-to-follow map with stats updating in real time as you move.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Tom's Guide
3 minutes ago
- Tom's Guide
Amazon buys wearable AI startup Bee, a wrist device that hears and transcribes every word you speak
As Amazon spends this year attempting to bolster its AI bonafides, the internet retail giant is reportedly acquiring Bee, a startup that makes an AI wearable for your wrist. The announcement was made by Bee CEO Maria de Lourdes Zollo on LinkedIn who wrote she "couldn't think of better partners to help us bring truly personal, agentic AI to even more customers." Amazon confirmed the acquisition in an email to Tom's Guide, though noted that the deal "isn't closed yet" and the two entities are still two companies. What is Bee? (Image credit: Bee) Bee is a recent startup that makes a $49.99 Fitbit-esque wrist device, dubbed the Bio Pioneer edition that is still in preorder and slated to launch in September. The device is meant to listen to your conversations through your day and then use AI to transcribe everything said by and around you. From there the AI agent will generate personalized summaries of your day plus reminders and suggestions in the Bee app. You can also let the Bee access your calendar, contacts, emails, location, photos and reminders to help the AI's insights and send information. With Amazon, we were told that Bee is working on a number of new features to "provide even greater control over" their devices. Sign up to get the BEST of Tom's Guide direct to your inbox. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors "We're excited to partner with the Bee team to continue inventing in this space," Amazon spokesperson Alexandra Miller said in an email. Is it any good? (Image credit: Bee) Some have tried out the device, The Verge's Victoria Song went hands on with the Bee and described it as a "glimmer of a good idea." She was skeptical that we need to record our conversations all the time and found that it confused real-life conversations with media playing in the background. Plus, there are privacy concerns around Bee, though Lourdes has said previously that Bee doesn't store any recordings. Before pre-sales opened the company emphasized that they wanted to profit via device sales and subscriptions. However, compared to the Humane AI Pin, Rabbit R1 and Friend pendant, Song did say the Bee is the "most successful AI wearable" she's tried with the cavaet that it's a very low bar. We reached out to Bee for comment on how it plans to partner with Amazon and its goals with the new partnership. They have not responded as of publication but we will update if they do. Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.


Android Authority
3 minutes ago
- Android Authority
YouTube's latest experiment makes comments feel more like Reddit, if that sounds like an improvement
Joe Maring / Android Authority TL;DR YouTube is rolling out Reddit-style comment threading to Premium subscribers on Android and iOS. The experiment has been updated so the main comment is threaded to subsequent replies. Threaded comments will remain available until August 14. YouTube's comments section is going to look a little different for Premium subscribers. The company is rolling out an experiment inspired by Reddit. Earlier this year, YouTube began testing a new threaded comment UI for Android and iOS. At the time, only a small group of users got the chance to use the threaded comment system. YouTube has now announced that it is expanding the experiment to Premium users. When the test first rolled out, it was a bit of a half-baked product compared to the way it's implemented on Reddit. However, YouTube has since improved the feature. Previously, it only connected a user's profile picture to the 'X replies' button. Now the feature has been updated, so the main comment is threaded to subsequent replies. Additionally, replies to a reply are now threaded as well. Overall, it's a change that some will appreciate and others maybe less so. As it's an experiment, it won't be around for forever. The company notes that comment threading will stick around for Premium subscribers until August 14. Just like the initial test, this experiment will only appear on Android and iOS. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.


Tom's Guide
3 hours ago
- Tom's Guide
Sam Altman's trillion-dollar AI vision starts with 100 million GPUs. Here's what that means for the future of ChatGPT (and you)
ChatGPT's CEO Sam Altman has a bold vision for the future of AI, something other big tech can't compete with: one powered by 100 million GPUs. That jaw-dropping number, casually mentioned on X just days after ChatGPT Agent launched as we await ChatGPT-5, is a glimpse into the scale of AI infrastructure that could transform everything from the speed of your chatbot to the stability of the global energy grid. Altman admitted the 100 million GPU goal might be a bit of a stretch — he punctuated the comment with 'lol" — but make no mistake, OpenAI is already on track to surpass 1 million GPUs by the end of 2025. And the implications are enormous. we will cross well over 1 million GPUs brought online by the end of this year!very proud of the team but now they better get to work figuring out how to 100x that lolJuly 20, 2025 What does 100 million GPUs even mean? (Image credit: Shutterstock) For those unfamiliar, I'll start by explaining the GPU, or graphics processing unit. This is a specialized chip originally designed to render images and video. But in the world of AI, GPUs have become the powerhouse behind large language models (LLMs) like ChatGPT. Unlike CPUs (central processing units), which handle one task at a time very efficiently, GPUs are built to perform thousands of simple calculations simultaneously. That parallel processing ability makes them perfect for training and running AI models, which rely on massive amounts of data and mathematical operations. So, when OpenAI says it's using over a million GPUs, it's essentially saying it has a vast digital brain made up of high-performance processors, working together to generate text, analyze images, simulate voices and much more. To put it into perspective, 1 million GPUs already require enough energy to power a small city. Scaling that to 100 million could demand more than 75 gigawatts of power, around three-quarters of the entire UK power grid. It would also cost an estimated $3 trillion in hardware alone, not counting maintenance, cooling and data center expansion. Sign up to get the BEST of Tom's Guide direct to your inbox. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors This level of infrastructure would dwarf the current capacity of tech giants like Google, Amazon and Microsoft, and would likely reshape chip supply chains and energy markets in the process. Why does it matter to you? While a trillion-dollar silicon empire might sound like insider industry information, it has very real consequences for consumers. OpenAI's aggressive scaling could unlock: Faster response times in ChatGPT and future assistants More powerful AI agents that can complete complex, multi-step tasks Smarter voice assistants with richer, real-time conversations The ability to run larger models with deeper reasoning, creativity, and memory In short, the more GPUs OpenAI adds, the more capable ChatGPT (and similar tools) can become. But there's a tradeoff: all this compute comes at a cost. Subscription prices could rise. Feature rollouts may stall if GPU supply can't keep pace. And environmental concerns around energy use and emissions will only grow louder. The race for silicon dominance (Image credit: Shutterstock) Altman's tweets arrive amid growing competition between OpenAI and rivals like Google DeepMind, Meta and Anthropic. All are vying for dominance in AI model performance, and all rely heavily on access to high-performance GPUs, mostly from Nvidia. OpenAI is reportedly exploring alternatives, including Google's TPUs, Oracle's cloud and potentially even custom chips. More than speed, this growth is about independence, control and the ability to scale models that could one day rival human reasoning. Looking ahead at what's next (Image credit: ANDREW CABALLERO-REYNOLDS/AFP via Getty Images) Whether OpenAI actually hits 100 million GPUs or not, it's clear the AI arms race is accelerating. For everyday users, that means smarter AI tools are on the horizon, but so are bigger questions about power, privacy, cost and sustainability. So the next time ChatGPT completes a task in seconds or holds a surprisingly humanlike conversation, remember: somewhere behind the scenes, thousands (maybe millions) of GPUs are firing up to make that possible and Sam Altman is already thinking about multiplying that by 100. Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.