
The Galaxy Watch 8 may ruin my favorite thing about Samsung smartwatches
Kaitlyn Cimino / Android Authority
Leaks don't usually take the wind out of my sails, but the latest details about Samsung's upcoming Galaxy Watch lineup came close. According to renders posted by trusted leaker Evan Blass, the Galaxy Watch 8, Galaxy Watch 8 Classic, and the Galaxy Watch Ultra 2025 will each feature the same squircle design that debuted with the original Galaxy Watch Ultra, and, to put it softly, that's a massive disappointment.
I wasn't a fan of the Ultra's shape at launch (or months later on my wrist), but I held out hope that it would stay confined to just one SKU for a certain type of buyer. To my dismay, it looks like Samsung is going all-in on the squircle.
Do you want a fully squircle Samsung Galaxy Watch lineup?
14 votes
Yes, I like the design.
14 %
No, I hate the squircle.
71 %
I'd like variety within the lineup.
14 %
Let's start with the obvious: the squircle just doesn't look good. At least, not to me. It's a strange hybrid of the Apple Watch's rounded rectangle and Samsung's usual circular style, a compromise that lands firmly in no-man's land. A circular smartwatch has always had a classic charm. It feels like an actual watch and doesn't look out of place with a dress shirt or a leather strap. Square-faced wearables like the Apple Watch lean heavily into their tech-first identity, offering more screen real estate and sharper information layout, but still featuring a refined impression.
Leaked Galaxy Watch 8 renders
Leaked Galaxy Watch 8 Classic renders
Leaked Galaxy Watch Ultra 2025 renders
Samsung's squircle tries to split the difference and ends up accomplishing neither. It doesn't have the elegance of a traditional watch shape or the unapologetic utility of a square one. At best, it feels indecisive, and at worst, it feels like a child's toy. Wearables aren't just gadgets anymore, they're accessories we wear 24/7, and looks matter. Plenty of people may like the squircle shape, but I'm not one of them.
As the saying goes, form follows function, but here, too, the squircle stumbles. The Galaxy Watch Ultra's shape adds noticeable bulk to the lineup, especially for smaller-wristed folks like me. I typically love a large watch, but only if its size is packaged with refinement and its increased real estate delivers added usefulness.
Samsung's squircle design doesn't have the elegance of a traditional watch shape or the unapologetic utility of a square one.
Simply put, if I'm going to wear something oversized, I want that space to be filled with display space to deliver bigger menus, richer complications, and enriched user interactions. The squircle is chunky without the payoff of a proportionally larger screen; it merely sacrifices comfort and utility for visual distinction I can't get behind.
What bums me out most is how unnecessary this feels. Samsung has historically done a good job of offering variety. When the revamped Galaxy Watch 4 series arrived to usher in a new era of Wear OS, shoppers got the choice of a sporty or classic build. The Galaxy Watch 5 series echoed that two-pronged approach with a Galaxy Watch 5 Pro and a regular Galaxy Watch 5 model. Even last year's lineup gave us the circular Galaxy Watch 7 alongside the squircle Galaxy Watch Ultra.
Now, it seems Samsung is consolidating its lineup into a single design identity. Yes, there are three models, but all feature the same cartoonish profile.
Kaitlyn Cimino / Android Authority
I get the desire to unify the product line. Streamlining manufacturing, creating a signature look, etc. But making all your watches look the same doesn't mean they'll appeal to everyone. I love the domed shape of the Pixel Watch line, and I love the square-ish Apple Watch Series. I loved the historically elegant Galaxy Watch line and its elevated Classic models.
The rotating bezel might be making its way back to wrists with the Galaxy Watch 8 Classic, but it doesn't give the same analog watch vibes on a chubby rectangle. I realize the squircle may appeal to some people, but I just wish Samsung hadn't painted the entire lineup with the same broad stroke.
TL;DR: I hate the squircle. I want options. Samsung trimming every watch into the same questionable shape isn't forward-thinking; it's cutting corners. Literally.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
an hour ago
- Yahoo
5 ways people build relationships with AI
When you buy through links on our articles, Future and its syndication partners may earn a commission. Stories about people building emotional connections with AI are appearing more often, but Anthropic just dropped some numbers claiming it's far from as common as it might seem. Scraping 4.5 million conversations from Claude, the company discovered that only 2.9 percent of users engage with it for emotional or personal support. Anthropic wanted to emphasize that while sentiment usually improves over the conversation, Claude is not a digital shrink. It rarely pushes back outside of safety concerns, meaning it won't give medical advice and will tell people not to self-harm. But those numbers might be more about the present than the future. Anthropic itself admits the landscape is changing fast, and what counts as "affective" use today may not be so rare tomorrow. As more people interact with chatbots like Claude, ChatGPT, and Gemini and more often, there will be more people bringing AI into their emotional lives. So, how exactly are people using AI for support right now? The current usage might also predict how people will use them in the future as AI gets more sophisticated and personal. Let's start with the idea of AI as a not-quite therapist. While no AI model today is a licensed therapist (and they all make that disclaimer loud and clear), people still engage with them as if they are. They type things like, "I'm feeling really anxious about work. Can you talk me through it?" or "I feel stuck. What questions should I ask myself?" Whether the responses that come back are helpful probably varies, but there are plenty of people who claim to have walked away from an AI therapist feeling at least a little calmer. That's not because the AI gave them a miracle cure, but because it gave them a place to let thoughts unspool without judgment. Sometimes, just practicing vulnerability is enough to start seeing benefits. Sometimes, though, the help people need is less structured. They don't want guidance so much as relief. Enter what could be called the emotional emergency exit. Imagine it's 1 AM and everything feels a little too much. You don't want to wake up your friend, and you definitely don't want to scroll more doom-laced headlines. So you open an AI app and type, "I'm overwhelmed." It will respond, probably with something calm and gentle. It might even guide you through a breathing exercise, say something kind, or offer a little bedtime story in a soothing tone. Some people use AI this way, like a pressure valve – a place to decompress where nothing is expected in return. One user admitted they talk to Claude before and after every social event, just to rehearse and then unwind. It's not therapy. It's not even a friend. But it's there. For now, the best-case scenario is a kind of hybrid. People use AI to prep, to vent, to imagine new possibilities. And then, ideally, they take that clarity back to the real world. Into conversations, into creativity, into their communities. But even if the AI isn't your therapist or your best friend, it might still be the one who listens when no one else does. Humans are indecisive creatures, and figuring out what to do about big decisions is tough, but some have found AI to be the solution to navigating those choices. The AI won't recall what you did last year or guilt you about your choices, which some people find refreshing. Ask it whether to move to a new city, end a long relationship, or splurge on something you can barely justify, and it will calmly lay out the pros and cons. You can even ask it to simulate two inner voices, the risk-taker and the cautious planner. Each can make their case, and you can feel better that you made an informed choice. That kind of detached clarity can be incredibly valuable, especially when your real-world sounding boards are too close to the issue or too emotionally invested. Social situations can cause plenty of anxiety, and it's easy for some to spiral into thinking about what could go wrong. AI can help them as a kind of social script coach. Say you want to say no but not cause a fight, or you are meeting some people you want to impress, but are worried about your first impression. AI can help draft a text to decline an invite or suggest ways to ease yourself into conversations with different people, and take on the role to let you rehearse full conversations, testing different phrasings to see what feels good. Accountability partners are a common way for people to help each other achieve their goals. Someone who will make sure you go to the gym, go to sleep at a reasonable hour, and even maintain a social life and reach out to friends. Habit-tracking apps can help if you don't have the right friend or friends to help you. But AI can be a quieter co-pilot for real self-improvement. You can tell it your goals and ask it to check in with you, remind you gently, or help reframe things when motivation dips. Someone trying to quit smoking might ask ChatGPT to help track cravings and write motivational pep talks. Or an AI chatbot might ensure you keep up your journaling with reminders and suggestions for ideas on what to write about. It's no surprise that people might start to feel some fondness (or annoyance) toward the digital voice telling them to get up early to work out or to invite people that they haven't seen in a while to meet up for a meal. Related to using AI for making decisions, some people look to AI when they're grappling with questions of ethics or integrity. These aren't always monumental moral dilemmas; plenty of everyday choices can weigh heavily. Is it okay to tell a white lie to protect someone's feelings? Should you report a mistake your coworker made, even if it was unintentional? What's the best way to tell your roommate they're not pulling their weight without damaging the relationship? AI can act as a neutral sounding board. It will suggest ethical ways to consider things like whether accepting a friend's wedding invite but secretly planning not to attend is better or worse than declining outright. The AI doesn't have to offer a definitive ruling. It can map out competing values and help define the user's principles and how they lead to an answer. In this way, AI serves less as a moral authority than as a flashlight in the fog. Right now, only a small fraction of interactions fall into that category. But what happens when these tools become even more deeply embedded in our lives? What happens when your AI assistant is whispering in your earbuds, popping up in your glasses, or helping schedule your day with reminders tailored not just to your time zone but to your temperament? Anthropic might not count all of these as effective use, but maybe they should. If you're reaching for an AI tool to feel understood, get clarity, or move through something difficult, that's not just information retrieval. That's connection, or at least the digital shadow of one. You and your friends can now share and remix your favorite conversations with the Claude AI chatbot Anthropic's new AI-written blog is more of a technical treat than a literary triumph A new AI feature can control your computer to follow your orders


Bloomberg
2 hours ago
- Bloomberg
Humanoid Robots Play Soccer Poorly in Chinese Exhibition Match
They looked like tipsy 7-year-olds stumbling about the soccer pitch. But the game that unfolded at an industrial zone in Beijing was a breakthrough for humanoid robots and the artificial intelligence that powered them through a 5-3 match on Saturday. Clad in black and purple jerseys with individual player numbers, diminutive humanoids faced off for two 10-minute halves, their movements controlled not by gesticulating coaches on the sidelines but by built-in algorithms.
Yahoo
3 hours ago
- Yahoo
Hesai Group (HSAI) Soars 19.26% as New Lidar Secures Safety Certification
Hesai Group (NASDAQ:HSAI) is one of the . Hesai Group soared by 19.26 percent on Thursday to end at $23.10 apiece after achieving a new milestone for its ATX Lidar and safety certification portfolio. In a statement, Hesai Group (NASDAQ:HSAI) said that its ATX, a compact, high-resolution, long-range lidar, officially obtained ISO 26262 ASIL B functional safety certification from SGS-TÜV, the world's leading testing, inspection, and certification institution. ATX marked its fourth ISO 26262 accreditation after after Pandar128, QT128, and AT128. With the new certification, Hesai Group (NASDAQ:HSAI) said it now holds the highest number of ISO 26262 certified products in the world. ISO 26262 is a key standard for the functional safety of electrical and electronic (E/E) systems in vehicles. It calls for carefully planned safety measures at every step of the development process—from safety management and concept design, to system, hardware, and software development. Logistics robots filling packages in a warehouse, preparing for delivery. As vehicles become more dependent on E/E systems, ISO plays an important role in ensuring vehicle components adhere to the most stringent safety standards to reduce costly recalls and prevent passenger injuries. While we acknowledge the potential of HSAI as an investment, our conviction lies in the belief that some AI stocks hold greater promise for delivering higher returns and have limited downside risk. If you are looking for an extremely cheap AI stock that is also a major beneficiary of Trump tariffs and onshoring, see our free report on the best short-term AI stock. READ NEXT: 20 Best AI Stocks To Buy Now and 30 Best Stocks to Buy Now According to Billionaires. Disclosure: None. This article is originally published at Insider Monkey. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data