The Google store at The Domain is opening this week
Similar to an Apple store, the storefront will feature Google products and tech for Austinites to try out such as Gemini Live, the company's AI assistant, on a Pixel phone or its Nest cameras and FitBit trackers. The store also will have pickups for online orders and have experts in person to help troubleshoot issues or provide repairs.
Google has experts on hand to help visitors get the most out of their device, such as troubleshooting an issue, providing Pixel phone repairs and more.
The Domain storefront will open at 10 a.m. Friday, May 30, and is at 11701 Domain Blvd, Suite #164, which is in the same building as the Shade Store, EVEREVE and Tecovas near Nordstrom.
The store will be open from 10 a.m. to 8 p.m. on Mondays through Saturdays and from 11 a.m. to 7 p.m. on Sundays.
This article originally appeared on Austin American-Statesman: Google's Domain storefront to open this week

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Business of Fashion
an hour ago
- Business of Fashion
AI Shopping Is Here. Will Retailers Get Left Behind?
AI doesn't care about your beautiful website. Visit any fashion brand's homepage and you'll see all sorts of dynamic or interactive elements from image carousels to dropdown menus that are designed to catch shoppers' eyes and ease navigation. To the large language models that underlie ChatGPT and other generative AI, many of these features might as well not exist. They're often written in the programming language JavaScript, which for the moment at least most AI struggles to read. This giant blindspot didn't matter when generative AI was mostly used to write emails and cheat on homework. But a growing number of startups and tech giants are deploying this technology to help users shop — or even make the purchase themselves. ADVERTISEMENT 'A lot of your site might actually be invisible to an LLM from the jump,' said A.J. Ghergich, global vice president of Botify, an AI optimisation company that helps brands from Christian Louboutin to Levi's make sure their products are visible to and shoppable by AI. The vast majority of visitors to brands' websites are still human, but that's changing fast. US retailers saw a 1,200 percent jump in visits from generative AI sources between July 2024 and February 2025, according to Adobe Analytics. Salesforce predicts AI platforms and AI agents will drive $260 billion in global online sales this holiday season. Those agents, launched by AI players such as OpenAI and Perplexity, are capable of performing tasks on their own, including navigating to a retailer's site, adding an item to cart and completing the checkout process on behalf of a shopper. Google's recently introduced agent will automatically buy a product when it drops to a price the user sets. This form of shopping is very much in its infancy; the AI shopping agents available still tend to be clumsy. Long term, however, many technologists envision a future where much of the activity online is driven by AI, whether that's consumers discovering products or agents completing transactions. To prepare, businesses from retail behemoth Walmart to luxury fashion labels are reconsidering everything from how they design their websites to how they handle payments and advertise online as they try to catch the eye of AI and not just humans. 'It's in every single conversation I'm having right now,' said Caila Schwartz, director of consumer insights and strategy at Salesforce, which powers the e-commerce of a number of retailers, during a roundtable for press in June. 'It is what everyone wants to talk about, and everyone's trying to figure out and ask [about] and understand and build for.' From SEO to GEO and AEO As AI joins humans in shopping online, businesses are pivoting from SEO — search engine optimisation, or ensuring products show up at the top of a Google query — to generative engine optimisation (GEO) or answer engine optimisation (AEO), where catching the attention of an AI responding to a user's request is the goal. That's easier said than done, particularly since it's not always clear even to the AI companies themselves how their tools rank products, as Perplexity's chief executive, Aravind Srinivas, admitted to Fortune last year. AI platforms ingest vast amounts of data from across the internet to produce their results. ADVERTISEMENT Though there are indications of what attracts their notice. Products with rich, well-structured content attached tend to have an advantage, as do those that are the frequent subject of conversation and reviews online. 'Brands might want to invest more in developing robust customer-review programmes and using influencer marketing — even at the micro-influencer level — to generate more content and discussion that will then be picked up by the LLMs,' said Sky Canaves, a principal analyst at Emarketer focusing on fashion, beauty and luxury. Ghergich pointed out that brands should be diligent with their product feeds into programmes such as Google's Merchant Center, where retailers upload product data to ensure their items appear in Google's search and shopping results. These types of feeds are full of structured data including product names and descriptions meant to be picked up by machines so they can direct shoppers to the right items. One example from Google reads: Stride & Conquer: Original Google Men's Blue & Orange Power Shoes (Size 8). Ghergich said AI will often read this data before other sources such as the HTML on a brand's website. These feeds can also be vital for making sure the AI is pulling pricing data that's up to date, or as close as possible. As more consumers turn to AI and agents, however, it could change the very nature of online marketing, a scenario that would shake even Google's advertising empire. Tactics that work on humans, like promoted posts with flashy visuals, could be ineffective for catching AI's notice. It would force a redistribution of how retailers spend their ad budgets. Emarketer forecasts that spending on traditional search ads in the US will see slower growth in the years ahead, while a larger share of ad budgets will go towards AI search. OpenAI, whose CEO, Sam Altman, has voiced his distaste for ads in the past, has also acknowledged exploring ads on its platform as it looks for new revenue streams. 'The big challenge for brands with advertising is then how to show up in front of consumers when traditional ad formats are being circumvented by AI agents, when consumers are not looking at advertisements because agents are playing a bigger role,' said Canaves. Bots Are Good Now Retailers face another set of issues if consumers start turning to agents to handle purchases. On the one hand, agents could be great for reducing the friction that often causes consumers to abandon their carts. Rather than going through the checkout process themselves and stumbling over any annoyances, they just tell the agent to do it and off it goes. ADVERTISEMENT But most websites aren't designed for bots to make purchases — exactly the opposite, in fact. Bad actors have historically used bots to snatch up products from sneakers to concert tickets before other shoppers can buy them, frequently to flip them for a profit. For many retailers, they're a nuisance. 'A lot of time and effort has been spent to keep machines out,' said Rubail Birwadker, senior vice president and global head of growth at Visa. If a site has reason to believe a bot is behind a transaction — say it completes forms too fast — it could block it. The retailer doesn't make the sale, and the customer is left with a frustrating experience. Payment players are working to create methods that will allow verified agents to check out on behalf of a consumer without compromising security. In April, Visa launched a programme focused on enabling AI-driven shopping called Intelligent Commerce. It uses a mix of credential verification (similar to setting up Apple Pay) and biometrics to ensure shoppers are able to checkout while preventing opportunities for fraud. 'We are going out and working with these providers to say, 'Hey, we would like to … make it easy for you to know what's a good, white-list bot versus a non-whitelist bot,'' Birwadker said. Of course the bot has to make it to checkout. AI agents can stumble over other common elements in webpages, like login fields. It may be some time before all those issues are resolved and they can seamlessly complete any purchase. Consumers have to get on board as well. So far, few appear to be rushing to use agents for their shopping, though that could change. In March, Salesforce published the results of a global survey that polled different age groups on their interest in various use cases for AI agents. Interest in using agents to buy products rose with each subsequent generation, with 63 percent of Gen-Z respondents saying they were interested. Canaves of Emarketer pointed out that younger generations are already using AI regularly for school and work. Shopping with AI may not be their first impulse, but because the behaviour is already ingrained in their daily lives in other ways, it's spilling over into how they find and buy products. More consumers are starting their shopping journeys on AI platforms, too, and Schwartz of Salesforce noted that over time this could shape their expectations of the internet more broadly, the way Google and Amazon did. 'It just feels inevitable that we are going to see a much more consistent amount of commerce transactions originate and, ultimately, natively happen on these AI agentic platforms,' said Birwadker.


Tom's Guide
2 hours ago
- Tom's Guide
I tested the AI transcription tools for iPhone vs Samsung Galaxy vs Google Pixel — here's the winner
This article is part of our AI Phone Face-Off. If you're interested in our other comparisons, check out the links below. Long before AI was a buzzword included in every handset's marketing material, a few lucky phones already offered automatic transcripts of voice recordings. But the arrival of on-device AI has extended that feature to more phones and more apps, including the Phone app itself, while also adding auto-generated summary features to the mix. All three of the major smartphone makers — Apple, Google and Samsung — offer some type of voice recording app on their flagship phones with real-time transcription as part of the feature set. Those phones now record and transcribe phone calls, too. And summary tools that tap into AI to produce recaps of conversations, articles, recordings and more have become commonly available on iPhones, Pixels and Galaxy S devices alike. But which phone offers the most complete set of transcription and summarization tools? To find out, I took an iPhone 15 Pro, Pixel 9 and Galaxy S25 Plus loaded with the latest available version of their respective operating systems, and put each device through a series of tests. If you need a phone that can turn your speech into text or cut through a lengthy recording to bring you the highlights, here's which phone is most up to the job. I wrote out a scripted phone call, handed one copy to my wife and then scurried outside to call her three separate times from the iPhone, Pixel and Galaxy S device. By scripting out our conversation, we could see which on-board AI provided a more accurate transcript. And after each call, I took a look at the AI-generated summary to see if it accurately followed our discussion of rental properties in the San Francisco Bay Area. The iPhone's transcript was the most muddled of the three, with more instances of incorrect words and a lack of proper punctuation. The biggest misstep, though, was mixed up words that my wife and I had said, as if we had been talking over each other. (We had not.) Because I was calling someone in my Contacts, though, the iPhone did helpfully add names to each speaker — a nice touch. The transcripts from the Pixel 9 and Galaxy S25 Plus were equally accurate when compared to each other. Samsung displays its transcripts as if you're looking at a chat, with different text bubbles representing each speaker. Google's approach is to label the conversation with 'you' and 'the speaker.' I prefer the look of Google's transcript, though I appreciate that when my wife and I talked expenses, Galaxy AI successfully put that in dollar amounts. Google's Gemini just used numbers without dollar designations. As for the summaries, the one provided by iPhone accurately summed up the information I requested from my wife. The Galaxy AI summary was accurate, too, but left out the budget amount, which was one of the key points of our discussion. Google's summary hit the key points — the budget, the dates and who was going on the trip — and also put the summary in second person ('You called to ask about a rental property…"). I found that to be a personal touch that put Google's summary over the top. I will point out that the iPhone and Galaxy S25 Plus summaries appeared nearly instantly after the call. It took a bit for the Pixel 9 to generate its summary — not a deal-breaker, but something to be aware of. Winner: Google — The Pixel 9 gave me one of the more accurate transcripts in a pleasing format, and it personalized a summary while highlighting the key points of the conversation. I launched the built-in recording apps on each phone all at the same time so that they could simultaneously record me reading the Gettysburg Address. By using a single recording, I figured I could better judge which phone had the more accurate transcript before testing the AI-generated summary. The transcript from Samsung's Voice Recorder app suffered from some haphazard capitalization and oddly inserted commas that would require a lot of clean-up time if you need to share the transcript. Google Recorder had the same issue and, based on the transcript, seemed to think that two people were talking. The iPhone's Voice Memos app had the cleanest transcript of the three, though it did have a handful of incorrectly transcribed words. All three recording apps had issues with me saying 'nobly advanced,' with the Galaxy S25 Plus thinking I had said 'nobleek, advanced' and the iPhone printing that passage as 'no league advanced.' Still, the iPhone transcript had the fewest instances of misheard words. As for summaries, the Galaxy AI-generated version was fairly terse, with just three bullet points. Both the Pixel and the iPhone recognized my speech as the Gettysburg Address and delivered accurate summaries of the key points. While getting a summary from the iPhone takes some doing — you have to share your recording with the iOS Notes app and use the summary tool there — I preferred how concise its version was to what the Gemini AI produced for the Pixel. Winner: Apple — Not only did the iPhone have the best-looking transcript of the three phones, its summary was also accurate and concise. That said, the Pixel was a close second with its summarization feature, and would have won this category had it not heard those phantom speakers when transcribing the audio. Why keep testing the transcription feature when we've already put the recording apps through their paces? Because there could come a time when you need to record a meeting where multiple people are talking and you'll want a transcript that recognizes that. You may be in for a disappointing experience if the transcripts of me and my wife recreating the Black Knight scene from 'Monty Python and the Holy Grail' are anything to go by. Both the Galaxy and Pixel phones had problems recognizing who was speaking, with one speaker's words bleeding into the next. The Pixel 9 had more than its share of problems here, sometimes attributing an entire line to the wrong speaker. The Galaxy had more incorrectly transcribed words, with phrases like 'worthy adversary' and 'I've had worse' becoming 'where the adversary is' and '5 had worse,' respectively. The Pixel had a few shockers of its own, but its biggest issue remained the overlapping dialogue At least, those phones recognized two people were talking. Apple Intelligence's transcript ran everything together, so if you're working off that recording, you've got a lot of editing in your future. With this test, I was less interested in the summarization features, though the Pixel did provide the most accurate one, recognizing that the dialogue was 'reminiscent' of 'Monty Python and the Holy Grail.' The Galaxy AI-generated summary correctly deduced that the Black Knight is a stubborn person who ignores his injuries, but wrongly concluded that both speakers had agreed the fight was a draw. The iPhone issued a warning that the summarization tool wasn't designed for an exchange like this and then went on to prove it with a discombobulated summary in which the Black Knight apparently fought himself. Winner: Samsung — Galaxy AI had easier-to-correct errors with speakers' lines bleeding into each other. The Gemini transcript was more of a mess, but the summary nearly salvaged this test for Google. Of all the promised benefits of AI on phones, few excite me more than the prospect of a tool that can read through email chains and surface the relevant details so that I don't have to pick through each individual message. And much to my delight, two of the three phones I've tested stand out in this area. I'm sad to say it isn't the Galaxy S25 Plus. I found the feature a bit clunky to access, as I had to use the built-in Internet app to go to the web version of Gmail to summarize an exchange between me and two friends where we settled on when and where to meet for lunch. Galaxy AI's subsequent summary included the participants and what we were talking about, but it failed to mention the date and location we agreed upon. Both the Pixel and the iPhone fared much better. Gemini AI correctly listed the date, time and location of where we were going to meet for lunch. It even spotted a follow-up email I had sent en route warning the others that I was running late. Apple Intelligence also got this feature right in the iPhone's built-in Mail app. I think the Pixel has the better implementation, as getting a summary simply requires you to tap the Gemini button for all the key points to appear in a window. iOS Mail's summary feature lives at the top of the email conversation so you've got to scroll all the way up to access your summary. Winner: Google — The Pixel and the iPhone summarized the message chain equally well, but Google's implementation is a lot easier to access. In theory, a summary tool for web pages would help you get the key points of an article quickly. The concern, though, is that the summary proves to be superficial or, even worse, not thorough enough to recognize all the key points. So how do you know how accurate the summary is? I figured to find out, I'd run one of my own articles through the summary features of each phone — this article about the push to move iPhone manufacturing to the U.S., specifically. I mean, I know what I wrote, so I should be in a good position to judge if the respective summary features truly got the gist of it. Galaxy AI did, sort of, with its summary consisting of two broadly correct points that the Trump administration wants to move phone manufacturing to the U.S. and that high labor costs and global supply chain automation are the big roadblocks. That's not horribly inaccurate, but it is incomplete, as the article talked more about the lack of dedicated assembly plants and equipment in the U.S. The iPhone's summary — appearing as a tappable option in the menu bar of Safari — was a little bit more detailed on the key roadblock, while also noting the potential for rising prices of U.S.-built phones. However, the summary provided via Gemini AI is far and away the most substantive. It specifically calls out a push for reshoring, notes what Apple already produces in the U.S., and highlights multiple bullet points on the difficulties of U.S. phone manufacturing. Winner: Google — Summaries don't always benefit from being brief, and the Galaxy AI-generated summation of my article hits key points without sacrificing critical details and explanations. You can read that summary and skip my article — please don't, it would hurt my feelings — and still get a good grip on what I had written. Sometimes, notes can be so hastily jotted down, you might have a hard time making sense of them. An ideal AI summary tool would be able to sort through those thoughts and produce a good overview of the ideas you were hoping to capture. If you remember from our AI Writing Tools test, I had some notes on the new features in iOS 26 that I used to try out auto-formatting features provided by each phone's on-device AI. This time around, I tried out the summary features and found them to be generally OK, with one real standout. Both Galaxy AI and Apple Intelligence turned out decent summaries. When I selected the Key Points options in Writing Tools for iOS Notes, the iPhone featured a good general summation of changes in iOS 26, with particular attention paid to the Safari and FaceTime enhancements. Other descriptions in the Apple Intelligence-produced summary were a bit too general for my tastes. I did like the concise descriptions in the Galaxy AI summary, where my lengthy notes were boiled down to two bullet points summing up the biggest additions. It's not the most detailed explanation, but it would work as an at-a-glance synopsis before you dive into the meat of the notes themselves. Gemini AI on board the Pixel 9 struck the best overall mix between brevity and detail. Google's AI took the bullet points of my original notes and turned them into brief descriptions of each feature — a helpful overview that gets to the heart of what I'd be looking for in a summary. Winner: Google — While Galaxy AI scores points for getting right to the point in its summary, the more useful recap comes from Gemini AI's more detailed write-up. If we had restricted these tests to transcripts, it might have been a closer fight, as both Apple and Samsung held their own against Google in converting recordings to text. But throw summaries into the mix, and Google is the clear winner, taking the top spot in four of our six tests. Even in the tests where the Pixel was bested by either the iPhone or the Galaxy S25 Plus, it didn't lag that far behind. Some of this comes down to what you prefer in a summarization tool. If it's concise summaries, you may be more favorably inclined to Galaxy AI than I was. Apple Intelligence also shows some promise that would benefit from fine-tuning to make its tools easier to access. But for the best experience right now, Google is clearly the best at transcription and summarization.
Yahoo
2 hours ago
- Yahoo
New phone safety features can help kids. But they only work if parents set them up
Keeping kids safe online can be a full-time job for parents. While good communication between parents and children is key, moms and dads should set kids up for success by placing parental controls on devices. Apple has a new software update; coming in September, iOS 26 expands tools that can help in the effort. For parents whose children have an iPhone, they can now play a role in deciding who their kids can text through Messages. If children want to communicate with a new phone number, they will need to get permission from their parents. It's a one-tap approval method for mom or dad, but gives parents a heads up that someone new may be entering their kids' digital life. Kids will also be able to send a parental approval request to chat, follow or friend users in any third-party apps (not developed by Apple) as well. Child Accounts have the user's age-range information. Now parents can share those details with app developers while keeping the child's birth date private. If developers receive the age information, they may better provide age-appropriate experiences for those users. Moms and dads can decide whether they want their kids' age range information shared with all apps or only those they select. Children cannot decide how their age range information is shared unless their parents allow them to do so. Whether or not a young person's account was set up as a Child Account, these age-appropriate protections will be enabled for all users 13 to 17. Web content filters, age ratings and Communication Safety will all be enacted on those accounts. Utah was the first state in the country to pass a bill requiring app stores to verify kids' ages. The state's App Store Accountability Act requires app stores — not individual apps — to seek parental consent before allowing minors to download apps. Each app on the App Store already shows its age rating based on information provided by developers. For parents, that's often a first check to help them decide whether the app would be appropriate for their child to download. Right now, those ratings are for ages 4+, 9+, 12+ and 17+. That leaves a big gap for teenagers. What may be appropriate for a 13 year old may not be for a 16 year old. Apple is helping by expanding age ratings by the end of the year, when you will see additional guidance for 4+ and 9+, but then 13+, 16+ and 18+. Since these age ratings are also used for parental control features like Screen Time and Ask to Buy, these differentiations will likely be helpful for many parents. Communication Safety is a feature that aims to stop kids from seeing nudity. If something explicit is detected in a photo or video a child receives or is trying to send, the image will be blurred. In the upcoming iOS 26 update, that capability will also apply to FaceTime. Apple says it will 'intervene' when nudity is detected in FaceTime video calls and will also blur out any nudity in Shared Albums in Photos. How do Android phones stack up against these parental controls? For those under 18, Google Messages triggers a sensitive content warning when it detects images that contain nudity. That prompts an Android device to blur those images and give helpful resources to users who receive that type of content. Parents can control the feature through Family Link for Supervised users. It's off by default for adults and unsupervised teen accounts, but users can turn it on in settings. This feature does not work for videos. Google Meet has an Acceptable Use Policy that prohibits nudity but only has the blurring capability in Messages. For now, apps in the Google Play Store follow the age ratings from the International Age Rating Coalition which are similar to those Apple has favored in the past. They break age categories into 3+, 7+, 12+, 16+ and 18+. When it comes to texting, parents of teens with Android phones can turn on 'Only allow calls and texts from phone contacts:' through Family Link. This blocks incoming texts from unknown numbers, but not outgoing ones. And worth noting: Android allows the incoming calls and text to go through if a child has reached out to that number within the past month. Giving parents more options to monitor how their kids spend time online is helpful. But while these companies grant the ability to have parental controls, they only work if moms and dads actually set them up. If parents haven't yet set up monitoring through Family Share for Apple or Family Link for Google, it's never too late to start.