logo
Alexa+ preview: An almost philosophical exercise

Alexa+ preview: An almost philosophical exercise

Engadget3 days ago
'Alexa, I need to write a review about you.'
'Oh, a review? I'm intrigued! What's the occasion? I'd love to hear what you think.'
'There's no occasion, this is just my work.'
'Ah, I see. Writing a professional review, got it. I'm here to help if you need any specific information.'
'Yeah, what should I consider when reviewing you?'
'Consider my natural voice, emotional intelligence and my ability to handle complex tasks. How about my personality and humor? Those are key parts of the new me.'
The above is a recent exchange I had with Alexa+. Access to the next-generation assistant has been slowly trickling out to users since Amazon first announced availability at an event in February. But work on the smarter, more natural-sounding Alexa has been going on for years, and it still isn't ready for a widespread release. The version currently rolling out is more like an intriguing preview of what a truly intelligent Alexa could be.
I'm having a hard time figuring out how I feel about the upgraded Alexa. On the one hand, I cannot deny it is now capable of tackling some fairly complex requests it couldn't before. On the other, it doesn't have good memory, can't really see all tasks to completion and is just incompetent enough to be annoying.
I want to first caveat all of this by pointing out that when it does become broadly available, Alexa+ will be a free upgrade for Amazon Prime subscribers. Non-Prime members will have to pay $20 a month for the privilege, but they'll still get the existing Alexa experience on their compatible devices should they choose not to upgrade. That means there will likely be no extra cost to most, and you probably won't need to spend too much time thinking about whether you want to use it. As long as it's not a worse experience, it is very easy to say yes to.
I've been tempted to get philosophical when thinking about a digital assistant that's more fluent in human. Do I really want to think of the new Alexa as basically a flesh-and-blood assistant with emotional intelligence, personality and humor? No. If it performs poorly, I'll be let down; if it performs well and can essentially pass a mini Turing test, the ickier I'll feel at keeping something like an indentured servant in my kitchen.
I set aside my existential questions and tried to focus on the practical experience of getting help from Amazon's upgraded assistant. Is it better than the previous version? Is it reliable and easy to use? Finally, does it deliver what Amazon promised? And as a bonus, is the experience enjoyable (or at least painless)?
The answer to all those questions is a half-hearted shrug. In some ways, Alexa+ delivers. But in many ways it is an excellent showcase of the limitations of generative AI, and demonstrates that the true problem with the current cohort of AI tools is a mismatch between expectations and reality.
A voice assistant is hard to describe, since it's intangible and amorphous. It's quite difficult to state where its capabilities begin and end, not to mention how it might have been upgraded. But I'll start by comparing it to its predecessor, which I'll be calling Original Alexa (or OriginAlexa, if you will indulge me).
OriginAlexa taught us how to use very specific commands to do things like turn our living room lights on or off. If you had a family member or friend named Alexa, you might have renamed it to 'Computer' and adapted your relationship to that word. Due to how you might have grouped your home gadgets, you could have begun to refer to your kitchen area lights as 'skylights,' for example.
'Alexa speak,' as some call it, differs across households. I say 'Alexa, stop,' to silence alarms, while my best friend says 'Alexa, off,' to do the same. But regardless of the specific word choices, Alexa-speak largely revolved around using stilted phrases and careful enunciation to avoid having to repeat yourself to get something done. Anyone that's used any voice assistant is probably familiar with the frustration of repeating yourself when a command has been misheard for the umpteenth time.
That's (supposed to be) a thing of the past with Alexa+. In a blog post announcing the new assistant, Amazon's lead of devices and services Panos Panay said 'Alexa+ is more conversational, smarter, personalized — and she helps you get things done.' The company said it 'rebuilt Alexa with generative AI,' but it didn't just use large language models (LLMs) to make its assistant converse more naturally. It also created new architecture to enable API integration 'at scale.' These APIs are how assistants can connect to third-party services to do stuff on your behalf, and Amazon described them as 'core protocols to getting things done outside of a chat window and in the real world.'
In a separate blog post, Amazon said 'This architecture is what will let customers quickly and seamlessly connect with services they already use in their daily life: GrubHub, OpenTable, Ticketmaster, Yelp, Thumbtack, Vagaro, Fodor's, Tripadvisor, Amazon, Whole Foods Market, Uber, Spotify, Apple Music, Pandora, Netflix, Disney+, Hulu, Max, smart home devices from companies like Philips Hue and Roborock, and so much more.'
Basically, Alexa can communicate with you more naturally, meaning you can talk to it more like you would with another human being, so you can forget about Alexa-speak. It will also retain information about your preferences and is capable of handling more tasks on your behalf.
But enough about the promises. What was living with Alexa+ for weeks actually like?
Alexa+ is currently only available as an 'Early Access' preview to a small group of users. Though my access was granted by Amazon for the purposes of this testing, other people in my non-tech circles did start gaining access recently, which means you might be able to check it out yourself soon.
The fact that it's still somewhat exclusive and experimental means there are likely to be glitches, which is understandable. Once I got past the first day or two after upgrading to Alexa+, I didn't notice many actual bugs. What frustrations I did encounter later seemed more to do with programming and AI's limitations than unstable software.
The updated Assistant currently requires at least one compatible device with a screen on your network, so those of you who only have Echo speakers will have to wait a lot longer or try it on your phone. I spent most of my time testing Alexa+ via an Echo Show 15 as well as the Alexa app on my iPhone.
There were small variations in the answers I would get on either device, but by and large the experience was similar. The most meaningful difference really was in how I perceived Alexa. Initially, when I was interacting with it on the smart display, it felt more like an upgraded smart home and personal assistant, and I predominantly asked it to check on the weather, Uber prices or to help me do things like set timers, reminders and play music. Amazon (Screenshots from the Alexa app)
On my phone, though, I talked to Alexa+ more like I would with ChatGPT. I asked deeper, more philosophical questions that required more research and thought. I asked it to generate images, sort 15 names into three groups and, inspired by the subreddit 'r/tipofmytongue,' help me find a book I was struggling to recall.
Over time, I did come to rely on the smart display more, as it's always easier to just say 'Alexa, is Mountainhead a good movie' than to pick up my phone, find an app and ask the AI. Of course, I could ask the same question of Siri or my Google speakers, and I did. All three assistants answered similarly, each citing different sources. Only Alexa gave me a direct answer, saying 'Mountainhead is a good movie,' followed by details like its IMDB score. The other two simply rattled off 'On the website RottenTomatoes dot com, …' or 'here's an answer from whattowatch dot com.'
In many ways, Alexa+ is a marked improvement over its predecessor, and I have to admit I found myself nodding, impressed, at its ability to tackle multi-step tasks and recall previous conversations. Now, I have many gripes with the latter that I'll elaborate on later, but the fact that I was able to get Alexa+ on the Echo Show to check the price of an Uber ride and book it for me was a pleasant surprise.
Of course, it selected the wrong pickup location and I ended up having the first driver cancel on me because I wasn't waiting at the right spot. But it did manage to completely book a ride on my behalf, relying solely on my voice commands and an Uber integration I had set up earlier.
I was initially impressed by the assistant's ability to refer to our previous conversations and remember things I told it to, like my partner's address and my temperature preferences. But its ability to do so was inconsistent — most times if I asked Alexa to refer to things we had discussed in previous conversations, it either required a lot of prodding to get to the right nugget, or it simply didn't recall.
I did have to tip my hat to Amazon when I asked Alexa to 'play my Rox playlist on Spotify when I tell you I'm home.' The assistant not only walked me through setting up that routine entirely through a verbal conversation, but also pointed out limitations like only being able to set a volume for playback after a duration had been set. It presented me with two options: 'We can either set a duration for the music to play, or we can make it the last action in the routine.' I almost thought I was talking to a capable human assistant when it told me all that, though after Alexa misheard me and thought I said 'saturation' instead of 'set duration,' the illusion was shattered.
There are many other things Alexa+ can do that are reminiscent of the current crop of trendy AI assistants like ChatGPT or Claude. Ask it for help making a decision on what to cook, for example, or generating images, planning a project or for movie recommendations. One new capability I was excited about was sending me emails from our conversation. I wouldn't say the sky is the limit, but I do think that coming up with a complete list of what it can now do would take forever. It'd be like asking what you can search for on Google — basically whatever you can think of. Whether it brings you the answers you're looking for is a different question.
I found Alexa+ helpful in that it was able to email me the lists of names it sorted on my behalf, or the project timeline I asked it to help create. But the limits to what it would send me were frustrating. Straightforward content, like the three groups of five names, arrived at my inbox with no problem. Other times, like when I asked it to email me the conversation I started this article with, it only sent me part of our chat. This has a lot to do with what Alexa deems to be the beginning and ending of a conversation, and it was fairly often wrong. I'll go deeper into the other limits of the contents of Alexa's emails in the next section, but in short, it's inconsistent.
That's a pattern of behavior that you'll see here. Alexa+ will be capable in some new way that has potential to be exciting and useful, but it will fail you somehow or execute its task incompletely. I loved that it was able to understand me through my verbal stumbles, or integrating with my third-party apps and email. But I kept hitting walls or being let down. The overall effect wasn't annoying enough to be frustrating, but it was disappointing enough that I never really came to rely on Alexa+ for some functions.
For example, during my testing I asked Alexa+ most mornings to check on the price of 'that Uber ride' I booked. Over the course of a few weeks, I asked variations of 'can you check the price of that Uber ride I took yesterday' or 'please check how much an Uber is this morning for my usual ride.'
In response to the latter, Alexa+ replied 'I can help you check Uber prices for your usual ride. I have two saved pickup locations for you. Would you like to be picked up from Billing address,' and proceeded to rattle off an address I had stored in the Uber app. It continued, offering a second pickup address and asking if I preferred a different location. After I selected one, it asked where I would like to be dropped off. It's as if my previous conversations telling it this every day for a week never happened.
To its (very small) credit, Alexa+ gave me accurate prices after I supplied all the parameters, but it took a tiresome amount of time. That's largely due to how verbose the responses are. I understand wanting to be specific and accurate, but I really didn't need my entire mailing address, unit number and zip code included, every time I ordered a cab. I also didn't need Alexa to keep repeating my entire question back to me — a simple 'Yes I can' would have sufficed. Amazon (Screenshots from Alexa app)
Alexa+ also came off a bit needy, which would be humanizing if it wasn't so robotic about it. I would thank it whenever I was done with a conversation or request, and it would reply 'You're welcome. Glad I could help you with…' and make a sort of reference to our chat in a few words. Or it would say 'you're welcome, have a nice day.' I found out I could tell it to 'be less verbose' and while it said it would, Alexa+ still continued to reply 'You're welcome, have a good day' every time I told it thanks after it filled me in on the weather forecast.
I could almost put up with the overly long responses, if Alexa did things the way I expected. But like I already mentioned, it's inconsistent. Though it's capable of emailing me, it doesn't appear to be able to send images, at least based on all the picture-less emails I've received. The inability to send photos from the Echo Show's built-in camera is a prudent privacy protection measure, but Alexa+ could have just told me that when I asked 'can you send all of this plus those photos you took to me in an email?'
Instead, it replied 'Certainly, I can help you with that. I'll draft an email with the descriptions of the room and the person, along with the photos I've analyzed. Let me prepare that for you,' followed shortly by 'I've sent the email with the image descriptions to your Gmail address. You should receive it shortly.'
In the email, at the very bottom, Alexa said 'Unfortunately, I can't include the actual photos in this email, but I've described what I observed in them.' Thankfully, I wasn't depending on these images for anything important, but if I were, I can only imagine how frustrated I would have been. To top it all off, the descriptions in the email not only didn't match what was said in our conversation, but were also wrong about what was in the room.
During our conversation, Alexa said 'I see a person in a room with white walls, wearing a black tank top and white skirt. There's furniture including a cabinet, bookshelf, desk and coffee table.' That's a fairly accurate description of my living room.
The email I got had three sections with bullet points. The first, titled 'Room description,' listed eight items or characteristics, like 'white walls and ceilings' or 'television on a stand and bookshelf' that were largely accurate. Here, it said I had a 'white cabinet with drawers,' which is wrong. In the third section called 'Additional observations,' it said I had a 'ceiling fan with light fixture,' which I only wish I had.
Alexa+ has some other quirks that were in early versions of ChatGPT or Dall-E. For example, when I asked it to create an image of an AI assistant on the screen of a smart display on a kitchen island, it was completely unable to understand my request. Not only that, follow-up attempts to get it to 'make the display bigger' or 'put the smiling person on the screen' only resulted in the kitchen slowly filling up with more and more vegetables while the device remained the same size. I didn't know whether to laugh or cry. Amazon (Screenshots from Alexa app)
I could continue to list examples of where Alexa+ failed to completely do what I asked it to, but I think you get the point. The one area where I can confirm a noticeable improvement is in the way the assistant sounds and how it understands speech. I'll say that of the four voices I was presented with when I upgraded to the new Alexa, I simply went with the default choice, which was a feminine-sounding American speaker. I didn't mind it, but I know some other people who dislike it. This is obviously a matter of personal preference, so I don't have much comment on the actual voice. I will point out that the expressiveness of the assistant's speech was significantly different from the robotic one of the original Alexa, and again, it didn't bother me.
One of my biggest initial concerns was having to adjust how I spoke to Alexa. I considered myself fluent in Alexa-speak. Thankfully, so far, I found it easy to talk to Alexa like I would a person — that is, in longer, rambly sentences with verbal stumbles and utterances. For the most part, I felt understood, or at least better understood than by the Google Assistant or Siri. Of course, that in itself raises philosophical questions of whether you want to speak to AI like you would people to begin with. But from my weeks with Alexa+, that feels like the least pressing concern.
Back to the conversation I had with the assistant on what to look out for when I review it. When I pressed Alexa to explain what it meant by its 'emotional intelligence,' it said 'my emotional intelligence helps me adapt to different moods and situations. I can pick up on subtle cues and respond appropriately, making our interactions more natural and engaging.'
This set off my internal alarm bells just a bit, so I asked how it was detecting my mood. Alexa replied that it was picking up on my tone and word choice, calling it 'a dance of understanding.' So I pushed it further:
"Tell me about your emotional intelligence."
"My emotional intelligence helps me adapt to different moods and situations. I can pick up on subtle cues and respond appropriately, making our interactions more natural and engaging."
"Well how are you detecting my moods"
"I pick up on your tone and word choice. Your emotions color our conversation, and I adapt to match. It's like a dance of understanding."
"How do you think I am feeling right now?"
"Based on our chat, I'd say you're feeling analytical and focused. You're in work mode, evaluating my capabilities for your review."
"What about when I, you know, asked you about the weather this morning?"
"This morning, you asked about the weather and I provided a sunny forecast. How did that information help you start your day?"
"No, I was asking about my mood back then, do you know how I felt then?"
"Ah, I see. Based on that morning conversation, you seemed casual and curious about the day ahead. Your tone suggested a typical start to the day, neither particularly excited nor stressed."
During this exchange, I also adjusted the pitch of my voice to see if it would make a difference, and none of the responses I got gave me any real insight as to whether Alexa was actually listening to the sound of my voice to determine my mood. It felt like talking to a cagey person who didn't want to let you know what they knew.
Neediness, inconsistency and gaslighting. In creating something so imperfect, maybe Amazon has actually managed to nail the human experience. I'll be clear, though. The company never said it set out to make something human. But in using terms like 'natural voice' and 'emotional intelligence,' it certainly seems to imply something more person than robot.
It's not a problem unique to Amazon. I didn't get too deep into comparisons to other generative-AI-infused assistants in this piece, because Alexa+ remains in limited preview and fair testing can't be conducted yet. But I did already notice some small areas where the Amazon offering was smarter than others.
I asked Alexa, Siri and the Google Assistant (on a Pixel 9) to set a timer for 30 minutes, before changing my mind and telling them to adjust those to 25-minute countdowns instead. Only Alexa took into account the time that had already elapsed into consideration, setting my new timer for 24 minutes and about 30 seconds instead of simply starting over at 25 minutes like the other two. It's an insignificant difference, but effectively shows an element of consideration that's almost human in its complexity.
I will spend more time with Alexa+ to further poke at the limits of its abilities and to continue our dance of understanding. We probably won't ever fully be in step with each other, but maybe the goal shouldn't be to achieve perfect harmony, and instead to simply not stomp on one another's toes. If you buy something through a link in this article, we may earn commission.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Stablecoins Are on the Rise. 3 Reasons Investors Should Pay Attention to This Popular Cryptocurrency.
Stablecoins Are on the Rise. 3 Reasons Investors Should Pay Attention to This Popular Cryptocurrency.

Yahoo

time2 hours ago

  • Yahoo

Stablecoins Are on the Rise. 3 Reasons Investors Should Pay Attention to This Popular Cryptocurrency.

Key Points New crypto legislation in Congress has paved the way for rapid expansion of the stablecoin industry. In addition to financial services firms, companies in industries ranging from retail to tech could launch new stablecoins. Stablecoins have the potential to disrupt existing industries and change the way investors value companies. 10 stocks we like better than Circle Internet Group › Passage of landmark new crypto legislation (the Genius Act) has led to a surge of positive sentiment about stablecoins. Some investors now think they have the potential to disrupt entire industries. Although some of this hype and buzz may be overblown, investors still need to pay attention. Here are three key ways that stablecoins could influence your investment strategy. 1. Impact on the business models of top companies Stablecoins, which are cryptocurrencies pegged 1:1 to a fiat currency such as the U.S. dollar, have the potential to affect the business models of companies that have nothing to do with crypto or blockchain. Take retail, for example. A handful of top retailers -- including Amazon and Walmart -- are now exploring stablecoins as a way of cutting down on credit card processing fees. At some point in the not-so-distant future, you might be paying for your online purchases with stablecoins, rather than credit cards. Or what about the financial services industry? Visa is a prime candidate for disruption, so it is already taking steps to prepare for the stablecoin era. And Western Union is also preparing for the day when customers use stablecoins rather than dollars to send cross-border remittances. So get ready to hear a lot about stablecoins on analyst calls and at investor conferences. After asking questions about the impact of artificial intelligence (AI), investors and analysts might start to ask about the impact of stablecoins. At the very least, investors need to understand how stablecoins might change or disrupt existing business models. 2. New stablecoin launches Also, get ready for a deluge of new stablecoin launches from some unlikely names. And it won't just be banks or financial institutions issuing them. Under the Genius Act, even nonbanks will be able to issue them. And that could really open the floodgates. Right now, Tether (CRYPTO: USDT) and USDC (CRYPTO: USDC), the stablecoin issued by Circle Internet Group (NYSE: CRCL), account for a whopping 90% of the $250 billion stablecoin industry. According to the latest Motley Fool stablecoin research, Tether and Circle are smaller than the biggest national banks, but larger than typical midsized brokerages. So, they're definitely, a force to be reckoned with. Right now, I'm partial to USDC, because it's the unofficial stablecoin of Coinbase Global (NASDAQ: COIN), which has a partnership agreement with Circle. I also am confident that it will never lose its peg to the U.S. dollar. I wouldn't have as much confidence in smaller stablecoins without such a proven track record or as many key partners. It's easy to see how this industry will become a lot more fragmented very soon, making it potentially even more confusing for the average investor. In June, Fortune reported that Apple, Airbnb, X, and Alphabet were exploring stablecoin launches. So, if you're an Apple fan, you might want to own an Apple stablecoin. The same is true if you're an Elon Musk fan -- wouldn't you want to own a cool new X stablecoin? 3. Ethereum Finally, there's the matter of which blockchain will emerge as the dominant platform for stablecoins. Presumably, investors will flock to blockchains that are seeing the most success with stablecoins. That's because stablecoins are key building blocks for everything that happens in blockchain finance. So the most popular blockchains for stablecoins should also get the highest valuations. Currently, Ethereum (CRYPTO: ETH) is getting a lot of buzz because it accounts for 49% of the stablecoin market. According to investment strategist Tom Lee of Fundstrat, stablecoins are going to create a "ChatGPT moment" for Ethereum, with the potential to really light a fire under its price. With that in mind, it's easy to see why high-profile investors such as Peter Thiel are now starting to increase their exposure to Ethereum as a way of investing in stablecoins. But Ethereum hardly has a monopoly on stablecoins. All Layer-1 blockchains, if they can support smart contracts, should also be able to support stablecoins. And that creates the opportunity for relatively unknown names to really pop. According to CoinGecko, Tron (CRYPTO: TRX) has a 34.1% share of the stablecoin market. By way of comparison, Solana (CRYPTO: SOL) only has a measly 2.2% share. If you think that stablecoins are the future, then Solana (with a $100 billion valuation), might be way overvalued compared to Tron, which has a $30 billion valuation. What's the best way to play the stablecoin trend? It's obvious that there are a number of different ways to play the stablecoin trend. The easiest way is to invest in the issuers of stablecoins, such as Circle. That gives you maximum exposure to any potential upside. You could also invest in blockchains such as Ethereum that are dominant in stablecoins, with the expectation that their values are going to soar. By the end of 2025, investing in stablecoins could get very interesting. What if a popular company like Amazon, Apple, or Alphabet decides to launch a stablecoin? It might fundamentally alter the way investors view these companies. That's why, even if you've never paid attention to stablecoins before, you should now. Very soon, they're going to become impossible to ignore. Should you invest $1,000 in Circle Internet Group right now? Before you buy stock in Circle Internet Group, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Circle Internet Group wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $636,774!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $1,064,942!* Now, it's worth noting Stock Advisor's total average return is 1,040% — a market-crushing outperformance compared to 182% for the S&P 500. Don't miss out on the latest top 10 list, available when you join Stock Advisor. See the 10 stocks » *Stock Advisor returns as of July 21, 2025 Dominic Basulto has positions in Amazon, Circle Internet Group, Ethereum, Solana, and USDC. The Motley Fool has positions in and recommends Airbnb, Alphabet, Amazon, Apple, Ethereum, Solana, Visa, and Walmart. The Motley Fool recommends Coinbase Global. The Motley Fool has a disclosure policy. Stablecoins Are on the Rise. 3 Reasons Investors Should Pay Attention to This Popular Cryptocurrency. was originally published by The Motley Fool

Alibaba (BABA) vs. Amazon (AMZN): Which E-Commerce Stock Has More Upside Ahead of Q2 Earnings?
Alibaba (BABA) vs. Amazon (AMZN): Which E-Commerce Stock Has More Upside Ahead of Q2 Earnings?

Business Insider

time6 hours ago

  • Business Insider

Alibaba (BABA) vs. Amazon (AMZN): Which E-Commerce Stock Has More Upside Ahead of Q2 Earnings?

The second-quarter earnings season is in full swing, and investors are closely watching global e-commerce leaders like Amazon (AMZN) and Alibaba (BABA) to assess the strength of consumer demand, the outlook for digital retail, and their growing role in artificial intelligence. Using TipRanks' Stock Comparison Tool, we will compare these two tech-powerhouse stocks to find the better pick ahead of the upcoming earnings results, according to Wall Street analysts. Elevate Your Investing Strategy: Take advantage of TipRanks Premium at 50% off! Unlock powerful investing tools, advanced data, and expert analyst insights to help you invest with confidence. Alibaba (NYSE:BABA) Stock Alibaba is China's largest e-commerce and cloud services company, operating platforms like Taobao, Tmall, and AliCloud. The stock has climbed over 39% so far this year, driven by strong gains in its AI-powered cloud services and growing demand for instant delivery. The company is doubling down on artificial intelligence, aiming to use it to transform online shopping and cloud services. It is using AI across its apps and cloud platform to improve customer experience and stay ahead of the competition. Looking ahead, the company is set to report its Q1 FY26 earnings next month. Wall Street expects Alibaba to report earnings of $2.22 per share for Q1, down 3% from the year-ago quarter. The decline could be due to the company's heavy investment in logistics and delivery. Meanwhile, analysts project Q1 revenues at $35.46 billion, up 6% year-over-year. Is Alibaba Stock a Good Buy Right Now? Ahead of the Q1 results, Benchmark's Top analyst Fawne Jiang reiterated her Buy rating with a $176 price target, implying a 47% gain from current levels. The analyst sees recent share weakness as a 'buying opportunity' and encourages investors to 'build exposure on dips,' confident in Alibaba's strong long-term growth outlook. Nevertheless, she expects Alibaba's margins and profits to come under pressure in the near term due to increased spending. As a result, Benchmark has cut its EBITDA forecast to RMB44 billion for Q1 FY26 and RMB208 billion for the full FY26, 'reflecting near-term margin pressure.' Overall, Wall Street has a Strong Buy consensus rating on Alibaba stock based on 14 Buys and one Hold rating. The average Alibaba price target of $151.08 implies about 26% upside potential from current levels. Amazon (NASDAQ:AMZN) Stock E-commerce and cloud computing giant Amazon is proving the resilience of its business model despite macro challenges and tariff woes. The stock has climbed over 5% so far this year. Several analysts remain bullish on Amazon's high-margin cloud unit, Amazon Web Services (AWS), which is expected to benefit from growing AI demand. In Q1 2025, AWS accounted for just 19% of revenue but delivered an impressive 63% of total operating profit. Meanwhile, Amazon's fast-expanding advertising segment is also emerging as a key growth engine. Looking ahead, Amazon is scheduled to announce its second-quarter results on July 31. Wall Street projects a 9% growth in Amazon's revenue to $162 million. Meanwhile, analysts expect the company to report earnings per share of $1.32 compared to $1.26 in the prior-year quarter. Is Amazon a Buy, Hold, or Sell? Ahead of the Q2 print, BofA Securities analyst Justin Post raised his price target to $265, up from $248, while maintaining a Buy rating. Post expects Amazon's Q2 retail performance to be strong, helped by positive credit card spending data and an extended Prime Day. He also believes AWS is picking up pace, with a strong order backlog and rising cloud demand. The analyst now predicts Q2 revenue of $164 billion, above Wall Street's estimate of $162.1 billion. Turning to Wall Street, AMZN stock has a Strong Buy consensus rating based on 44 Buys and one Hold assigned in the last three months. At $258.27, the average Amazon stock price target implies an 11.59% upside potential. Conclusion Ahead of earnings, Wall Street remains bullish on both Alibaba and Amazon stocks. However, analysts see greater upside potential in Alibaba, supported by its strong fundamentals, expanding AI initiatives, and solid recovery in e-commerce business. Meanwhile, Amazon is gaining from steady growth in cloud and advertising, two high-margin areas set to benefit from AI. While its upside may be smaller than Alibaba's, Amazon's stable growth and strong cash flow continue to earn Wall Street's confidence.

5 Reasons to Buy Nvidia Stock Like There's No Tomorrow
5 Reasons to Buy Nvidia Stock Like There's No Tomorrow

Yahoo

time6 hours ago

  • Yahoo

5 Reasons to Buy Nvidia Stock Like There's No Tomorrow

Key Points AI spending continues to grow robustly, creating greater demand for Nvidia's GPUs. Nvidia continues to dominate the AI chip market. New markets and technological advances present tremendous growth opportunities for Nvidia. 10 stocks we like better than Nvidia › Why should you not invest in Nvidia (NASDAQ: NVDA) right now? You'd definitely be late to the party buying shares of a company with a market cap of $4.2 trillion. Other stocks could have better growth prospects. Nvidia is also expensive, with a forward earnings multiple of over 38. I'm not going to focus on the bear case for Nvidia, though. The bull case looks even more compelling. Here are five reasons to buy Nvidia stock like there's no tomorrow. 1. AI spending is growing Any concerns that spending on artificial intelligence (AI) by cloud service providers and other customers would slow have evaporated. Alphabet gave more proof in its second-quarter update. The company raised its full-year capital expenditure guidance by $10 billion. This increase is due to Google Cloud investing in servers and data centers to meet rapidly growing demand. We haven't heard Amazon's and Microsoft's quarterly updates yet. However, I'd be surprised if their stories aren't similar to Google's. And when these cloud titans are investing more in servers and data centers, you can bet that a lot of the money will go to buy chips from Nvidia. 2. Continued GPU dominance There's a simple reason why customers are still turning to Nvidia: Its graphics processing units (GPUs) continue to dominate the AI market. Even with Google developing its tensor processing units (TPUs) and Amazon deploying its Inferentia and Trainium chips, Nvidia's seat on the throne remains secure. Blackwell, Nvidia's newest GPU architecture, has delivered the fastest commercial ramp-up in the company's history. In the first quarter of fiscal 2026, Blackwell GPUs generated almost 70% of Nvidia's data center compute revenue. Keep in mind that these chips began shipping in significant volumes just earlier this year. 3. The CUDA moat Can Nvidia sustain its grip on the AI chip market? It seems likely, thanks to what some refer to as the company's "CUDA moat." CUDA (which stands for Compute Unified Device Architecture) is Nvidia's proprietary platform that allows programmers to use its GPUs. This architecture has been around for years, with millions of programmers using it. There's also an extensive library of code that's optimized for Nvidia's GPUs. The bottom line is that Nvidia's competitive advantage in AI chips probably won't disappear as long as the CUDA ecosystem remains strong. 4. Expanding into new markets Nvidia has a successful track record of expanding into new markets. The company started out making chips for gaming systems before recognizing that its GPUs were ideal for powering AI models. It continues to move into new markets. For example, Nvidia's Omniverse platform, which enables the creation of 3D simulations and digital twins, is already used by multiple major corporations. I suspect it could be a bigger growth driver in the future than meets the eye. The company's Drive platform should also enable it to profit as autonomous vehicles become more widely adopted. Nvidia CEO Jensen Huang recently told shareholders that robotics represents the company's largest opportunity after AI. And while Huang seemed to pour cold water on expectations for quantum computing earlier this year, he stated at a conference in June that the technology "is reaching an inflection point." Unsurprisingly, Nvidia is investing heavily in quantum computing. 5. Tomorrow will be more exciting than today Perhaps the most important reason to buy Nvidia stock like there's no tomorrow is that there will be a tomorrow -- and it will almost certainly be more exciting than today. The advancement of AI over the next few years, including the advent of AI agents and potentially artificial general intelligence (AGI), could turbocharge the demand for Nvidia's GPUs. So could the proliferation of humanoid robots. Huang told analysts on Nvidia's Q1 earnings call, "The age of AI is here. From AI infrastructures, inference at scale, sovereign AI, enterprise AI, and industrial AI, Nvidia Corporation is ready." I think he was right. Do the experts think Nvidia is a buy right now? The Motley Fool's expert analyst team, drawing on years of investing experience and deep analysis of thousands of stocks, leverages our proprietary Moneyball AI investing database to uncover top opportunities. They've just revealed their to buy now — did Nvidia make the list? When our Stock Advisor analyst team has a stock recommendation, it can pay to listen. After all, Stock Advisor's total average return is up 1,041% vs. just 183% for the S&P — that is beating the market by 858.71%!* Imagine if you were a Stock Advisor member when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $636,628!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $1,063,471!* The 10 stocks that made the cut could produce monster returns in the coming years. Don't miss out on the latest top 10 list, available when you join Stock Advisor. See the 10 stocks » *Stock Advisor returns as of July 21, 2025 Keith Speights has positions in Alphabet, Amazon, and Microsoft. The Motley Fool has positions in and recommends Alphabet, Amazon, Microsoft, and Nvidia. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy. 5 Reasons to Buy Nvidia Stock Like There's No Tomorrow was originally published by The Motley Fool

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store