logo
Apple Music users are loving the new AutoMix feature in iOS 26, but one big limitation is dividing opinion

Apple Music users are loving the new AutoMix feature in iOS 26, but one big limitation is dividing opinion

Yahoo14 hours ago
When you buy through links on our articles, Future and its syndication partners may earn a commission.
Apple Music's AutoMix feature has rolled out to iOS 26 public beta
Despite going viral, AutoMix has the habit of cutting songs short
Though AutoMix has had mostly positive reviews, a lot of users have noticed this habit
Apple is gearing up for the launch of its huge iOS redesign, and now that iOS 26 public beta is live, we're one step closer to its full rollout, which is expected to arrive in September. One of the star features of iOS 26 is AutoMix in Apple Music, which has garnered a lot of attention.
Apple Music's AutoMix function (which uses AI to mix between songs on the spot) was announced at WWDC and immediately went viral. It was a hit in developer beta, resulting in a wave of viral videos showcasing its clever beat-matching abilities, and is now even more accessible since rolling out to public beta.
During its early roll-out stages, users have praised AutoMix for enhancing the Apple Music experience, adding that it could spark an even more competitive relationship between other music streaming platforms, most notably Spotify. But despite the positive reception, there's one common limitation that test users have picked up on.
Apple Music's flagship DJ sacrifices song runtime
There's no denying that AutoMix can do almost everything a DJ can during a live set, and it's a huge step up from the standard crossfade setting. However, its uniqueness has been shadowed by its song-shortening habits, which haven't gone unnoticed by Apple Music fans on Reddit and TikTok.
Many users have spotted that when a song comes to an end and AutoMix starts doing its thing, it has the habit of starting the mixing process far too early, chopping the last 30 seconds off a song. In some cases, AutoMix will even sacrifice the beginning of the next song in the queue, starting playback 30 seconds into a track.
One user shared a video on TikTok mixing two Taylor Swift songs, showing the first song ending with 25 seconds remaining and the second song starting 49 seconds in – completely skipping over the first verse.
While this isn't the case for all AutoMix examples, it's been a common occurrence for a number of users, leading them to question whether AutoMix needs a few more touches here and there before its wider rollout. But as with any AI feature, it's not guaranteed to get it right every single time, but when it does, it sounds pretty good – just take a look at the video below, which shows AutoMix at its best without cutting the songs short.
There is still some time left before iOS 26 is set to roll out to everyone, meaning that there's time for Apple to crack down on the minor issues that have surfaced with AutoMix. It has a lot of potential to outshine other streaming platforms, and though I'm mainly a Spotify user, I might stick with Apple Music beyond my free trial when AutoMix drops.
You might also like
How to download the iOS 26 public beta
I've been using iOS 26 for a month – here are 3 things I love and 1 I don't
iOS 26 public beta reveals Apple is reviving a controversial feature it was forced to abandon
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

It's not you, it's me. ChatGPT doesn't want to be your therapist or friend
It's not you, it's me. ChatGPT doesn't want to be your therapist or friend

Yahoo

timea few seconds ago

  • Yahoo

It's not you, it's me. ChatGPT doesn't want to be your therapist or friend

In a case of "it's not you, it's me," the creators of ChatGPT no longer want the chatbot to play the role of therapist or trusted confidant. OpenAI, the company behind the popular bot, announced that it had incorporated some 'changes,' specifically mental health-focused guardrails designed to prevent users from becoming too reliant on the technology, with a focus on people who view ChatGPT as a therapist or friend. The changes come months after reports detailing negative and particularly worrisome user experiences raised concerns about the model's tendency to 'validate doubts, fuel anger, urge impulsive actions, or reinforce negative emotions [and thoughts].' The company confirmed in its most recent blog post that an update made earlier this year made ChatGPT 'noticeably more sycophantic,' or 'too agreeable,' 'sometimes saying what sounded nice instead of what was helpful.' OpenAI announced they have 'rolled back' certain initiatives, including changes in how they use feedback and their approach to measuring 'real-world usefulness over the long term, not just whether you liked the answer in the moment.' 'There have been instances where our 4o model fell short in recognizing signs of delusion or emotional dependency,' OpenAI wrote in an Aug. 4 announcement. 'While rare, we're continuing to improve our models and are developing tools to better detect signs of mental or emotional distress so ChatGPT can respond appropriately and point people to evidence-based resources when needed.' Here's what to know about the recent changes to ChatGPT, including what these mental health guardrails mean for users. ChatGPT integrates 'changes' to help users thrive According to OpenAI, the 'changes' were designed to help ChatGPT users 'thrive.' 'We also know that AI can feel more responsive and personal than prior technologies, especially for vulnerable individuals experiencing mental or emotional distress,' OpenAI said. 'To us, helping you thrive means being there when you're struggling, helping you stay in control of your time, and guiding—not deciding—when you face personal challenges.' The company said its 'working closely' with experts, including physicians, human-computer-interaction (HCI) researchers and clinicians as well as an advisory group, to improve how 'ChatGPT responds in critical moments—for example, when someone shows signs of mental or emotional distress.' Thanks to recent 'optimization,' ChatGPT is now able to: Engage in productive dialogue and provide evidence-based resources when users are showing signs of mental/emotional distress Prompt users to take breaks from lengthy conversations Avoid giving advice on 'high-stakes personal decisions,' instead ask questions/weigh pros and cons to help users come up with a solution on their own 'Our goal to help you thrive won't change. Our approach will keep evolving as we learn from real-world use,' OpenAI said in its blog post. 'We hold ourselves to one test: if someone we love turned to ChatGPT for support, would we feel reassured? Getting to an unequivocal 'yes' is our work.' This article originally appeared on USA TODAY: ChatGPT adds mental health protections for users: See what they are Solve the daily Crossword

Friends star's alleged 'Ketamine Queen' given trial date
Friends star's alleged 'Ketamine Queen' given trial date

Yahoo

timea few seconds ago

  • Yahoo

Friends star's alleged 'Ketamine Queen' given trial date

A woman charged with selling Friends actor Matthew Perry the dose of ketamine that killed him will go on trial next month. The trial of Jasveen Sangha, allegedly known as the Ketamine Queen, will begin on 23 September after an order from a Los Angeles judge on Tuesday. She is the only defendant standing trial over Perry's death after four others reached plea agreements with prosecutors. The 42-year-old, who has pleaded not guilty, is charged with five counts of ketamine distribution, including one count of distribution resulting in death. Sangha's trial has been postponed four times after her lawyers said they needed longer to go through the prosecution's evidence and to finish their own investigation Perry died in his home in October 2023, aged 54, after getting ketamine from his regular doctor for treatment of depression, which is an increasingly common use for the surgical anaesthetic. The actor was taking ketamine six to eight times a day before he died, according . Read more from Sky News:Reeves told to find 'substantial' tax rises Prosecutors say Perry illegally sought more ketamine from his doctor, Salvador Plasencia, after he wouldn't give him as much as he wanted. They allege he then sought more from Sangha, who allegedly presented herself as "a celebrity drug dealer with high-quality goods". Perry's assistant and friend admitted to buying large amounts of ketamine for him from Sangha, including 25 vials for $6,000 in cash, a few days before his death. Prosecutors allege that purchase included the doses that killed Perry. Plasencia along with Perry's personal assistant, his friend, and another doctor. None have been sentenced yet.

OpenAI in talks for share sale valuing startup at $500 billion, Bloomberg News reports
OpenAI in talks for share sale valuing startup at $500 billion, Bloomberg News reports

Yahoo

timea few seconds ago

  • Yahoo

OpenAI in talks for share sale valuing startup at $500 billion, Bloomberg News reports

(Reuters) -ChatGPT maker OpenAI is in early talks about a potential secondary sale of stock for current and former employees at a valuation of about $500 billion, Bloomberg News reported on Tuesday. Reuters could not immediately verify the report. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store