
How to tell if that song is AI-generated? Here are some things to check
The fictitious rock group, Velvet Sundown, which comes complete with AI-generated music, lyrics and album art, is stoking debate about how the new technology is blurring the line between the real and synthetic in the music industry, and whether creators should be transparent with their audience.
Computer software is widely used in music production, and artificial intelligence is just the latest tool that disc jockeys, music producers and others have added to their production pipeline. But the rise of AI song generators such as Suno and Udio is set to transform the industry because they allow anyone to create songs with just a few prompts.
4 If you encounter a new song that leaves you wondering whether it's 100% made with AI, there are some methods that could reveal how it was created.
StockPhotoPro – stock.adobe.com
While some people do not care whether they're listening to AI-generated music, others might be curious to know.
If you encounter a new song that leaves you wondering whether it's 100% made with AI, there are some methods that could reveal how it was created.
Do a background check
If you're wondering who's behind a song, try some old-fashioned detective work.
The 'most obvious cues' come from 'external factors,' said Manuel Mousallam, head of research and development at streaming service Deezer.
Does the band or artist have social media accounts? Lack of a social presence might indicate there's no one there. If they do exist online, examine the kind of content they post, and how long it goes back.
Is there any sign that the artist or band exists in real life? Are there any upcoming concerts and can you buy a ticket for a gig? Is there footage of past concerts on YouTube? Has an established record label released their singles or albums?
Try going to the source. Song creators often — but not always — publish their generated tunes on the Suno or Udio platforms, where they can be found by other users.
The catch is that you'll have to sign up for an account to get access. Users can look up songs by track name or the creator's handle, and browse genres and playlists. But it can still be difficult to spot a song, especially if you don't know the name of the song or creator.
Song tags
4 The 'most obvious cues' come from 'external factors,' said Manuel Mousallam.
AP
Deezer has been flagging albums containing AI-generated songs, as part of its efforts to be more transparent as it battles streaming fraudsters looking to make quick money through royalty payments.
The Deezer app and website will notify listeners with an on-screen label — 'AI-generated content' — to point out that some tracks on an album were created with song generators.
The company's CEO says the system relies on in-house technology to detect subtle but recognizable patterns found in all audio created by AI song generators. The company hasn't specified how many songs it has tagged since it rolled out the feature in June, but says up to 18% of songs uploaded to its platform each day are AI-generated.
Song scanners
4 The Deezer app and website will notify listeners with an on-screen label to point out that some tracks on an album were created with song generators.
InfiniteFlow – stock.adobe.com
There are a few third-party services available online that promise to determine whether a song is human-made or generated by AI.
I uploaded a few songs I generated to the online detector from IRCAM Amplify, a subsidiary of French music and sound research institute IRCAM. It said the probability that they were AI-generated ranged from 81.8% to 98% and accurately deduced that they were made with Suno.
As a cross-check, I also uploaded some old MP3s from my song library, which got a very low AI probability score.
The drawback with IRCAM's tool is that you can't paste links to songs, so you can't check tunes that you can only hear on a streaming service.
There are a few other websites that let you both upload song files and paste Spotify links for analysis, but they have their own limitations. When I tried them out for this story, the results were either inconclusive or flagged some AI songs as human-made and vice versa.
Check the lyrics
4 There are a few third-party services available online that promise to determine whether a song is human-made or generated by AI.
doidam10 – stock.adobe.com
AI song tools can churn out both music and lyrics. Many serious users like to write their own words and plug them in because they've discovered that AI-generated lyrics tend to be bad.
Casual users, though, might prefer to just let the machine write them. So bad rhyming schemes or repetitive lyrical structures might be a clue that a song is not man-made. But it's subjective.
Some users report that Suno tends to use certain words in its lyrics like 'neon,' 'shadows' or 'whispers.'
If a song includes these words, it's 'a dead giveaway' that it's AI, said Lukas Rams, a Philadelphia-area resident. He has used Suno to create three albums for his AI band Sleeping with Wolves but writes his own lyrics. 'I don't know why, it loves to put neon in everything.'
No easy answers
AI technology is improving so quickly that there's no foolproof way to determine if content is real or not and experts say you can't just rely on your ear.
'In general, it can be difficult to tell if a track is AI-generated just from listening, and it's only becoming more challenging as the technology gets increasingly advanced,' said Mousallam of Deezer. 'Generative models such as Suno and Udio are constantly changing, meaning that old identifiers – such as vocals having a distinctive reverb – are not necessarily valid anymore.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Android Authority
32 minutes ago
- Android Authority
I want Gemini to be my DJ in YouTube Music
Stephen Headrick / Android Authority Imagine having your very own DJ in your pocket, ready to mix your favorite songs at a moment's notice. No, not your dad's 3-5 second crossfade, but instead a completely customized mix that makes the two songs you're transitioning between meld together like they were designed that way by the artists themselves. That's the idea behind Apple Music's upcoming AI-powered AutoMix feature, which is coming with this fall's iOS 26 release. As a former Spotify user and now years-long YouTube Music user, this is the first time I've even batted an eye at one of Apple's few Android apps. I'm not really a big fan of the old-school cross-fade functionality, but this is different. Google is consistently adding Gemini-powered features across all of its verticals — including in the main YouTube app — but so far, we haven't seen much of this AI prowess in YouTube Music. AutoMix is a seemingly small Apple Music feature with big implications, and I have some ideas on how Google can bring its super-powered Gemini experience into its music app. What music streaming service do you use? 1 votes Spotify 0 % Apple Music 0 % YouTube Music 100 % Other (let me know in the comments) 0 % I don't use a music streaming service 0 % DJ Apple, spin that track! Apple first announced AutoMix back in June at this year's WWDC, its annual developer's conference. In its own words, this feature uses 'time stretching and beat matching to deliver continuous playback and an even more seamless listening experience.' Marketing jargon aside, there is a clear difference that you can hear with AutoMix enabled when compared to the standard cross-fade. Details on how the feature works are sparse, but shortly after the iOS 26 developer beta was released, videos of AutoMix in action quickly began circulating on all the major social media platforms. For me at least, I was intrigued the moment I heard it in action. Because AutoMix is not a database of pre-mixed songs and is instead powered by AI, the mixing experience can vary from user to user. Occasionally, the mixes are pretty underwhelming — remember, this is still just a beta software — but more often than not, it doesn't just work, it sounds magical. It adds an entirely new dimension to your music listening experience. I would note here that right now, AutoMix seems to work best when the two songs are from a similar genre or have a similar number of beats per minute. That said, I'd imagine this will only improve in the future. AutoMix isn't an entirely new concept. In fact, Spotify released their own take on this feature years ago under the exact same name. It, too, appears to use beat matching to determine the best way to transition between songs, but from examples I've seen and user commentary I've read, Apple's take appears to be a more advanced version. Perhaps advancements in AI also allow for an overall better mixing experience. YouTube Music needs to join the party I've been a YouTube Music convert for years now, and all in all, I've been very happy with the service. First and foremost, it's included with YouTube Premium, which is one reason I believe Google's subscription is one of the most high-value subscriptions out there. And to Google's credit, YouTube Music has frequently improved its service, adding new features and making the user interface more appealing and user-friendly. Google has added Gemini-powered capabilities to just about every corner of its software universe. From YouTube to Google Workspace to Android devices and most everything between, you will likely find some sort of Gemini-enabled feature. And it's not like we haven't seen Google dabble in the world of music, either. Stephen Headrick / Android Authority Google is adding Gemini everywhere. Is YouTube Music next? Google needs to join the personal DJ party, and this is just the beginning of what they could do with the power of Gemini. AutoMix honestly seems like a small addition as I write about it, but I can't emphasize enough how different it feels to listen to music with this mixing enabled. It makes for such a satisfying listening experience, and I am more than confident in some form of Gemini powering Google's version of this. Let's take this a little further than this relatively small AutoMix feature. How else could Gemini enhance my music streaming experience? Look no further than Spotify DJ, a feature that's already been around for a few years in Spotify land. Essentially, Spotify is using AI to generate playlists according to what you already listen to, and it has helped my colleague discover a lot more new music than he used to. Imagine Google's take on this: a Gemini Live-like DJ that you can talk back and forth with and really fine-tune your taste to find the most relevant music possible. Stephen Headrick / Android Authority YouTube Music already has playlists created by AI, based on your text input. Time to take it to the next level with a Gemini Live-like experience. Here's the thing: Google is already doing most of the heavy lifting that this type of feature would require. It already has a feature in the Google Discover feed called Daily Listen, where two AI-powered 'podcast hosts' serve up a daily short-form podcast with news and information relevant to your interests. And it works incredibly well. Creepy well, if you ask me. And Google already generates playlists based on text input. Gemini could just connect these dots together more efficiently. Since this is a Gemini Live-like experience, why not just hum something into your mic and have DJ Gemini generate a playlist of songs solely based on the vibes of whatever you're humming. That sounds both incredible and incredibly doable with the power of Gemini. I feel like we're only scratching the surface here, and yet, as I already stated, much of this is already being done by Google elsewhere in its services. Now, the YouTube Music team just needs to package this all up for its platform. Is Apple's DJ good enough to make me ditch YouTube Music? Switching music services isn't exactly fun. The app's algorithms take time to learn your preferences, and YouTube Music has years of my listening data now, so it has gotten pretty good at suggesting the right music (and podcasts…yes, I use this app for podcasts, too) at the right time. Is Apple's AutoMix feature enough to convince me to switch? Only time will tell, especially because we don't yet have clarity on whether or not this will be an iPhone-only feature, or if it will eventually make its way over to our green bubble world. Since it's an AI-powered feature, Apple may decide AutoMix can only work on Apple devices and Apple's chips; I'm not saying it's not possible for them to bring it to Android, I just wonder whether Apple will use on-device AI as a reason to keep AutoMix exclusive to its devices. That said, I tested the feature out on an iPhone 14 Pro, which doesn't have access to Apple Intelligence, so I would guess that they'll bring it to their Android app at some point, and maybe even as soon as this fall when AutoMix officially launches. Apple Music on iOS 26 beta, with AutoMix enabled Apple Music on Android 16 beta, with no AutoMix option in sight For now, I'm sticking with YouTube Music. I'm really confident in Gemini at this point, and it's only getting better. Bringing more AI into YouTube Music is the logical next step for the evolution of the platform. I remember when Google first launched Gemini — remember Bard? — I was really unsure what to make of Google's AI. The transition from Google Assistant was really rocky at first, too. But Gemini has gotten really good, and more importantly, the way it has been integrated into Google's services has become extremely powerful. I use it a lot throughout my day. I think it's only a matter of time before we see an AutoMix-like feature introduced in YouTube Music, most likely alongside other more advanced Gemini-based features. Follow


Forbes
2 hours ago
- Forbes
AI Boom Fuels San Francisco Party Scene As People Seek Connection
Secret salons, oyster happy hours, coffee raves--San Francisco is back and basking in idyllic weather as rent increases jump to the highest in the nation. Streets are teeming with people racing to events, despite tech season being weeks away, when conferences like Dreamforce, Disrupt, TedAI and SF Tech Week take over the city. Much of the frenzy is being attributed to the AI gold rush with people returning to town to get a piece of the action, so over the past week I popped into several happenings to hear what everyone is talking about. Finding your tribe At AGI House, a sprawling Hillsborough mansion just outside of San Francisco, known for hosting tech celebrities like Google cofounder Sergey Brin and Grimes, dozens gathered for a garden gala featuring talks with industry luminaries OpenAI's chief strategy officer Jason Kwon and former OpenAI interim CEO Emmett Shear. It was as insider as it gets with a pulsing DJ set by Twitch cofounder Justin Kan, meticulously curated by AGI House founder Rocky Yu and Icons podcaster Melanie Uno. In the mix was Poshmark cofounder Manish Chandra who shared with me his views on the AI transformation. I asked, what will people do with their lives as AI frees up time, and more importantly how will they pay for it. He replied, 'I feel like we're moving to more and more abundance, even though the path to abundance always feels a little uncertain and dark." "When the dot com boom was crashing, it was impossible to find a job. Highway 101 was emptier than Covid. There were see-through buildings, literally no jobs, and people were throwing in the towel.' Trying times, he recalled. "I remember it from a personal perspective, because I had young kids and had to figure out how to survive.' He expressed how hard times bring out things that can transform you, whether you discover superpowers or connect with new people. 'Human connections deepen when times are tough,' he said. 'When times are good, people just kind of ignore each other.' He also said there have been far crazier boom and bust cycles that have come before, with companies giving away BMWs and other outrageous perks to attract engineers. 'In the nineties, technology was changing so fast, it felt like everything you were doing was going to become obsolete, literally every day," he said. 'Every 10 years, we predict the demise of Silicon Valley, and we feel like whatever the technology is coming is dooming humanity, and is more severe than last time. Yet here we all are thriving, sitting here this lovely evening.' Emmett Shear, now cofounder of Andreessen-backed AI alignment lab, Softmax, sat down with me to discuss how people can best keep their head straight during these times. He explained that in the seventies there was a seminal work authored by Alvin Toffler, called Future Shock, that explored the psychological disorientation that can occur as a result of rapid technological change. 'This feeling of overwhelm, that if things keep changing, I can't learn fast enough to keep up with the system," he said. "But the way you keep up is actually by giving up on trying to understand everything at that level of detail.' He then shared his barbell strategy for surviving the next five years. 'In a high variance environment where things can change a lot in unexpected ways, you should just YOLO big things that might work, because even if you fail, your tried-and-true plan could also fail. So there's no point playing it safe, might as well be ambitious,' he advised. 'On the other hand, as things get riskier, you'll need to build up safety and reliability support to counterbalance.' He said hunter-gatherers lived in the same situation we're wandering into, a world of forces more powerful than themselves and beyond their control. Not only was it spiritually, emotionally and intellectually beneficial to be in a tight community, but also economically sound. When you store meat from the hunt in the bellies of friends, they'll be around to help when you find yourself in a tough spot. Futureproofing AI bets Back in San Francisco, at a Michelin starred restaurant where the meal was served community style making dining optional, AI unicorn Honeybook gathered the press to discuss how AI is birthing a new breed of one person startups and solopreneurs. It was here I had a chance to talk with Jeff Crowe, managing partner of Norwest Venture Partners, who told me the story of a 20-year old founder that landed seed funding to create text-to-sheets, text-to-deck apps right before ChatGPT made it a feature. This led to the question, how can VCs futureproof bets to prevent obsolescense in the age of AI. He said the first thing to look at is product. If it's a thin wrapper around a core, it's hard to futureproof as the LLMs eat their way further into the application layer. 'It's how venture capital looked at personal software in the nineties and said what's the differentiation if Microsoft moves into the space. Thirty-plus years later, it's a similar phenomenon in AI products, where OpenAI, Anthropic and others keep adding functionality." As far as defensible moats, he looks for product capabilities not easily disruptable like those with domain-specific data, integration with large enterprise systems, and bespoke distribution tied to supply chain. If it's a product that's been around longer, he looks for how fast it's pivoting to AI, driving into core functionality as well as operations including development, customer support, sales, marketing, HR and finance. Because if operations aren't futureproofed, competitors can gain a superior cost structure and become more capital efficient and profitable. He looks to see whether customers are adapting because some are going to get accelerated, and others obliterated, with risks that have nothing to do with the core business. Lastly, he looks for a culture that's nimble and can move exceptionally quickly. A fan of young talent, Crowe believes hiring AI-natives is the best way to transform an organization, because their rate of change is less than a worker whose baseline is pre-AI. Embracing AI workers Across town, Initialized Capital was hosting its own press dinner, introducing their portfolio of agentic startups deploying digital workers. Runway cofounder Siqi Chen told me from the moment he launched his startup in 2020, he knew they'd never have more than 100 people, because they had early access to GPT-3 and knew they could scale faster with AI, than headcount. In contrast to Crowe's hiring strategy, Chen said, Runway is hiring only senior talent. 'The profile of how we hire is quite different today than it was even three or four years ago. It's staff or principal level only at this point, because junior stuff can basically be done by LLMs today." 'We're seeing non-technical people contribute on a technical level like never before-- tagging a robot to write the code for a bug--that's just magic," he said. Runway uses bots for everything, from qualifying leads to reviewing documents. One of Initialized other portco commented that they deployed AI in Slack for IT support under the name of Paul, not AI Paul. A bit head-spinning to think you can be chatting with an AI colleague and not know it, even if they are funny. Initialized Capital's managing partner Brett Gibson said it's the natural progression of where we're heading. I asked him whether this was the end of the app economy. He replied, 'Software is going to trend towards being generatable. There are going to be a lot of apps you still want a relationship with for a variety of reasons, because they have other people on them and you're collaborating, or perhaps the AI itself has a personality you want to interact with. It's not going away, it's just going to have to adapt.' And what about humans, I asked, what's next for humans? 'The one thing that makes me very hopeful is that if there's anything AI is very good at, it's personalized education. And so hopefully, that will be the path for those feeling left out. People should follow whatever they're interested in and curious about because a high agency person using high leverage tools are going to do something cool and that's valuable," he said. AI gets the last word Back at AGI House, hanging out with hashtag inventor Chris Messina, I asked what advice he would give Gen Alpha on where to focus their energies, considering how pandemic losers have become AI winners, with ballet dancers, hair stylists and bartenders the few trades AI can't replace. 'VCs are over, SAS is over, everything that's been going on for the last 10 or 15 years kind of doesn't really make sense anymore,' he replied. 'If you really want to invest in the future, it's about having a perspective, being able to bring people into that and creating movements.' Echoing what Chen said: "There's only one Mr. Beast--and so if you develop relationship as a brand, that becomes sustainable value because AI cannot replace brand. Or can it? ChatGPT, may have no defensible moat as an AI assistant, but as a cultural icon with an estimated 1 billion followers, it remains pretty much untouchable. Just like the city from which it came.


USA Today
3 hours ago
- USA Today
On the origin of the '7-11' nickname for Boston Celtics stars Jaylen Brown, Jayson Tatum
Once, in a more innocent era for the two Boston Celtics superstar forwards, Jaylen Brown and Jayson Tatum had a different nickname than one often hears for them now. Today's Fire and Ice was once known as "7/11," a nod to the jerseys worn by the two Celtics swingmen in that era (Brown wore and still wears jersey No. 7, and Tatum wore 11 back then before switching to his current jersey No. 0). Dating from their debut in Las Vegas Summer League back in 2017 -- ancient history by now, but a fun window into their past -- the folks behind the "NBC Sports Boston" YouTube channel put together a clip looking back at the nickname earlier this summer. Narrated by NBC Sports Boston reporter Chris Forsberg, the clip dives into those halcyon days when the dup cribbed the name of a popular convenience store for their collective identity (we think they landed with the right nickname later on, though). Check it out below! Listen to "Havlicek Stole the Pod" on: Spotify: iTunes: YouTube: