logo
‘Sorry, I didn't get that': AI misunderstands some people's words more than others

‘Sorry, I didn't get that': AI misunderstands some people's words more than others

Yahoo27-01-2025
The idea of a humanlike artificial intelligence assistant that you can speak with has been alive in many people's imaginations since the release of 'Her,' Spike Jonze's 2013 film about a man who falls in love with a Siri-like AI named Samantha. Over the course of the film, the protagonist grapples with the ways in which Samantha, real as she may seem, is not and never will be human.
Twelve years on, this is no longer the stuff of science fiction. Generative AI tools like ChatGPT and digital assistants like Apple's Siri and Amazon's Alexa help people get driving directions, make grocery lists, and plenty else. But just like Samantha, automatic speech recognition systems still cannot do everything that a human listener can.
You have probably had the frustrating experience of calling your bank or utility company and needing to repeat yourself so that the digital customer service bot on the other line can understand you. Maybe you've dictated a note on your phone, only to spend time editing garbled words.
Linguistics and computer science researchers have shown that these systems work worse for some people than for others. They tend to make more errors if you have a non-native or a regional accent, are Black, speak in African American Vernacular English, code-switch, if you are a woman, are old, are too young or have a speech impediment.
Unlike you or me, automatic speech recognition systems are not what researchers call 'sympathetic listeners.' Instead of trying to understand you by taking in other useful clues like intonation or facial gestures, they simply give up. Or they take a probabilistic guess, a move that can sometimes result in an error.
As companies and public agencies increasingly adopt automatic speech recognition tools in order to cut costs, people have little choice but to interact with them. But the more that these systems come into use in critical fields, ranging from emergency first responders and health care to education and law enforcement, the more likely there will be grave consequences when they fail to recognize what people say.
Imagine sometime in the near future you've been hurt in a car crash. You dial 911 to call for help, but instead of being connected to a human dispatcher, you get a bot that's designed to weed out nonemergency calls. It takes you several rounds to be understood, wasting time and raising your anxiety level at the worst moment.
What causes this kind of error to occur? Some of the inequalities that result from these systems are baked into the reams of linguistic data that developers use to build large language models. Developers train artificial intelligence systems to understand and mimic human language by feeding them vast quantities of text and audio files containing real human speech. But whose speech are they feeding them?
If a system scores high accuracy rates when speaking with affluent white Americans in their mid-30s, it is reasonable to guess that it was trained using plenty of audio recordings of people who fit this profile.
With rigorous data collection from a diverse range of sources, AI developers could reduce these errors. But to build AI systems that can understand the infinite variations in human speech arising from things like gender, age, race, first vs. second language, socioeconomic status, ability and plenty else, requires significant resources and time.
For people who do not speak English – which is to say, most people around the world – the challenges are even greater. Most of the world's largest generative AI systems were built in English, and they work far better in English than in any other language. On paper, AI has lots of civic potential for translation and increasing people's access to information in different languages, but for now, most languages have a smaller digital footprint, making it difficult for them to power large language models.
Even within languages well-served by large language models, like English and Spanish, your experience varies depending on which dialect of the language you speak.
Right now, most speech recognition systems and generative AI chatbots reflect the linguistic biases of the datasets they are trained on. They echo prescriptive, sometimes prejudiced notions of 'correctness' in speech.
In fact, AI has been proved to 'flatten' linguistic diversity. There are now AI startup companies that offer to erase the accents of their users, drawing on the assumption that their primary clientele would be customer service providers with call centers in foreign countries like India or the Philippines. The offering perpetuates the notion that some accents are less valid than others.
AI will presumably get better at processing language, accounting for variables like accents, code-switching and the like. In the U.S., public services are obligated under federal law to guarantee equitable access to services regardless of what language a person speaks. But it is not clear whether that alone will be enough incentive for the tech industry to move toward eliminating linguistic inequities.
Many people might prefer to talk to a real person when asking questions about a bill or medical issue, or at least to have the ability to opt out of interacting with automated systems when seeking key services. That is not to say that miscommunication never happens in interpersonal communication, but when you speak to a real person, they are primed to be a sympathetic listener.
With AI, at least for now, it either works or it doesn't. If the system can process what you say, you are good to go. If it cannot, the onus is on you to make yourself understood.
This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Roberto Rey Agudo, Dartmouth College
Read more:
Eliminating bias in AI may be impossible – a computer scientist explains how to tame it instead
I unintentionally created a biased AI algorithm 25 years ago – tech companies are still making the same mistake
Building machines that work for everyone – how diversity of test subjects is a technology blind spot, and what to do about it
Roberto Rey Agudo does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Amazon Fire Stick 4K is just $25, its lowest price of the year ahead of Prime Day
Amazon Fire Stick 4K is just $25, its lowest price of the year ahead of Prime Day

Yahoo

timean hour ago

  • Yahoo

Amazon Fire Stick 4K is just $25, its lowest price of the year ahead of Prime Day

Has your TV remote been letting you down lately? Slow, laggy and just not as intuitive or responsive as you'd like it to be. Do you get the spinning wheel of content death? Have time to grab a drink and go to the bathroom after you launch a streaming service before it actually loads? We feel you — same here. It's really a bummer, especially with so much excellent content available these days. But there's good news: Right now, Amazon's Fourth of July sale is going on, and the mega-retailer is offering the newest model of its Fire Stick for 50% off. And believe us when we tell you it's a serious upgrade. Let's get into it! Well, for starters, you're getting Amazon's newest Fire Stick for half off. Have you ever seen any new tech discounted to half its price shortly after release? That alone is enough to make it add to cart-worthy. But also, consider this: For about the price of a movie ticket and small popcorn, the Fire Stick 4K offers quick and easy access to more content than you'll ever be able to watch in a lifetime. We're talking 1.5 million TV shows and movies at your fingertips. If you like chilling on your couch and catching the newest season of The Bear or want to be among the first to see Sinners when it's released this month, this deal is for you. There's a lot to love about the new stick, including that "4K" right in the name. If you have a 4K TV, this device is optimized for your existing setup. It's powered by a quad-core 1.7 GHz processor, which means quicker app starts and more fluid navigation — or, as Amazon says, "It makes getting to the good stuff even easier." And Wi-Fi 6 support means you won't experience lags and other hiccups, even if others in your house are connected to the same router. You can also expect more intuitive, AI-powered search functionality — just hold down the mic button and tell Alexa exactly what you want. Want to see rom-coms with a love triangle? Ask for that. Eager to watch a horror movie that involves the paranormal? Yep, that too. And, if you're a gamer, the 4K stick lets you play Xbox titles, no console required. You can stream popular games like Starfield, Forza Motorsport, Palworld and "hundreds" of other high-quality games directly from the doodad with Xbox Game Pass Ultimate via cloud gaming. These are just a few of the more tangible features. The stick also offers access to Dolby Vision and even lets you control other smart-home tech in your house, like a thermostat or lights. Over 60,000 five-star fans say Amazon's new streaming remote offers stellar picture quality, intuitive programming, and is easy to set up. "I bought this Fire TV Stick thinking it would help me stream The Office for the 47th time in peace. What I didn't expect was for it to basically become my new life coach," said one self-actualized shopper. "The 4K quality? So sharp I started cleaning my living room just because the dust on my TV stand looked extra high-def. The AI-powered search is wild — it's like it knows what I want before I do. My social life may suffer, but my binge game is elite." "I love being able to stream Xbox games without a console," remarked a gratified gamer. "And Wi-Fi 6 keeps everything smooth even when the network's busy. It's the ultimate all-in-one device for streaming and smart-home control." "Setup? Ridiculously simple," shared a final succinct shopper. "Navigation? Smooth like jazz. Lag? Nonexistent. This thing moves faster than I do when someone says 'free snacks in the break room.'" Cons 👎 While some buyers feel this little device is magic in a stick, others reported some issues. "Amazing amount of apps available," remarked one user before adding that, unfortunately, "Limited memory restricts the amount of apps you can download unless you add an outside memory stick." "I cannot tell you how annoying it is when I pause a show, come back and my TV is off and I have to reopen the app," said another. If you have Amazon Prime, you'll get free shipping, of course. Not yet a member? No problem. You can sign up for your free 30-day trial here. (And by the way, those without Prime still get free shipping on orders of $35 or more.) The reviews quoted above reflect the most recent versions at the time of publication.

Companies are relying on aptitude and personality tests more to combat AI-powered job hunters
Companies are relying on aptitude and personality tests more to combat AI-powered job hunters

Business Insider

timean hour ago

  • Business Insider

Companies are relying on aptitude and personality tests more to combat AI-powered job hunters

Are you happy? Do you sleep well? Do you have many friends? Are you a workaholic? Those are some of the questions Katelin Eagan, 27, said she had to answer recently when she was applying for a job. She agreed to take a cognitive and personality assessment as part of the hiring process, but was a bit bewildered. Many of the questions had nothing to do with the engineering position, which, after completing the tests and going through several months of silence, she was eventually rejected for. Eagan says she's been applying for jobs full-time since the start of the year. Her efforts haven't panned out yet, which she attributes partly to how competitive her field has become and employers having room to be picky. "I think there's definitely a lower amount than I thought there would be," she said of available roles. But that may be only part of the story. Employers are growing increasingly selective, partly because many are seeing a flood of seemingly perfect candidates, many of whom are suspected of using AI to finesse their applications, according to recruiters and hiring assessment providers who spoke to BI. The solution many companies have come to? Make everyone take a test — and see who candidates really are, irrespective of what ChatGPT suggested they put on their résumés. According to surveys conducted by TestGorilla, one firm that administers talent assessments for employers, 76% of companies that had hired in the 12 months leading up to April said they were using skills tests to determine if a candidate was a right fit, up from 55% who said they were using role-specific skills tests in 2022. Employers seem most interested in testing for soft skills — amorphous qualities like communicativeness and leadership — as well as administering general aptitude and personality tests, Wouter Durville, the CEO of TestGorilla, told Business Insider. TestGorilla's Critical Thinking test was completed more than 100,000 times in the first quarter of this year, a 61% increase compared to the same quarter in 2024. The firm also offers a Big 5 personality assessment, which was completed more than 127,000 times in the first quarter — a 69% increase compared to last year. Demand among US employers in particular has been "massive," Durville said, adding that many firms have turned to tests as a result of being overwhelmed with job applications. The US is the largest market for the firm, which is based in the Netherlands. "The biggest thing is people just want to hire the best people. It's very selfish and it's fine," Durville said. Canditech, another firm that offers hiring assessments, says it's also seen rapid growth in the last year. In 2024, the assessment usage grew 135% compared to the prior year, CEO Guy Barel told BI. He estimates that assessment usage is on track to soar 242% year-over-year. Barel says the surge is partly due to the job market tipping more in favor of employers. In many cases, companies he works with are flooded with "tons of candidates" and looking to "move forward as fast as possible," he said. Criteria, another skills-based assessment provider, says test usage has more than doubled in recent years. "AI is kind of creating this authenticity crisis in talent acquisition, because everyone can and is putting their résumé into ChatGPT." Criteria CEO Josh Millet told BI. "It's all about demonstrating your ability or your skill or your personality in an objective way that's a little bit harder to fake." The AI job market Jeff Hyman, a veteran recruiter and the CEO of Recruit Rockstars, estimates that demand for testing among his clients has increased by around 50% over the last 18 months. That's due to a handful of different reasons, he said — but companies being inundated by job applications is near the top, thanks to candidates leaning more on AI to gain an edge and send out résumés en masse, he says. Hyman says a typical job he tries to fill for a client has around 300 to 500 applicants, though he's spoken to companies trying to fill roles with more than 1,000 candidates within several days of being posted online. The number of job applications in the US grew at more than four times the pace of job requisitions in the first half of 2024, according to a report from WorkDay. Companies also want to test candidates' soft skills as remote work grows more common, Hyman adds — and they want to be sure they're getting the right person. Depending on the size of the organization, a bad hire can cost a company anywhere from $11,000 to $24,000, a survey conducted by CareerBuilder in 2016 found. According to TestGorilla, 69% of employers who issued tests this year said they were interested in assessing soft skills, while 50% said they were interested in assessing a candidate's cognitive ability. A separate survey by Criteria ranked emotional intelligence as the most sought-after skill among employers, followed by analytical thinking. "It's about their personality and to see if they are a good fit to the organization, if they share the same DNA," Durville said, though he noted that, in many cases, companies find the results of the tests to be shaky as a sole evaluation metric. TestGorilla, Canditech, and Criteria told BI that employers say they're enjoying the time and cost savings of administering tests. According to TestGorilla, 82% of employers who said they used skills-based hiring — a catch-all term for hiring based on proven skills — said they were satisfied with new hires, compared to 73% of US employers on average. Canditech, meanwhile, claims its assessments can help employers cut down on hiring time by as much as 50%, and reduce "unnecessary interviews" by as much as 80%, according to its website. But Hyman thinks there are some issues with hiring tests. For one, he says employers turn down candidates who don't score well "all the time," despite them being otherwise qualified for the job. The trend also appears to be turning off job candidates. Hyman estimates around 10%-20% of applicants will outright refuse to take a test if employers introduce it as a first step in the hiring process, though that's a practice Canditech's Barel says is becoming increasingly common. Hyman says he frequently has conversations with employers urging them not to put so much weight on test results, due to the potential for a mis-hire. "That's lazy hiring, to be honest. I think that's not the right way to go about it," he said.

Be careful with this feature in iMessage. It almost ruined my life.
Be careful with this feature in iMessage. It almost ruined my life.

Business Insider

time2 hours ago

  • Business Insider

Be careful with this feature in iMessage. It almost ruined my life.

My iPhone threatens to ruin my reputation, career, marriage, friendships, or entire life. Several times a week. Sometimes, I look down to discover it's been — unbeknownst to me — recording an audio message. With one wrong move, I could accidentally send that accidental audio message to, well, anyone. What might have been in those few minutes of surreptitiously recorded audio? Most likely, just ambient white noise coming from inside my purse or pocket. But it could be terrible! Maybe I was singing along (badly) to the radio. Maybe I was loudly discussing some scandalous social gossip or confidential work information. Maybe I was complaining about my editor. (Brad, I know you're reading this — I would never.) Maybe I was having a particularly cacophonic bathroom experience. Accidental iMessage recordings happen on other people's iPhones, too I'm not alone — this is happening to lots of people. When I grumbled about this on Threads, I got dozens of replies from people who were also constantly accidentally recording. There are several Reddit posts about the problem, too. One of those posts contains a pure nightmare: "My phone sent a recording of me peeing to my boss." They said they quickly sent a follow-up text telling their boss the recording was accidental and not to listen. "I have no idea if he heard it. I can only assume he did and, out of respect, never brought it up," the redditor told me over direct message. Another person said they accidentally sent a recording of sexy talk with their spouse to their sister. Yikes! Of course, sending voice memos and audio recordings can be great! Sometimes, they come in handy when you want to tell a longer story — and especially in group chats. The other day, I sent a four-minute audio recording to my friend detailing some gossip about our social circle. But I want to use audio recordings to gossip — not accidentally be the cause of it. ("Did you hear Katie sent a recording of herself in the bathroom to the group?!") What was driving me nuts was that I couldn't really seem to understand why this kept happening. In fact, when I actually want to send an audio recording, I fumble around with actually knowing how to do it. Hint: It's not the microphone in the text box — that's for speech-to-text. The audio message is buried in the list of options when you hit the "+" sign, sandwiched between Stickers, Apple Cash, Send Later, and Memoji. (Tim Cook, I am looking you dead in the eyes and telling you I will never use Memojis. Stop trying to make Memojis happen.) I love my iPhone because it usually just works. I understand it, it's intuitive, and after years of using one, I understand how the features work. But here I was, unable to figure out why this kept happening. Was it a bug or user error? If this is happening on your iPhone, there's a fix It turns out, the "Raise to Listen" feature is ON by default in iMessage. This feature is for you to be able to listen to audio recordings when you put the phone up to your ear, but it also works the other way. When you have iMessage open and put the phone up to your ear (or close to it — the phone gets confused sometimes!), it can trigger the audio recording. Here's how you find it: Go to Settings > Apps > Messages. Scroll all the way down until you see the "Raise to Listen" feature. Toggle this OFF if you don't want to use it. It might make it slightly more difficult to listen to audio messages, but it will stop the accidental ones. (When I reached out to Apple for comment on my potential life-ruining, they suggested turning off Raise to Listen if it was an issue for me.) The Raise to Listen feature has been causing weird accidental audio messages since at least 2015, but it seems (in my experience) that it's happened much more often in the last year or so. Now that I've turned the feature off, I can breathe (and poop) easily, knowing I wont accidentally send someone a recording. You should do it, too.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store