Latest news with #jobcandidates


Harvard Business Review
5 days ago
- Business
- Harvard Business Review
How AI Assessment Tools Affect Job Candidates' Behavior
According to the World Economic Forum, more than 90% of employers use automated systems to filter or rank job applications, and 88% of companies already employ some form of AI for initial candidate screening. Take Unilever, for example. The consumer goods giant uses AI-driven tools from HireVue to assess early-career applicants, saving 50,000 hours and more than $1 million in the process. Most companies, when considering AI assessment tools, focus on the gains the tools bring in terms of efficiency and quality. But what they don't factor in is how AI assessment may change candidates' behavior during the assessment. Our new research, examining over 13,000 participants across 12 studies, reveals that this is a crucial blind spot. We looked at simulation of a variety of assessment situations in both the laboratory and the field, and we collaborated with a startup platform offering game-based hiring solutions called Equalture. The results show that job candidates consistently emphasized analytical traits when they believed AI was evaluating them, while downplaying the very human qualities—empathy, creativity, intuition—that often distinguish outstanding employees from merely competent ones. This drove candidates to present a different and potentially more homogeneous version of themselves, in turn affecting who was likely to succeed in an AI-enabled hiring process, with implications for organizations using AI in hiring, promotions, or admission decisions. Why This Matters for Your Organization The implications of our findings extend beyond individual hiring decisions. When candidates systematically misrepresent themselves, organizations face several critical challenges: Talent pool distortion: While AI is sometimes blamed for making biased hiring decisions (for example, discriminating against women in the selection process), our research suggests that knowing that one is assessed by AI also biases candidates, making them believe that they should prioritize their analytical capabilities. As a result, companies may be screening out exactly the candidates they need simply by using AI: that innovative thinker or emotionally intelligent leader you're looking for might present themselves as a rule-following analyst because they believe that is what the AI wants to see. Validity compromise: Assessment tools are only as good as the data they collect. When candidates strategically alter their responses, the fundamental validity of the assessment process might be undermined. Organizations may no longer measure authentic capabilities—instead, they may measure what candidates think AI will value the most. Unintended homogenization: If most candidates believe AI favors analytical traits, the talent pipeline may become increasingly uniform, potentially undermining diversity initiatives and limiting the range of perspectives in organizations. Companies like IBM and Hilton, which integrate AI into both hiring and internal promotion systems, must now contend with whether such tools nudge employees toward formulaic self-presentation. New transparency regulations like the EU's AI Act, which require organizations to disclose AI use in high-stakes decisions, make these outcomes all the more likely. When candidates are aware that an AI is assessing them, they are more likely to change their behavior. What Leaders Can Do Based on our findings, organizations can take several concrete steps to address the AI assessment effect: Radical transparency: Do not just disclose AI assessment—be explicit about what it actually evaluates. Clearly communicate that your AI can and does value diverse traits, including creativity, emotional intelligence, and intuitive problem-solving. This might include providing examples of successful candidates who demonstrated strong intuitive or creative capabilities. Currently, few companies seem to be transparent about what exactly it is that AI assesses—at least this information is not easily accessible when clicking through career page information on the websites of many major companies. That said, applicants discuss and share their intuitions on blogs and videos, which may be counterproductive because it may or may not align with actual practices. We advise companies not to leave their candidates to speculate. Regular behavioral audits: Implement systematic reviews of your AI assessment outcomes. For instance, New York City has enacted Local Law 144, requiring employers to conduct annual bias audits of AI-based hiring. In response, one of the market leaders in AI-based hiring, HireVue reports their recent audits for race or gender bias across jobs and use cases. In addition to examining biases regarding demographics, we suggest using these audits to look for patterns indicating behavioral adaptation: Are candidates' responses becoming more homogeneous over time? Are you seeing a shift toward analytical presentations at the expense of other valuable traits? Hybrid assessment: Some organizations combine human and AI assessments. For example, Salesforce notes that besides technology, a human will review applications. Nvidia and Philip Morris International guarantee ultimate assessment and decision-making through a human. One of our studies shows that while this hybrid human assessment does reduce candidates' tendency to highlight analytical capabilities, it does not eliminate it. To close the gap, you need to train your human hirers to compensate for the AI effect. The Path Forward As AI becomes increasingly embedded in organizational decision-making, we must recognize that these tools do not just change processes—they change people. The efficiency gains from AI assessment may come at the cost of authentic candidate presentation and, ultimately, the human diversity that makes organizations innovative and resilient. The irony is striking: In our quest to remove human bias from hiring, we may have created a system where AI introduces a new form of bias. The solution is not to abandon AI, but to design assessment systems that account for and counteract these behavioral shifts. Only by keeping humans—not just metrics—at the heart of our assessment strategies can we build hiring systems that truly identify and nurture the diverse talent our organizations need.


Entrepreneur
30-06-2025
- Business
- Entrepreneur
Power Your Job Hunt With This $40 AI Platform
Disclosure: Our goal is to feature products and services that we think you'll find interesting and useful. If you purchase them, Entrepreneur may get a small share of the revenue from the sale from our commerce partners. Did you know entrepreneurs have a harder time getting a job? A study at Rutgers discovered that 35% of recruiters are less likely to interview candidates with entrepreneurial experience. With that hurdle to overcome, these candidates need all the help they can get, and Canyon is ready to provide an assist. Canyon helps users create resumes and land their dream jobs. Right now, a lifetime subscription to the Pro Plan can be yours for only $39.99 (reg. $684) with code CANYON20 until July 20. Save time and boost your job search with Canyon's AI features In today's job market, standing out is crucial. Canyon uses AI to help you craft the perfect resume and cover letter, optimizing things for each specific application. This can help you stand out and secure more interviews. It even assigns your resume a Canyon score, providing you with actionable feedback to make it better. Once your resume is perfected, it's time to work on the cover letter. Canyon has an AI cover letter generator, which tailors it to both your job description and personal background in seconds. And you can add a professional-quality headshot too, as Canyon can generate realistic AI headshots to attach to your resume or post on your LinkedIn profile. Unlike other AI resume builders, Canyon is also ready to help you autofill applications. This saves you tons of time in the job application process, instantly personalizing fields on your application. It also tracks your applications, storing them all in one place so you always know where you've applied. Canyon doesn't stop at applications; it also prepares you for interviews with AI-powered mock sessions featuring tailored questions and actionable feedback. Get a lifetime subscription to the Canyon Pro Plan for just $39.99 (reg. $684) with code CANYON20 until July 20. StackSocial prices subject to change.


The Independent
23-05-2025
- Business
- The Independent
Lawsuit claiming discrimination by the Workday HR program could have huge impacts on how AI is used in hiring
Workday has been sued by multiple job candidates who claim that the human resources software firm's screening technology is discriminatory. The collective action lawsuit could have a huge impact on how artificial intelligence is used in the hiring process. Workday is used by thousands of organizations around the world to recruit new employees. Many companies use it as a first step in the hiring process. It's often the online portal where applications for jobs are submitted. One program that the HR company offers is called HiredScore AI. This service grades candidates through 'unbiased, AI-driven analysis,' according to the firm. The idea behind this program is to allow recruiters a quick way to sort through candidates. But several job candidates who have submitted applications through Workday have come forward, claiming the company's algorithms are discriminatory, based on age and other factors. Derek Mobley, a Morehouse College graduate with almost a decade of experience in financial, IT and customer service work, sued Workday last year, claiming that its algorithms led to more than 100 job application rejections over seven years because of his age, race and disabilities. Four other job candidates made similar discrimination claims. On Friday a California district judge ruled that the case can move forward as a collective action lawsuit. All of the plaintiffs are over the age of 40, and they claim that they have submitted hundreds of job applications through Workday collectively, only to get rejected each time. One of the plaintiffs, Jill Hughes, claimed that she received automated rejections for hundreds of submitted job applications 'often received within a few hours of applying or at odd times outside of business hours,' according to court documents. She claimed this indicated 'a human did not review the applications,' the documents read. Workday said in a statement to The Independent: 'We continue to believe this case is without merit. This is a preliminary, procedural ruling at an early stage of this case that relies on allegations, not evidence. 'The Court has not made any substantive findings against Workday, and has not ruled this case can go forward as a class action. We're confident that once Workday is permitted to defend itself with the facts, the plaintiff's claims will be dismissed.' In a collective action lawsuit, plaintiffs have to opt into it, whereas in a class action lawsuit, a large group of people is included in the case unless they opt out. Workday still has the option to ask for the claims to be handled by the court individually. Mobley's original complaint said that 'too often' algorithmic decision-making and data analysis tools 'reinforce and even exacerbate historical and existing discrimination.' If Mobley and the other plaintiffs are successful in their collective action suit, Workday may have to change its practices. This could have a dramatic effect on how companies use AI in future hiring processes to weed out candiates before a manager can even see their applications.

News.com.au
09-05-2025
- Business
- News.com.au
‘Scary' job interview moment caught on video
Picture this: You've polished off your resume, triple-checked your outfit and you're ready to charm your way into a dream job. Then, you log into the virtual meeting room only to be met with a robotic voice instead of the typical friendly manager. This is the disturbing new reality of job interviews in 2025. In clips posted to social media, job applicants have shared their experiences with the new technology. 'This was so scary guys,' one post was captioned. Gathering over 2.5 million views, many were quick to brand the new practice as 'disrespectful' and 'dystopian'. A woman who appears to be interviewing for a job at international gym chain Club Pilates can be seen dressed in a blazer with her hair and makeup done. She is then met with the voice of an AI assistant who says, 'Hello, I'm Alex the recruiter at Club Pilates.' Before she can even reply, the bot continues to speak saying, 'Thank you for taking the time to interview today,' before going on to explain the role. 'I just wanted to interview in real life,' the woman told her viewers. 'AI interviews are so disrespectful and dehumanising. You don't want anything to do with this company if this is how they are treating their candidates,' said one viewer. 'If a company doesn't have the decency to use a real human to interview you that shows exactly who the company is,' agreed another. 'Oh wow. This is so unacceptable,' said a third. AI software is becoming increasingly popular with employers who are using it as an efficient, cheaper and quicker way to sift through large volumes of applicants. Sometimes, the technology is used to filter likely-candidates through to a second human-led interview. Other times, the AI tool may decide whether a candidate moves forward in the hiring process without any human review. AI interview bots are digital systems powered by artificial intelligence that can ask questions, listen to your answers, analyse your tone, and even your facial expressions. Big-name companies such as L'Oreal are already implementing this technology to screen thousands of candidates. Why AI? Unlike human recruiters, AI bots have the ability to interview 500 people before lunch. It serves as a time and cost-effective way of getting through the interview process. In theory, AI's supposed bias reduction is meant to provide applicants with a more level playing field where their looks, outfits or even voice are not considered. Their data-driven decisions take all the hard work out of crunching the numbers, patterns and keywords to determine who's ready for the job. However, the human touch and personal connection is missing. What's the big problem with AI? Recruitment expert and workplace consultant Tammie Ballis told that the use of AI in interviews can be 'irresponsible and dangerous'. 'When it comes to human factor you still need instinct, you still need to hear the motivation of the candidate and assess their body language. All the things AI can't do,' she said. Ms Ballis has been in the industry for 10 years and believes Australians aren't willing to put up with AI interviewing. She believes that candidates who have their first interview in-person are 'more likely to stay for the duration of the recruitment process'. Applicants are not only missing out on human connection, they are also being met with a lack of transparency. Having no idea what criteria the AI bot is using to interview you can mean that key points and creativity are easily lost in translation. 'Because you're not speaking to a real person you can't ask questions. You can't ask for feedback or for them to rephrase the question,' Ms Ballis said. Not only this, using AI presents the obvious issue of malfunctions. In clips shared to social media, people have captured the moment their AI interviews took a turn for the worst. In one video captioned 'I was expecting a real human. They didn't tell me ahead of time they'd use AI,' a man in a shirt and tie can be seen on a video call with an AI bot. The bot can be heard repeatedly saying 'lets touch base' as it appears to malfunction over and over before becoming incoherent. Ms Ballis believes recruitment is a strictly 'human job' but that agencies can benefit from the implementation of AI in other ways such as writing job ads, completing tedious manual tasks or screening resumes.