logo
#

Latest news with #HRresponsibility

AI Bias In Hiring Is An HR Problem, Not A Tech Issue
AI Bias In Hiring Is An HR Problem, Not A Tech Issue

Forbes

time21 hours ago

  • Business
  • Forbes

AI Bias In Hiring Is An HR Problem, Not A Tech Issue

Stacey Tara leads People & Ops at Exa (AI search startup in SF). She writes about hiring, ops, culture, and decision frameworks. AI tools have quickly changed how companies hire. They speed up decisions, reduce admin work and promise objectivity. But that promise doesn't always hold. AI systems rely heavily on past hiring data, so if a company historically favored certain demographics—consciously or unconsciously—those biases get encoded into the algorithms. A prominent example is when Amazon discontinued its AI recruiting tool after discovering it consistently favored male candidates. Too often, HR teams assume AI bias is a technical problem. It's not. Engineers might build the tools, but AI will learn from existing data and replace its patterns. That means it's up to HR to set the standards and clarify what success looks like. Otherwise, history will repeat itself. HR Is Responsible For Addressing AI Bias As a hiring leader, you won't see AI bias emerge because of something like a massive system failure. It's often as simple as a qualified candidate getting filtered out because they went to a nontraditional school, took a career break or used different phrasing on a resume. These aren't red flags; they're signals of diverse experience. But without clear guidance, AI tools often penalize them. Over time, this narrows the talent pool and can reinforce homogeneity across teams. Bias in hiring doesn't just raise ethical issues. It also directly impacts business performance. Diverse, inclusive corporations are more likely to outperform their competitors. Teams also experience fewer knowledge gaps and improve decision making when they have diverse perspectives. Furthermore, biased hiring can erode trust internally. Employees notice when diversity efforts are performative or ineffective, which can damage culture, morale and, eventually, the company's external reputation. It's clear that unbiased hiring processes are a strategic necessity. And only you and your hiring team can teach AI tools what that means. 4 Steps HR Leaders Can Take To Mitigate AI Bias Unfortunately, there's a critical gap when it comes to companies' AI governance. According to McKinsey's recent Global Survey on the State of AI, only 13% of respondents said their company employs AI compliance specialists. Even fewer have dedicated AI ethics teams. When it comes to AI-powered hiring, HR can fill the leadership vacuum by taking these actions. 1. Audit the tools regularly. Checking on your AI tools' effectiveness should be routine. Don't just ask if they're working properly; check who they work for. For example, are certain groups being screened out more often? 2. Get the data right. Work with the technical team to ensure training data includes diverse, representative profiles. If the data is biased, the AI will be too. 3. Be transparent with candidates. To build trust during the hiring process, let candidates know if AI is part of decision making. Provide context for how applications are evaluated, and make it clear that the company is thinking critically about fairness. 4. Keep humans in the loop. Don't let AI make the final call. Human reviewers are vital because they can spot nuance and context that algorithms miss. Final Thought AI bias in hiring isn't a bug for engineers to patch. It's a systemic risk that HR must own. Hiring teams are in the best position to drive change by making the hiring process fair, accountable and human-first—even when machines are involved. Forbes Human Resources Council is an invitation-only organization for HR executives across all industries. Do I qualify?

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store