Latest news with #DerekMobley


Indian Express
26-06-2025
- Business
- Indian Express
Rejected 100 times: IT Worker sues Workday, alleges AI hiring bias over age, race and mental health
Derek Mobley, 50, an IT professional from North Carolina, has filed a lawsuit against recruiting software company Workday, alleging that its algorithm unfairly rejected him from over 100 job applications between 2017 and 2019 due to his age, race, and disabilities. According to a Wall Street Journal report, Mobley noticed a pattern: despite being qualified, he was either quickly turned down – sometimes within an hour – or never received a response. What stood out to him was that most of the companies he applied to used Workday's hiring platform. Mobley, an African-American who lives with anxiety and depression, said the repeated rejections didn't add up. 'There's a standard bell curve in statistics. It didn't make sense that my failure rate was 100%,' he told reporters. 'It dawned on me that this must be some kind of server reviewing these applications and turning me down.' In 2023, Mobley filed a lawsuit claiming that Workday's algorithmic screening system discriminated against him, flagging his age, race, and mental health as undesirable traits. He also pointed to personality tests built into the application process that may have detected his conditions. Last month, a California court ruled that Mobley's case could move forward – a potentially landmark decision that could open the door to lawsuits from millions of job seekers over the age of 40. Workday has pushed back, stating that its system simply matches CV keywords with job requirements and assigns scores based on those matches. While employers can include screening questions that trigger automatic rejections, the company emphasised that the final hiring decisions rest with the employers themselves. 'There's no evidence that the technology results in harm to protected groups,' the company said in a statement. However, hiring experts warn that automated scoring systems can inadvertently filter out qualified candidates, for reasons ranging from resume gaps to subtle mismatches with job criteria. Mobley's case brings renewed attention to the hidden biases within AI-driven recruitment and the challenges faced by marginalised job seekers in the age of algorithmic hiring.


Time of India
25-06-2025
- Business
- Time of India
Hiring software Workday's AI may have an ‘ageist' problem, company claims, ‘they are not trained to…'
Image credit: Workday Hiring software provider Workday is facing a class action lawsuit that alleges its AI-powered job applicant screening system has an "ageist" problem. The lawsuit claims that Workday's AI-powered system discriminates against candidates aged 40 and over. The lawsuit builds on an employment discrimination complaint filed last year by Derek Mobley against the company. Mobley's initial suit alleged that the company's algorithm-based system discriminated against applicants based on race, age, and disability. According to a report by Forbes, four more plaintiffs have now joined the lawsuit, specifically accusing Workday of age discrimination . What Workday said about the lawsuit In an email sent to Forbes, a Workday spokesperson denied allegations that their technology contains bias. The company said, 'This lawsuit is without merit. Workday's AI recruiting tools do not make hiring decisions, and our customers maintain full control and human oversight of their hiring process. Our AI capabilities look only at the qualifications listed in a candidate's job application and compare them with the qualifications the employer has identified as needed for the job. They are not trained to use—or even identify—protected characteristics like race, age, or disability. The court has already dismissed all claims of intentional discrimination, and there's no evidence that the technology results in harm to protected groups.' by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Giao dịch CFD với công nghệ và tốc độ tốt hơn IC Markets Đăng ký Undo The Workday spokesperson also noted that the company has recently implemented measures to ensure that the software they use adheres to ethical standards. Hidden bias in AI hiring tools and why automation may not always be fair As per data compiled by DemandSage, an estimated 87% of companies are reportedly using AI for recruitment in 2025. The report notes that these companies rely on tools like Workable, Bamboo HR, and Rippling for the same. While these systems help automate hiring, a study by the University of Washington from last year revealed that they are often biased. AI tools can show traits of racial, gender, and socioeconomic biases from the data they're trained on or from the algorithms themselves. One of the examples is Amazon's scrapped AI tool that discriminated against women and resume filters that favour elite education or specific language patterns, often excluding underrepresented groups. Redmi Pad 2: Know these Things Before Buying! AI Masterclass for Students. Upskill Young Ones Today!– Join Now


Hindustan Times
24-06-2025
- Business
- Hindustan Times
Rejected from 100 jobs, IT worker sues Workday over hiring software discrimination
An IT worker in North Carolina has sued software firm Workday, alleging its recruiting platform rejected over 100 of his job applications for several years based on his age and race, a report by the Wall Street Journal said. In 2023, Mobley sued Workday, claiming the company's hiring software filtered him out of the application process.(Representational) Derek Mobley said that he applied for more than 100 jobs from 2017 to 2019 and later, but was met with rejection or silence each time, with emails turning him down arriving late in the night or just an hour after he submitted his application. The 50-year-old noticed that most of the companies he applied to used an online recruiting platform created by Workday, which helps track and screen job candidates. 'It did not make sense' In 2023, Mobley sued Workday, claiming its algorithm screened him out, based on his age, race and disabilities. Mobley, an African-American man who suffers from anxiety and depression, said that he applied for a job he knew he was qualified for, so the multiple rejections just did not make sense. 'There's a standard bell curve in statistics. It didn't make sense that my failure rate was 100%,' he said. Last month, California said Mobley's claim could proceed, opening the door to millions of potential claims from job seekers over 40. Workday has argued that its software matches keywords on CVs with job qualifications and scores applicants. It said that while employer clients can set up questions that lead to automatic rejections, the software lets the employer make the final decision on candidates. 'There's no evidence that the technology results in harm to protected groups,' the company said. Did not get one interview However, Mobely said that he applied for over two years and didn't get a single interview. 'It dawned on me that this must be some kind of server reviewing these applications and turning me down," he said. He alleged that the software rejected him because it picked up on his age, race and detected his anxiety and depression through personality tests. Experts have said scoring systems can reject qualified workers for various reasons, such as gaps in their resumes or a lack of matching qualifications.


Mint
23-06-2025
- Business
- Mint
Millions of résumés never make it past the bots. One man is trying to find out why.
U.S. job hunters submit millions of online applications every year. Often they get an automatic rejection or no response at all, never knowing if they got a fair shake from the algorithms that gatekeep today's job market. One worker, Derek Mobley, is trying to discover why. Mobley, an IT professional in North Carolina, applied for more than 100 jobs during a stretch of unemployment from 2017 to 2019 and for a few years after. He was met with rejection or silence each time. Sometimes the rejection emails arrived in the middle of the night or within an hour of submitting his application. Mobley, now 50 years old, noticed that many of the companies he applied to used an online recruiting platform created by software firm Workday. The platforms, called applicant tracking systems, help employers track and screen job candidates. In 2023 Mobley sued Workday, one of the largest purveyors of recruiting software, for discrimination, claiming its algorithm screened him out, based on his age, race and disabilities. Mobley, a Black graduate of Morehouse College who suffers from anxiety and depression, said the math didn't add up. He says he applied only for jobs he believed he was qualified for. 'There's a standard bell curve in statistics. It didn't make sense that my failure rate was 100%," said Mobley, who has since gotten hired and twice promoted at Allstate. His suit is now emerging as the most significant challenge yet to the software behind nearly every hiring decision these days. Last month—after several failed challenges by Workday—a federal judge in California said Mobley's age-discrimination claim could proceed, for now, as a collective action. The ruling opens the door to millions of potential claims from job seekers over the age of 40. While the judge has ruled that Workday didn't intentionally discriminate against Mobley, she left open the door for him to prove that Workday's technology still had the effect of penalizing him because of his age. She hasn't addressed the race and disability claims. Mobley still has a tough case to prove, and the suit may go through years of legal wrangling. Yet the case could force Workday to part the curtains on how its algorithm scores applications, a process that has remained a black box since job searches began moving online decades ago. 'Hiring intermediaries have pretty much been excused from regulation and they've escaped any legal scrutiny. I think this case will change that," said Ifeoma Ajunwa, a professor at Emory University School of Law and author of 'The Quantified Worker." Workday says Mobley's claims have no merit. It said its software matches keywords on résumés with the job qualifications that its employer-customers load for each role, then scores applicants as a strong, good, fair or low match. While employer clients can set up 'knockout questions" that lead to automatic rejections—for example, asking if a person has legal authorization to work in the U.S. or is available for weekend shifts—the software is designed so employers make the final decisions on candidates who make it through the initial screen, Workday argued in court filings. 'There's no evidence that the technology results in harm to protected groups," the company said. Before his job search, Mobley's career path hadn't been smooth. He was laid off in the recession that followed the Sept. 11, 2001, terrorist attacks and again after the housing meltdown in 2008. After that, he left finance and transitioned to what he viewed as a more recession-proof career in technology, earning an associate degree in network system administration. Still, steady jobs were hard to come by. He spent a year as a contractor at IT firm HPE, hoping the stint would turn into a permanent position. Mobley said he was let go, and he later joined a lawsuit against HPE alleging age and race discrimination. The case was settled in 2020. HPE declined to comment. That job loss led to two years of unemployment, starting in 2017. He applied to more than 100 jobs and found himself on Workday's recruiting platform over and over. He didn't get a single interview, let alone a job. Soon, Mobley felt he discerned a pattern. 'It dawned on me that this must be some kind of server reviewing these applications and turning me down." He worried that hiring software screened him out because it picked up on his age and race through details on his résumé or that it detected his anxiety and depression through personality tests he took as part of some job applications. The frustrations of the job search weighed on his emotional health, credit and retirement savings, he said. He stayed afloat by driving for Uber and working short-term jobs. Mobley eventually did find a job, the old-fashioned way. In 2019, he said, a recruiter for Allstate called him. A phone screen led to an interview with a hiring manager and then an offer. He is now a catastrophe controller, managing the workflow of customers' property and auto damage claims. Mobley said he suspects Workday's software flagged his profile, essentially blackballing him across its entire system, regardless of which company he applied to. Workday disputes that idea, and HR technology experts are skeptical of the theory. Employers customize recruiting software with their own criteria, they say, creating closed systems that shouldn't theoretically speak to each other. But there is evidence that underlying scoring algorithms can shut out certain job seekers, said Kathleen Creel, a computer scientist at Northeastern University who has been following the Workday case. That might happen, she said, through mechanical errors such as misclassifying a previous job title, or by incorporating more complicated algorithmic mistakes that penalize members of a single group or people with certain combinations of characteristics. Such scoring systems can disadvantage qualified workers, according to researchers at Harvard Business School, who have found that the systems effectively screen out millions of workers by scoring them low for all kinds of reasons, such as having gaps in their résumés or not matching every qualification listed on a lengthy job description. The researchers didn't test for illegal discrimination, such as discrimination based on age, gender or race. Since 2022, Workday has built a team focused on ensuring its products meet ethical artificial-intelligence standards. 'Our customers want to know, can I trust these technologies? How were they developed?" Kelly Trindel, who leads the ethical AI team, said at a conference this month at New York University Law School. Still, the company has fought some efforts to regulate automated hiring tools. In 2023, a New York City law went into effect requiring employers that use technology like chatbot interviewing tools and resume scanners to audit them annually for potential race and gender bias, and then publish the results on their websites. When the bill was proposed, Workday argued to loosen some of the rules. If Mobley succeeds, software companies and their customers may be required to do more due diligence and disclosure to ensure they don't enshrine bias. Employment lawyers say any finding of liability could open the door to job seekers also suing employers who use them. 'This isn't a personal vendetta," Mobley said. 'I'm an honest law-abiding person trying to just get a job in an honest way."


Forbes
23-06-2025
- Business
- Forbes
What The Workday Lawsuit Reveals About AI Bias—And How To Prevent It
Workday, Inc is facing a collective-action lawsuit based on claims that the artificial intelligence ... More used by the company to screen job applicants discriminated against candidates. HR finance company Workday, Inc is facing a collective-action lawsuit based on claims that the artificial intelligence used by the company to screen job applicants discriminated against candidates 40 years old and over. In 2024, Derek Mobley filed an employment discrimination lawsuit against Workday, alleging that their algorithm-based job applicant screening system discriminated against him and other applicants based on race, age and disability. Four additional plaintiffs have now accused the company of age discrimination. A Workday spokesperson refuted the claims to HR Drive stating 'This is a preliminary ruling at an early stage of this case, and before the facts have been established. We're confident that once those facts are presented to the court, the plaintiff's claims will be dismissed.' Data compiled by DemandSage estimates that in 2025, 87% of companies use AI for recruitment. Applicant tracking systems like Workable, Bamboo HR, Pinpoint ATS, and Rippling, which are used by employers to help manage the recruitment and hiring process, rely on AI to help streamline and automate the recruitment process. Companies are leaning heavily on AI to make crucial recruitment and hiring decisions, but these tools used so frequently during the employment process are laden with bias. One example was an AI recruiting tool used by Inc's machine-learning specialists, which was found to discriminate against women—the company scrapped the tool in 2018. AI bias is pervasive in recruitment and hiring tools. AI bias is pervasive in recruitment and hiring tools. A 2024 study from the University of Washington revealed racial and gender bias in AI tools used to screen resumes. There can be data bias, which is when AI systems are trained on biased data that can contain an overrepresentation of some groups (white people for example) and an underrepresented of other groups (non-white people for example). This can manifest into an AI tool that ends up rejecting qualified job candidates because it was trained on biased data. There is also algorithmic bias, which can include developer coding mistakes, where a developer's biases become embedded into an algorithm. An example of this is an AI system designed to flag job applicants whose resume's include certain terms meant to signal leadership skills like 'debate team,' 'captain' or 'president.' These key terms could end up filtering out job candidates from less affluent backgrounds or underrepresented racial groups, whose leadership potential might show up in non-traditional ways. Two other types of bias, proxy data bias and evaluation bias, can show up in recruitment and hiring tools. Proxy data bias can be described as the bias that shows up when proxies, or substitutes, are used for attributes like race and gender. An example of this is an algorithm that prioritizes job candidates who attended Ivy League or elite institutions, which may filter out candidates who went to historically Black colleges and universities (HBCUs), community colleges or state schools. Evaluation bias is the bias that results when evaluating the data. An example of this is if an organization is assessing a candidate for culture fit (which is notoriously biased), and trains an AI tool to prioritize job candidates who have particular hobbies listed on their resumes or who communicate in particular ways, which can bias candidates from cultures outside of the dominant norm. An algorithm that prioritizes job candidates who attended Ivy League or elite institutions may ... More filter out candidates who went to historically Black colleges and universities (HBCUs). As more organizations use AI to help with employment decisions, there are several steps that should be taken to mitigate the bias often baked into these tools. First, workplaces that utilize AI tools for hiring, selection and recruitment decisions should demand transparency from vendors to gain a deeper understanding of how the data was trained and what is being done to ensure the data has been audited for bias related to factors like race, gender, age and disability. In addition, companies should request frequent audits from the vendors to assess AI tool for bias. It's important for organizations to partner with experts in ethical AI usage in the workplace to ensure that when AI is integrated into workplace systems, there are safeguards in place. For example, an expert may assess whether job candidates from HBCUs are being filtered out of the talent pool. When using AI in any capacity in the workplace, it's helpful to seek guidance from your legal counsel or legal team to ensure AI tools are compliant with local and state laws. Transparency also applies to workplaces—organizations should be candid about AI usage during the employment process and should always consider alternative evaluation methods. AI, in many ways, has made our lives easier, more convenient and more accessible but there are valid concerns when it comes to AI usage and fairness. If equity is the goal and your workplace uses AI for recruitment and hiring decisions, it's good to trust the AI (to a reasonable extent) but always verify. AI is a powerful way to complement the employment process but should never replace human oversight.