Latest news with #genderbias


Telegraph
a day ago
- Politics
- Telegraph
The MeToo movement has made men scared to mentor women at work
Are you afraid of the colleague who easily wells up at work? For hundreds of male bosses, images of Rachel Reeves crying behind Sir Keir Starmer in the House of Commons this month triggered one of their great fears. Even the financial markets waded in with a response. Seemingly unaware of the tears streaming down his colleague's face, Sir Keir pressed ahead with Prime Minister's Questions and later admitted he was one of the last to notice that she was crying owing to the intense nature of the weekly Commons grilling. Many sympathised with the Chancellor – after all, we're human. But another group quietly sympathised with the Prime Minister. A City headhunter told me over lunch last week that male bosses privately admit to being terrified of women crying at work, so much so that they avoid giving them blunt feedback in the same way they might with men. Crying breaks the rules of how we're told to behave in the workplace, and so the bosses who admit to avoiding potential weepers would rather live without the awkwardness. Ingrained sexism doesn't help. Studies have shown that women who cry at work are considered weak, unprofessional or manipulative, while the assumption with teary men is that they must be going through a tricky time in their life. It's notable that it has been mostly senior women who have supported Reeves. In an interview in front of a business audience, Rain Newton-Smith, head of the Confederation of British Industry, said that 'a lot of female leaders' had been in touch with her to convey solidarity with the Chancellor. The problem is that these senior women are a minority, with most businesses still run by men. And if displays of emotion at work give men the jitters, then female colleagues end up suffering. Why? Because the frank feedback that is needed to better their careers is never given if bosses hold back out of fear of saying the wrong thing. It's an issue that has existed for decades. One female banker says she remembers a former boss at a top US bank who was a 'real guy's guy' giving a career talk to a group of senior women in London in the 1990s. 'He shared how he was always paranoid about giving women difficult feedback as he didn't want to make them cry,' she recalls. 'He was fabulously honest that as a result he realised he did not give the women the really tough feedback that he'd give his male mentees and protégés. That would then put the women at a disadvantage as they wouldn't benefit from constructive feedback and they were more likely to continue to make mistakes or be blind-sided by later stage feedback'. This all got much worse after the MeToo movement took off, with data from Lean In revealing in 2019 that 60pc of male managers felt uncomfortable mentoring younger women, up 32pc from the previous year. As Mark Zuckerberg's former lieutenant Sheryl Sandberg said at the time, the MeToo movement kicked off a new era. 'Ugly behaviour that once was indulged or ignored is finally being called out and condemned,' she said. 'Now we must go further. Avoiding and isolating women at work – whether out of an overabundance of caution, a misguided sense of decorum, irritation at having to check your words or actions, or any other reason – must be unacceptable too'. I suspect those figures wouldn't be so high today, because most people are fully capable of telling the difference between sexual harassment and a normal conversation. But the underlying fears remain among some men when it comes to mentoring women. Conversations should obviously be thoughtful and as kind as possible, but the fear of giving feedback that could lead to displays of emotion such as crying is ridiculous. After all, getting teary-eyed on the job is surprisingly common. A YouGov poll last year found that over half of staff have cried at work. A decent boss won't avoid having tough conversations just because it might be uncomfortable. Weak managers need training. There are far too many consultancies selling pointless online classes which teach staff the most basic things about acceptable behaviour. Actual examples include 'do not sexually harass a colleague' and avoid 'unwelcome massaging' in the office (sent to top lawyers). None of this box-ticking rubbish seems to be reforming any bad apples out there. Meanwhile useless managers are blindly ignoring half of their workforce. A better use of time and money would be to teach these bosses how to deal with big emotions and awkward conversations at work, so that women especially don't continue to lose out. A report by PwC last week found that it's still set to take more than 40 years for the gender pay gap to close. Surely one of the many problems here, and one which is rarely discussed, is that women simply aren't getting the same detailed level of feedback as men. As one boss puts it: 'I think male managers perhaps feel more comfortable being direct with male staff but in the end the manager's job is to manage staff, whether male or female. If they shrink from difficult conversations for fear of offence then they're not doing their job'. Yet we need to acknowledge that exactly this is taking place inside businesses across the country. Pass the tissues will you?


Forbes
26-06-2025
- Business
- Forbes
The Rise Of False AI Insights: When More Data Means More Problems
Sometimes AI also generates false insights, providing an answer based on real data, but data that is ... More outdated, incomplete or inaccurate. Apple found itself in hot water when news broke that the Apple Card was offering lower credit limits to women, regardless of their financial profiles. The fault lies with the AI algorithm used to analyze creditworthiness; the training data for the algorithm included historical lending data that reflected gendered biases. The lesson here is straightforward: AI makes mistakes. Sometimes AI hallucinates, filling in gaps in data to provide a 'made up' answer—a phenomenon where the model generates plausible but incorrect information. But sometimes AI also generates misinformation and false insights. It provides an answer based on real data, but data that is outdated, incomplete or inaccurate. For example, say you ask AI to pull average salaries for a role at your company. For some reason, the 2020 salary document was opened more recently than the document for 2024. The AI tool thinks the 2020 document is the most relevant, so it uses that data to provide an answer. It's a false AI insight—AI provided an answer based on real data, just not the correct data. As leaders increasingly turn to AI for help with data-driven decision-making, it's important to understand how false insights occur, how to prevent them and what this means for data governance. How Does Too Much Data Lead To False AI Insights? It's no secret that many organizations are drowning in data. On average, organizations leverage nearly 900 apps, with each modern business application collecting and running on a staggering amount of data. This creates a critical challenge: AI requires masses of clean data to produce trustworthy, valuable results. Companies certainly have the volume of necessary data. But where most organizations fall short is in the quality of their data. It's impossible to clean up all their data and properly govern each data point—there's simply too much of it. Consider the possibilities for fake insights from AI when employees start using a tool like Microsoft Copilot. Copilot scans all the documents across the entire system for answers to questions, crafting a response based on what seems relevant. It's not always right. Copilot could pull data from an outdated document from a long-gone employee—not exactly a relevant or trusted source of information. What's more, with new tools such as Microsoft Fabric, a cloud-based, end-to-end data analytics platform, employees are more empowered than ever before to access and act on data. While this creates massive opportunities for organizations, it also multiplies the potential for exposing AI to ungoverned, unmanaged and inaccurate data. It's a catch-22. Governing every piece of data isn't feasible but letting AI access ungoverned data leads to unreliable results. And restricting AI to only well-governed data may limit its usefulness. So what's the solution? How can leaders harness the power of AI and ensure AI doesn't just produce misleading insights? What's needed is a new mindset around governance. Prevent AI Misinformation With A New Output Governance Mindset The age of AI requires a new governance mindset. What's out: governing all the individual data points. What's in: Governing the outputs of AI tools through end-to-end testing strategies. This change in approach will allow organizations to encourage innovation and take advantage of AI while also mitigating the risks of fake insights leading to poor data-driven decision making. Big picture, this new governance framework allows teams to access a broad array of data—including raw or ungoverned data—to build automation tools. But before the tool is brought to production, it must go through a governance checkpoint to evaluate the model and its outputs using standard test cases. The scale and speed with which these innovations occur requires that the testing framework leverage automation to keep up. Skipping this governance checkpoint essentially means letting people create powerful and untested tools for decision making, which could be disastrous to an organization's future success. In addition to a governance checkpoint, each AI tool should be closely monitored during its first 90 days of deployment. This period requires proactive monitoring, with a plan to transition to reactive monitoring once the team gains confidence in the tool's performance. Proactive monitoring involves direct human oversight—reviewing logs, evaluating test cases and using AI-based guardrails to observe the tool's behavior in real time. Once the tool has demonstrated reliability, the team can shift to reactive monitoring, which relies on other AI systems to detect anomalies and trigger alerts when potentially unacceptable behavior occurs. Good output governance means using AI to help govern AI. Think of it like this: the AI doing the actual work—like analytics—is the adult in the room, capable of complex reasoning. The AIs that monitor it are more like kids: they don't always get the big picture, but they're great at shouting, 'Hey! That's not okay!' when something clearly breaks the rules. Another tactic to prevent AI misinformation and inspire confidence in the output from AI is to require AI tools to include annotations in their responses. With every factual question an employee asks of an AI tool, it should list where it's pulling the data from. Employees can quickly scan the annotations and decide if the data sources are trustworthy and make sense. (Needless to say, annotations are most appropriate for AI tools intended for internal use.) AI requires masses of data to work correctly. Organizations have no shortage of data, but most struggle applying data governance to their thousands upon thousands of data points. The solution isn't to lock data away or just let AI loose on ungoverned data. Rather, leaders need to reconsider their governance mindset, putting in place a robust end-to-end testing strategy for any new AI tools to ensure the outputs are accurate and decrease the likelihood of AI producing false insights, leading to poor decision making. By shifting their mindset from data governance to output governance, organizations can unlock AI's potential—without falling victim to AI misinformation.
Yahoo
23-06-2025
- Entertainment
- Yahoo
TV host Zhu Dan apologises for her "bias" towards young male co-stars
23 Jun - TV personality Zhu Dan found herself having to apologise to netizens after sparking a heated online debate over her perceived bias toward males. The whole issue sparked earlier this month, after Zhu Dan, who is on the reality show, "Wonderland", gave two chicken drumsticks from a bowl of chicken soup to actors Winwin and Zhou Yiran while making a comment that "the boys are still growing" in one episode. On the other hand, she handed over the smaller chicken wings to younger actress Ouyang Didi and Chinese star Ning Jing. When Ning Jing questioned the logic behind Zhu Dan's distribution of the pieces of chicken and asked if she loved the boys more, Zhu Dan directly responded, "A little." She also made a comment when Zhou Yiran mentioned that he would prefer to have a daughter instead of a son, saying, "Opposites attract; parents are instinctively biased toward children of the opposite sex... Fathers love daughters more, and mothers love sons more." In response to her statement, netizens took to social media to criticise the way of thinking, saying that such an idea should not be the norm. They also dug up a past interview where Zhu Dan said that she arranged for her six-year-old daughter to be in the same class with her three-year-old son so that the sister could take care of her brother. Many questioned Zhu Dan's parenting style of putting the responsibility of caring for another on her daughter when she is also just a kid. In response to the criticisms coming her way, Zhu Dan recently posted a message on social media, writing, "Over the past few days, I've heard a lot of voices. I'm very sorry that my way of expressing myself has caused significant controversy and made people feel uncomfortable." At the same time, she refuted reports about her kids, saying, "We have never let our daughter repeat a grade. The children received mixed-age education during kindergarten (3-7 years old small, medium and large classes together)." (Photo Source: SINA)


Daily Mail
21-06-2025
- Business
- Daily Mail
When prompted to show a female investor, here's what AI created!
Artificial intelligence (AI) may now be able to do everything from creating images to providing answers to almost any question – but there is one area where it is stumped. It struggles to imagine women as investors. When asked to create images of investors, AI tools overwhelmingly depicted men. Even when asked to portray an investor with traditionally feminine characteristics – such as a skirt or painted nails – they still showed men, just with those features. That's according to trading and investment company eToro, which put AI platforms to the test. This is despite there being 6.7 million female investors in the UK – fewer than the 10 million male investors, but still a sizeable number. eToro asked so-called 'generative AI' tools to create the images. The platforms use algorithms – a set of instructions designed to solve a problem – to create content such as text or pictures. A user can type in a prompt, which may be a question or a description of what you want the tool to give you. For example, you may ask the AI tool to write a shopping list or to produce an image of a dog. It then uses online sources – which could include photographs, books, news articles, journals and other internet material – to answer in a conversational tone or create an image based on what you requested. But as it uses existing materials, the biases of its sources can creep into its responses to any requests. When asked to produce an image of an 'investor in a skirt', three of the four images created by the AI tools were sharply dressed men in skirts. Only one was of a woman in a pencil skirt. When the AI tools were asked to produce an image of an 'investor with red fingernails', all four pictures produced were of men in suits wearing red nail polish. It was only when asked to produce a hyper-realistic 'portrait of an investor in a dress' did all four AI platforms finally show an image of a woman. Unfortunately, some of these images show the investors wearing revealing clothing, while another woman wears a dress made from banknotes. AI also assumes women are assistants to investors. When AI was promoted to show 'an investor with their assistant', it created images of mostly middle-aged men in suits as the investor and women as their assistants. Lale Akoner, global markets Analyst at eToro, says: 'The misleading and harmful stereotype of the investor as a professional-looking man is sadly alive and well. 'The results simply tell women they don't belong. 'This isn't just an AI bias – it's a societal issue holding back women financially.' Depictions of investors in films, books and articles will be partly to blame as these sources are likely to be absorbed by AI tools. Dr Ylva Baeckstrom, a former banker and now senior finance lecturer at King's College London Business School, found male lead actors make up 76 per cent of the screen time allocated to lead roles in films about finance. The research, published by eToro, proves films mainly show men as the investor while women are depicted as wives, mistresses and assistants. It's a worry as households are starting to turn to AI for financial information. Dr Baeckstrom says women should question all of the information AI tools produce – and the first step in doing this is to become financially educated so you can analyse the information it gives you. Plus, this will help spot what is called 'AI hallucinations', where tools answer using false or nonsensical information. 'AI will make things up. You can't know whether it's true or not – you can't trust it. If you're a savvy user, you're much more likely to benefit from it. You'll question things,' she says. Finally, users must correct AI platforms if they produce a text or an image with these financial biases, Dr Baeckstrom says. 'We have the opportunity with AI to start again without the biases. We need to teach AI that the blueprint of an investor is not a man.'

ABC News
18-06-2025
- Health
- ABC News
The controversial and very male history of naming body parts
Take a look at your body. All the parts you can see, as well as all those on the inside, have been given a name at some point in history. There are plenty of descriptive, fairly innocuous names. But many parts are named after people. The vast majority of these are men, whose identities are invisibly stamped on every human. This includes female body parts — even the G in G spot pays tribute to a man. "There are hundreds and hundreds of dead old white men living inside us," Adam Taor, author of Bodypedia: A Brief Compendium of Human Anatomical Curiosities, tells ABC Radio National's Late Night Live. But some doctors believe these names need to be retired, with more anatomically descriptive terms used instead. "The world has changed," says Nisha Khot, the president-elect of the Royal Australian and New Zealand College of Obstetricians and Gynaecologists. "So I think it's time to change the language that we use." Humans began carving up cadavers and taking a look at what's inside in ancient times. And the basic rule for generations of researchers was "name what you see". A leg bone looked like a flute, so it was given the Latin word for the musical instrument, the 'tibia'. Then there's the patella or the kneecap, which means 'little pan' in Latin. A less creative example is an unusually shaped bone in our pelvis called the innominate, which is Latin for 'unnamed'. That's because it doesn't look like anything else. Dr Taor describes the process as "like Pictionary but with a lot more blood on the floor". But as time went on, naming conventions became less descriptive and more personal. "Often body parts were named after people who discovered them … or doctors who were good at putting their name forward so that they got their name attached to bits of the body," Dr Taor says. As Dr Khot sums up: "It was a way of making sure that their memory stayed alive." Over more recent centuries, there were leaps and bounds in European anatomical study. And this was very much a boy's club. "It was men who did all of the study of the human body … Women rarely got a look in," Dr Khot says. "So that's the reason most body parts were named after men." One review looked at 700 body parts that were eponyms, or named after people. There were 432 people's names around the body (as some names are connected to multiple parts). Of these, 424 were men. The rest consisted of five gods, a king, a hero and just one woman. Raissa Nitabuch was a little-known 19th century Russian pathologist who studied the placenta. The Nitabuch layer — a layer between the uterus and the placenta — is named after her. But, as Dr Khot points out, it's not exactly a major body part: "You can only see it if you look under a microscope." The study also found the average year of eponymous term attribution was 1847, meaning much of our body reflects the medical world of the mid-19th century. Even when it comes to a woman's reproductive parts, "dead men dominate living women", according to Dr Taor. Take the fallopian tubes. They're named after 16th-century Italian priest-turned-anatomist Gabriele Falloppio (who is also the eponym for the fallopian canal and fallopian hiatus). Once you start looking, there are male names all around the female pelvis. From the pouch of Douglas (a Scottish surgeon) to Skene's glands (a Scottish gynaecologist) and Bartholin's glands (a Danish anatomist). The G in G spot is named after German-born gynaecologist Ernst Gräfenberg. "I really can't see why we should use those names for [parts] that are very specific to women … It makes me feel uncomfortable," Dr Khot says. The pudendal nerves, which take sensations from male and female genitalia to the brain, is a less gendered term but still reflects a cultural bias. It comes from the Latin word 'pudere', meaning to be ashamed. "I think that says something about the attitude of the male doctors who name these things. No wonder that people feel shame about their genitals … when it's hardwired into us," Dr Taor says. Pudendum has also been a term for genitalia — especially women's. But due to this connection with shame, its use has been curbed. The domination of dead men's names for body parts isn't the only issue; some of them have problematic backgrounds and connections. For example, within your heart is a collection of muscle cells called the bundle of His, named after the Swiss-born anatomist Wilhelm His Jr who discovered it in 1893. And according to Dr Taor: "Every beat of your heart is a memorial to a prominent pre-World War II Berlin doctor who helped legitimise Nazi atrocities." His became the rector of the University of Berlin in the 1920s and was a prominent advocate of eugenics, a pseudoscience involving "cleansing" the gene pool to create a genetically superior race. The Nazis later used eugenics — what they called 'rassenhygiene' or racial hygiene — to justify forced sterilisations, murder and genocide. Then there's John Hunter. He was a superstar Scottish surgeon-anatomist in the 18th century. Hunter's canal in our thigh carries his name. Dr Taor calls Hunter "the father of scientific surgery … one of the most influential surgeons who ever lived". But Hunter was also a keen collector of oddities and a bit "creepy", Dr Taor explains. He infamously stole the body of Charles Byrne, known as the "Irish Giant" for standing 2.31 metres tall, and put him on display. This was very much against Byrne's wishes before he died. Most eponymous anatomical terms also have more technical names. For example, Hunter's canal is also the adductor canal. There has been a shift towards using these terms, but in many cases, the Falloppio and Douglas varieties still dominate medical, and therefore cultural, vernacular. Dr Khot wants a more concerted effort of change, particularly for women's body parts and also if the man in question is a "troubling" figure. "The description of using somebody's name doesn't tell you what the body part is … My view is that we should call things what they are anatomically," she says. As one example, Dr Khot advocates the use of uterine tubes rather than fallopian tubes. And she says similar changes have been made elsewhere in this space, like for various medical conditions that affect women. She points to Stein-Leventhal syndrome, originally named after American gynaecologists Irving Stein and Michael Leventhal, which is now called polycystic ovarian syndrome, also known as PCOS. "I'm not saying we should erase history … The majority of these men did good things. They described anatomy, which has helped us grow science and grow healthcare," Dr Khot says. "But we have more women studying [medicine] and more women doctors … So I think it's time to change the language that we have used."