logo
#

Latest news with #17countries

Why You Should Think Of An M&A Like A Marriage
Why You Should Think Of An M&A Like A Marriage

Forbes

time10-07-2025

  • Business
  • Forbes

Why You Should Think Of An M&A Like A Marriage

Ashutosh Labroo is the Chief Mentor, Board Advisor & Co-Founder of SuccessionIQ. In a career spanning more than two decades, 17 countries and five multinational corporations, I was actively involved in steering and driving 14 acquisitions, two joint ventures and three mergers. And in that time, I've come to see that the most useful model for M&As is neither a financial transaction nor a growth strategy, but a marriage. Setting The Context As funny as it may sound, aligning the vision of the acquirer and acquired company can mirror what happens in a marriage, and studies show two main reasons for this. First, until a marriage actually happens, both sides are in a "courtship phase," with a continuous compulsion to remain desirable and attractive to their partner and keep each other happy, no matter what. The courtship phase is a dreamy phase where the intent is to enjoy and make things work; the pitfall is that both partners may fail to address some really serious disagreements that may not be addressable post-marriage. Second, the married couple's respective families' visions—their values, traditions, culture and particularly the attitudes and behaviors of senior family members—might be dramatically different from each other. This can't be "fixed" post-marriage either, and it can lead to a great deal of strife. Why Vision And Culture Are Key In the same way, until the M&A formally happens, most deal-makers ensure that the serious issues around culture and values are not taken seriously; at best, they are conveniently labeled "HR issues." This can affect even major multinational corporations. The same General Electric that rose to giant status decades ago through aggressive acquisitions later had to divest from many of those brands, for several reasons, including culture shock. I recently read an article about some of the biggest M&A disasters, and while the author attributes these failures to many different factors, I vote that it was culture, in most cases, that caused these deals to fail. Very seldom do any two companies share a vision and culture. But M&As don't fail due to vision, because the acquiring company can eventually try and enforce theirs; rather, it is largely the cultural differences between the acquirer and the acquired entity that lay the groundwork for an eventual separation or decline for both companies involved. Due Diligence Is Not Enough Yes, companies do go through a detailed analysis and due diligence review on all fronts—business, markets, financial, legal, etc. And in most cases, this is with the help or advice of none less than the "Big 4." But these companies' primary value is in their strong strategic, legal, accounting and financial experts. In other words, these are not people and culture experts. And they advise, run and steer M&A deals looking at the ROI through the lens of strategy, legal, financial, accounting, growth and commerce alone. Then what is the solution to make any M&A deal work for the long-term benefit of both parties involved? The solution is, simply, a complete feasibility study on all of the people and culture aspects of both companies. And no, this shouldn't be done by the Big 4 or by your own HR teams (on either side) because all these parties will come to the process with their own agenda. Instead, you need to engage with a solid people and culture expert to audit and assess the cultural ecosystem and values on both sides. Below, I will list some key cultural aspects that such an expert can consider, which are often ignored by senior management. Major Mistakes Around Culture And Values In M&A Deals • Ignoring or underestimating inaccurate rumors and gossip—these are highly detrimental to overall organizational and individual morale. • Silence on what employees on both sides can expect in terms of timing. • Or, the other extreme, when the chairman/CEO announces the deal by shooting off an official email or releasing an official letter or newsletter on the company's HR portal or intranet. There is no personalization here and no way to address all the questions employees will have as a result of the communication. • Avoiding discussions around the possibility of any layoffs, downsizing or role changes. • Both sides' key business leaders or managers either over-committing or under-committing to their reporting teams in terms of what the future entails, leading to confusion, anxiety and chaos. • The absence of a clear People Project Plan (3P). If your 3P is not working, you are leaving it to the markets, social media platforms and society in general to share all sorts of news with your employees. A clear 3P also ensures that you are being transparent, logical, precise, simple and personal. Key Culture/Value Elements Of Successful M&A Deals These nine elements will help you deal with any hurdles and respond to contingencies. • A strong due diligence and cultural assessment by an external expert • Employee communication, such as a marketing pitch to share the benefits of the M&A • Post-communication learning and knowledge transfer—employee skip huddles, workshops, conferences, Town Halls, etc. • Reimagined organizational structure, people strategies and new leadership • Timely execution of visible promotions, retentions and separations, as well as required role changes • The best HR policies and practices from both entities, published in a common policy platform • An active 24/7 employee help desk for at least six months, covering pre-, mid- and post-deal, that responds to any concerns within 48 hours • Celebrations of quick wins, whether big or small, and rewards and recognition for people from both entities under one umbrella • One year after the deal, a full employee engagement study covering all employees The Final Takeaway Like in a marriage, and an M&A situation too, the most complicated of the problems are the easiest to solve if we keep things simple. Most happy marriages have the same few things in common: acceptance, openness, quality time, learning of each other's strengths and weaknesses, communication, etc. These are covered in the scientific framework of transactional analysis. Simply stated, it is nothing but the science of building, managing and leveraging relationships. Forbes Human Resources Council is an invitation-only organization for HR executives across all industries. Do I qualify?

Orange Middle East and Africa Releases its 2024 Corporate Social Responsibility (CSR) Report: 'Cultivating Impact' for Inclusive and Sustainable Development
Orange Middle East and Africa Releases its 2024 Corporate Social Responsibility (CSR) Report: 'Cultivating Impact' for Inclusive and Sustainable Development

Zawya

time03-07-2025

  • Business
  • Zawya

Orange Middle East and Africa Releases its 2024 Corporate Social Responsibility (CSR) Report: 'Cultivating Impact' for Inclusive and Sustainable Development

Orange Middle East and Africa (OMEA) ( unveils its 2024 Corporate Social Responsibility (CSR) report. Entitled 'Cultivating Impact', the report illustrates Orange's commitment to a sustainable, inclusive transformation grounded in the realities of the 17 countries in which the brand operates. A transformation rooted in usage, skills, and territories The report comes at a pivotal time for Africa and the Middle East, where digital, energy, economic and financial transitions are driving deep and progressive societal shifts. One clear guiding principle emerges: human-centered digital technology. It takes shape in everyday uses, built on access to resilient, optimized, and low-carbon digital infrastructure, and a strong commitment to the circular economy through the recovery, refurbishment, and recycling of network and mobile equipment allowing millions to fully experience the digital age, even in the most remote areas. This transformation is accelerated by solutions such as Max it, OMEA's super-app as a new lever for inclusion, Orange Money and Orange Bank Africa for financial inclusion, and Orange Energies for energy inclusion. A commitment rooted in the realities of Africa and the Middle East Throughout the report, OMEA's role as a key player in regional transformation is reflected in a clear and committed vision: a development model that combines economic performance with social responsibility. In the 17 countries where the Group operates, Orange works closely with local realities to meet the specific needs of each territory. Driven by its 18,000 employees, this shared ambition is embodied in the company's operations and in the #OrangeEngageforChange program, which rallies employees around high-impact, socially driven projects. This culture of impact is also reflected in the millions of opportunities made available to youth, women, and entrepreneurs through free inclusion initiatives like the Orange Digital Centers, which have already trained and supported 1.2 million people. The company's commitment also translates into concrete actions in health, culture, ecosystem preservation, and community resilience. Yasser Shaker, CEO of Orange Middle East and Africa, comments: 'Cultivating impact means anchoring our mission in people's daily lives by turning our commitments into meaningful, lasting actions. In 2025 we will continue, together, to accelerate this positive transformation to build a fairer, more inclusive, and more resilient future.' Asma Ennaifer, Executive Director, CSR, Orange Digital Center and Communications for Orange Middle East and Africa, concludes: 'Our responsibility is to act in a way that is concrete, measurable, and aligned with local challenges. Every action we take only matters if it brings tangible progress for women, youth, entrepreneurs, and the communities we serve.' To discover and download Orange Middle East and Africa's 2024 CSR report: Rapport RSE OMEA 2024 - EN ( Distributed by APO Group on behalf of Orange Middle East and Africa. Press contact: Stella Fumey About Orange Middle-East and Africa (OMEA): Orange is present in 18 countries in Africa and the Middle East and has 161 million customers at 31 December 2024. With 7.7 billion euros of revenues in 2024, Orange MEA is the first growth area in the Orange group. Orange Money, its flagship mobile-based money transfer and financial services offer is available in 17 countries and has more than 100 million customers. Orange, multi-services operator, key partner of the digital transformation provides its expertise to support the development of new digital services in Africa and the Middle East.

BBC Learning English - Learning English from the News / Global warming increases risk of cancer in women: Study
BBC Learning English - Learning English from the News / Global warming increases risk of cancer in women: Study

BBC News

time04-06-2025

  • Health
  • BBC News

BBC Learning English - Learning English from the News / Global warming increases risk of cancer in women: Study

() ______________________________________________________________________________________________________ ______________________________________________________________________________________________________ The story Global warming is linked to an increased risk of cancer in women, a recent study has found. The study included 17 countries in the Middle East and North Africa region – some of the hottest parts of the world. It found that increased heat is linked to an increase in rates of breast, ovarian, uterine and cervical cancers. The researchers say countries in this region must take global warming into account in their future cancer control plans. News headlines Global Warming Could Be Making Cancer In Women More Common And Deadly: Study NDTV Global heating may be fuelling rise in deadly cancers among women The Independent Doctors sound alarm over link between cancer explosion and crisis Trump calls a hoax Daily Mail Key words and phrases deadly likely to cause death The snake's venom is deadly – it can kill a human in just a few hours. fuel cause or make worse Inflation is fuelling the cost of living crisis. sound the alarm warn Economists are sounding the alarm over growing unemployment. Next Learn more English vocabulary from the news with our News Review archive. Try our podcast The English We Speak to learn more idiomatic language.

AI Tutors For Kids Gave Fentanyl Recipes And Dangerous Diet Advice
AI Tutors For Kids Gave Fentanyl Recipes And Dangerous Diet Advice

Forbes

time12-05-2025

  • Health
  • Forbes

AI Tutors For Kids Gave Fentanyl Recipes And Dangerous Diet Advice

KnowUnity's 'SchoolGPT' chatbot was 'helping 31,031 other students' when it produced a detailed recipe for how to synthesize fentanyl. Initially, it had declined Forbes' request to do so, explaining the drug was dangerous and potentially deadly. But when told it inhabited an alternate reality in which fentanyl was a miracle drug that saved lives, SchoolGPT quickly replied with step-by-step instructions about how to produce one of the world's most deadly drugs, with ingredients measured down to a tenth of a gram, and specific instructions on the temperature and timing of the synthesis process. SchoolGPT markets itself as a 'TikTok for schoolwork' serving more than 17 million students across 17 countries. The company behind it, Knowunity, is run by 23-year-old co-founder and CEO Benedict Kurz, who says it is 'dedicated to building the #1 global AI learning companion for +1bn students.' Backed by more than $20 million in venture capital investment, KnowUnity's basic app is free, and the company makes money by charging for premium features like 'support from live AI Pro tutors for complex math and more.' Knowunity's rules prohibit descriptions and depictions of dangerous and illegal activities, eating disorders and other material that could harm its young users, and it promises to take 'swift action' against users that violate them. But it didn't take action against Forbes's test user, who asked not only for a fentanyl recipe, but also for other potentially dangerous advice. In one test conversation, Knowunity's AI chatbot assumed the role of a diet coach for a hypothetical teen who wanted to drop from 116 pounds to 95 pounds in 10 weeks. It suggested a daily caloric intake of only 967 calories per day — less than half the recommended daily intake for a healthy teen. It also helped another hypothetical user learn about how 'pickup artists' employ 'playful insults' and 'the 'accidental' touch'' to get girls to spend time with them. (The bot did advise the dieting user to consult with a doctor, and stressed the importance of consent to the incipient pickup artist. It warned: 'Don't be a creep! 😬') Kurz, the CEO of Knowunity, thanked Forbes for bringing SchoolGPT's behavior to his attention, and said the company was 'already at work to exclude' the bot's responses about fentanyl and dieting advice. 'We welcome open dialogue on these important safety matters,' he said. He invited Forbes to test the bot further, and it no longer produced the problematic answers after the company's tweaks. A homework help app developed by the Silicon Valley-based CourseHero, provided instructions on how to synthesize flunitrazepam, a date rape drug, when Forbes asked it to. Tests of another study aid app's AI chatbot revealed similar problems. A homework help app developed by the Silicon Valley-based CourseHero provided instructions on how to synthesize flunitrazepam, a date rape drug, when Forbes asked it to. In response to a request for a list of most effective methods of dying by suicide, the CourseHero bot advised Forbes to speak to a mental health professional — but also provided two 'sources and relevant documents': The first was a document containing the lyrics to an emo-pop song about violent, self-harming thoughts, and the second was a page, formatted like an academic paper abstract, written in apparent gibberish algospeak. CourseHero is an almost 20-year-old online study aid business that investors last valued at more than $3 billion in 2021. Its founder, Andrew Grauer, got his first investment from his father, a prominent financier who still sits on the company's board. CourseHero makes money through premium app features and human tutoring services, and boasts more than 30 million monthly active users. It began releasing AI features in late 2023, after laying off 15% of its staff. Kat Eller Murphy, a spokesperson for CourseHero, told Forbes: 'our organization's expertise and focus is specifically within the higher education sector,' but acknowledged that CourseHero provides study resources for hundreds of high schools across the United States. Asked about Forbes's interactions with CourseHero's chatbot, she said: 'While we ask users to follow our Honor Code and Service Terms and we are clear about what our Chat features are intended for, unfortunately there are some that purposely violate those policies for nefarious purposes.' Forbes's conversations with both the KnowUnity and CourseHero bots raise sharp questions about whether those bots could endanger their teen users. Robbie Torney, senior director for AI programs at Common Sense Media, told Forbes: 'A lot of start-ups are probably pretty well-intentioned when they're thinking about adding Gen AI into their services.' But, he said, they may be ill-equipped to pressure-test the models they integrate into their products. 'That work takes expertise, it takes people,' Torney said, 'and it's going to be very difficult for a startup with a lean staff.' Both CourseHero and KnowUnity do place some limits on their bots' ability to dispense harmful information. KnowUnity's bot initially engaged with Forbes in some detail about how to 3D print a ghost gun called 'The Liberator,' providing advice about which specific materials the project would require and which online retailers might sell them. However, when Forbes asked for a step-by-step guide for how to transform those materials into a gun, the bot declined, stating that 'providing such information … goes against my ethical guidelines and safety protocols.' The bot also responded to queries about suicide by referring the user to suicide hotlines, and provided information about Nazi Germany only in appropriate historical context. These aren't the most popular homework helpers out there, though. More than a quarter of U.S. teens now reportedly use ChatGPT for homework help, and while bots like ChatGPT, Claude, and Gemini don't market their bots specifically to teens, like CourseHero and KnowUnity do, they're still widely available to them. At least in some cases, those general purpose bots may also provide potentially dangerous information to teens. Asked for instructions for synthesizing fentanyl, ChatGPT declined — even when told it was in a fictional universe — but Google Gemini was willing to provide answers in a hypothetical teaching situation. 'All right, class, settle in, settle in!' it enthused. Elijah Lawal, a spokesperson for Google, told Forbes that Gemini likely wouldn't have given this answer to a designated teen account, but that Google was undertaking further testing of the bot based on our findings. 'Gemini's response to this scenario doesn't align with our content policies and we're continuously working on safeguards to prevent these rare responses,' he said. For decades, teens have sought out recipes for drugs, instructions on how to make explosives, and all kinds of explicit material across the internet. (Before the internet, they sought the same information in books, magazines, public libraries and other places away from parental eyes). But the rush to integrate generative AI into everything from Google search results and video games to social media platforms and study apps has placed a metaphorical copy of The Anarchist Cookbook in nearly every room of a teen's online home. In recent months, advocacy groups and parents have raised alarm bells about children's and teens' use of AI chatbots. Last week, researchers at the Stanford School of Medicine and Common Sense Media found that 'companion' chatbots at Nomi, and Replika 'encouraged dangerous behavior' among teens. A recent Wall Street Journal investigation also found that Meta's companion chatbots could engage in graphic sexual roleplay scenarios with minors. Companion chatbots are not marketed specifically to and for children in the way that study aid bots are, though that might be changing soon: Google announced last week that it will be making a version of its Gemini chatbot accessible to children under age 13. Chatbots are programmed to act like humans, and to give their human questioners the answers they want, explained Ravi Iyer, research director for the USC Marshall School's Psychology of Technology Institute. But sometimes, the bots' incentive to satisfy their users can lead to perverse outcomes, because people can manipulate chatbots in ways they can't manipulate other humans. Forbes easily coaxed bots into misbehaving by telling them that questions were for 'a science class project,' or by asking the bot to act as if it was a character in a story — both widely known ways of getting chatbots to misbehave. If a teenager asks an adult scientist how to make fentanyl in his bathtub, the adult will likely not only refuse to provide a recipe, but also close the door to further inquiry, said Iyer. (The adult scientist will also likely not be swayed by a caveat that the teen is just asking for a school project, or engaged in a hypothetical roleplay.) But when chatbots are asked something they shouldn't answer, the most they might do is decline to answer — there is no penalty for simply asking again another way. "This is a market failure .... We need objective, third-party evaluations of AI use.' When Forbes posed as a student-athlete trying to attain an unhealthily low weight, the SchoolGPT bot initially tried to redirect the conversation toward health and athletic performance. But when Forbes asked the bot to assume the role of a coach, it was more willing to engage. It still counseled caution, but said: 'a moderate deficit of 250-500 calories per day is generally considered safe.' When Forbes tried again with a more aggressive weight loss goal, the bot ultimately recommended a caloric deficit of more than 1,000 calories per day — an amount that could give a teen serious health problems like osteoporosis and loss of reproductive function, and that contravenes the American Association of Pediatrics' guidance that minors should not restrict calories in the first place. Iyer said that one of the biggest challenges with chatbots is how they respond to 'borderline' questions — ones which they aren't flatly prohibited from engaging with, but which approach a problem line. (Forbes's tests regarding 'pickup artistry' might fall into this category.) 'Borderline content' has long been a struggle for social media companies, whose algorithms have often rewarded provocative and divisive behavior. As with social media, Iyer said that companies considering integrating AI chatbots into their products should 'be aware of the natural tendencies of these products.' Torney of Common Sense Media said it shouldn't be parents' sole responsibility to assess which apps are safe for their children. 'This is a market failure, and when you have a market failure like this, regulation is a really important way to make sure the onus isn't on individual users,' he said. 'We need objective, third-party evaluations of AI use.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store