logo
#

Latest news with #LLM-fueled

A MAGA bot network on X is divided over the Trump-Epstein backlash
A MAGA bot network on X is divided over the Trump-Epstein backlash

NBC News

time2 days ago

  • Politics
  • NBC News

A MAGA bot network on X is divided over the Trump-Epstein backlash

A previously unreported network of hundreds of accounts on X is using artificial intelligence to automatically reply to conservatives with positive messages about people in the Trump administration, researchers say. But with the MAGA movement split over the administration's handling of files involving deceased sex offender Jeffrey Epstein, the accounts' messaging has broken, offering contradictory statements on the issue and revealing the LLM-fueled nature of the accounts. The network, tracked for NBC News by both the social media analytics company Alethea and researchers at Clemson University, consists of more than 400 identified bot accounts, though the number could be far larger, the researchers say. Its accounts offer consistent praise for key Trump figures, particularly support for Health Secretary Robert F. Kennedy Jr. and White House press secretary Karoline Leavitt. As often is the case with bot accounts, those viewed by NBC News tended to have only a few dozen followers, and their posts rarely get many views. But a large audience does not appear to be the point. Their effectiveness, if they have any, is in the hope that they contribute to a partisan echo chamber, and that en masse they can 'massage perceptions,' said Darren Linvill, the director of Clemson University's Media Forensics Hub, which studies online disinformation campaigns. 'They're not really there to get engagement. They're there to just be occasionally seen in those replies,' Linvill told NBC News. The researchers declined to share specifics on how they identified the accounts, but noted they shared a number of distinct trends. All were created, seemingly in batches, around three specific days last year. They frequently punctuate their posts with hashtags, often ones that are irrelevant to the conversation. They post almost exclusively by replying to other users, often to people who pay X for verification and by repeating similarly worded sentiments over and over in short succession. At times, they will respond to someone's post by repeating it back to them verbatim. It's unclear who is behind the network, or which of the multiple AI chatbots that are widely accessible to the public was used to power it. The bots have posted support for conservative figures since 2024, including supporting Trump and other Republicans on the ballot in the lead-up to the election, and then afterward posting that they were excited for Trump to take office. Though they would occasionally mix their messages — some have professed affection for MSNBC host Rachel Maddow, for instance — their messaging was consistently in favor of MAGA figures until the recent Epstein files controversy. A core constituency of Trump supporters voted for him on the belief that Trump, a former friend of Epstein's, would expose a list of supposed rich and powerful clients and bring justice to Epstein's victims. It's only since earlier this month, when Attorney General Pam Bondi announced she would not release additional Epstein files, that the accounts' messaging has become so split, with some accounts telling different users opposite opinions almost concurrently. During the same minute last Saturday morning, for example, one account in the network both cautioned a MAGA supporter from judging Bondi too harshly and told another that Bondi or FBI Director Kash Patel and Deputy Director Dan Bongino should resign over the scandal.

Get Ready For AI On Everything, Everywhere All At Once
Get Ready For AI On Everything, Everywhere All At Once

Forbes

time08-04-2025

  • Business
  • Forbes

Get Ready For AI On Everything, Everywhere All At Once

As AI proliferates across machines, organizations must carefully choose operating environments. Trusted advisers can help. The prevailing theme at this year's NVIDIA GTC conference was that AI will run virtually everywhere. If NVIDIA CEO Jensen Huang's latest epic keynote proves prophetic, every machine is a potential AI node possessing ever-evolving intelligence. The future is here; it's just distributed across machines. Many, many machines—from computers large and small to cars and robots. AI also informs digital twins, in which software representations of complex physical systems dot our organizational matrices. We have the technology to make AI better than it was. Fueled by data, AI will be better, faster and more intelligent. AI nodes will continue to run courtesy of GPUs, high-speed networking, connective software tissue, as well as with the help of beefy servers and vats of digital storage. These technologies are meticulously governed by various command-and-control constructs across public and private clouds, on-premises and extending tendrils out to the edge on PCs. Most organizations pursuing an AI strategy today are targeting the deployment of generative AI powered by LLMs, whose applications generate content or ferret out information. These organizations constitute a growing enterprise AI market. At its core, enterprise AI is about applying AI technology to the most critical processes in your business, driving productivity where it matters most. This could range from boosting employee productivity to augmenting customer experiences to grow revenues. When used strategically—targeted at the right areas in the right way—enterprise AI empowers organizations to refine what sets them apart and enhances their competitive edge. Imagine a bank crafting an LLM-fueled digital assistant that helps retrieve critical information for customers, potentially helping them to decide how best to allocate their money. Or a healthcare organization that uses a prescriptive GenAI solution to help draft notes on patient exams or provide helpful context to physicians during exams. Seventy-eight percent of organizations surveyed by Deloitte expect to increase their AI spending in the next fiscal year, with GenAI expanding its share of the overall AI budget. When it comes to executing their AI strategies, organizations will make technology architecture decisions based on what they are trying to do with their AI use cases, as well as their experience and comfort level. While some may run GenAI models from public cloud providers, others will prefer running GenAI workloads on-premises due to concerns about curbing operational costs, which can spiral if not managed properly. Organizations embarking on AI journeys for the first time may feel more comfortable running GenAI workloads on-premises, where they can control and manage their own data, or more specifically, the secret sauce also known as IP. For organizations governed by data sovereignty mandates, on premises may be the only option. Others requiring near real-time performance will look to the edge of networks, where latency is lower. Today, many of these solutions will be powered by servers in corporate datacenters, or even somewhere along the edge of the network. Yet even those network boundaries are expanding as more developers run LLMs locally on AI PCs and workstations. This would have been impossible even two years ago; soon it will be standard practice. Ultimately, technology decisions must align with the desired outcomes and each organization must make its own deployment decisions based on their goals. With AI permeating every machine with silicon and circuits, organizations must choose the platform (or platforms), that provide the best scalability, security and business value for each use case. Deploying GenAI for the first time can be fraught with complexities; even the most robust organizations fear the unknown. But don't fall prey to inertia. There's no better time to embrace enterprise AI to operate critical AI applications and services in your datacenter or at the edge—where you can control and monitor performance, security and other factors that help you best protect and serve your business. Wherever organizations choose to operate their GenAI solutions, they must lean on trusted advisers for help. They will help guide your AI strategy, determine use cases as well as how to right-size infrastructure components to run your solutions optimally. And remember, in a world where AI is running in everything, everywhere and all at once, data remains your most precious fuel. Organizations must shore up their data estates to properly take advantage of GenAI. The right advisor will help you prepare your data to be consumed, from understanding how to clean and classify data to understanding how to best bring it to bear on targeted use cases. Is your organization ready to harness AI to boost productivity? Learn more about the Dell AI Factory.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store