Latest news with #LLMSiri

Ammon
01-07-2025
- Business
- Ammon
Apple's AI Siri might be powered by OpenAI
Ammon News - Apple is considering enlisting the help of OpenAI or Anthropic to power its AI-upgraded Siri, according to a report from Bloomberg's Mark Gurman. As Apple continues to struggle with the development of an upgraded 'LLM Siri,' it reportedly asked OpenAI and Anthropic to create versions of their large language models to test on the company's private cloud infrastructure. For months, Apple has been working to get its AI-enhanced Siri back on track after delaying the overhauled assistant's launch in March. Apple later appointed Vision Pro head Mike Rockwell as the leader of AI and Siri after CEO Tim Cook 'lost confidence' in the team's former chief, John Giannandrea. As reported by Bloomberg, Rockwell asked his team to test whether Anthropic's Claude, OpenAI's ChatGPT, or Google's Gemini performs better at handling basic requests compared to its own models, with Anthropic's apparently seen as the most promising. While Google has Gemini AI features for Android and its Pixel lineup, Samsung licenses Google's AI model for its phones. It is also reportedly close to cutting a deal with Perplexity, which already has a tie-up with Motorola. Earlier this month, Bloomberg reported Apple executives had considered acquiring Perplexity to help boost its AI ambitions. LLM Siri was largely absent from Apple's Worldwide Developers Conference earlier this month, where SVP of worldwide marketing Greg Joswiak admitted that the technology 'didn't hit our quality standard.' Bloomberg notes that Apple's plans to incorporate a third-party AI model into Siri are still at an 'early stage' and that it's still considering using in-house models. The Verge


Tom's Guide
01-07-2025
- Business
- Tom's Guide
Apple could hand Siri's AI upgrades to OpenAI or Anthorpic — what we know
Apple's foray into AI has not been going so well, especially where Siri is concerned. The company has struggled to get the promised "LLM Siri" upgrades off the ground, which has led to significant delays in its release. Now, it sounds like the company may be considering taking drastic steps to get Siri back on track. According to Mark Gurman at Bloomberg, Apple has asked both OpenAI and Anthropic to train custom versions of their own large language models — with the aim of getting them to run on Apple's Private Cloud Computer infrastructure. While Apple Intelligence does connect with ChatGPT to allow the chatbot to answer questions Siri can't handle, it's always been marketed as a distinct feature. Outsourcing the code to run LLM Siri to another company would be a first for Apple, but it could help the company catch up in the generative AI race — after jumping on that bandwagon rather late. Gurman claims that Apple's "investigation" into this change is still at an early stage, and it hasn't made a final decision yet. This also means that the in-house version of LLM Siri is still in active development for the time being. The change, if it happens, is likely to happen sometime next year. According to reports, Apple didn't anticipate that generative AI would be such a big deal until it was too late. So over the past few years, it's been trying to catch up with the likes of Google, Meta and Open AI in an attempt to get its own AI models up and running. That hasn't worked out so well, and it means Apple has fallen back on partnering with existing AI companies to get back on track. We've already seen that with the ChatGPT partnership in iOS 18, and utilizing third-party help to create LLM Siri wouldn't be out of the ordinary. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. But the more these delays happen, the more Apple falls behind, and deepens its reliance on third parties — something that could prove expensive. Gurman claims this has already proved problematic, and discussions with Anthropic have led the AI startup to push for a "multibillion-dollar annual fee that increases sharply each year." This would not be good for Apple's profit margins. We're going to have to wait and see how this all plays out, but the one thing we do know is that LLM Siri will not be coming this year. And we had the opportunity to ask Apple about that at WWDC 2025 last month — so be sure to check out the full interview for all the details.


India Today
01-07-2025
- Business
- India Today
Apple eyes OpenAI and Anthropic for Siri revamp, wants their models to run on its own private cloud
Apple is reportedly in talks with OpenAI and Anthropic to potentially power a major update to Siri, according to a Bloomberg report. The company is apparently exploring the use of external large language models (LLMs) to replace or support its own AI efforts. Reportedly, Apple has asked both OpenAI and Anthropic to train versions of their models that could be tested on Apple's own cloud infrastructure. This setup would allow the company to keep user data private while leveraging the strengths of more advanced third-party AI move is reportedly part of an internal push to revive Apple's underwhelming progress in AI. In a post on X, Bloomberg's Mark Gurman said, 'Apple is considering using AI technology from Anthropic or OpenAI to power Siri, sidelining its own in-house models in a potentially blockbuster move aimed at turning around its AI effort.'Gurman added that the shift towards external models has had a noticeable impact internally. Tom Gunter, one of Apple's senior AI engineers, left the company last week. Additionally, the team behind MLX, which is Apple's open-source AI framework, has allegedly threatened to quit over frustrations with the current direction. advertisement Despite these developments, Apple has yet to make a final decision to completely abandon its in-house models. Gurman noted that the company is still running a project named 'LLM Siri,' which could power a redesigned version of Siri in 2025 using its own AI models. However, Gurman says that top executives at Apple reportedly believe that relying on third-party models could be the key to catching up with rivals in the AI this year, Apple abruptly cancelled plans to build its own in-house coding models, which had been announced as part of Swift Assist at last year's developer conference. Instead, the company now allows developers to use tools like ChatGPT or Anthropic's Claude directly through Xcode, its integrated development environment. Apple is also using Claude internally for code shift comes as Apple faces increasing pressure to retain top AI talent. According to Gurman, companies like Meta and OpenAI are offering salaries that can more than double what Apple pays, making it harder to attract and keep skilled engineers. Meta CEO Mark Zuckerberg on Monday announced a new AI division called the Meta Superintelligence Labs, which is aimed at developing artificial general intelligence (AGI) – an AI system that can perform most tasks as well as or better than humans can. For this new AI unit, Meta has poached some of the best AI researchers from its rivals like OpenAI, Google, and Anthropic. Apple is reportedly working on a long-awaited revamp of Siri alongside the launch of iOS 26.4 in 2026. The upgraded virtual assistant is expected to offer more sophisticated capabilities, tapping into on-screen content and user data to execute complex, multi-step commands with greater context-awareness.- Ends
&w=3840&q=100)

Business Standard
01-07-2025
- Business
- Business Standard
Apple considers using Anthropic or OpenAI to power Siri in major shift
Apple Inc. is considering using artificial intelligence technology from Anthropic PBC or OpenAI to power a new version of Siri, sidelining its own in-house models in a potentially blockbuster move aimed at turning around its flailing AI effort. The iPhone maker has talked with both companies about using their large language models for Siri, according to people familiar with the discussions. It has asked them to train versions of their models that could run on Apple's cloud infrastructure for testing, said the people, who asked not to be identified discussing private deliberations. If Apple ultimately moves forward, it would represent a monumental reversal. The company currently powers most of its AI features with homegrown technology that it calls Apple Foundation Models and had been planning a new version of its voice assistant that runs on that technology for 2026. Apple's investigation into third-party models is at an early stage, and the company hasn't made a final decision on using them, the people said. A competing project internally dubbed LLM Siri that uses in-house models remains in active development. Making a change — which is under discussion for next year — could allow Cupertino, California-based Apple to offer Siri features on par with AI assistants on Android phones, helping the company shed its reputation as an AI laggard. Representatives for Apple, Anthropic and OpenAI declined to comment. Shares of Apple closed up over 2 per cent after Bloomberg reported on the deliberations. Siri Struggles The project to evaluate external models was started by Siri chief Mike Rockwell and software engineering head Craig Federighi. They were given oversight of Siri after the duties were removed from the command of John Giannandrea, the company's AI chief. He was sidelined in the wake of a tepid response to Apple Intelligence and Siri feature delays. Rockwell, who previously launched the Vision Pro headset, assumed the Siri engineering role in March. After taking over, he instructed his new group to assess whether Siri would do a better job handling queries using Apple's AI models or third-party technology, including Claude, ChatGPT and Alphabet Inc.'s Google Gemini. After multiple rounds of testing, Rockwell and other executives concluded that Anthropic's technology is most promising for Siri's needs, the people said. That led Adrian Perica, the company's vice president of corporate development, to start discussions with Anthropic about using Claude, the people said. The Siri assistant — originally released in 2011 — has fallen behind popular AI chatbots, and Apple's attempts to upgrade the software have been stymied by engineering snags and delays. A year ago, Apple unveiled new Siri capabilities, including ones that would let it tap into users' personal data and analyze on-screen content to better fulfill queries. The company also demonstrated technology that would let Siri more precisely control apps and features across Apple devices. The enhancements were far from ready. Apple initially announced plans for an early 2025 release but ultimately delayed the launch indefinitely. They are now planned for next spring, Bloomberg News has reported. AI Uncertainty People with knowledge of Apple's AI team say it is operating with a high degree of uncertainty and a lack of clarity, with executives still poring over a number of possible directions. Apple has already approved a multibillion dollar budget for 2026 for running its own models via the cloud but its plans for beyond that remain murky. Still, Federighi, Rockwell and other executives have grown increasingly open to the idea that embracing outside technology is the key to a near-term turnaround. They don't see the need for Apple to rely on its own models — which they currently consider inferior — when it can partner with third parties instead, according to the people. Licensing third-party AI would mirror an approach taken by Samsung Electronics Co. While the company brands its features under the Galaxy AI umbrella, many of its features are actually based on Gemini. Anthropic, for its part, is already used by Inc. to help power the new Alexa+. In the future, if its own technology improves, the executives believe Apple should have ownership of AI models given their increasing importance to how products operate. The company is working on a series of projects, including a tabletop robot and glasses that will make heavy use of AI. Apple has also recently considered acquiring Perplexity in order to help bolster its AI work, Bloomberg has reported. It also briefly held discussions with Thinking Machines Lab, the AI startup founded by former OpenAI Chief Technology Officer Mira Murati. Souring Morale Apple's models are developed by a roughly 100-person team run by Ruoming Pang, an Apple distinguished engineer who joined from Google in 2021 to lead this work. He reports to Daphne Luong, a senior director in charge of AI research. Luong is one of Giannandrea's top lieutenants, and the foundation models team is one of the few significant AI groups still reporting to Giannandrea. Even in that area, Federighi and Rockwell have taken a larger role. Regardless of the path it takes, the proposed shift has weighed on the team, which has some of the AI industry's most in-demand talent. Some members have signaled internally that they are unhappy that the company is considering technology from a third-party, creating the perception that they are to blame, at least partially, for the company's AI shortcomings. They've said that they could leave for multimillion-dollar packages being floated by Meta Platforms Inc. and OpenAI. Meta, the owner of Facebook and Instagram, has been offering some engineers annual pay packages between $10 million and $40 million — or even more — to join its new Superintelligence Labs group, according to people with knowledge of the matter. Apple is known, in many cases, to pay its AI engineers half — or even less — than what they can get on the open market. One of Apple's most senior large language model researchers, Tom Gunter, left last week. He had worked at Apple for about eight years, and some colleagues see him as difficult to replace given his unique skillset and the willingness of Apple's competitors to pay exponentially more for talent. Apple this month also nearly lost the team behind MLX, its key open-source system for developing machine learning models on the latest Apple chips. After the engineers threatened to leave, Apple made counteroffers to retain them — and they're staying for now. Anthropic and OpenAI Discussions In its discussions with both Anthropic and OpenAI, the iPhone maker requested a custom version of Claude and ChatGPT that could run on Apple's Private Cloud Compute servers — infrastructure based on high-end Mac chips that the company currently uses to operate its more sophisticated in-house models. Apple believes that running the models on its own chips housed in Apple-controlled cloud servers — rather than relying on third-party infrastructure — will better safeguard user privacy. The company has already internally tested the feasibility of the idea. Other Apple Intelligence features are powered by AI models that reside on consumers' devices. These models — slower and less powerful than cloud-based versions — are used for tasks like summarizing short emails and creating Genmojis. Apple is opening up the on-device models to third-party developers later this year, letting app makers create AI features based on its technology. The company hasn't announced plans to give apps access to the cloud models. One reason for that is the cloud servers don't yet have the capacity to handle a flood of new third-party features. The company isn't currently working on moving away from its in-house models for on-device or developer use cases. Still, there are fears among engineers on the foundation models team that moving to a third-party for Siri could portend a move for other features as well in the future. Last year, OpenAI offered to train on-device models for Apple, but the iPhone maker was not interested. Since December 2024, Apple has been using OpenAI to handle some features. In addition to responding to world knowledge queries in Siri, ChatGPT can write blocks of text in the Writing Tools feature. Later this year, in iOS 26, there will be a ChatGPT option for image generation and on-screen image analysis. While discussing a potential arrangement, Apple and Anthropic have disagreed over preliminary financial terms, according to the people. The AI startup is seeking a multibillion-dollar annual fee that increases sharply each year. The struggle to reach a deal has left Apple contemplating working with OpenAI or others if it moves forward with the third-party plan, they said. Management Shifts If Apple does strike an agreement, the influence of Giannandrea, who joined Apple from Google in 2018 and is a proponent of in-house large language model development, would continue to shrink. In addition to losing Siri, Giannandrea was stripped of responsibility over Apple's robotics unit. And, in previously unreported moves, the company's Core ML and App Intents teams — groups responsible for frameworks that let developers integrate AI into their apps — were shifted to Federighi's software engineering organization. Apple's foundation models team had also been building large language models to help employees and external developers write code in Xcode, its programming software. The company killed the project — announced last year as Swift Assist — about a month ago. Instead, Apple later this year is rolling out a new Xcode that can tap into third-party programming models. App developers can choose from ChatGPT or Claude.


Tom's Guide
19-05-2025
- Business
- Tom's Guide
Apple's AI rollout has not gone very smoothly — and this report details what's happened
The subject of Siri, and the upgrades Apple promised back at WWDC 2024, has been pretty hot the past few months. Ever since Apple had to delay the rollout of Siri's AI-infused upgrades, on account of it taking "longer than [Apple] thought." Well, it sounds like this might be a learning experience for Apple. Bloomberg's Mark Gurman and Drake Bennett have a mammoth report on Apple's Siri fiasco, and the rollout of what is apparently internally known as "LLM Siri." In fact, due to all the high-profile delays, both reporters say that Apple isn't going to be announcing new features so far in advance from now on. It sounds like this is the same lesson Apple should have learned with the AirPower charger, which was announced back in 2017 and then never got released. All because Apple announced the charger too early, before it realized it wasn't actually able to make it. The report goes into a lot of detail, but I will try to explain the situation behind Apple's AI blunders as simply as possible. One key problem is that Apple started off late and, as previous leaks have claimed, the sudden popularity of services like ChatGPT caught the company by surprise. In fact, despite having an AI department for many years previously, Apple hadn't even considered the concept of Apple Intelligence before the release of ChatGPT in 2022. Following that, it seemed Apple had to scramble to catch up — all while the rest of the tech industry was doing the same. Before the launch of ChatGPT, Apple's software head Craig Federighi was reluctant to invest in what was needed to improve Apple's AI capabilities — especially since there was no end goal. According to sources, it wasn't until after ChatGPT was released and Federighi used generative AI in one of his projects that the benefits became clear to him. That led to a sudden pivot towards generative-AI features for the then-upcoming iOS 18. Despite the pivot to LLMs, it became clear that Apple wasn't going to be able to catch up — and Apple's chatbot was lagging behind the likes of ChatGPT and Gemini. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. One way Apple attempted to catch up was to bolt the new LLM Siri onto the Old Siri, which is the biggest problem with rolling out the new feature to iOS. It was described by sources as "whack-a-mole", with three bugs popping up every time an old problem was fixed. Apparently, individual features look good, but integrating them as a whole "Siri" assistant causes everything to fall apart. So it's no surprise that the new LLM Siri has been delayed as much as it has. Apple's AI chief, John Giannandrea, has taken much of the blame for Apple's AI faults since he isn't a "forceful" personality like other executives. Not only does this mean he's alleged to have not fought hard enough for funding for the AI department, but employees also claim that he isn't actually pushing the team hard enough. This is partly because he doesn't see rival chatbot makers as serious threats to Apple, but also potentially because he doesn't believe chatbots are the kind of features consumers actually want. However, Giannandrea has claimed that Siri's failure is not on him — and should be placed on Apple's marketing teams for overhyping and focusing on features that weren't finished. Apparently, this is something product managers are responsible for finalizing, which in this case would be Federighi. And the final insult is that Apple was a little too conservative in buying the GPUs necessary for AI processing. Apparently, this led to Apple's rivals buying up all the supply, and the lack of GPUs meant Apple's models were trained a lot more slowly as a result. The one thing the report makes clear is that Apple is "unlikely" to spend much time talking about Siri at WWDC 2025. Even the features that have already been announced, but have yet to materialize, are still "months away" from shipping. If there's anything Apple's good at, it's brushing its defeats under the rug and ploughing forward. So expect WWDC to focus on iOS 19, which is expected to get a major redesign, and other features ready to go when the update arrives this fall. We may even hear more about changes coming to Apple Intelligence, but if this report is accurate, we shouldn't expect a repeat of last year. Which we can all agree is a good thing. The promise of Apple Intelligence is all well and good, but people don't really like buying promises, especially when those promises can be broken. You can check out our WWDC 2025 hub for all the latest news and predictions about the upcoming show.