
After DeepSeek, China's Baidu to open source its Ernie AI chatbot
Baidu is set to open source its Ernie generative AI model, which will be a major development in the ongoing global AI competition. The company has confirmed that the open-sourcing of its large language model will begin with a gradual rollout starting Monday. While it may not be as disruptive as the emergence of DeepSeek, Baidu's move is already sparking debate within the AI community and is being closely watched by industry leaders across the globe.advertisementBaidu's decision comes as a surprise to many, especially given its long-standing preference for a proprietary approach to AI development. The company had previously opposed the open-source model, favouring internal control over its tools and infrastructure.'Baidu has always been very supportive of its proprietary business model and was vocal against open-source, but disruptors like DeepSeek have proven that open-source models can be as competitive and reliable as proprietary ones,' Lian Jye Su, chief analyst with technology research and advisory group Omdia, previously told CNBC.
While some experts believe Baidu's move may not have the same dramatic effect as DeepSeek's launch, others argue that it is an important milestone in the broader evolution of artificial intelligence.'This isn't just a China story. Every time a major lab open-sources a powerful model, it raises the bar for the entire industry,' said Sean Ren, associate professor of computer science at the University of Southern California and Samsung's AI Researcher of the Year.advertisementRen added that open-source models put pressure on companies like OpenAI and Anthropic to justify their closed platforms, premium APIs, and subscription-based pricing models. 'While most consumers don't care whether a model's code is open-sourced, they do care about lower costs, better performance, and support for their language or region. Those benefits often come from open models, which give developers and researchers more freedom to iterate, customize, and deploy faster,' he said.Industry insiders are also pointing to the broader impact Baidu's move could have on pricing. Alec Strasmore, founder of AI advisory Epic Loot, likened the development to a direct challenge to the commercial dominance of current AI leaders.'Baidu just threw a Molotov into the AI world,' Strasmore said. 'OpenAI, Anthropic, DeepSeek, all these guys who thought they were selling top-notch champagne are about to realise that Baidu will be giving away something just as powerful,' he added, comparing Baidu's move to budget retail giant Costco creating its own high-quality alternative.'This isn't a competition; it's a declaration of war on pricing,' he said, adding that the open-source release of Ernie could encourage startups and developers to stop paying top dollar for AI access.Baidu's ambitions are clear. In March, the company claimed its latest ERNIE X1 model could match the performance of DeepSeek's R1 at half the price. CEO Robin Li also hinted earlier this year that Baidu's open-source strategy aims to support developers across the globe.advertisement'Our releases aim to empower developers to build the best applications — without having to worry about model capability, costs, or development tools,' Li said during a developer event in April.However, not everyone is convinced the news will immediately disrupt the global AI landscape. Cliff Jurkiewicz, vice president of global strategy at applied AI firm Phenom, said Baidu's announcement may not generate much reaction in markets like the US.'The news of Baidu going open source probably lands with a big thud,' Jurkiewicz said. 'Most people in the United States don't even know it's a Chinese tech company.'He also drew comparisons between Baidu's move and the early days of Android. 'When Android first emerged, its standout feature was that it was configurable and customisable. But it was almost too much work in the sense that people just wanted the thing to function correctly,' he said. 'Android, out of the box, is plain and vanilla, so it has to be customised, and that's a real challenge,' Jurkiewicz added.- Ends
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Hindustan Times
2 hours ago
- Hindustan Times
OpenAI says it has no plan to use Google's in-house chip
Jul 01, 2025 04:11 AM IST OpenAI said it has no active plans to use Google's in-house chip to power its products, two days after Reuters and other news outlets reported on the AI lab's move to turn to its competitor's artificial intelligence chips to meet growing demand. OpenAI is actively using Nvidia's graphics processing units (GPUs), and AMD's AI chips to power its growing demand.(Representational/REUTERS) A spokesperson for OpenAI said on Sunday that while the AI lab is in early testing with some of Google's tensor processing units (TPUs), it has no plans to deploy them at scale right now. Google declined to comment. While it is common for AI labs to test out different chips, using new hardware at scale could take much longer and would require different architecture and software support. OpenAI is actively using Nvidia's graphics processing units (GPUs), and AMD's AI chips to power its growing demand. OpenAI is also developing its chip, an effort that is on track to meet the "tape-out" milestone this year, where the chip's design is finalized and sent for manufacturing. OpenAI has signed up for Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector. Most of the computing power used by OpenAI would be from GPU servers powered by the so-called neocloud company CoreWeave. Google has been expanding the external availability of its in-house AI chips, or TPUs, which were historically reserved for internal use. That helped Google win customers, including Big Tech player Apple, as well as startups like Anthropic and Safe Superintelligence, two ChatGPT-maker competitors launched by former OpenAI leaders.


Time of India
4 hours ago
- Time of India
As Apple's AI struggles continue, it again looks at these technology companies including one backed by Amazon
Apple Inc is considering a major shift in its artificial intelligence strategy, exploring partnerships with Anthropic PBC or OpenAI to power a new version of Siri, potentially sidelining its in-house AI models, Bloomberg reported. Tired of too many ads? go ad free now This move aims to bolster Apple's struggling AI efforts and keep pace with competitors in the rapidly evolving generative AI landscape. According to the report, citing sources familiar with the matter, Apple has engaged in discussions with both Anthropic and OpenAI about integrating their large language models, such as Claude or ChatGPT, into Siri. The company has requested these firms train custom versions of their models to run on Apple's cloud infrastructure, specifically its Private Cloud Compute servers powered by high-end Mac chips, to prioritize user privacy. These talks remain in early stages, and Apple has not finalized its decision, with an internal project, dubbed LLM Siri, still developing in-house models. The initiative, led by Siri chief Mike Rockwell and software engineering head Craig Federighi, follows a reassignment of Siri oversight from Apple's AI chief, John Giannandrea, amid delays and underwhelming responses to Apple Intelligence features. The report noted that Rockwell, who assumed the Siri engineering role in March, instructed his team to evaluate whether third-party models like Claude, ChatGPT, or Google's Gemini outperform Apple's own technology. Testing reportedly showed Anthropic's Claude as the most promising fit for Siri's needs, prompting talks led by Apple's vice president of corporate development, Adrian Perica. Siri, launched in 2011, has lagged behind modern AI chatbots, with promised upgrades like enhanced app control and personal data integration delayed from early 2025 to next spring, per Bloomberg. Apple's broader AI strategy remains uncertain, with a multibillion-dollar budget approved for 2026 to run in-house models, but executives are increasingly open to third-party solutions for a quicker turnaround. Tired of too many ads? go ad free now What is Apple's current partnership with OpenAI Apple's current AI features rely on its proprietary Apple Foundation Models, with plans for a 2026 Siri overhaul initially based on this technology. However, the report said that adopting third-party models could help Apple match the capabilities of AI assistants on Android devices, shedding its reputation as an AI laggard. This approach mirrors Samsung's use of Google's Gemini for its Galaxy AI features and Amazon's integration of Anthropic's technology for Alexa+. What Apple's change in AI strategy mean The shift has sparked concerns within Apple's 100-person AI team, led by Ruoming Pang, with morale reportedly souring over the potential reliance on external technology. The report highlighted that some engineers feel their work is being undervalued, with competitors like Meta and OpenAI offering lucrative packages—ranging from $10 million to $40 million annually—to poach talent. Apple recently lost senior researcher Tom Gunter and narrowly retained its MLX team after counteroffers, underscoring the competitive pressure. While Apple already uses ChatGPT for certain Siri queries and text generation, discussions with Anthropic have hit snags over financial terms, with the startup seeking a multibillion-dollar annual fee. If no deal is reached, Apple may pivot to OpenAI or other providers, signaling a potential transformation in its AI approach.


Time of India
4 hours ago
- Time of India
HCLTech and OpenAI collaborate to drive enterprise-scale AI adoption
HCLTech , a leading global technology company, today announced a multi-year strategic collaboration with OpenAI , a leading AI research and deployment company, to drive large-scale enterprise AI transformation as one of the first strategic services partners to OpenAI. HCLTech's deep industry knowledge and AI Engineering expertise lay the foundation for scalable AI innovation with OpenAI. This collaboration will enable HCLTech's clients to leverage OpenAI's industry-leading AI products portfolio alongside HCLTech's foundational and applied AI offerings for rapid and scaled GenAI deployment. Additionally, HCLTech will embed OpenAI's industry-leading models and solutions across its industry-focused offerings, capabilities and proprietary platforms, including AI Force, AI Foundry, AI Engineering and industry-specific AI accelerators. This deep integration will help its clients modernize business processes, enhance customer and employee experiences and unlock growth opportunities, covering the full AI lifecycle, from AI readiness assessments and integration to enterprise-scale adoption, governance and change management. HCLTech will roll out ChatGPT Enterprise and OpenAI APIs internally, empowering its employees with secure, enterprise-grade generative AI tools. Vijay Guntur, Global Chief Technology Officer (CTO) and Head of Ecosystems at HCLTech, said, 'We are honored to work with OpenAI, the global leader in generative AI foundation models. This collaboration underscores our commitment to empowering Global 2000 enterprises with transformative AI solutions. It reaffirms HCLTech's robust engineering heritage and aligns with OpenAI's spirit of innovation. Together, we are driving a new era of AI-powered transformation across our offerings and operations at a global scale.' Giancarlo 'GC' Lionetti, Chief Commercial Officer at OpenAI, said, 'HCLTech's deep industry knowledge and AI engineering expertise sets the stage for scalable AI innovation. As one of the first system integration companies to integrate OpenAI to improve efficiency and enhance customer experiences, they're accelerating productivity and setting a new standard for how industries can transform using generative AI.'