Latest news with #Cloud


Time of India
an hour ago
- Business
- Time of India
Google may be helping ChatGPT-maker OpenAI to reduce its dependency on Nvidia for AI chips
Representative Image OpenAI has started using Google 's artificial intelligence chips to power its ChatGPT and other products, a report claims. A report by the news agency Reuters cited a source familiar with the development who informed about this move, which suggests that the Microsoft-backed AI startup wants to bring diversification in its chip suppliers beyond Nvidia . The ChatGPT maker is recognised as one of the largest purchasers of Nvidia's graphics processing units (GPUs). OpenAI uses these AI chips for both model training and inference computing, which involves an AI model applying its knowledge to new information for predictions or decisions. Earlier this month, Reuters reported that OpenAI intended to add Google's Cloud service to meet its increasing demand for computing capacity. This collaboration represents a notable partnership between two prominent competitors within the AI sector. What does this move by OpenAI mean for Google As per the Reuters report, Google may have agreed to this OpenAI deal as it seems to expand external access to its proprietary tensor processing units (TPUs), which were previously used mostly for internal operations. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Free P2,000 GCash eGift UnionBank Credit Card Apply Now Undo This shift has attracted clients such as Apple, along with startups like Anthropic and Safe Superintelligence, both founded by former OpenAI executives and seen as competitors to ChatGPT, the report adds. OpenAI's decision to lease Google's TPUs marks its first significant use of non-NVIDIA chips and reflects a move away from depending entirely on Microsoft's data centres, its main backer, the report adds. According to The Information, which first reported this move, claims that this development could position TPUs as a more cost-effective alternative to Nvidia's GPUs. The report also noted that OpenAI is using the TPUs via Google Cloud to help reduce inference costs. However, Google, which is a rival in the AI space, is not offering its most advanced TPU models to OpenAI, a Google Cloud employee told The Information. Adding OpenAI as a customer underscores how Google is leveraging its AI ecosystem, starting from hardware to software, to expand its cloud business. Both companies are yet to make official announcements about this reported deal.


New York Post
a day ago
- Sport
- New York Post
Liberty's Natasha Cloud returns to the site of WNBA betrayal
SAN FRANCISCO — Natasha Cloud's infectious personality was stuffed away like a winter coat when June hits. The woman who often has so many words to say had nothing. This was on a lowly day in early February, when according to Cloud, she learned on Instagram that she had been traded from Phoenix to Connecticut. Advertisement The Mercury, who Cloud said promised her she'd retire with the team, included her at the last minute to get a four-team trade to acquire Satou Sabally and Alyssa Thomas across the finish line. Even worse, Cloud was now going to play for one of the worst teams in the W.


Business Standard
2 days ago
- Business
- Business Standard
INTELLECT launches Open Business Impact AI Platform on PF Cloud at GIFT City
Intellect Design Arena announced the launch of PF Cloud, the world's first Open Business Impact AI Platform, Purple Fabric on PF Cloud at GIFT City, India's flagship International Financial Services Centre (IFSC). Anchored in GIFT City's globally competitive, regulation-compliant ecosystem of fintechs, global banks, regulators, and infrastructure providers, this strategic launch marks a significant step in advancing India's vision of becoming a Global AI Services Hub. Intellect's Open Business Impact AI Platform on PF Cloud is a culmination of over a decade of research and 20 million engineering hours, designed to help enterprises move from experimentation to enterprise-grade, accountable AI adoption.


Time Business News
2 days ago
- Business
- Time Business News
Simplifying Complex Cloud Operations: The Business Case for Managed Kubernetes in Private Cloud Environments
Every IT leader knows how it starts. A few containerized workloads run successfully. Teams grow comfortable with microservices. Soon, new applications, more clusters, and expanded environments appear. Before long, what began as a promising modernization project turns into an intricate web of dependencies, configurations, and management burdens that stretch your teams thin. Kubernetes has become the de facto standard for orchestrating containerized applications, but managing Kubernetes at scale is anything but simple. This growing operational complexity is exactly why enterprises are increasingly turning to Managed Kubernetes as a Service for Private Cloud to regain control, simplify operations, and unlock real business value. Kubernetes offers extraordinary power, but introduces challenges that directly affect business performance: Manual cluster management drains valuable engineering resources. Upgrades, patching, and version compatibility become time-consuming. Security configurations across multiple clusters grow harder to maintain. Downtime risks increase as complexity expands. These operational pressures shift focus away from core innovation and product delivery. The result is slower time-to-market, rising operational costs, and frustrated teams. This is where Managed Kubernetes as a Service with Gardener offers a different path. Providers like Cloudification deliver fully managed, GitOps-driven Kubernetes environments that help businesses regain operational clarity and confidently scale without sacrificing control. Enterprises often believe that to simplify Kubernetes operations, they must give up control to public cloud providers. This is a false choice. With Managed Kubernetes as a Service for Private Cloud, businesses retain full data ownership and governance while outsourcing the day-to-day operational burdens of Kubernetes management. By partnering with experts like Cloudification, you benefit from: Fully automated cluster provisioning, upgrades, and maintenance Consistent security policies applied uniformly across environments Immediate response to incidents without draining internal teams Freedom from vendor lock-in with open-source technology foundations Your engineers stay focused on building and delivering value, not maintaining complex infrastructure behind the scenes. Every hour spent troubleshooting Kubernetes clusters is an hour not spent delivering customer value. Internal Kubernetes management often leads to hidden operational costs that quietly accumulate: Increased staffing requirements for specialized skills Delays caused by troubleshooting complex deployment issues Long-term expenses tied to poorly optimized resource usage Managed services convert unpredictable operational overhead into transparent service costs. This allows for: Lower total operational expenses over time More predictable financial planning Better resource utilization and cluster optimization Cloudification's GitOps-driven automation ensures that your clusters remain consistent, efficient, and fully aligned with best practices, minimizing waste and maximizing performance. Security remains one of the most challenging aspects of Kubernetes management, especially in regulated industries. Each new cluster introduces potential configuration drift and access control inconsistencies. By choosing managed Kubernetes services, you gain: Consistent role-based access controls across all environments Automated patching and vulnerability management Centralized audit logging for compliance reporting Full visibility into cluster health and security posture Instead of constant firefighting, your security and compliance teams operate from a position of confidence, knowing that policies are enforced uniformly at every level. In competitive markets, the speed at which you can bring new features and services to market directly impacts your business growth. Complex Kubernetes operations often become bottlenecks to this agility. Managed Kubernetes simplifies deployment pipelines, reduces downtime during upgrades, and eliminates many of the manual steps that slow release cycles. This allows your development teams to: Deploy new features more frequently and safely Experiment with new services without infrastructure concerns Recover faster from failures or performance issues Not every organization needs a massive Kubernetes footprint on day one. Managed Kubernetes supports gradual adoption. Begin with a few key applications or business-critical workloads. Gain confidence as you see operational stability improve. As needs grow, easily scale clusters horizontally without adding complexity to your internal operations. Cloudification's consulting and workshop services help teams build internal Kubernetes skills while maintaining operational stability throughout growth phases. Even with careful planning, Kubernetes containerization projects can encounter unexpected challenges. In-house teams may struggle with: Complex multi-cluster networking Storage integration for stateful workloads Performance tuning under heavy load When these issues arise, having experienced Kubernetes experts readily available makes a significant difference. Cloudification's managed service model provides immediate access to certified professionals who help resolve problems quickly while empowering your team to learn and grow. Keeping Kubernetes environments healthy long-term requires consistent operational discipline. Fortunately, managed services simplify much of this by design. To keep your private cloud Kubernetes environment optimized: Review cluster resource utilization periodically Conduct security audits on role-based access configurations Validate disaster recovery processes regularly Encourage cross-functional feedback between development and operations These lightweight habits ensure that your managed Kubernetes deployment continues delivering value sustainably over time. At its core, adopting Managed Kubernetes as a Service with Gardener is not just a technology decision. It is a business strategy to reduce operational burdens, control costs, strengthen security, and empower faster innovation. By simplifying the most complex aspects of Kubernetes management, enterprises regain focus on what matters most: delivering exceptional products and services to their customers. With open-source foundations, GitOps automation, and expert guidance, Cloudification provides businesses with a Kubernetes platform that balances power and simplicity. You maintain full control over your data and systems while eliminating the daily operational headaches that slow progress. If you are ready to simplify your cloud operations and turn Kubernetes into a true business enabler, Cloudification is here to help you design and operate a private cloud environment that finally works for your business, not against it. TIME BUSINESS NEWS

Engadget
2 days ago
- Engadget
AI-powered chat summaries are coming to WhatsApp
Meta is adding a new Message Summaries feature to WhatsApp that uses AI to summarize unread messages in a few bullet points. The feature is built on the Private Processing technique Meta announced at Llamacon in April, and claims to let AI work with content in WhatsApp without exposing any of it to Meta itself. Once the feature appears in your app, you just tap on the onscreen banner over your unread messages with that says "Summarize privately" to receive a summary from Meta AI. The Message Summaries feature is rolling out to WhatsApp users in the US chatting in English first, but Meta says it hopes to "bring it to other languages and countries later this year." The company pitches summaries as an easier way to catch-up on what you missed if you haven't checked your phone or you're just in too many chats. AI is by no means foolproof at even simple tasks like this — Apple's trouble with notification summaries was only a few months ago — but the tool could be appealing to people in particularly large and active chats. The real novelty of the summaries is how Meta claims to be deploying them without walking back the private nature of WhatsApp chats. The company has a blog post and whitepaper digging into the details of how Private Processing works, but on first blush it sounds similar to Private Cloud Compute, the method Apple uses to call on more demanding AI features without exposing its users' data. Using end-to-end encryption and a secure cloud environment, WhatsApp messages can be processed without data being accessed while its happening, or saved after the fact. Importantly, all of this is still optional. Summaries won't be provided without you asking for them first, and the feature is disabled by default. Meta also says you can exclude chats from being shared with the company's AI via the Advanced Chat Privacy feature.