
The Apple AirPods Pro 3 release date may have been delayed, here's what we know
Initially rumoured to launch alongside the iPhone 17 later this year, that no longer appears to be the case. Two respected analysts now claim the AirPods Pro 3 won't arrive until early 2026, and it'll be an even longer wait for the AirPods Max 2, which could follow in 2027.
Arguably Apple's most popular product after the iPhone, the AirPods Pro have long set the standard for wireless earbuds. Last year, Apple introduced the AirPods 4 and a new variant with active noise cancellation, bringing ANC to its more affordable earbuds for the very first time.
But with rumours pointing to a redesign for both the earbuds and the charging case, plus new health-tracking features and even infrared sensors, the AirPods Pro 3 could be Apple's most advanced earbuds yet. Here's everything we know so far, including the expected release date, price and potential new features.
Apple AirPods Pro 3: Release date
There's no official release date for the AirPods Pro 3 yet. Apple hasn't even confirmed they exist, but there's no shortage of speculation.
Back in February, Bloomberg's Mark Gurman claimed the AirPods Pro 3 were just 'months away' from launch, fuelling expectations of a September 2025 release alongside the iPhone 17. Apple has historically unveiled new earbuds during its autumn keynotes, so this seemed plausible at the time.
However, more recent reports paint a very different (and rather disappointing) picture. In May, reliable supply chain analyst Ming-Chi Kuo said that 'AirPods may not see significant updates until 2026', suggesting that the AirPods Pro 3 won't launch until next year, adding that Apple's next-generation earbuds are expected to feature infrared cameras, which could be contributing to the delay.
In June, a leaked investor note from analyst Jeff Pu also pointed to a 2026 launch for the AirPods Pro 3. While Pu didn't publicly comment, the internal report, which was later shared on X, claimed that no new AirPods will launch in 2025, with the AirPods Pro 3 now expected in 2026 and a lighter version of the AirPods Max following in 2027.
Apple AirPods Pro 3: Price
There aren't any pricing rumours surrounding the AirPods Pro 3 just yet, but earbuds and headphones don't tend to fluctuate in price very much. In 2022, the AirPods Pro 2 launched at £249, and Apple lowered the price in 2023 to £229 when it launched a new charging case with USB-C.
I don't think the AirPods Pro 3 will cost significantly more than this, though it's likely Apple will bump it back up to £249 rather than stick with the £229 price tag of the AirPods Pro 2.
The base AirPods 4 cost £129 and the AirPods 4 with active noise cancellation cost £279, so they're certain to cost more than that. In fact, the AirPods 3 cost £169, so Apple increased the price by £10. A return to a £249 price tag isn't really out of the realm of possibility.
Apple AirPods Pro 3: Design and specs
If the rumours are to be believed, the AirPods Pro 3 have been in the works for a number of years, and they're set to get a big design upgrade.
In October 2023, Gurman claimed that the AirPods Pro would be updated with a new design and a faster H3 chip, giving it even better active noise cancellation.
Earlier this February, Apple filed a patent that described a change to the way the earbuds' touch controls work. According to the patent, users will be able to perform vertical control movements on a non-capacitive surface, meaning you could theoretically increase the volume while wearing gloves.
I think they'll look broadly the same, however, white AirPods with short stems. The tech giant doesn't seem to deviate very much when it comes to design. They're so distinctive that I can't see them shifting in colour, either.
The charging case will almost certainly boast USB-C connectivity, seeing as Apple has pretty much shifted all its devices over to the new standard, and it's very likely the AirPods Pro 3 will borrow design features from the fourth-generation charging case.
Apple got rid of the physical 'setup button' on the AirPods 4's rear, replacing it with a capacitive sensor on the front of the case, just under the LED sensor. The indicator all but disappears when it's not in use. The case was also slimmer and lighter, and I expect the AirPods Pro 3 to gain a similar charging case when it's released.
Apple AirPods Pro 3: Features
Now, here's the most exciting part about the AirPods Pro 3 – the features. While sound and active noise cancellation are expected to get better, it's the features that could really make the earbuds shine.
At WWDC 2025, Apple announced several iOS 26 features for currently-released AirPods, many of which are likely to carry over to the AirPods Pro 3. These include the ability to control your iPhone or iPad camera remotely by pressing the stem and a new sleep detection feature that pauses playback when you nod off. Apple has also announced that it's adding more head gesture controls and will roll out live translation during phone calls.
But what about new features specific to the AirPods Pro 3? Firstly, they could be crammed full of health and fitness features. The all-new Beats Powerbeats Pro 2 can track your heart rate, and according to Mark Gurman, this feature could also come to the AirPods Pro 2. He said in December 2024 that Apple was working on in-ear heart rate monitoring for fitness tracking, meaning you could track your heart rate while exercising without having to wear an Apple Watch. He also suggested that Apple was researching in-ear temperature sensors.
Lastly, Apple is reportedly testing infrared cameras in future AirPods models, which is reportedly the reason for the delay. According to Bloomberg's Mark Gurman, the cameras could enable new AI-powered features and spatial computing experiences, while analyst Ming-Chi Kuo suggests they might support spatial audio enhancements and even enable gesture-based controls.
Kuo previously said the IR-camera-equipped AirPods were unlikely to enter mass production until 2026, which could coincide with the expected release of the Pro 3, though it's unclear whether this tech will make it into the AirPods Pro 3 or a later model.
With impressive sound and noticeably better noise cancellation than the first-generation pro earbuds, the AirPods Pro 2 also feature touch controls on the buds themselves. In his review, tech critic David Phelan said of the buds: 'Noise-cancelling was good enough to reduce continuous sounds such as engine noise on a train'. Now, you can save £40 while they're discounted at Laptops Direct.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Reuters
29 minutes ago
- Reuters
Ingram Micro says identified ransomware on certain of its internal systems
July 5 (Reuters) - Ingram Micro (INGM.N), opens new tab said on Saturday it recently identified ransomware on certain of its internal systems. The information technology company took steps to secure the relevant environment, including taking certain systems offline, it said in a statement. The Irvine, California-based company also launched an investigation with the assistance of leading cybersecurity experts and notified law enforcement, it added.

Finextra
an hour ago
- Finextra
Context Engineering for Financial Services: By Steve Wilcockson
The hottest discussion in AI right now, at least the one not about Agentic AI, is about how "context engineering" is more important than prompt engineering, how you give AI the data and information it needs to make decisions, and it cannot (and must not) be a solely technical function. "'Context' is actually how your company operates; the ideal versions of your reports, documents & processes that the AI can use as a model; the tone & voice of your organization. It is a cross-functional problem.' So says renowned Tech Influencer and Associate Professor at Wharton School, Ethan Molick. He in turn cites fellow Tech Influencer Andrej Karpathy on X, who in turn cites Tobi Lutke, CEO of Shopify: "It describes the core skill better: the art of providing all the context for the task to be plausibly solvable by the LLM. " The three together - Molick, Karpathy and Lutke - make for a powerful triumvirate of Tech-influencers. Karpathy consolidates the subject nicely. He emphasizes that in real-world, industrial-strength LLM applications, the challenge entails filling the model's context window with just the right mix of information. He thinks about context engineering as both a science—because it involves structured systems and system-level thinking, data pipelines, and optimization —and an art, because it requires intuition about how LLMs interpret and prioritize information. His analysis reflects two of my predictions for 2025 one highlighting the increasing impact of uncertainty and another a growing appreciation of knowledge. Tech mortals offered further useful comments on the threads, two of my favorites being: 'Owning knowledge no longer sets anyone apart; what matters is pattern literacy—the ability to frame a goal, spot exactly what you don't know, and pull in just the right strands of information while an AI loom weaves those strands into coherent solutions.' weaves those strands into coherent solutions.' 'It also feels like 'leadership' Tobi. How to give enough information, goal and then empower.' I love the AI loom analogy, in part because it corresponds with one of my favorite data descriptors, the "Contextual Fabric". I like the leadership positivity too, because the AI looms and contextual fabrics, are led by and empowered by humanity. Here's my spin, to take or leave. Knowledge, based on data, isn't singular, it's contingent, contextual. Knowledge and thus the contextual fabric of data on which it is embedded is ever changing, constantly shifting, dependent on situations and needs. I believe knowledge is shaped by who speaks, who listens, and what about. That is, to a large extent, led by power and the powerful. Whether in Latin, science, religious education, finance and now AI, what counts as 'truth' is often a function of who gets to tell the story. It's not just about what you know, but how, why, and where you know it, and who told you it. But of course it's not that simple; agency matters - the peasant can become an abbot, the council house schoolgirl can become a Nobel prize-winning scientist, a frontier barbarian can become a Roman emperor. For AI, truth to power is held by the big tech firms and grounded on bias, but on the other it's democratizing in that all of us and our experiences help train and ground AI, in theory at least. I digress. For AI-informed decision intelligence, context will likely be the new computation that makes GenAI tooling more useful than simply being an oft-hallucinating stochastic parrot, while enhancing traditional AI - predictive machine learning, for example - to be increasingly relevant and affordable for the enterprise. Context Engineering for FinTech Context engineering—the art of shaping the data, metadata, and relationships that feed AI—may become the most critical discipline in tech. This is like gold for those of us in the FinTech data engineering space, because we're the dudes helping you create your own context. I'll explore how five different contextual approaches, all representing data engineering-relevant vendors I have worked for —technical computing, vector-based, time-series, graph and geospatial platforms—can support context engineering. Parameterizing with Technical Computing Technical computing tools – think R, Julia, MATLAB and Python's SciPy stack - can integrate domain-specific data directly into the model's environment through structured inputs, simulations, and real-time sensor data, normally as vectors, tables or matrices. For example, in engineering or robotics applications, an AI model can be fed with contextual information such as system dynamics, environmental parameters, or control constraints. Thus the model can make decisions that are not just statistically sound but also physically meaningful within the modeled system. They can dynamically update the context window of an AI model, for example in scenarios like predictive maintenance or adaptive control, where AI must continuously adapt to new data. By embedding contextual cues, like historical trends, operational thresholds, or user-defined rules, such tools help ground the model's outputs in the specific realities of the task or domain. Financial Services Use Cases Quantitative Strategy Simulation Simulate trading strategies and feed results into an LLM for interpretation or optimization. Stress Testing Financial Models Run Monte Carlo simulations or scenario analyses and use the outputs to inform LLMs about potential systemic risks. Vectors and the Semantics of Similarity Vector embeddings are closely related to the linear algebra of technical computing, but they bring semantic context to the table. Typically stored in so-called vector databases, they encode meaning into high-dimensional space, allowing AI to retrieve through search not just exact matches, but conceptual neighbors. They thus allow for multiple stochastically arranged answers, not just one. Until recently, vector embeddings and vector databases have been primary providers of enterprise context to LLMs, shoehorning all types of data as searchable mathematical vectors. Their downside is their brute force and compute-intensive approach to storing and searching data. That said, they use similar transfer learning approaches – and deep neural nets – to those that drive LLMs. As expensive, powerful brute force vehicles of Retrieval-Augmented Generation (RAG), vector databases don't simply just store documents but understand them, and have an increasingly proven place for enabling LLMs to ground their outputs in relevant, contextualized knowledge. Financial Services Use Cases Customer Support Automation Retrieve similar past queries, regulatory documents, or product FAQs to inform LLM responses in real-time. Fraud Pattern Matching Embed transaction descriptions and retrieve similar fraud cases to help the model assess risk or flag suspicious behavior. Time-Series, Temporal and Streaming Context Time-series database and analytics providers, and in-memory and columnar databases that can organize their data structures by time, specialize in knowing about the when. They can ensure temporal context—the heartbeat of many use cases in financial markets as well as IoT, and edge computing- grounds AI at the right time with time-denominated sequential accuracy. Streaming systems, like Kafka, Flink, et al can also facilitate the real-time central nervous systems of financial event-based systems. It's not just about having access to time-stamped data, but analyzing it in motion, enabling AI to detect patterns, anomalies, and causality, as close as possible to real time. In context engineering, this is gold. Whether it's fraud that happens in milliseconds or sensor data populating insurance telematics, temporal granularity can be the difference between insight and noise, with context stored and delivered by what some might see as a data timehouse. Financial Services Use Cases Market Anomaly Detection Injecting real-time price, volume, and volatility data into an LLM's context allows it to detect and explain unusual market behavior. High-Frequency Trading Insights Feed LLMs with microsecond-level trade data to analyze execution quality or latency arbitrage. Graphs That Know Who's Who Graph and relationship-focussed providers play a powerful role in context engineering by structuring and surfacing relationships between entities that are otherwise hidden in raw data. In the context of large language models (LLMs), graph platforms can dynamically populate the model's context window with relevant, interconnected knowledge—such as relationships between people, organizations, events, or transactions. They enable the model to reason more effectively, disambiguate entities, and generate responses that are grounded in a rich, structured understanding of the domain. Graphs can act as a contextual memory layer through GraphRAG and Contextual RAG, ensuring that the LLM operates with awareness of the most relevant and trustworthy information. For example, graph databases - or other environments, e.g. Spark, that can store graph data types as accessible files, e.g. Parquet, HDFS - can be used to retrieve a subgraph of relevant nodes and edges based on a user query, which can then be serialized into natural language or structured prompts for the LLM. Platforms that focus graph context around entity resolution and contextual decision intelligence can enrich the model's context with high-confidence, real-world connections—especially useful in domains like fraud detection, anti-money laundering, or customer intelligence. Think of them as like Shakespeare's Comedy of Errors meets Netflix's Department Q. Two Antipholuses and two Dromios rather than 1 of each in Comedy of Errors? Only 1 Jennings brother to investigate in Department Q's case, and where does Kelly MacDonald fit into anything? Entity resolution and graph context can help resolve and connect them in a way that more standard data repositories and analytics tools struggle with. LLMs cannot function without correct and contingent knowledge of people, places, things and the relationships between them, though to be sure many types of AI can also help discover the connections and resolve entities in the first place. Financial Services Use Cases AML and KYC Investigations Surface hidden connections between accounts, transactions, and entities to inform LLMs during risk assessments. Credit Risk Analysis Use relationship graphs to understand borrower affiliations, guarantors, and exposure networks. Seeing the World in Geospatial Layers Geospatial platforms support context engineering by embedding spatial awareness into AI systems, enabling them to reason about location, proximity, movement, and environmental context. They can provide rich, structured data layers (e.g., terrain, infrastructure, demographics, weather) that can be dynamically retrieved and injected into an LLM's context window. This allows the model to generate responses that are not only linguistically coherent but also geographically grounded. For example, in disaster response, a geospatial platform can provide real-time satellite imagery, flood zones, and population density maps. This data can be translated into structured prompts or visual inputs for an AI model tasked with coordinating relief efforts or summarizing risk. Similarly, in urban planning or logistics, geospatial context helps the model understand constraints like traffic patterns, zoning laws, or accessibility. In essence, geospatial platforms act as a spatial memory layer, enriching the model's understanding of the physical world and enabling more accurate, context-aware decision-making. Financial Services Use Cases Branch Network Optimization Combine demographic, economic, and competitor data to help LLMs recommend new branch locations. Climate Risk Assessment Integrate flood zones, wildfire risk, or urban heat maps to evaluate the environmental exposure of mortgage and insurance portfolios. Context Engineering Beyond the Limits of Data, Knowledge & Truths Context engineering I believe recognizes that data is partial, and that knowledge and perhaps truth or truths needs to be situated, connected, and interpreted. Whether through graphs, time-series, vectors, tech computing platforms, or geospatial layering, AI depends on weaving the right contextual strands together. Where AI represents the loom, the five types of platforms I describe are like the spindles, needles, and dyes drawing on their respective contextual fabrics of ever changing data, driving threads of knowledge—contingent, contextual, and ready for action.


Auto Blog
2 hours ago
- Auto Blog
No End in Sight? US BMW XM Sales Fall 24% in Q2 2025
By signing up I agree to the Terms of Use and acknowledge that I have read the Privacy Policy . You may unsubscribe from email communication at anytime. Turns out the XM isn't just hard to look at; it's hard to sell, too. You'd no doubt recognize the XM if you happened to see one (of the apparently few) on the street. A front-end design like no other, complete with attention-grabbing, glowing grilles, makes the SUV almost impossible to ignore. This was probably, at least partially, the point when BMW launched the Halo M product back at the end of 2022. But sales figures show that BMW may have totally missed the mark with the XM — a rare coincidence with what most brand enthusiasts have been saying since the SUV's inception. 0:09 / 0:09 2026 Audi A6 Avant debuts to fight BMW 5 Series Touring Watch More Sales of the BMW XM peaked early on and never picked back up Things started off okay, if not amazing, for the XM. It launched in late Q1 2023, and in Q2 of the same year, 762 models made it into customer garages. That's not bad when you consider the BMW XM is a somewhat low-production model commanding a price deep in the six figures. Q1 2024 even saw growth for the model — likely as a result of the SUV coming to market in the US late in the quarter, but hey, we'll give credit where it's due. Moving 541 units, an increase of 32.3% over the previous year, the XM looked to hold its own. Unfortunately, the good news for XM sales ended there, a year and a half ago. XM sales fell 29.5% and 30.7% in Q2 and Q3 of 2024, respectively, and most recently, in Q2 2025, they continued to fall another 23.8%, with just 409 SUVs finding new homes. While obviously the Bavarian automaker isn't producing as many XMs as, say, the BMW 3 Series, it isn't totally due to low production that the model isn't moving. A quick search online reveals over 200 XM models sitting on dealer lots, including 2023 and 2024 models. Changes for 2026 seem unlikely to ail the model, but time will tell It's unlikely to be reactionary since it's about the time in the XM's product lifecycle for an update, but BMW is making changes to the 2026 BMW XM. The newest model year drops the base model, leaving only the XM Label. The 2026 XM Label enjoys 738 horsepower and 738 pound-feet of torque — nearly 100 more horsepower than the standard car — and some special red badging details. For 2026, BMW will introduce new exterior paints and interior upholstery, wheels, and welcome light animation. That also means the price of entry rises to over $186,000, when previously, you could get a base model for around $160,000. Unless, of course, BMW does something wacky and debuts the 2026 XM Label at a lower MSRP than the 2025 model. Autoblog Newsletter Autoblog brings you car news; expert reviews and exciting pictures and video. Research and compare vehicles, too. Sign up or sign in with Google Facebook Microsoft Apple By signing up I agree to the Terms of Use and acknowledge that I have read the Privacy Policy . You may unsubscribe from email communication at anytime. Which, while unlikely, isn't impossible. Despite announcing and showcasing the model's changes for 2026, the automaker is mum on pricing, claiming it will be announced closer to the model's August start of production. If the brand really wants to move some metal, a lower MSRP would be a great way to do it. After all, it's hard to claim the XM is at all close to the aspirational model BMW M hoped it would be. Either way, the elimination of the base model will effectively doom the XM to a continued downward sales trend for the remainder of its potentially short life. Final thoughts The XM is a victim of marketing. I've had a decent chunk of time behind the wheel of the XM, and the saddest thing about seeing it fail is that it's actually a pretty competent car. If you pulled all the silly badging and re-positioned it as an X8, priced alongside the top-tier X7 M60i or something similar, it would likely be a pretty good seller. Yes, it's still ugly, but beauty is in the eye of the beholder, and I was never the target audience. It will be interesting to see how BMW pivots from the XM to the next halo car and what lessons the brand might have learned along the way. About the Author Steven Paul View Profile