
Apple iPhone 17 Air: Did Apple Just Accidentally Leak A Key Design Detail?
Apple recently released its second developer beta of the next iPhone software, iOS 26. It's what the iPhone 17 series will run on when the handsets are released. But it looks like Apple has given something away about one of the new phones, perhaps by mistake.
Has Apple just revealed what the display on the iPhone 17 "Air" will look like?
As spotted by Macworld, deep in the code for the latest developer beta is a new version of the popular clownfish iPhone wallpaper. So what, you might say, that's nothing new.
Except this one is. The new wallpaper has 'a resolution of '420×912@3x,' or 1260×2736 pixels. There's no iPhone model with such a resolution, at least not yet,' the report pointed out. That's pretty clear that one of the iPhones in development will have this all-new screen size, which ties in perfectly with the reports that have leaked about the super-slim iPhone in development, nicknamed iPhone 17 Air.
'The new resolution wallpaper we found matches a report by analyst Ming-Chi Kuo last year, which claimed that the iPhone 17 Air would have a screen of around 6.6 inches with a resolution of approximately 1260×2740. In other words, the wallpaper all but confirms the existence of a new iPhone model with that exact display resolution. Given that iOS 26 will almost certainly be released in September along with the new iPhones, it's clear that Apple is already making the final adjustments for the next-generation iPhone,' Macworld explained.
The publication even suggests the new wallpaper was left in the beta entirely by accident: 'Since other iOS 26 wallpapers don't have that resolution, most likely the wallpaper was forgotten by Apple within the build released today for developers and shouldn't be seen by the public,' it said.
Well, the truth is you never know with Apple. Could they have planted it in the beta to get everyone talking? Personally, I adhere to the cock-up theory of history rather than the conspiracy one, but who knows?
The Prompt: Get the week's biggest AI news on the buzziest companies and boldest breakthroughs, in your inbox.You're Subscribed!
You're Subscribed!
If it vanishes when the third beta is released, it may indicate a mistake. Either way, it seems to double down on the belief that the iPhone 17 Air will have a 6.6-inch display, which is good to know.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNET
30 minutes ago
- CNET
Apple's $96 Million Siri Settlement Closes In Days. Chances Are Good You Could Be Eligible
If you're eligible for a settlement payout from Apple, make sure you sign up by July 2. Viva Tung/CNET As useful as they -- sometimes -- can be, virtual assistants can often be just as annoying, especially if you've ever called one up by mistake. If you're an Apple user who's had that sort of issue with Siri in the last decade, I've got a settlement you should know about. Apple customers may be eligible for a payout from a $96 million class-action settlement if the Siri virtual assistant was accidentally activated during a private conversation. However, if you want your payout for this privacy invasion, you'll need to make sure you sign up soon. The deadline to file a claim now less than a week away, and after that you'll be out of luck. Apple agreed to the settlement after being sued for allegedly allowing Siri to listen in on private conversations without consent. Now, a claims website is live, and if you meet the criteria, you could get a piece of the payout. Whether you're a longtime iPhone user or just want to see if you're eligible, here's everything you need to know before the window closes. The settlement period covers a full decade and given the ubiquity of Apple products, there's a good chance you'll be eligible for a piece of the payout. If you meet the eligibility standards, you can claim a payment for up to five Siri-enabled devices, with a cap on how much you can receive per device. We'll get into the specific amount a little bit later. The impact of this settlement has the potential to be wide-ranging, given the reach of Apple's product ecosystem. According to a Business of Apps report from November, citing company and market research data, there were roughly 155 million active iPhones in the US as of 2024, a number that's been steadily increasing since the product's debut. Similarly, active Apple TV streaming boxes in the US have also been increasing year to year, with more than 32 million active in the US as of 2023. To find out if you're eligible for this settlement, read on. For more, find out what's up with the recent delay of T-Mobile data breach settlement checks. Who sued Apple and why? This class-action lawsuit, Lopez et al v. Apple Inc., was first brought against Apple in 2019, with plaintiffs alleging that they were routinely recorded by their Apple devices after unintentionally activating the Siri virtual assistant, violating their privacy in the process. They further alleged that these recordings were then sold to advertisers and used to target them with ads online. Specific incidents mentioned in the suit include plaintiffs seeing ads online for brands like Air Jordan and Olive Garden after Apple device users discussed them out loud. In some instances, plaintiffs claimed that their devices began listening to them without them having said anything at all. At least one plaintiff involved in the case was a minor when it was first filed. Though it agreed to the settlement, Apple hasn't admitted any wrongdoing. "Siri has been engineered to protect user privacy from the beginning," Apple said in a statement sent to CNET. "Siri data has never been used to build marketing profiles and it has never been sold to anyone for any purpose. Apple settled this case to avoid additional litigation so we can move forward from concerns about third-party grading that we already addressed in 2019. We use Siri data to improve Siri and we are constantly developing technologies to make Siri even more private." Who is eligible for this class-action settlement? The eligibility requirements for this settlement are fairly broad, as it's open to anyone who owned a Siri-enabled Apple device between Sept. 17, 2014, and Dec. 31, 2024. In order to opt in, you'll have to swear under oath that at some point during that period, you accidentally activated Siri on each device you want to get a payment for, and that these activations occurred during a conversation meant to be private. Siri-enabled devices include iPhones, iPads, Apple Watches, MacBooks, iMacs, Apple TV streaming boxes, HomePod speakers and iPod Touches. How can I opt in to this Apple settlement? As of Thursday, May 8, a website has been launched where Apple customers can claim a portion of the settlement, if they believe they qualify. If you're looking to submit a claim, you have until July 2, 2025, to do so. It's not clear at this time when payments will be disbursed to approved claimants but it will surely be sometime after Aug. 1, 2025, when a final approval hearing is scheduled. How much can I get from the class-action settlement? Payments per device are to be capped at $20, although depending on how many people opt in to the settlement, claimants could receive less than that. Each individual can only claim payments for up to five devices, meaning the maximum possible payment you could receive from the settlement is $100. For more on Apple, see why a majority of users don't care for Apple Intelligence and find out which iOS setting can stop apps from tracking you.


Tom's Guide
3 hours ago
- Tom's Guide
I tried new AirPods features with the iOS 26 beta — and Apple missed an opportunity to add this killer feature
For the past couple of weeks now, I've been trying out as many new features I can find with the iOS 26 developer beta. For example, I've already explained the first thing I did after checking out the software's new Liquid Glass design. Soon after that, I checked out the new CarPlay experience to see what's new and different with Apple's car infotainment system. The next thing up on my list was to try out the new AirPods features that are included with iOS 26 update. Considering how Apple's AirPods are some of the best wireless earbuds on the market, it comes as no surprise that the company would look to add capabilities that make them even more helpful across different situations. I got to test out some of these new features with the iOS 26 beta release, but I'm still confused by one thing that Apple hasn't added — the ability to use the AirPods as a wireless microphone system. Honestly, I'm baffled because it seems like a no-brainer addition given the other things Apple has introduced with the beta software. For example, you can now use AirPods to control how you take photos and videos on your iPhone by simply squeezing on their stems. I tested out this exact feature on my iPhone 16 Pro Max with firmware version 8A293c on my AirPods Pro 2. While video recording, I can use these controls to start and stop recordings remotely. When you shoot as many videos as I do, this feature is helpful because I can put my iPhone on a tripod and then shoot a video of myself at a distance using its rear cameras as opposed to the front-facing one. I hear the usual start recording tone on my AirPods that indicates a recording has started, then the stop recording tone when I'm done. Again, this is a handy feature, but Apple's missing the ball with turning the AirPods into a wireless microphone that can be used for better audio quality because it's on me — and therefore better at picking up my voice. When I do start a video recording using these new gestures, the audio recording is still through my iPhone, so it sounds distant because I'm farther away. I don't understand why Apple doesn't make this addition because it would only make the AirPods better for shooters and creators. With the iOS 26 beta software, Apple says that the AirPods are getting 'studio-quality' audio recording. I can definitely tell that the firmware update enhances my audio quality because I tried this out while outdoors in the city with a phone call to a friend. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. They couldn't hear all the background noise that was around me, like the loud honking noises from the cabs and other traffic disturbances. Knowing that the microphones are delivering better results with audio, you would think that this could be applied to video recording as well — but it isn't, and I think it's a missed opportunity. I'm hopeful the AirPods gain this feature in subsequent iOS 26 updates, just because it could replace my existing DJI Mic system that I use. Apple can certainly have a big impact if the AirPods could effectively act as wireless microphones when recording video with an iPhone. In fact, it would be even better if Apple comes up with a lanyard attachment that could attach an AirPod to your shirt. It could also make the audio better for interviews to break up the audio into two separate tracks since two AirPods would be used as an audio source. Given the popularity of the AirPods for music listening, they could also become game-changing with video recording. What's interesting is that you can technically use AirPods as a wireless mic with an iPhone, but only through third party camera apps like FiLMiC Pro and BlackMagic Camera. At the same time, I suspect that these camera apps wouldn't have access to the same 'studio-quality' audio recording. Hopefully Apple sees this big opportunity and takes the time to add it in time for iOS 26's final release presumably this fall with the iPhone 17 launch.


Business Upturn
6 hours ago
- Business Upturn
Sage Healthspan Launches AI-Powered, Privacy-First Wellness App to Bridge Critical Gaps in Preventative Healthcare
Orange County, California, June 27, 2025 (GLOBE NEWSWIRE) — Sage Healthspan, a California-based digital health company, announced its precision health app, now available for free on Apple's App Store. Designed to address notable shortcomings in modern healthcare delivery, Sage introduces a privacy-first, on-device artificial intelligence (AI) platform aimed at helping users interpret their blood work, monitor health trends, and optimize wellness without sacrificing data security. A Glimpse into Sage Healthspan Analytic Features In an era where preventative healthcare is increasingly prioritized, many individuals still lack access to timely, comprehensible, and actionable insights from their routine lab results. Sage Healthspan identified a persistent issue: although patients regularly undergo blood tests, most are only contacted if major abnormalities are flagged. This approach overlooks nuanced health patterns and early indicators of disease, leading to missed opportunities for early intervention. 'Instead of waiting for symptoms to appear, Sage empowers users to engage proactively with their health data,' said Megan Haas, Media and Communications Lead at Sage Healthspan. 'The AI platform helps transform routine lab work into structured insights, encouraging users to take control of their health trajectory in a secure, comprehensible, and private way.' Closing the Loop Between Data and Action Sage allows users to upload or photograph existing lab results directly from their Apple devices. The platform then interprets the data using local AI algorithms, never uploading personal information to the cloud, providing summaries, visualizations, and tailored recommendations. This includes suggestions for lifestyle adjustments, supplement considerations, or follow-up testing to support long-term wellness goals. Importantly, Sage differentiates itself by emphasizing a 'privacy-first' model. All health data remains on the user's device, enabling secure HIPAA-compliant analysis. In a climate where 78% of healthcare executives name cybersecurity as a top priority, Sage's on-device processing offers a practical alternative to cloud-based health apps. A Comprehensive Health Insight Engine Sage's AI generates insights across an extensive array of health and biomarker categories, including cardiometabolic health, inflammation, blood sugar, autoimmunity, nutrient status, thyroid health, kidney and liver function, and more. As users contribute additional data over time, Sage builds a personalized timeline, allowing for trend detection and wellness optimization rooted in longitudinal analysis. A unique feature of Sage's platform is its lab test ordering capability. Users can independently request advanced biomarkers that are often excluded from standard panels. These include cardiovascular indicators such as ApoB and Lp(a), fasting insulin and HOMA-IR for metabolic health, thyroid and sex hormone assessments, and nutritional markers like vitamin D and omega-3 fatty acid levels. The Growing Importance of Intelligent Health Analytics Recent studies show that 67% of patients report confusion over their lab results, with 61% struggling to understand medical terminology. Additionally, with primary care visits averaging only 15-18 minutes, providers have limited time to address complex, individualized wellness concerns. Sage Healthspan aims to complement, not replace, medical professionals by offering supportive tools that clarify and contextualize lab data for users. From a broader healthcare systems perspective, early detection is increasingly recognized as a critical cost-saver. Nearly 90% of the United States' annual $4.5 trillion healthcare expenditure is tied to chronic conditions. Sage's focus on biomarker-based early detection aligns with evolving industry goals: identifying risk before symptoms arise and enabling targeted, timely action. Positioning in the Era of Medicine 3.0 The emergence of AI in health represents a shift toward what experts call 'Medicine 3.0,' a paradigm that emphasizes prevention, personalization, and patient empowerment. Sage Healthspan embodies this transition. By integrating intelligent health analytics into everyday devices and eliminating the need for cloud computing, the company delivers accessible wellness insights while maintaining robust privacy protections. Sage Healthspan is currently available exclusively for iOS on Apple's App Store. Users are encouraged to begin by uploading pre-existing lab results to generate immediate health insights. For more information or to learn how Sage can support a wellness journey, visit About Sage Healthspan Sage Healthspan is a health technology company based in California focused on closing critical gaps in preventative healthcare. Through its privacy-first AI app, Sage empowers users to understand and act on their blood work, providing structured insights across a range of biomarkers to support health optimization and early detection. Sage's core mission is to make personalized wellness both accessible and secure. Sage Healthspan Logo Disclaimer: The above press release comes to you under an arrangement with GlobeNewswire. Business Upturn takes no editorial responsibility for the same. Ahmedabad Plane Crash