
iOS 26 beta 2 is live — here's the biggest changes for your iPhone
Apple has released the second developer beta for its iOS 26 update, following on the original released that arrived during the middle of WWDC 2025 earlier this month.
Apple's release notes for the second developer beta note several fixes to known issues found in the first beta, as well as a few new additions that look to improve the user experience. Even though this is an updated release, we're still very early in the beta process. As such, many of the features and additions could look different when the update officially launches, which will happen in the fall, according to Apple.
Still, many of these changes give us a good idea of what exactly we'll find when we are given access to the public beta, which is set to arrive at some point in July.
Some of the biggest changes appear to be focused on Apple's new Liquid Glass design, which makes many of the menus and drop-downs appear more transparent. There have been some complaints that certain menus are much harder to read as a result of Apple's new design.
The new update looks to counter this in two ways. First, Apple has reduced the amount of blur in the background of the iPhone Control Center, resulting in better contrast between the on-screen elements. Secondly, the update allows the Reduce Transparency option in the accessibility menu to reduce the effect even more than before.
One of the changes spotted in the first iOS 26 beta was that Apple had moved the '+' icon in the Safari app that opens a new tab to the upper left of the screen.
However, in the most recent developer beta, Apple has returned the button to the bottom left corner as had been the case in iOS 18.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
The second iOS 26 beta also makes it much easier to find out which accessibility options an app supports before downloading it from the App Store. While looking at an app's product page, users can access a new Accessibility section that developers will be required to fill out with all the features their products support.
The addition to the App Store is among the many accessibility features Apple promised to bring to the iPhone and other devices back in May.
Apple is also introducing full support for the order tracking feature for Apple Wallet that allows Siri to scan your Mail app to find all orders and emails from merchants, even if the products and services weren't paid for with Apple Pay.
Apple's second developer beta also adds several smaller additions and features that look to improve how certain apps and features work. For instance, the beta introduces a new Live Radio widget for Apple Music. There is also a new 'Alt 1' ringtone option for the "Reflection" ringtone. Apple is also improving the description for Low Power Mode to better explain what it does.
Apple has also changed the Transcribe Calls feature's name in the Live Captions setting to Save Call Transcripts. Not only that, but Apple has also made the feature's description state more clearly that call participants will be alerted that the call is being transcribed.
A feature introduced in the first developer beta is the new Recovery Assistant, which appears to allow users to find and solve issues that stop their devices from booting properly. Now in the second beta, Apple has officially mentioned the existence of this tool.Currently, the ability to restore an iPhone or iPad require connection to a separate device. This can be an issue for people who don't have easy access to one, likely forcing them to head to an Apple repair center to sort things out.
While Apple aims the developer beta at people who design software for the iPhone, anyone with an Apple ID can download the iOS 26 developer beta if they wish. Still, given that this is unfinished software, you may be better off waiting until the more stable public beta arrives in July.
If you do want to try out the beta now, we'd recommend putting it on a device that you don't rely on for your everyday use. Any iPhone released since the iPhone 11 in 2019 will support the iOS 26 developer beta, though some new features require a phone that supports Apple Intelligence.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNET
23 minutes ago
- CNET
Apple's $96 Million Siri Settlement Closes In Days. Chances Are Good You Could Be Eligible
If you're eligible for a settlement payout from Apple, make sure you sign up by July 2. Viva Tung/CNET As useful as they -- sometimes -- can be, virtual assistants can often be just as annoying, especially if you've ever called one up by mistake. If you're an Apple user who's had that sort of issue with Siri in the last decade, I've got a settlement you should know about. Apple customers may be eligible for a payout from a $96 million class-action settlement if the Siri virtual assistant was accidentally activated during a private conversation. However, if you want your payout for this privacy invasion, you'll need to make sure you sign up soon. The deadline to file a claim now less than a week away, and after that you'll be out of luck. Apple agreed to the settlement after being sued for allegedly allowing Siri to listen in on private conversations without consent. Now, a claims website is live, and if you meet the criteria, you could get a piece of the payout. Whether you're a longtime iPhone user or just want to see if you're eligible, here's everything you need to know before the window closes. The settlement period covers a full decade and given the ubiquity of Apple products, there's a good chance you'll be eligible for a piece of the payout. If you meet the eligibility standards, you can claim a payment for up to five Siri-enabled devices, with a cap on how much you can receive per device. We'll get into the specific amount a little bit later. The impact of this settlement has the potential to be wide-ranging, given the reach of Apple's product ecosystem. According to a Business of Apps report from November, citing company and market research data, there were roughly 155 million active iPhones in the US as of 2024, a number that's been steadily increasing since the product's debut. Similarly, active Apple TV streaming boxes in the US have also been increasing year to year, with more than 32 million active in the US as of 2023. To find out if you're eligible for this settlement, read on. For more, find out what's up with the recent delay of T-Mobile data breach settlement checks. Who sued Apple and why? This class-action lawsuit, Lopez et al v. Apple Inc., was first brought against Apple in 2019, with plaintiffs alleging that they were routinely recorded by their Apple devices after unintentionally activating the Siri virtual assistant, violating their privacy in the process. They further alleged that these recordings were then sold to advertisers and used to target them with ads online. Specific incidents mentioned in the suit include plaintiffs seeing ads online for brands like Air Jordan and Olive Garden after Apple device users discussed them out loud. In some instances, plaintiffs claimed that their devices began listening to them without them having said anything at all. At least one plaintiff involved in the case was a minor when it was first filed. Though it agreed to the settlement, Apple hasn't admitted any wrongdoing. "Siri has been engineered to protect user privacy from the beginning," Apple said in a statement sent to CNET. "Siri data has never been used to build marketing profiles and it has never been sold to anyone for any purpose. Apple settled this case to avoid additional litigation so we can move forward from concerns about third-party grading that we already addressed in 2019. We use Siri data to improve Siri and we are constantly developing technologies to make Siri even more private." Who is eligible for this class-action settlement? The eligibility requirements for this settlement are fairly broad, as it's open to anyone who owned a Siri-enabled Apple device between Sept. 17, 2014, and Dec. 31, 2024. In order to opt in, you'll have to swear under oath that at some point during that period, you accidentally activated Siri on each device you want to get a payment for, and that these activations occurred during a conversation meant to be private. Siri-enabled devices include iPhones, iPads, Apple Watches, MacBooks, iMacs, Apple TV streaming boxes, HomePod speakers and iPod Touches. How can I opt in to this Apple settlement? As of Thursday, May 8, a website has been launched where Apple customers can claim a portion of the settlement, if they believe they qualify. If you're looking to submit a claim, you have until July 2, 2025, to do so. It's not clear at this time when payments will be disbursed to approved claimants but it will surely be sometime after Aug. 1, 2025, when a final approval hearing is scheduled. How much can I get from the class-action settlement? Payments per device are to be capped at $20, although depending on how many people opt in to the settlement, claimants could receive less than that. Each individual can only claim payments for up to five devices, meaning the maximum possible payment you could receive from the settlement is $100. For more on Apple, see why a majority of users don't care for Apple Intelligence and find out which iOS setting can stop apps from tracking you.


Tom's Guide
3 hours ago
- Tom's Guide
I tried new AirPods features with the iOS 26 beta — and Apple missed an opportunity to add this killer feature
For the past couple of weeks now, I've been trying out as many new features I can find with the iOS 26 developer beta. For example, I've already explained the first thing I did after checking out the software's new Liquid Glass design. Soon after that, I checked out the new CarPlay experience to see what's new and different with Apple's car infotainment system. The next thing up on my list was to try out the new AirPods features that are included with iOS 26 update. Considering how Apple's AirPods are some of the best wireless earbuds on the market, it comes as no surprise that the company would look to add capabilities that make them even more helpful across different situations. I got to test out some of these new features with the iOS 26 beta release, but I'm still confused by one thing that Apple hasn't added — the ability to use the AirPods as a wireless microphone system. Honestly, I'm baffled because it seems like a no-brainer addition given the other things Apple has introduced with the beta software. For example, you can now use AirPods to control how you take photos and videos on your iPhone by simply squeezing on their stems. I tested out this exact feature on my iPhone 16 Pro Max with firmware version 8A293c on my AirPods Pro 2. While video recording, I can use these controls to start and stop recordings remotely. When you shoot as many videos as I do, this feature is helpful because I can put my iPhone on a tripod and then shoot a video of myself at a distance using its rear cameras as opposed to the front-facing one. I hear the usual start recording tone on my AirPods that indicates a recording has started, then the stop recording tone when I'm done. Again, this is a handy feature, but Apple's missing the ball with turning the AirPods into a wireless microphone that can be used for better audio quality because it's on me — and therefore better at picking up my voice. When I do start a video recording using these new gestures, the audio recording is still through my iPhone, so it sounds distant because I'm farther away. I don't understand why Apple doesn't make this addition because it would only make the AirPods better for shooters and creators. With the iOS 26 beta software, Apple says that the AirPods are getting 'studio-quality' audio recording. I can definitely tell that the firmware update enhances my audio quality because I tried this out while outdoors in the city with a phone call to a friend. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. They couldn't hear all the background noise that was around me, like the loud honking noises from the cabs and other traffic disturbances. Knowing that the microphones are delivering better results with audio, you would think that this could be applied to video recording as well — but it isn't, and I think it's a missed opportunity. I'm hopeful the AirPods gain this feature in subsequent iOS 26 updates, just because it could replace my existing DJI Mic system that I use. Apple can certainly have a big impact if the AirPods could effectively act as wireless microphones when recording video with an iPhone. In fact, it would be even better if Apple comes up with a lanyard attachment that could attach an AirPod to your shirt. It could also make the audio better for interviews to break up the audio into two separate tracks since two AirPods would be used as an audio source. Given the popularity of the AirPods for music listening, they could also become game-changing with video recording. What's interesting is that you can technically use AirPods as a wireless mic with an iPhone, but only through third party camera apps like FiLMiC Pro and BlackMagic Camera. At the same time, I suspect that these camera apps wouldn't have access to the same 'studio-quality' audio recording. Hopefully Apple sees this big opportunity and takes the time to add it in time for iOS 26's final release presumably this fall with the iPhone 17 launch.


Business Upturn
6 hours ago
- Business Upturn
Sage Healthspan Launches AI-Powered, Privacy-First Wellness App to Bridge Critical Gaps in Preventative Healthcare
Orange County, California, June 27, 2025 (GLOBE NEWSWIRE) — Sage Healthspan, a California-based digital health company, announced its precision health app, now available for free on Apple's App Store. Designed to address notable shortcomings in modern healthcare delivery, Sage introduces a privacy-first, on-device artificial intelligence (AI) platform aimed at helping users interpret their blood work, monitor health trends, and optimize wellness without sacrificing data security. A Glimpse into Sage Healthspan Analytic Features In an era where preventative healthcare is increasingly prioritized, many individuals still lack access to timely, comprehensible, and actionable insights from their routine lab results. Sage Healthspan identified a persistent issue: although patients regularly undergo blood tests, most are only contacted if major abnormalities are flagged. This approach overlooks nuanced health patterns and early indicators of disease, leading to missed opportunities for early intervention. 'Instead of waiting for symptoms to appear, Sage empowers users to engage proactively with their health data,' said Megan Haas, Media and Communications Lead at Sage Healthspan. 'The AI platform helps transform routine lab work into structured insights, encouraging users to take control of their health trajectory in a secure, comprehensible, and private way.' Closing the Loop Between Data and Action Sage allows users to upload or photograph existing lab results directly from their Apple devices. The platform then interprets the data using local AI algorithms, never uploading personal information to the cloud, providing summaries, visualizations, and tailored recommendations. This includes suggestions for lifestyle adjustments, supplement considerations, or follow-up testing to support long-term wellness goals. Importantly, Sage differentiates itself by emphasizing a 'privacy-first' model. All health data remains on the user's device, enabling secure HIPAA-compliant analysis. In a climate where 78% of healthcare executives name cybersecurity as a top priority, Sage's on-device processing offers a practical alternative to cloud-based health apps. A Comprehensive Health Insight Engine Sage's AI generates insights across an extensive array of health and biomarker categories, including cardiometabolic health, inflammation, blood sugar, autoimmunity, nutrient status, thyroid health, kidney and liver function, and more. As users contribute additional data over time, Sage builds a personalized timeline, allowing for trend detection and wellness optimization rooted in longitudinal analysis. A unique feature of Sage's platform is its lab test ordering capability. Users can independently request advanced biomarkers that are often excluded from standard panels. These include cardiovascular indicators such as ApoB and Lp(a), fasting insulin and HOMA-IR for metabolic health, thyroid and sex hormone assessments, and nutritional markers like vitamin D and omega-3 fatty acid levels. The Growing Importance of Intelligent Health Analytics Recent studies show that 67% of patients report confusion over their lab results, with 61% struggling to understand medical terminology. Additionally, with primary care visits averaging only 15-18 minutes, providers have limited time to address complex, individualized wellness concerns. Sage Healthspan aims to complement, not replace, medical professionals by offering supportive tools that clarify and contextualize lab data for users. From a broader healthcare systems perspective, early detection is increasingly recognized as a critical cost-saver. Nearly 90% of the United States' annual $4.5 trillion healthcare expenditure is tied to chronic conditions. Sage's focus on biomarker-based early detection aligns with evolving industry goals: identifying risk before symptoms arise and enabling targeted, timely action. Positioning in the Era of Medicine 3.0 The emergence of AI in health represents a shift toward what experts call 'Medicine 3.0,' a paradigm that emphasizes prevention, personalization, and patient empowerment. Sage Healthspan embodies this transition. By integrating intelligent health analytics into everyday devices and eliminating the need for cloud computing, the company delivers accessible wellness insights while maintaining robust privacy protections. Sage Healthspan is currently available exclusively for iOS on Apple's App Store. Users are encouraged to begin by uploading pre-existing lab results to generate immediate health insights. For more information or to learn how Sage can support a wellness journey, visit About Sage Healthspan Sage Healthspan is a health technology company based in California focused on closing critical gaps in preventative healthcare. Through its privacy-first AI app, Sage empowers users to understand and act on their blood work, providing structured insights across a range of biomarkers to support health optimization and early detection. Sage's core mission is to make personalized wellness both accessible and secure. Sage Healthspan Logo Disclaimer: The above press release comes to you under an arrangement with GlobeNewswire. Business Upturn takes no editorial responsibility for the same. Ahmedabad Plane Crash