logo
#

Latest news with #WWDC

What You Should Know About T9 Dialing on Your iPhone Before iOS 26
What You Should Know About T9 Dialing on Your iPhone Before iOS 26

CNET

time14 hours ago

  • CNET

What You Should Know About T9 Dialing on Your iPhone Before iOS 26

Apple announced on June 9 at its Worldwide Developers Conference that the next version of the iPhone's operating system will be called iOS 26. The tech giant said that update will bring a transparent glass design to icons and menus and much more to your iPhone this fall. But when Apple released iOS 18 in 2024, the tech giant brought T9 dialing to all iPhones. With T9 dialing, you can call someone just by typing their name into your iPhone's Phone app. That means with iOS 18, you don't have to search for a person's name in your contacts, or use Messages, to call them. Read more: Everything Apple Announced at WWDC Here's everything you need to know about T9 dialing and how to use it on your iPhone. What exactly is T9 dialing? T9 stands for text on nine keys. Before cellphones had full keyboards, many phones had 12 keys. These were for the numbers zero to nine, star and the pound sign or hashtag. I felt old just writing that. And to text someone, you mostly used the nine numbered keys. Each numbered key had either three or four corresponding letters attached to it, with the zero key being the spacebar and the one key blank. If you go into your Phone app and tap Keypad across the bottom of your screen, you can see the corresponding numbers and letters there. At first, if you wanted to type "hello" you had to press 44(H)-33(E)-555(L)-555(L)-666(0). That's a lot of typing for one fairly short word. T9 texting was introduced as an early form of predictive typing. It lets you press fewer keys and send messages faster. So for "hello" you'd type 4-3-5-5-6. Much easier. Getty Images How to use T9 dialing on iPhone With iOS 18, you can now use T9 dialing to make calls. Here's how. 1. Open your Call app on your iPhone. 2. Tap Keypad at the bottom of the screen. From here, start typing the name of the person you want to call using the principles of T9. So if you want to call dad, you'd type 3-2-3 into your keypad, and for mom, you'd type 6-6-6, but nothing to be freaked out about, promise. Their name should appear across the top of the screen. As you type, a backspace button will appear beneath the pound sign/hashtag if you make a mistake. Press that as many times as needed. Note there is no space button. Once you find the person, tap their name and their phone number will automatically be entered. Then hit the green call button, and you're set. That is not his real number. Apple/CNET If there is more than one person in your Contacts app by a given name, as you type you'll see an option beneath the top result that says something like "3 more…" Tap that to open a new menu with all the corresponding contacts. Then tap the call button next to the correct contact and your call will immediately start. Be prepared to explain why you're calling the other person and not just texting them. For more on iOS 18, here's what you need to know about iOS 18.5 and iOS 18.4, as well as our iOS 18 cheat sheet. You can also check out what you should know about iOS 26.

How Steve Jobs would have reacted to Apple's WWDC liquid glass redesign
How Steve Jobs would have reacted to Apple's WWDC liquid glass redesign

The Star

timea day ago

  • Entertainment
  • The Star

How Steve Jobs would have reacted to Apple's WWDC liquid glass redesign

There are two Steve Jobs keynotes that stand out in my memory more than any other. The first, of course, is the moment he introduced the iPhone in 2007. The entire keynote was a master class in storytelling, engineering, and showmanship. But my favourite part was when Jobs, in the middle of a live demo, prank called a Starbucks and calmly said he wanted to order 4,000 lattes to go. Then he quickly added, 'Just kidding,' and hung up the call. It was a small thing, but it was unforgettable. It was unexpected. It was … fun. But there's another moment that sticks with me. It's less iconic, but only because on the scale of the iPhone, everything is less iconic. It was, however, just as telling about how Jobs thinks about products and how to talk about them. It was 2000, when Jobs introduced Mac OS X's Aqua interface. The new design was fluid, full of gradients and transparency. It was colorful and reflective – almost glossy. It looked unlike anything else at the time. And when Jobs talked about it, he said something that defined Apple's relationship with design for the next two decades: 'One of the design goals was that when you saw it, you wanted to lick it.' Then he paused and licked his lips. I often think about the fact that the goal of designing a piece of software that millions of people would use was as much about how it made people feel as it was about being useful. Obviously, it had to be useful, but it also had to be fun. It had to be delightful. This brings me to this year's WWDC. Apple announced a major redesign of all its software platforms with what it's calling 'Liquid Glass.' According to Alan Dye, Apple's VP of human interface design, the goal was to give the system 'depth, vibrancy, and a new level of expression.' It's a very different look, especially on the iPhone – but there are real changes on the Mac as well. But the thing I keep thinking about is: Where's the fun? The keynote was impressive. It was polished. It was efficient. But it didn't quite feel joyful. It didn't feel like Apple was showing off something it loved. It felt like Apple was explaining something it had to get right. Dye used a lot of words to explain how the company studied the properties of glass and how it reflects and refracts light. The thing is, I think it would have been fine if he'd just said Apple thinks it's really cool. I've heard and read critics saying that Jobs would roll over in his grave if he saw the new interface design. That's the kind of thing that's easy to say for views, but I don't think it's true at all. First, the new design is still an early beta. Yes, there are things that don't work from a design perspective – but it's far too early to pass judgment. I have confidence that Apple will fix them as it gets closer to September when it ships them to the public. My point isn't that Jobs wouldn't have liked what Apple is doing with Liquid Design. My point is that he would have had a lot more fun with it than the company seems to be having. Perhaps it's harder now than in 2000. Perhaps that's because Apple is under intense pressure, now more than ever. It's been a year since Apple teased the arrival of a smarter Siri and its broader vision for AI, now branded 'Apple Intelligence.' Expectations are high, especially as it seems the competition is delivering on Apple's promises with more speed and consistency than Apple itself. But the Liquid Glass redesign – what should have been the most obviously delightful part – felt strangely sterile. During the Aqua introduction, Jobs said that 'when you design a new user interface, you have to start off humbly. You have to start off saying, 'What are the simplest elements in it? What does a button look like?' And you spend months working on a button.' The implication was that even something as small as a button can carry emotion, weight, and personality. I miss the company that wasn't afraid to get weird. To call a design 'lickable.' To order 4,000 lattes from the stage. I'm not saying Apple needs to recreate Steve Jobs's persona. That would be impossible – and probably a bad idea. But I do think it needs to rediscover a little of that energy. That sense of play. That design isn't just functional, or even beautiful. It's emotional. It's fun. A user interface doesn't have to be revolutionary to be memorable. It just has to make you feel something. Better yet if that feeling is: 'I kinda want to lick this.' – Inc./Tribune News Service

It's time to stop talking about AirPods with cameras… for now
It's time to stop talking about AirPods with cameras… for now

Digital Trends

timea day ago

  • Digital Trends

It's time to stop talking about AirPods with cameras… for now

A lingering question remains for me around the likely-forthcoming AirPods Pro 3, and it's not about when they'll launch. It's still unclear whether we'll see new AirPods Pro with the iPhone 17 launch later this year, but if they do appear, there still seems to be debate over whether they'll have cameras embedded. Recommended Videos Now, the general consensus is, if they do arrive in September, they'll be similar in shape to the current AirPods Pro 2, but with longer battery life, better sonic performance and enhanced health sensors. However, a new feature in iOS 26, spotted by 9to5Mac, has suggested that there will be some camera smarts in the new earbuds. The site points out: 'in iOS 26's Settings app, AirPods' 'Remote Control' option is included in a new AirPods menu titled 'Camera Control.' There are currently no other settings in that menu, making the heading name seem more noteworthy.' I agree that it's odd to have a menu with only one option inside, but to me that suggests more controls are coming for the AirPods, not that they're going to pack camera technology in the future. Apple announced at WWDC that it'll allow users to start recording video from AirPods with the new iOS 26 software, so that makes sense. However, even the simple fact that this was even questioned highlights that some still believe Apple could launch headphones with camera technology inside this year. Perhaps because the only 'facts' we have to go on are from predictions, albeit from noted journalists and analysts. But there are lots of other reasons why camera-enabled AirPods are unlikely this year, placing the potential launch data of a game-changing new feature being added to Apple's headphones firmly into next year at least. So no new camera-packing AirPods? The first thing to clear up in terms of likely rumors: if Apple does put cameras into its AirPods, it's hugely unlikely it'll be the same kind of lens found in a smartphone or even on smart glasses. The reason for this is two-fold: firstly, there are a lot of issues with design. Meta's reported 'Camerabuds', which were talked about last year, were intriguing, but they apparently had problems figuring out the design. Making a camera that small and packing it into a tiny earbud brings issues with power consumption, heat and being able to make the components small enough without sending the cost skyrocketing. Oh, and of course: what about when hair gets in the way? Then there's the privacy issues. The Ray-Ban Meta smart glasses have an LED light on the front that's auto-enabled so people know when you're recording – how would something similar appear on an earbud? Even if there was a light, it would be hard to see twinkling away on the side of the head. Apple wouldn't want to wade into a privacy war without having a decent solution there. That's why most of the rumors have centered around the brand, if it does indeed add a camera to the AirPods Pro, will focus on using an infrared (IR) camera inside. This would allow the headphones to understand the world around the user, using the light sensing to 'see' objects and give more information to the phone. This would mean sound playback could be more immersive and enriched, especially used in conjunction with the Vision Pro. Apple Intelligence would be enhanced if it had more contextual understanding of its surroundings, again highlighting how the brand is thinking not about LLM chatbots for AI, but instead about using the new technology to enhance the phone's capabilities overall. The issues mentioned above, around miniaturisation and hair getting in the way, are still going to be present with infra red technology, but wouldn't be as severe as Apple already has a lot of experience with IR emitters thanks to the presence of the Face ID camera on the front of every new iPhone. IR tech in headphones could potentially also enable gesture technology, similar to that already used in the Vision Pro. It sounds crazy that someone would want to walk around waving their hands in the air to control their mobile device (although never forget this guy when the Vision Pro came out). But just try it now – imagine you're on a keyboard and you want to change a track. Swipe to the right with your hand and see how that feels. If changing a track was that easy, I'd be doing it all the time. When will Apple likely deploy the tech? When you start breaking down all the hurdles that would need to be overcome, it feels unlikely that we'd ever see AirPods Pro with cameras embedded this year, even if they are 'only' IR. What makes more sense is that the new AirPods Pro – whether that's Pro 3 or Pro 4 – are launched with a refined Vision Pro headset, complementing one another and providing a great chance to upsell. Want the new Vision Pro? Enhance it with new AirPods. Enjoying the new AirPods? Try the Vision Pro – they work really well with it. Given that's tipped for 2026 at the earliest, according to noted analyst Mark Gurman, it would make sense that we don't see the IR-enhanced AirPods Pro until then. Which leads us back to this year. Given Apple took three years to upgrade the original AirPods Pro to the Pro 2 (2019 to 2022), it seems unlikely that it would launch new headphones this year, even if that does fit the previous cadence. This is a classic case of two rumors clashing – things like reference to new AirPods Pro 3 appearing in the new iOS 26 source code suggest they're imminent, and Gurman has already suggested that we'll see new AirPods Pro with heart-rate tracking this year. But well-respected analyst Ming-Chi Kuo, the 'other' strong source of Apple info, suggests they'll be debuted in 2026 to allow for the IR tech to be included, alongside an upgraded Vision Pro (according to analyst Jeff Pu). So when the iPhone 17 launch happens later this year, keep an eye out for new headphones. Although having just outlined all of this, it's now inevitable that we'll see someone from the AirPods team on stage, showing off the 'third generation of AirPods, now with full cameras for enhanced Apple Intelligence…'

Apple must steal these 3 tab features to make Safari truly irresistible
Apple must steal these 3 tab features to make Safari truly irresistible

Digital Trends

time2 days ago

  • Digital Trends

Apple must steal these 3 tab features to make Safari truly irresistible

When Apple took to the stage at WWDC 2025 a few weeks ago, I was expecting some key improvements to Safari. Instead, what the company served was a redesign and the promise of faster performance. Now, Safari isn't a devastating laggard. For a lot of users in the Apple ecosystem, it gets the job done. But over the past few years, rivals — both established and upstarts — have come up with features that make Safari feel as if it were stuck in the past. When Arc came out, it reimagined what a browser can offer, despite being built atop the same engine as Chrome. Of particular interest was its intuitive tab management system. Recommended Videos Safari's take on how tabs are managed beyond syncing across devices remains stagnant. Over the past few months, I've tried a host of browsers, especially underrated gems like Opera and new-age AI-powered options such as Dia, which reimagine how you interact with browser tabs. Here's a list of the most innovative tab tricks that Safari must draw a few inspirations from: Talk with tabs Dia is an AI-focused browser, but it's not exactly stuffing it down every user's throat. It serves a meaningful kind of tab action powered by AI. Let's say you are scrolling a webpage and select a word or passage. As soon as you do it, the text as well as the entire tab content is automatically copied to the AI assistant's chat feed. You just go ahead and type in your query to get the answer without any copy-paste chore or even opening another tab. For example, if I merely highlight a technical term such as '10GBE Ethernet' on a webpage, all I need to type in the sidebar is 'Explain' and hit enter. The AI will crawl the web, find answers from reliable sources, and present them in a well-formatted structure. Alternatively, you can just chat about the webpage without ever highlighting a word, passage, or image. It's a low-friction, high-reward tactic for getting work done without opening a dedicated tab each time you need some background info. On a similar note, Dia lets you pull information from across multiple tabs and pull a report out of it. For example, if you have opened eight shopping tabs for earbuds on Amazon, you can simply use the '@' command in the search box, type the tab name (the letters you see on the tab card) or pick them from the drop-down list, and type your query, like: 'Compare these sweaters, create a table with their pros and cons, and their price in a sorted manner.' Doing so will pull information from all the tabs in the background, and it will be neatly presented to you as a table detailing everything you asked for. It's like pulling intelligence from across multiple tabs, without doing the manual back and forth. Tab control made easy Opera browser offers one of the most forward-looking approaches to tab management. It sticks with the traditional route of handling tabs with cursor movement or keyboard shortcuts, but for users who want an extra dash of convenience, it offers a chat-like system, as well. Think of it as talking to Siri or Gemini to handle your basic browser chores. Opera browser comes with an assistant that can understand your natural language commands for handling tabs. For example, you can ask it to perform chores such as 'put all my IEEE tabs in a group,' 'close all the background tabs,' 'bookmark my Reuters and MIT tabs,' and more. Just like Dia, Opera's assistant can also handle in-tab chores, such as summarizing an article or asking queries about content on the page, even if it requires web research in the background. It works flawlessly, and as a journalist, it saves me a lot of time while doing research and keeping things in order. All of it is paired with the thoughtfully designed tab island system in Opera, which is color-coded, collapsible, and supports drag/drop gestures. Furthermore, thanks to workspaces, all your tabs and tab groups can be neatly arranged across different browsing profiles without any overlap. Saving and sharing tabs There can only be so many tabs you can keep active at a time before the browser starts slowing things down. But more than just keeping tabs alive, one needs a system where they can be saved, like a neat digital notebook, in a presentable fusion. And in a shareable format, as well. Unfortunately, Safari doesn't deliver on this promise. On the other hand, Microsoft and Opera offer a fantastic solution. In Opera, you get Pinboards. Think of it as Pinterest, but for your web browser. You can create and organize as many pinboards as you want, and save your browser notes, complete with an active web preview and personal notes, custom images, and wallpapers. Pinboards let you play with how the tab previews look, like a notebook or a vertically-scrolling social media-inspired content feed. And when you share it, the pinboard is tuned into a URL that opens in the same format as you created it. For the recipient, there is no log-in hassle or Opera browser requirement. Similar to Opera, Microsoft's Edge browser has also offered a similar system called Collections. It also lives in the sidebar, lets you add custom notes, and assign a name to each tab cluster. With a single click, you can copy all the contents of a collection to the clipboard and share it. The sharing system for Edge Collections is not as elegant as Opera Pinboards, but it gets the job done. I just hope Apple pays attention to these meaningful tab interactions that rivals have adopted and delivers its own take in Safari down the road. I am hopeful, but at the moment, I am sticking to browsers that do it better.

INVESTOR DEADLINE: Robbins Geller Rudman & Dowd LLP Announces that Apple Inc. Investors with Substantial Losses Have Opportunity to Lead Class Action Lawsuit
INVESTOR DEADLINE: Robbins Geller Rudman & Dowd LLP Announces that Apple Inc. Investors with Substantial Losses Have Opportunity to Lead Class Action Lawsuit

Business Wire

time2 days ago

  • Business
  • Business Wire

INVESTOR DEADLINE: Robbins Geller Rudman & Dowd LLP Announces that Apple Inc. Investors with Substantial Losses Have Opportunity to Lead Class Action Lawsuit

SAN DIEGO--(BUSINESS WIRE)--The law firm of Robbins Geller Rudman & Dowd LLP Apple class action lawsuit. Captioned Tucker v. Apple Inc., No. 25-cv-05197 (N.D. Cal.), the Apple class action lawsuit charges Apple as well as certain of Apple's top current and former executives with violations of the Securities Exchange Act of 1934. If you suffered substantial losses and wish to serve as lead plaintiff of the Apple class action lawsuit, please provide your information here: CASE ALLEGATIONS: The Apple class action lawsuit alleges that defendants throughout the Class Period made false and/or misleading statements and/or failed to disclose that: (i) Apple misstated the time it would take to integrate the advanced artificial intelligence ('AI')-based Siri features into its devices; (ii) accordingly, it was highly unlikely that these features would be available for the iPhone 16; (iii) the lack of such advanced AI-based features would hurt iPhone 16 sales; and (iv) as a result, Apple's business and/or financial prospects were overstated. The Apple class action lawsuit further alleges that on March 7, 2025, Apple announced it was indefinitely delaying promised updates to its Siri digital assistant. The Apple class action lawsuit alleges that on this news, the price of Apple stock fell. Then, on March 12, 2025, the Apple class action lawsuit further alleges that Morgan Stanley published a report in which analyst Erik Woodring lowered his price target on Apple from $275 to $252, asserting that the delay in introducing advanced Siri features would impact iPhone upgrade cycles throughout 2025 and 2026, and presenting evidence that roughly 50% of iPhone owners who did not upgrade to the iPhone 16 attributed their decision to such delays. On this news, the price of Apple stock fell further, according to the complaint. Thereafter, the Apple class action lawsuit alleges that on April 3, 2025, the Wall Street Journal published an article titled 'Apple and Amazon Promised Us Revolutionary AI. We're Still Waiting,' which stated, in relevant part, that '[w]ith 'more personal' Siri . . . , the tech giant[] marketed features [it] ha[s] yet to deliver,' and suggested that while 'this is challenging technology and the cost of getting it wrong is devastatingly high, especially for [a] compan[y] like Apple . . . that must build trust with customers,' 'the same responsibility applies to marketing: They shouldn't announce products until they're sure they can deliver them.' On this news, the price of Apple stock fell more than 7%, according to the complaint. Finally, on June 9, 2025, Apple hosted its Worldwide Developer Conference ('WWDC'), almost one year to the day after first announcing the suite of supposedly forthcoming Apple Intelligence features at the 2024 WWDC, and Apple failed to announce any new updates regarding advanced Siri features, according to the complaint. On this news, the price of Apple stock fell further, according to the complaint. Last year, Robbins Geller secured a $490 million recovery in a securities fraud class action case alleging Apple CEO Timothy Cook made false and misleading statements to investors – the third-largest securities class action recovery ever in the Northern District of California and the fifth-largest such recovery ever in the Ninth Circuit. In the order granting final approval of the settlement, the court recognized the 'skill and strategic vision, as well as the risk taken by [Robbins Geller]' in securing the sizeable recovery while efficiently managing the 'uniquely complex' aspects of the case against 'highly sophisticated and experienced counsel and defendants.' Learn more by clicking here. THE LEAD PLAINTIFF PROCESS: The Private Securities Litigation Reform Act of 1995 permits any investor who purchased or acquired Apple securities during the Class Period to seek appointment as lead plaintiff in the Apple class action lawsuit. A lead plaintiff is generally the movant with the greatest financial interest in the relief sought by the putative class who is also typical and adequate of the putative class. A lead plaintiff acts on behalf of all other class members in directing the Apple class action lawsuit. The lead plaintiff can select a law firm of its choice to litigate the Apple class action lawsuit. An investor's ability to share in any potential future recovery is not dependent upon serving as lead plaintiff of the Apple class action lawsuit. ABOUT ROBBINS GELLER: Robbins Geller Rudman & Dowd LLP is one of the world's leading law firms representing investors in securities fraud and shareholder litigation. Our Firm has been ranked #1 in the ISS Securities Class Action Services rankings for four out of the last five years for securing the most monetary relief for investors. In 2024, we recovered over $2.5 billion for investors in securities-related class action cases – more than the next five law firms combined, according to ISS. With 200 lawyers in 10 offices, Robbins Geller is one of the largest plaintiffs' firms in the world, and the Firm's attorneys have obtained many of the largest securities class action recoveries in history, including the largest ever – $7.2 billion – in In re Enron Corp. Sec. Litig. Please visit the following page for more information:

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store