What Is an AI PC? How AI Will Reshape Your Next Computer
AI, in one form or another, is poised to redefine just about all new tech products, but the tip of the spear is the AI PC. The simple definition of an AI PC could be "any personal computer built to support AI apps and features." But know: It's both a marketing term (Microsoft, Intel, and others toss it around freely) and a general descriptor of where PCs are going.
As AI evolves and encompasses more of the computing process, the idea of the AI PC will simply become the new norm in personal computers, resulting in profound changes to the hardware, the software, and, eventually, our entire understanding of what a PC is and does. AI working its way into mainstream computers means your PC will predict your habits, be more responsive to your daily tasks, and even adapt into a better partner for work and play. The key to all that will be the spread of local AI processing, in contrast to AI services served up solely from the cloud.
I'm one of PCMag's AI experts, having thoroughly tested and reviewed many of the first AI PCs. I also write a regular AI-focused column, Try AI, with Emily Forlini, our senior reporter on the AI beat. Below, I'll break down the definition of an AI PC even further, detail the new processing hardware behind this AI PC revolution, and give an overview of what this new technology is capable of now.
Simply put: Any laptop or desktop built to run AI apps or processes on the device, which is to say, "locally," is an AI PC. In other words, with an AI PC, you should be able to run AI services similar to ChatGPT, among others, without needing to get online to tap into AI power in the cloud. AI PCs will also be able to power a host of AI assistants that do a range of jobs—in the background and the foreground—on your machine.
But that's not the half of it. Today's PCs, built with AI in mind, have different hardware, modified software, and even changes to their BIOS (the computer's motherboard firmware that manages basic operations). These key changes distinguish the modern AI-ready laptop or desktop from the systems sold just a few years ago. Understanding these differences is critical as we enter the AI era.
Unlike traditional laptops or desktop PCs, AI PCs have additional silicon for AI processing, usually built directly onto the processor die. On AMD, Intel, and Qualcomm systems, this is generically called the neural processing unit, or NPU. Apple has similar hardware capabilities built into its M-series chips with its Neural Engine.
In all cases, the NPU is built on a highly parallelized and optimized processing architecture designed to crunch many more algorithmic tasks simultaneously than standard CPU cores can. The regular processor cores still handle routine jobs on your machine—say, your everyday browsing and word processing. The differently structured NPU, meanwhile, can free up the CPU and the graphics-acceleration silicon to do their day jobs while it handles the AI stuff.
Having an NPU is an AI enabler. Before NPUs, neural processing tasks were best relegated to discrete graphics processors (GPUs), leveraging silicon arrangements like Nvidia's CUDA core architecture to operate neural networks. If your PC has a dedicated GPU, you generally don't want it tied up in AI processing when it has other work to do. Also, discrete graphics chips tend to use a lot of power. NPUs, explicitly designed for AI tasks, are much more power-efficient out of the gate. They allow similar on-device AI processing without killing battery life.
In sum, NPUs are purpose-built for neural networks, which have different processing needs than the sequential threaded processing typical of most PC applications. Hardware optimized for neural network architectures is a better match for AI-specific tasks, and much more efficient. Even though some AI tasks can be handled by GPU or CPU hardware, using an NPU frees up those other components to handle the workloads they were meant to support.
Mind you, even with an NPU present, AI functions sometimes draw on CPU and GPU compute resources, too, with different systems and apps balancing the load differently across these core components. Still, NPUs make possible AI crunching that traditionally wasn't realistic on lighter-hitting PCs like mainstream laptops. The early requirement for a dedicated GPU to run the first ChatGPT-like open-source models on consumer hardware was a big limiter since, in the laptop world, only mobile workstations, gaming systems, and powerful content-creation models would have one in the first place.
Even then, running local models on that hardware was extremely taxing, necessitating a move to dedicated silicon that could take on the work without monopolizing the CPU and GPU. To alleviate that, NPUs were integrated into processors by AMD, Intel, and Qualcomm and have been filtering into laptops (and, to a lesser extent, desktops) over the last two years.
One measurement dominates current conversations around AI capability: trillions of operations per second, or TOPS. TOPS measures the maximum number of 8-bit integer (INT8) mathematical operations a chip can execute, translating into AI inference performance. This is one type of math used to process AI functions and tasks.
Sometimes, the TOPS measure you see is rating the NPU's capability alone. But at other times, you'll see a manufacturer refer to "total system TOPS," which includes the CPU's and GPU's AI-crunching muscle in the overall number, and that will be a lot higher than the NPU TOPS number alone. But in simple terms, a higher number of TOPS will mean a faster and more capable system for AI applications. If you look at the most recent processors from AMD, Intel, and Qualcomm in laptops, a figure of around 50 TOPS for the NPU indicates a cutting-edge chip; earlier-generation ones (especially from Intel) may have TOPS counts in the teens.
It's worth noting that TOPS ratings aren't the result of some benchmark or performance test we run. Rather, they indicate a vendor-rated theoretical peak in performance, found by calculating the chip's clock frequency and multiplying by the number of operations it can perform per clock cycle. But again, this is theoretical, and real-world performance can be much lower, slowed by limitations like memory bandwidth, data transfer bottlenecks, thermal throttling, and the specific workloads in question.
Also, sometimes the whole point of running an AI task on an NPU is to run it more efficiently than you otherwise could or to run it in the background. (It's not necessarily about drag-race speed.) So don't obsess over maxing out the TOPS count in your CPU. It's often best just to know that yours is within striking distance of other recent AI PCs. (For example, early NPU-equipped laptops using Intel's "Meteor Lake" Core Ultra 100 chips may be rated for just 11 NPU TOPS versus newer Core Ultra 200 chips' 48 NPU TOPS.)
Neural processing is only one ingredient in what makes the modern AI PC: You need AI software to take advantage of the hardware. Software has become the main battleground for companies eager to define the AI PC in terms of their own brands.
Intel defines an AI PC as one equipped with an NPU. Then, Microsoft added to this definition with hardware potency requirements for using its Copilot AI assistant software in Windows 11 and the new dedicated Copilot Key required on all Windows laptops with AI hardware inside.
Microsoft's Copilot PCs and more premium Copilot+ PCs are already what many people associate with the term "AI PC." The combination of Copilot Assistant, the Copilot Key on the laptop to call up this software, and a growing range of locally run AI features (from on-the-fly webcam image enhancement to Microsoft Windows' Recall) shows how varied the software side of things can be.
Apple emphasizes the neural engine hardware less, but Apple Intelligence still uses it. However, Apple Intelligence includes a combination of on-device and cloud-computing features. These are all part and parcel of the macOS user experience, with Apple Intelligence features available in individual apps and at the operating system level. Writing assistance, for example, is available across apps like Mail, Messages, Notes, and Pages (and even some third-party apps). Siri's Apple Intelligence enhancements can now adjust system settings, manage device features, and automate workflows.
Apple is also leaning on its broader ecosystem, with Apple Intelligence integrated into Macs, iPhones, and iPads, with tight integration across devices and apps. This lets you not only use features solely on one device but also share information and system settings—carrying over, say, from your phone to your laptop.
Google's ChromeOS, on the other hand, is more web-focused. It includes AI-powered features that run on the device but emphasizes integrated functions like audio recording and transcription, and video call enhancements that rely on local hardware. Other AI innovations use Google's web-based Gemini assistant and embed AI into many aspects of the system, like right-click summaries ('Help Me Read') and granting Gemini assistant the ability to see what's on your screen for more contextual assistance.
With so many different takes on AI-enabled devices, not to mention a rapidly changing list of capabilities and features, a brief discussion of AI PC software cannot be comprehensive. But our ongoing coverage of these big players is a fine place to start.
The biggest implementation of AI on the laptop is the addition of generative AI in all its flavors. Below isn't an exhaustive list but rather an overview of some unique functions AI PCs can carry out now.
Language models can summarize text, define terms, explain concepts, and reword your writing for clarity, tone, or audience. These tools use large and small language models, giving you functionality similar to what you'd get from ChatGPT, but without needing internet access. Running these models on your machine also means you don't need to worry about data privacy since nothing gets shared with a third party. Examples of common locally run language models include Microsoft Phi and Orca models, Apple's proprietary LLMs in Apple Intelligence, and brand-specific tools, like Lenovo AI Now (built on Meta's Llama 3.0).
Generative AI also extends to image generation on your PC. Image generation is the process used to make AI art, design custom emoji, or perform complex image edits easily. What once required specialized skills and software can now be done by everyday users, often with nothing more than a simple prompt.
AI also provides real-time captioning for videos and meetings. It can intelligently turn audio into text, differentiate between speakers, and even translate languages in real-time. We've seen this applied via locally run resources in applications like Microsoft's Live Captions in Copilot+ systems and Google's similarly named Live Caption in ChromeOS.
New AI features like Microsoft's Recall, mentioned earlier, also transform how users interact with their computers. Recall continuously captures screenshots and indexes your activity locally, enabling you to search through your files, browsing history, and chats using natural, conversational language instead of exact file names or folder paths, making retrieving information faster and easier. Say you know that you were browsing for tennis rackets some months back but are unsure when, on what site, or even on which browser. Recall could scan those screenshots and put you back on track by interpreting their content.
However, this always-watching, always-recording feature has raised some security hackles. In response, Microsoft has made this tool an opt-in feature and bolstered it with strong encryption and security measures to protect user data.
AI chatbots are also becoming full-fledged "assistants"—by which we mean they can carry out tasks on your behalf across different apps and contexts. Mind you, that capability is still in the early days.
For one, Copilot+ provides personalized help by analyzing your notes, photos, and calendar locally. This can help it write context-rich email drafts that pull details from those sources or enable it to search photos with loose, conversational language. Apple's Siri, meanwhile, has advanced tech-support knowledge and the ability to control smart home devices, automate daily routines, and perform specific actions in different apps on request, like closing tabs in Safari or adding to Notes based on a comment.
As these tools get more contextual access, build individual profiles around your habits and behavior patterns, and gain access to functions across apps, they'll continue to be more useful in various ways. Imagine an assistant that can order items when you express interest in a recipe, or that can plan a weekend trip and book the necessary hotel and transportation for you, all based on an itinerary tailored to your interests. As AI tools move from simple commands and prompts to begin planning, reasoning, and coordinating actions across multiple apps, those sorts of uses could become the norm.
But the AI in an AI PC isn't limited to the specific digital assistant or suite of generative AI tools that garner the most attention. Some PC hardware and software benefits aren't in these tools.
Many of the latest systems include AI enhancements that run in the background, managing different systems and enhancing performance for a better user experience. You won't need to tweak settings or have technical knowledge to tune a system.
Intelligent system management has been in laptops for years, but new AI capabilities allow AI PCs to actively manage power and fan settings, balance CPU and GPU performance, and fine-tune battery usage, all in the background. These optimizations are dynamic, adjusting to user behavior and learned patterns, ensuring the laptop runs smoothly and efficiently throughout the day. For example, Copilot+ PCs use the NPU to watch user telemetry data in the background and automatically adjust hardware settings such as fan speed to optimize cooling.
AI can also improve the look and sound of video calls, boosting webcam image and microphone audio quality. Automated adjustments to white balance, image exposure, and focused framing in real-time can make your video calls look more professional without you having to make the tweaks yourself. Also, eliminating background sounds and enhancing audio clarity improves your sound without the need to invest in additional equipment. While many of these features are similar to what you might get on Zoom or Google, running them on-device means you can get those benefits without being tied to a specific service. Plus, added features like eye contact correction go above and beyond the enhancements you'll get online.
These are just some ways that AI enhances your computer experience, even without chatting with a bot or asking for an image. AI PCs provide all of these enhancements as part of the overall package, and the NPU hardware handles these background improvements without draining your battery or monopolizing your CPU.
One of the most significant ways an AI PC can improve your computing experience isn't in the OS or the hardware: It's in the apps you use daily. That's where things are changing most rapidly, as AI-accelerating hardware becomes more common and developers push forward to leverage it in new, creative ways.
Many content creation tools have new AI functions that would have been mind-blowing for pros just a couple of years ago but are now available to everyday users. These new capabilities are available to anyone with an AI PC, with cool AI-enhanced tools for both work and leisure:
Adobe Premiere Pro leverages the NPU to accelerate key AI-powered features in video editing. These include intelligent tools for automatic aspect-ratio adjustment, tracking and framing subjects on the fly; sophisticated audio tagging that distinguishes among dialogue, music, and sound effects; and even generative AI capabilities for tasks like extending video clips by creating extra frames and footage as fill.
For image manipulation, Adobe Photoshop integrates AI filters, often accelerated by the NPU, enabling enhanced portrait adjustments, effortless background removal, and the intelligent removal of unwanted objects or people from photographs. Before, tasks like these took pros lots of time and manual effort. Now, they're done in moments.
DaVinci Resolve simplifies complex tasks with AI, notably allowing users to perform intricate object masking in videos without the traditional time-consuming manual workflows.
In 3D and graphics, Blender incorporates AI tools capable of generating 3D-rendered objects directly from text descriptions, dramatically speeding up the initial modeling stages.
Audio editing is also transforming, with programs like Moises Live and DJay Pro utilizing AI to isolate individual instruments or vocal tracks from existing audio recordings.
Even everyday communication sees AI boosts. Zoom incorporates AI to dynamically adjust sharpness and lighting in video feeds during calls, improving visual clarity. Doing this kind of thing in real-time, locally, is resource-intensive without an NPU; conversely, it's a strain on the cloud provider if everyone in a huge call is applying effects and relying on the service to execute it remotely. This is where an AI PC shows its strength: sharing the load.
For faster video production, CapCut employs AI to automate various video editing processes. These include transcript-based edits (to remove dead time and filler words), automatic captioning, NPU-driven video resizing, and identifying key moments in long videos that can be repackaged into short clips for social media. All this can make polished video more attainable for a wider swath of users.
Finally, in coding and development, Visual Studio Code integrates generative AI tools that can significantly accelerate the coding process through intelligent code suggestions and generation.
Powered by integrated NPU hardware and new AI software, everyday apps are getting more powerful and easier to use. Thanks to the shift to AI PCs, professional-grade tools are more accessible, creation and productivity are more intuitive, and everyday computer use is leveling up.
Of course, it's still early days for the AI PC. As AI and PC hardware evolve, you will surely see new features, new tools, and advanced versions of these rudimentary capabilities improving month after month, year after year.
Apple's ecosystem hints at where things might go next, with AI features that work across laptops, phones, and other devices, making AI assistance available beyond the confines of a single device, using and understanding data from your other devices. Smart home, wearables, and fitness devices are all ripe for integrating with your AI assistant, eliminating the need for manually controlling devices and apps and allowing personalized automation and content that understands more about your life, your home, and your body—and it's looking like all of that capability might be centered around your PC. From personalized workouts to context-aware energy management, your AI PC could become the assistant that proactively orchestrates your daily life to be more convenient, healthy, and secure.
I've barely touched upon major players in the AI world, like OpenAI's ChatGPT, Google's Gemini, and Anthropic's Claude. You can use these online tools on any PC, but they will inevitably become entwined with our personal devices and the data that lives on them.
As AI tools and AI-capable devices become more common, they raise all sorts of questions that demand careful consideration. Long-term concerns around security, ethics, and data privacy loom larger than ever as our devices get smarter and our tools more powerful. Shorter-term concerns about affordability arise, too, as AI features make for more premium PCs and subscriptions to different AI tools accumulate. The actual usefulness of AI tools will come under scrutiny as the "AI PC" label fades away and just becomes part of our understanding of what personal computers are and do.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
23 minutes ago
- Yahoo
Jim Cramer Reiterates $200 PT for Palantir
Palantir Technologies Inc. (NASDAQ:) is one of the stocks that Jim Cramer shed light on. During the episode, Cramer reiterated his $200 price target for the company stock, as he said: 'If the market could mount such a comeback after a very difficult moment, which was Liberation Day, and against all these institutions, then why not keep buying? It's working. Okay, so now you can quibble over what these people are buying. There's a lot of money going into the high-flying stocks, stocks like Palantir, and I told you, $50 goes to $100, $100 goes to $200.' A software engineer manipulating a vast network of code on virtual monitors. Palantir (NASDAQ:PLTR) creates software platforms that uncover patterns in complex data, support operational decisions. The company integrates AI tools to improve intelligence analysis and organizational workflows. Cramer mentioned the stock during the July 1 episode and said: 'The biggest winner of the first half was, of course, Palantir Technologies, the government enterprise software company with the stock that's beloved by individual investors. It finished the first six months of the year up more than 80%. The skeptics will point to Palantir's nosebleed valuation, I mean, this is now a $308 billion company, trades at a mere 225 times this year's earnings estimates, or the fact that very few people can articulate what their software really does. But that's par for the course with enterprise software stories, and Palantir's got tremendous growth with surprisingly high margins. Just as important, the people who run the company are simpatico with the Trump administration, especially the Defense Department. Good way to win business. They want to change the Defense Department in a way that I think you and I might want, but they'll just say it in a potty-mouth way. Palantir's now a $130 stock, and I've said for a while now that it's headed to $200, not because of the fundamentals, but because that's how momentum stocks behave. Just remember, if you've got huge gains in this one, those gains don't count until you're reading the register on part of the position. Take something off the table, let the rest run, play with the house's money.' While we acknowledge the potential of PLTR as an investment, we believe certain AI stocks offer greater upside potential and carry less downside risk. If you're looking for an extremely undervalued AI stock that also stands to benefit significantly from Trump-era tariffs and the onshoring trend, see our free report on the best short-term AI stock. READ NEXT: 30 Stocks That Should Double in 3 Years and 11 Hidden AI Stocks to Buy Right Now. Disclosure: None. This article is originally published at Insider Monkey.
Yahoo
an hour ago
- Yahoo
Nvidia Stock Is Way Cheaper Than You Think. Here's 1 Reason Why.
Nvidia's profits will grow for years to come. Factoring this growth into Nvidia's forward price-to-earnings multiple makes this stock look very cheap. 10 stocks we like better than Nvidia › Nvidia (NASDAQ: NVDA) is one of the hottest stocks on the market today. Over the past five years, Nvidia shares have soared in value by nearly 1,500%, including another 20% in the last 12 months. Think the run is over? Think again. Nvidia stock remains far cheaper than most investors realize due to one critical factor. The artificial intelligence (AI) revolution is in full swing. But we're still in the early innings. In 2023, the United Nations estimated the global artificial intelligence market to be worth roughly $190 billion. By 2033, the organization believes the AI market's value will soar to nearly $5 trillion. That's a compound annual growth rate of more than 30%. Thus far, Nvidia's growth rates have exceeded the AI market's growth rate. Last year, Nvidia's sales more than doubled. This year, analysts expect sales to jump again by more than 50%. Why is Nvidia growing faster than the market overall? Because it rapidly gained market share due to its superior hardware and software offerings. Right now, Nvidia is estimated to have more than 90% of the AI GPU market. Its new Blackwell chips were at one time sold out for more than 12 months. Essentially every cloud computing infrastructure business on the planet is racing to buy more Nvidia products. Nvidia's CUDA developer platform, meanwhile, keeps customers locked in to its hardware and software ecosystem. In summary, through early investment, Nvidia has the best AI GPUs on the market from a performance standpoint. Its software integration, meanwhile, ensures developers are locked in to their products for the long haul. All of this has allowed Nvidia to post an industry-leading gross margin -- more than double that of competitors like Intel. While competition will emerge, Nvidia has gained a critical capital and reputational advantage -- an advantage that should persist for years to come, allowing it to charge more for its products than the competition. Trading at 27 times sales, Nvidia stock looks incredibly pricey for a $4 trillion business. On an earnings basis, shares trade at 53 times trailing earnings. Again, this looks expensive at first glance. But sales have a strong potential to grow by more than 30% per year for a decade or more. And given Nvidia's competitive advantages, profits should closely track this growth, even if margins do compress somewhat due to rising competition. On a forward earnings basis -- that is, based on what analysts expect Nvidia to earn next year -- shares trade at just 38 times 2026 earnings. That's already a much more palatable valuation. Add another two years of 30% profit growth and shares suddenly trade at just 22 times 2028 earnings. And remember, Nvidia could continue to grow at this rate through 2033 and beyond. So yes, Nvidia stock is expensive up front. But for long-term investors willing to spread that upfront premium over a long holding period, shares are surprisingly cheap. But only if you're willing to stay patient for years, or even decades at a time. Before you buy stock in Nvidia, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Nvidia wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $674,432!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $1,005,854!* Now, it's worth noting Stock Advisor's total average return is 1,049% — a market-crushing outperformance compared to 180% for the S&P 500. Don't miss out on the latest top 10 list, available when you join . See the 10 stocks » *Stock Advisor returns as of July 7, 2025 Ryan Vanzo has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Intel and Nvidia. The Motley Fool recommends the following options: short August 2025 $24 calls on Intel. The Motley Fool has a disclosure policy. Nvidia Stock Is Way Cheaper Than You Think. Here's 1 Reason Why. was originally published by The Motley Fool Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Gizmodo
an hour ago
- Gizmodo
Two Days After Prime Day, this Intel Mini PC (12GB DDR5, 512GB SSD) Falls to Its Best Price Since Launch
Prime Day itself ended two days ago but this Sunday, Amazon is still rolling out some of its biggest deals yet. In fact, some of the discounts have dropped even lower than we saw during the actual event. Nobody knows quite when these deals will disappear but if you're looking to score a discount in anticipation of back-to-school season, the time to do so is now. Among the most attention-grabbing offers this weekend is the GMKtec Mini PC that's driven by the Intel N150 processor, 512GB SSD and 12GB DDR5 RAM now up for grabs for only $159 instead of its regular $229. This is the all-time low for this small desktop and the 30% discount is quite tempting. See at Amazon The GMKtec Mini PC is built around Intel's latest Twin Lake N150 chip which features four cores and four threads as well as a 3.6GHz turbo frequency. Compared to the previous gen Alder Lake N100, the N150 delivers up to a 10% performance boost and it's 15% faster than the N95. That means more responsive office work and a more responsive experience for everyday computing. The 6MB cache means things won't slow down even when editing documents, managing emails, or running light creative apps. Internally, it's equipped with 12GB of 4800 MT/s DDR5 RAM which is paired with a 512GB M.2 SSD. This allows for fast booting, rapid file copying and the ability to run more than one program simultaneously without bogging down. The SSD offers ample space for programs and files while the DDR5 RAM possesses the bandwidth for multitasking today. It also boasts three USB 3.2 ports (up to 5Gbps), two HDMI 2.0 outputs, a DisplayPort 1.4, and two Gigabit Ethernet ports for reliable networking. WiFi 6 support provides faster, more reliable wireless connections (up to 600Mbps) while Bluetooth 5.2 ensures that it's easy to hook up wireless peripherals. This Mini PC can also drive up to three 4K displays simultaneously, thanks to Intel UHD Graphics and the combination of HDMI and DisplayPort outputs. This is perfect if you're managing several windows at once or want to enjoy movies and games in crisp Ultra HD. AV1 decoding support ensures smooth playback of the latest streaming formats, which makes this device a solid choice for home entertainment as well as productivity. If you're looking to upgrade your computing setup, now is the perfect opportunity to grab this all-time low price before it's gone. See at Amazon