On-Prem

Systems

The NPU: Neural processing unit or needless pricey upsell?

Tech for tech's sake with niche uses that traditional hardware can handle


Opinion If you haven't heard of neural processing units (NPUs) by now, you must have missed a year's worth of AI marketing from Intel, AMD, and Qualcomm.

In the past 12 months, these AI-focused processors have been touted as the next essential upgrade – one that everyone apparently needs to make the most of artificial intelligence. But is this all just marketing hype, or do NPUs genuinely offer the transformative value they promise?

What are NPUs?

NPUs are specialized processors within system-on-chips (SoCs) designed to handle AI-specific tasks, like for example, background noise suppression, real-time video enhancement, and basic generative AI functions. Companies including Intel with its VPU in Meteor Lake, AMD with Ryzen AI, and Qualcomm with the Hexagon AI processor have all embedded NPUs into their silicon, claiming they will revolutionize the computing experience by making devices smarter and more efficient. The idea is to offload AI workloads from the CPU and GPU to save power, theoretically improving battery life and providing faster on-chip AI processing.

But are these AI-enabled processors genuinely game-changing features or are they occupying precious die space that could be better utilized to meet the real needs of users?

The reality behind NPU benefits

While NPUs do add efficiency, particularly in mobile devices where every watt saved is valuable, their impact on laptops – where battery life is already robust – is harder to justify. The tasks NPUs handle are largely niche and have a limited impact on the average user's experience. If you're someone who frequently uses voice commands or relies heavily on video call enhancements, an NPU might save some battery life. But for most users, at present it's a nice-to-have feature rather than an essential one. CPUs and GPUs have managed these functions adequately for years, and while NPUs might lower power consumption slightly, the innovation is more about incremental efficiency than offering meaningful new capabilities.

Take Intel's Meteor Lake VPU, for instance. It's marketed as a solution for on-device AI tasks like video call background blurring and noise cancellation – tasks that CPUs and GPUs have been handling effectively. The primary benefit is a marginal boost in power efficiency, which, while useful, is unremarkable when you consider the overall computing experience. AMD's Ryzen AI takes a similar approach, offering efficiency gains without groundbreaking functionality. Qualcomm's Hexagon processor, drawing from its mobile pedigree, brings similar capabilities to laptops but doesn't significantly expand the range of applications for most users.

TOPS trumps

When discussing NPUs, vendors often highlight TOPS as a metric of performance. Intel's upcoming Lunar Lake platform boasts a 48 TOPS NPU, AMD's Ryzen AI 300 series is capable of 55 TOPS, and Qualcomm's Snapdragon X Elite features a 45 TOPS NPU. These numbers are thrown around as if they have substantial meaning for real-world users.

However, TOPS is a theoretical measurement of peak performance under ideal lab conditions. It's calculated based on the number of multiply-accumulate (MAC) units and operating frequency, but it doesn't necessarily translate to actual performance gains in everyday use. For the average user, these figures are as meaningful as theoretical horsepower in a car they will never drive at top speed.

The trade-off: Die space utilization

Including an NPU consumes valuable die space, which could potentially be allocated to enhance more universally beneficial features including CPU cores or GPU capabilities. Using AMD's Zen 5-based Ryzen AI 300 mobile SoC as an example, the NPU occupies about 10-15 percent of the die space – a significant portion. If that space were instead used to add more CPU cores, users could experience noticeable improvements in multi-threaded applications, benefiting developers, content creators, and power users alike.

Alternatively, expanding the integrated GPU could offer better graphics performance, a feature that would be appreciated by gamers and professionals using graphics-intensive applications. Given that GPUs have traditionally been the go-to hardware for AI workloads, enhancing GPU capabilities could serve a dual purpose.

Are NPUs truly future-proofing?

Manufacturers promote NPUs as essential for future-proofing laptops in an AI-driven world. However, given the rapidly evolving nature of AI, it's challenging to predict which hardware features will remain relevant. While NPUs do offer some advantages for specific AI tasks, most users are unlikely to notice their absence. The majority of everyday computing tasks – like web browsing, document editing, and media consumption – do not require AI-driven optimization.

Even for users who occasionally interact with AI-powered features, CPUs and GPUs are typically sufficient for handling these workloads, albeit with slightly higher power consumption. The promise of NPUs lies more in potential future applications than in current, tangible benefits for the average consumer.

Gimmick or genuine innovation?

While AI has many practical applications – such as speech-to-text conversion and real-time translation tools – the inclusion of NPUs in laptops feels premature. The technology seems to be a solution in search of a problem, driven more by marketing strategies than by actual user demand. Until AI applications become truly mainstream and indispensable in daily computing, NPUs may remain an overhyped feature rather than an essential component.

In the meantime, consumers might benefit more from enhancements in processing power, graphics capabilities, and overall system performance – improvements that offer immediate and noticeable advantages. As it stands, NPUs are an interesting development, but perhaps not the game-changing innovation they're slated to be - at least for users.

PC makers are keen to promote hardware containing an NPU, possibly because AI PCs promise to lift average sales prcies across the sector by five to ten percent. ®

AMP pages do not support showing a poll. See the non-AMP story instead.

Send us news
74 Comments

Additional Microprocessors Decoded: Quick guide to what AMD is flinging out next for AI PCs, gamers, business

Plus: A peek at Nvidia's latest hype

Intel, AMD engineers rush to save Linux 6.13 after dodgy Microsoft code change

'Let's not do this again please'... days before release date

Where does Microsoft's NPU obsession leave Nvidia's AI PC ambitions?

While Microsoft pushes AI PC experiences, Nvidia is busy wooing developers

Intel debuts laptop silicon that doesn't qualify for Microsoft's 'Copilot+ PC' badge

TOPS, SCHMOPS, says Chipzilla, our NPUs may be slow but that doesn't matter

Intel’s datacenter architecture boss and Xeon lead jumps to Qualcomm

Sailesh Kottapalli sees ‘a once-in-a-career opportunity’ at the house of Snapdragon – maybe server CPUs or AI silicon?

Trump China tariffs to 'overshadow' the 'progress' of AI PCs

Already inflated costs of NPU-based boxes set to jump on import tax

Dude, you got a Dell, period! RIP XPS, Inspiron, Latitude, Precision

It'll all end in tiers

Qualcomm’s latest Snapdragon X chip targets $600 Copilot+ PCs

Hopes to elbow out competition on Arm-based hardware

UK unveils plans to mainline AI into the veins of the nation

Government adopts all 50 venture capitalist recommendations but leaves datacenter energy puzzle unsolved

Microsoft, PC makers cut prices of Copilot+ gear in Europe, analyst stats confirm

Double-digit reduction only served to 'stimulate some interest'

Microsoft eggheads say AI can never be made secure – after testing Redmond's own products

If you want a picture of the future, imagine your infosec team stamping on software forever

Just as your LLM once again goes off the rails, Cisco, Nvidia are at the door smiling

Some of you have apparently already botched chatbots or allowed ‘shadow AI’ to creep in