Laptops Falter in AI with No Results

The best laptops and laptop brands are fully embracing AI. Compared to a year ago, the list of top laptops has been filled with NPUs and a new generation of processors, all promising to integrate AI into every aspect of our daily lives. However, even after a couple of years of this AI revolution, there’s not much to show for it.

We now have Qualcomm’s long-awaited Snapdragon X Elite chips in Copilot+ laptops, and AMD has entered the competition with Ryzen AI 300 chips. Eventually, we’ll also have Intel Lunar Lake CPUs. The more we encounter these processors, the clearer it becomes that they are not built for an AI future but for the current needs – and you often can’t have it both ways.

It comes down to space

One crucial aspect of chip design that is not discussed frequently enough is space. If you explore hardware forums and enthusiast-oriented websites, you’ll already be aware of how significant space is.

But for the majority of people, it’s not something you typically consider. Companies like AMD and Intel could create colossal chips with immense computing power – and a huge demand for power and heat, but that’s a different topic – but they don’t. A significant part of the art of chip design lies in how much power you can squeeze into a specific space.

It’s essential to understand this because adding hardware to a chip is not without cost – it means taking up space that could otherwise be used for something else. For example, you can observe an annotated die shot of a Ryzen AI 300 CPU below. And in the upper right, you can see the amount of space the XDNA NPU occupies. It’s the smallest of the three processors on the chip – speculation puts it at approximately 14mm² – but it still takes up a considerable amount of space. AMD could use that space for more cores or, more likely, additional L3 Infinity Cache for the GPU.

This is not to single out AMD or suggest that Ryzen AI 300 CPUs perform poorly. They don’t, as you can read in our Asus Zenbook S 16 review. AMD, Intel, and Qualcomm are constantly making trade-offs in design to fit everything they need onto the chip, and it’s not as straightforward as just adding more cache and calling it a day. Pulling one lever affects countless other values, and all of these need to be balanced.

But it serves as a good illustration that adding an NPU to a chip is not something designers can do without making concessions elsewhere. Currently, these NPUs are largely ineffective. Even apps accelerated by AI would rather utilize the power of the integrated GPU, and if you have a discrete GPU, it’s much faster than the NPU. There are some use cases for an NPU, but for the vast majority of people, the NPU mainly functions to provide (slightly) better background blur.

Ryzen AI 300 is the only example we have for now, but Intel’s upcoming Lunar Lake chips will also encounter a similar situation. Both AMD and Intel are striving for Microsoft’s approval in Copilot+ PCs and therefore are including NPUs that can reach a certain power level to meet Microsoft’s seemingly arbitrary requirements. AMD and Intel were already incorporating AI co-processors on their chips before Copilot+ – but those co-processors are essentially useless now that we have new, much higher requirements.

It’s impossible to determine if AMD and Intel would design their processors differently without the Copilot+ push. Currently, however, we have a piece of silicon that serves little purpose on Ryzen AI 300 and eventually on Lunar Lake. It reminds us of Intel’s push with Meteor Lake, which has become almost obsolete in the face of Copilot+ requirements.

Promised AI features

As AMD and Intel have both promised, they’ll eventually be included in the Copilot+ fold. Currently, only Qualcomm’s Snapdragon X Elite chips have Microsoft’s approval, but AMD, at least, claims that its chip will be able to access Copilot+ features before the end of the year. That’s the other problem, though – we don’t really have any Copilot+ features.

The highlight since Microsoft announced Copilot+ has been Recall, and not a single person outside the press has been able to use it. Microsoft delayed it, restricted it to Windows Insiders, and by the time Copilot+ PCs were ready to launch, pushed it back indefinitely. AMD and Intel might be included in the Copilot+ fold before the end of the year, but that doesn’t mean much if we don’t have more local AI features.

We’re witnessing some consequences of Microsoft’s influence on the PC industry in action. We have a series of new chips from Qualcomm and AMD, and soon Intel, all of which have a component that isn’t doing much. It feels rushed, similar to what we saw with Bing Chat, and I wonder if Microsoft is truly as committed to this platform as it claims. Not to mention the fact that the factor driving sales of Copilot+ PCs isn’t AI features but better battery life.

In the next few years, it’s estimated that half a billion AI-capable laptops will be sold, and by 2027, they’ll make up more than half of all PC shipments. It’s clear why Microsoft and the broader PC industry are pushing so hard into AI. But when it comes to the products we have today, it’s difficult to say they are as essential as Microsoft, Intel, AMD, and Qualcomm would have you believe.

Laying the groundwork

It’s still important to consider the reasons in this situation, though. We have a classic chicken-and-egg problem with AI PCs, and even with the introduction of Copilot+ and the delay of Recall, that hasn’t changed. Intel, AMD, and Qualcomm are attempting to lay the foundation for AI applications that will exist in the future, when, hopefully, they will be so seamlessly integrated with how we use PCs that we won’t even think about having an NPU. It’s not a crazy idea – Apple has been doing this exact thing for years, and Apple Intelligence feels like the natural progression of that.

That’s not where we are right now, though, so if you’re investing in an AI PC, you need to be prepared for the consequences of being an early adopter. There aren’t many apps that can utilize your NPU, unless you really search hard, and even in apps with local AI features, they’d prefer to run on your GPU. Not to mention the changes in goals we’ve already witnessed with Copilot+ and the initial wave of NPUs from AMD and Intel.

I have no doubt that we’ll reach that point – there’s simply too much money being invested in AI right now for it not to become a staple in PCs. I’m still waiting to see when it will truly be as essential as we’ve been led to believe.

  • mayask

    Related Posts

    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    img { max-width: 100%; } OpenAI Following closely on the heels of its whopping $6.6 billion funding round, OpenAI on Thursday made the beta of a brand-new collaboration interface for…

    OpenAI raises $6.6B in latest funding round

    Andrew Martonik / Digital Trends OpenAI has now emerged as one of the wealthiest private companies on Earth after successfully securing a whopping $6.6 billion in its latest funding round…

    You Missed

    New Avatar: The Last Airbender game looks super ambitious

    • By mvayask
    • October 5, 2024
    • 41 views

    PS5 colorful chrome accessories pre-order now

    • By mvayask
    • October 5, 2024
    • 39 views
    PS5 colorful chrome accessories pre-order now

    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    • By mayask
    • October 5, 2024
    • 40 views
    ChatGPT’s new Canvas feature like Claude’s Artifacts vividly

    OpenAI raises $6.6B in latest funding round

    • By mayask
    • October 5, 2024
    • 45 views
    OpenAI raises $6.6B in latest funding round

    Qualcomm aims to add cool AI tools to Android phone

    • By mayask
    • October 5, 2024
    • 40 views
    Qualcomm aims to add cool AI tools to Android phone

    Reddit in $60M deal with Google for AI tools boost

    • By mayask
    • October 5, 2024
    • 39 views