Let's be honest: We're all tired of seeing AI plastered across every technology product. This trend is unlikely to abate any time soon. The latest victim of this trend is the PC market, with AMD, Intel, Microsoft, and Qualcomm having been talking about AI PCs for about a year now. Microsoft will be holding an event on March 21st titled “A New Era of Work.” AMD, Intel, and Qualcomm will have keynote speeches by their respective CEOs at Computex in Taipei, Taiwan. Get ready for a flood of AI PCs this year.
To be honest, Tirias Research has been promoting AI processing as the next big wave of computing that uses trained data to better process predictive models and user interfaces. The use of AI processing has significantly improved PC tasks such as voice recognition, video upscaling, video call optimization, microphone noise reduction, and power/battery management. The role of large language models (LLMs) in building AI (Generative AI or simply GenAI) that can generate new material/content from text prompts has unlocked another level of AI applications. With GenAI, some tasks such as image development, creative and business writing, chatbot assistants, and even video creation are possible with minimal user input. But so far, GenAI has been running in cloud datacenters with a few limited examples of client devices. The processing and power requirements to run the ever-increasing demands on GenAI threaten to disrupt cloud data centers.
Next-generation AI PC aims to change runaway cloud datacenter requirements by creating PCs that can run GenAI workloads locally with reduced or no cloud support I am.
Who defines an AI PC?
According to recent reports, Microsoft has set minimum requirements for its AI PCs to have at least 16 GB of DRAM and at least 40 TOPS (trillions of operations per second) neural processing units (NPUs). Masu. No currently shipping Windows PC processors meet the 40 TOPS requirement. Qualcomm's Snapdragon X Elite platform, scheduled to ship in late 2024, will meet Microsoft's performance requirements with around 45 TOPS and support for at least 16 GB of DRAM. Intel's Meteor Lake platform is slightly below these requirements at around 34 TOPS, but Intel's Lunar Lake platform could reach the requirements when it ships in late 2024. AMD's next Ryzen processor (Strix Point) may also meet or exceed 40 TOPS. TOPS Requirements. His AMD, Intel, and Qualcomm keynotes at Computex this year will give you an idea of ​​where each company stands in the race to meet Microsoft's requirements.
A question arises here. Should we wait until these new processors arrive in the second half of this year, or will Microsoft even be the final arbiter of AI PCs? AMD and Intel have some early examples of the capabilities of the platforms shipping today It shows. Qualcomm has been equipping Snapdragon processors for PCs with NPUs since 2018's Snapdragon 850.
Apple M processors for MacBooks have featured a neural processor (called Neural Engine) since the release of M1 in 2020, with the idea of ​​running more AI workloads on the client. . From the recent MacBook Air product announcement press release: “The move to Apple silicon makes every Mac a great platform for AI. M3 features a faster, more efficient 16-core Neural Engine and on-device includes CPU and GPU accelerators to power machine learning, making MacBook Air the world's best consumer laptop for AI. Leveraging this incredible AI performance, macOS Delivering intelligent features that improve productivity and creativity, users can enable powerful camera features, real-time speech-to-text, translation, text prediction, visual understanding, accessibility features, and more.”
Nvidia will claim that any PC with an RTX 20-series discrete graphics processing unit (GPU) or later is an AI PC because the GPU's Tensor units can run AI workloads. Nvidia uses AI to scale up display resolutions to run textures and ray-traced graphics in video games. The fact that most AI workloads are trained on his Nvidia GPUs helps support their claim. Running inference workloads on GeForce RTX GPUs is relatively easy. The problem is that it's not very power efficient compared to running the AI ​​model on a dedicated NPU.
The argument for running AI inference workloads on a dedicated NPU is that it is the most power-efficient way to run AI workloads without sacrificing notebook battery life, especially for businesses where notebooks become desktops. That means it sells better than PCs.
How will AI PCs advance over the rest of 2024?
Microsoft is expected to push support for AI CoPilot not only as a cloud service but also on client PCs. CoPilot can change the way people use their PCs and increase productivity. This may be the biggest change to the PC since the text-based DOS gave way to Windows' graphical user interface. Today's PC processors with NPUs may offer some AI capabilities, but AI processing power is very close to step-function increments.
Intel has been active in developing related software with the AI ​​PC Acceleration Program launched last year. The company claims that based on Intel Core Ultra processors, by 2025 it will bring AI to more than 100 million PCs. The company is working with over 100 ISVs and hopes to include over 300 AI-accelerated features that will improve the PC experience across audio effects, content creation, gaming, security, streaming, video collaboration, and more. . Companies already participating in the program include Adobe, Audacity, BlackMagic, BufferZone, CyberLink, DeepRender, Fortemedia, MAGIX, Rewind AI, Skylum, Topaz, VideoCom, Webex, Wondershare Filmora, XSplit, Zoom It will be. AMD, on the other hand, is more focused on large ISVs such as Microsoft and Adobe. Qualcomm is hoping the years of work it has put into its AI stack, AI hub and mobile platform will give it a jump start.
All of this activity will lead to major developments expected from all parties at Microsoft events in late March, and specifically at Computex in June. There, you'll learn more about how AI PCs will be developed later this year. You'll hear a lot about AI PCs throughout the year, but they'll become widely available over the next two years. While the benefits of AI PCs will initially be greatest for creative work, AI capabilities will be incorporated into many applications. While some AI capabilities will remain cloud-based, many will benefit from local AI. The only question about whether to buy an AI PC this year is whether you're on the cutting edge of this change.
Tirias Research tracks and consults companies across the electronics ecosystem, from semiconductors to systems and sensors to the cloud. Members of the Tirias Research team have consulted for companies including AMD, Arm, Intel, Nvidia, and Qualcomm across the CPU and GPU ecosystem.
follow me twitter.