TECH NEWS | CPUs gain importance as agentic AI expands data center workloads

0

While GPUs remain central to training neural networks due to their ability to process large volumes of data in parallel, CPUs play a supporting role by feeding data, running operating systems and managing tasks.

AI Data Center

The rise of agentic artificial intelligence is increasing demand for central processing units (CPUs) in AI data centers, as systems require more coordination, data handling and decision-making alongside graphics processing units (GPUs).

Agentic AI systems, described in June by Lisa Su, refer to continuously operating software that accesses data, applications and services to complete complex tasks with minimal human input. These systems rely on GPUs for high-throughput computation but depend on CPUs to manage workflows and support overall system operations.

In modern AI infrastructure, CPUs handle scheduling, memory management, input/output processes and control flow, ensuring GPUs remain utilized. As AI workloads shift from training to inference, CPU responsibilities increase, particularly in coordinating multistep processes and interpreting outputs.

While GPUs remain central to training neural networks due to their ability to process large volumes of data in parallel, CPUs play a supporting role by feeding data, running operating systems and managing tasks. In inference, however, CPUs take on a more active role by routing data, managing application logic and determining outcomes.

This shift reflects a broader move toward system-level optimization in AI deployments, where performance depends on the balance between compute, networking and software components.

Recent benchmark estimates indicate that a system using AMD EPYC processors can deliver up to 2.1 times higher performance per core and up to 2.26 times better energy efficiency compared with systems based on the Nvidia Grace Superchip. The x86 architecture used in EPYC processors also supports a wide range of existing enterprise software without requiring code modifications.

In AI clusters, CPUs act as controllers that organize GPU workloads, prepare and transfer data, and manage system-level processes. This coordination becomes more complex in agentic AI, where systems must handle tool calls, application programming interface requests and data queries while maintaining throughput.

The emergence of agentic AI is also increasing CPU utilization, as these systems repeatedly process data, refine outputs and interact with multiple applications. This creates additional demand for compute capacity in data centers.

Advanced Micro Devices is positioning its EPYC processors as part of a broader AI infrastructure that includes GPUs, networking and software. The company said its CPUs are designed to work alongside its Instinct GPUs and ROCm software stack to support scalable AI deployments.

Next-generation EPYC processors, codenamed “Venice,” are expected to power upcoming AI architectures aimed at improving performance and energy efficiency across workloads.

As AI adoption grows, industry demand for servers is increasing, driving a new cycle of infrastructure upgrades. Data center operators are focusing on balanced systems that optimize both CPU and GPU performance to support expanding AI applications.


————————————————————————-
WE ARE 10 YEARS OLD! TEN YEARS OF TECHSABADO, IMAGINE THAT.


WATCH TECHSABADO ON OUR YOUTUBE CHANNEL:








WATCH OUR OTHER YOUTUBE CHANNELS:


PLEASE LIKE our FACEBOOK PAGE and SUBSCRIBE to OUR YOUTUBE CHANNEL.




PLEASE LIKE our FACEBOOK PAGE and SUBSCRIBE to OUR YOUTUBE CHANNEL.

roborter
by TechSabado.com editors
Tech News Website at  | Website

Leave a Reply

Your email address will not be published. Required fields are marked *