Pull down to refresh stories
Emerging

Report claims Arm chips will power 90% of AI servers based on custom processors in 2029

Virtually all hyperscale cloud service providers (CSPs), as well as some of the leading developers of AI accelerators nowadays, have their own custom-silicon programs that are focused not only on developing AI accelerators, but also on custom general-purpose CPUs usually based on the Arm instruction set architecture (ISA). Over the next several years proliferation of custom CPUs based on the Arm ISA inside AI servers will increase to 90%, leaving x86 and Arm around 10%, according to Counterpoint Research . This piece sits on 1 source layers, but the real value is showing why the story should not be skimmed past too quickly.

Virtually all hyperscale cloud service providers (CSPs), as well as some of the leading developers of AI accelerators nowadays, have their own custom-silicon programs that are focused not only on developing AI accelerators, but also on custom general-purpose CPUs usually based on the Arm instruction set architecture (ISA). Over the next several years proliferation of custom CPUs based on the Arm ISA inside AI servers will increase to 90%, leaving x86 and Arm around 10%, according to Counterpoint Research . The signal is strong enough to deserve attention, but it still needs to be read as something developing rather than fully settled.

Emerging The topic has initial corroboration, but the newsroom is still waiting on stronger confirmation.
Reference image for: Report claims Arm chips will power 90% of AI servers based on custom processors in 2029
Reference image from Tom's Hardware. Tom's Hardware

Virtually all hyperscale cloud service providers (CSPs), as well as some of the leading developers of AI accelerators nowadays, have their own custom-silicon programs that are focused not only on developing AI accelerators, but also on custom general-purpose CPUs usually based on the Arm instruction set architecture (ISA). Over the next several years proliferation of custom CPUs based on the Arm ISA inside AI servers will increase to 90%, leaving x86 and Arm around 10%, according to Counterpoint Research . x86 processors from AMD and Intel have long dominated general-purpose servers, which is why most of the AI servers initially relied on Opteron and Xeon processors. Tom's Hardware is the main source layer for now, and the rest should be read as a signal that is still widening. On the device side, the useful angle is whether a technical change actually alters feel, lifespan, or upgrade cost in real use.

Advertising slot

Patrick Tech Store Accounts, tools, and software now available in the store This slot is temporarily dedicated to the Patrick Tech ecosystem.

What is happening now

Virtually all hyperscale cloud service providers (CSPs), as well as some of the leading developers of AI accelerators nowadays, have their own custom-silicon programs that are focused not only on developing AI accelerators, but also on custom general-purpose CPUs usually based on the Arm instruction set architecture (ISA). The main references behind this piece include Tom's Hardware.

Where the sources line up

Tom's Hardware is the main source layer for now, and the rest should be read as a signal that is still widening. Over the next several years proliferation of custom CPUs based on the Arm ISA inside AI servers will increase to 90%, leaving x86 and Arm around 10%, according to Counterpoint Research . The main references behind this piece include Tom's Hardware.

Advertising slot

Patrick Tech Store Accounts, tools, and software now available in the store This slot is temporarily dedicated to the Patrick Tech ecosystem.

The details worth keeping

x86 processors from AMD and Intel have long dominated general-purpose servers, which is why most of the AI servers initially relied on Opteron and Xeon processors. On the device side, the useful angle is whether a technical change actually alters feel, lifespan, or upgrade cost in real use.

Why this matters most

The signal is strong enough to deserve attention, but it still needs to be read as something developing rather than fully settled. With 1 source layers on the table, the part worth reading most closely is where firm facts meet the market's early reaction. However, Arm-based custom CPUs that are tailored for specific data-intensive AI workloads are more cost and power-efficient.

What to watch next

The next readout is price, device coverage, and whether the change feels real once the hardware reaches users. Patrick Tech Media will keep checking rollout speed, user reaction, and how Tom's Hardware update the next pieces. In this pass, the story was distilled from 1 signals into 1 source references that are genuinely useful to readers.

Source notes

Related stories