Want your server to access more than 100,000 DIMM slots in one go? This Korean startup claims that is CXL 3.1-based technology can help you scale to more than 100PB of RAM — but it will cost nearly $5 billion

Trending 1 month ago

Ever imagined drafting connected up to 100 petabytes of RAM? Well, this startup could beryllium nan cardinal to unlocking groundbreaking representation capabilities.

Korean fabless startup Panmnesia unveiled what it described arsenic nan world’s first CXL-enabled AI cluster featuring 3.1 switches during nan caller 2024 OCP Global Summit.

The solution, according to Panmnesia, has nan imaginable to markedly amended nan cost-effectiveness of AI information centers by harnessing Compute Express Link (CXL) technology.

Scalable - but costly

In an announcement, nan startup revealed nan CXL-enabled AI cluster will beryllium integrated wrong its main products, nan CXL 3.1 move and CXL 3.1 IP, some of which support nan connections betwixt nan CXL representation nodes and GPU nodes responsible for storing ample information sets and accelerating instrumentality learning.

Essentially, this will alteration enterprises to grow representation capacities by equipping further representation and CXL devices without having to acquisition costly server components.

The cluster tin besides beryllium scaled to information halfway levels, nan institution said, thereby reducing wide costs. The solution besides supports connectivity betwixt different types of CXL devices and is capable to link hundreds of devices wrong a azygous system.

The costs of specified an endeavor could beryllium untenable

While drafting upon 100PB of RAM whitethorn look for illustration overkill, successful nan property of progressively cumbersome AI workloads, it’s not precisely retired of nan question.

Sign up to nan TechRadar Pro newsletter to get each nan apical news, opinion, features and guidance your business needs to succeed!

In 2023, Samsung revealed it planned to usage its 32GB DDR5 DRAM representation dice to create a whopping 1TB DRAM module. The information down this move was to thief contend pinch progressively ample AI workloads.

While Samsung is yet to supply a improvement update, we do cognize nan largest RAM units Samsung has antecedently produced were 512GB successful size.

First unveiled successful 2021, these were aimed for usage successful next-generation servers powered by apical of nan scope CPUs (at slightest by 2021 standards - including nan AMD EPYC Genoa CPU and Intel Xeon Scalable ‘Sapphire Rapids’ processors.

This is wherever costs could beryllium a awesome inhibiting facet pinch nan Panmnesia cluster, however. Pricing connected comparable products, such arsenic nan Dell 370-AHHL representation modules astatine 512GB, presently stands astatine conscionable nether $2,400.

That would require important finance from an endeavor by immoderate standards. If 1 were to harness Samsung’s apical extremity 1TB DRAM module, nan costs would simply skyrocket fixed their expected value past twelvemonth stood astatine astir $15,000.

More from TechRadar Pro

  • Want to person entree to 96TB (yes Terabytes) of RAM? This CXL description container shows what nan early of representation looks like
  • With AMD's fastest mobile CPU, 64GB RAM and a brace of OLED screens, GPD Duo whitethorn beryllium nan champion mobile workstation ever
  • We've rounded up the best mini PC choices around
More
Source Technology
Technology