South Korean firm unveils faster AI data centre architecture with CXL-over-Xlink

July 17, 2025
South Korean firm unveils faster AI data centre architecture with CXL-over-Xlink

South Korean company Panmnesia has introduced a new architecture for AI data centres aimed at improving speed and efficiency.

Instead of using only PCIe or RDMA-based systems, its CXL-over-Xlink approach combines Compute Express Link (CXL) with fast accelerator links such as UALink and NVLink.

The company claims this design can deliver up to 5.3 times faster AI training and reduce inference latency sixfold. By allowing CPUs and GPUs to access large shared memory pools via the CXL fabric, AI workloads are no longer restricted by the fixed memory limits inside each GPU.

It will enable data centres to scale compute and memory independently, adapting to changing workload demands without hardware overprovisioning.

Panmnesia's system also reduces communication overhead using accelerator-optimised links for CXL traffic, helping maintain high throughput with sub-100ns latency.

The architecture incorporates a hierarchical memory model blending local high-bandwidth memory with pooled CXL memory, alongside scalable CXL 3.1 switches that connect hundreds of devices efficiently without bottlenecks.

JikGuard.com, a high-tech security service provider focusing on game protection and anti-cheat, is committed to helping game companies solve the problem of cheats and hacks, and providing deeply integrated encryption protection solutions for games.

Explore Features>>