Crucial has started shipments of its fastest and highest density server-class memory modules to date. Crucial’s 128 GB DDR4-2666 LRDIMMs are compatible with the latest memory-dense servers. These modules should be usable in both AMD EPYC systems and Intel Xeon systems, however Crucial states that they are optimized for Intel’s Xeon Scalable CPUs (Skylake-SP) launched earlier this year, and are aimed at mission-critical RAM-dependent applications. Due to the complexity of such LRDIMMs, and because of their positioning as super dense memory, the price is very high.

Crucial’s 128 GB LRDIMMs are rated to operate at a 2666 MT/s interface speed with CL22 timings at 1.2 V. The module is based on Micron’s 8 Gb DRAM ICs, are made using 20 nm process technology, and are assembled into 4Hi stacks using TSVs. The LRDIMM uses 36 of such stacks of ICs. Stacking naturally makes organization of the module very complex: we are dealing with an octal ranked LRDIMM featuring two physical ranks and four logical ranks. Making such a module run at 2666 MT/s is a challenge, so they end up running at relatively high latencies (which are higher than CL17 – CL20 specified by JEDEC for DDR4-2666). This can somewhat diminish the benefits of relatively high clocks, but is not surprising in order to keep them stable.

The key advantage of 128 GB LRDIMMs is their density. For example, a dual-socket Xeon Scalable platform using the -M suffixed processors, featuring 12 memory slots, can expand the maximum memory size by 2X to 1.5 TB from 768 GB by using 128 GB LRDIMMs over 64 GB LRDIMMs. For DRAM-dependent applications, such as large databases, holding everything in memory is the most important thing for performance. Obviously, such performance advantage will come at a price.

Specifications of Crucial's Server 128 GB DDR4-2666 LRDIMM
  Module Capacity Latencies Voltage Organization
CT128G4ZFE426S 128 GB CL22 1.2 V Octal Ranked

According to Crucial, production of 128 GB LRDIMMs involves 34 discrete stages with over 100 tests and verifications, making them particularly expensive to manufacture. These costs are then passed to customers buying such modules. The company sells a single 128 GB DDR4-2666 module online for $3,999 per unit, but server makers naturally get them at different rates based on quantity and support. At this rate, a full Xeon-SP system would cost $48k per socket, or for an EPYC system at 2 TB for each CPU, it would come to $64k per socket. At these rates, spending $13k or $4k for a CPU suddenly becomes a diminished part of the initial hardware cost (which is some justification for high-priced CPUs). At the Xeon-SP launch, Intel stated that fewer than 5% of its customers would use high memory capacity server configurations, though given how big the server market is, that is still a sizeable portion.

Crucial says that its 128 GB LRDIMMs are compatible with Xeon Scalable-based servers from various OEMs as well as software like Microsoft SQL and SAP HANA. We would expect them to be compatible with AMD EPYC servers too, however Crucial drives home the point about being optimized for Intel. Each module is tested individually to ensure maximum reliability for mission-critical applications. Specific support for these modules will be down to OEMs, as with other memory.

Related Reading

Source: Crucial

Comments Locked

28 Comments

View All Comments

  • versesuvius - Sunday, December 3, 2017 - link

    Providing RGB would put that beyond the reach of even the most enterprising of enterprises.
  • msroadkill612 - Sunday, December 3, 2017 - link

    So the take away is a big win for epyc - 16x dimm sockets vs 12x for intel.

    Epyc is less inclined to push users into expensive larger modules.
  • HStewart - Sunday, December 3, 2017 - link

    If I need that much memory and would spend that much money on memory, why would I skimp on the CPU with Epyc. I would go with real stuff and get top of line Intel CPU.
  • brucethemoose - Sunday, December 3, 2017 - link

    Intel also has 4P servers if you really, really need as much memory in 1 spot as possible. That's 48 DIMMs vs 32 for Epyc.
  • Pork@III - Sunday, December 3, 2017 - link

    Samsung ship 128GB 2400MT/s DDR4 modules from the beginning of Y2016

    M386AAK40B40-CUC Samsung 128GB PC4-19200 DDR4-2400MHz ECC Registered CL17 288-Pin Load Reduced DIMM 1.2V Octal Rank Memory Module
    Mfg Part No: M386AAK40B40-CUC
    $1,947.58 after 25% Cyber Week Discount Availability:In Stock
  • Guspaz - Monday, December 4, 2017 - link

    It’s almost like you’re comparing the street price during a sale of a slower product to the MSRP of a newer and faster product.
  • Pork@III - Tuesday, December 5, 2017 - link

    Faster? With how percentage? 11% theoretically 2666 vs 2400, but with this big big latency CL22 vs CL17 i don't think so. This pleasure of ~9-10% for doubled price...

Log in

Don't have an account? Sign up now