When first announced, 3D XPoint memory was largely seen as a high-speed caching technology to accelerate the performance of SSDs or hard drives in PCs.
As the “first new memory class” to ship in volume in decades, 3D XPoint was hailed as “revolutionary” by its creators – Intel and Micron – because depending on the application it offered 10X the performance of NAND flash and “1,000 times” the endurance.
Rather than a splash across industries, however, the new non-volatile memory launched by Intel as “Optane DC Persistent Memory” and Micron as QuantX made more of a kerplunk when first shipped by Intel two years ago.
While Intel insists it has sold “millions” of Optane SSDs, revenue from those sales is still lackluster.
“Given that the first shipments were in spring 2017, I would expect more,” said Jim Handy, director of research at Objective Analysis. “They’re losing a lot of money on Optane. It had darn-well-better catch on.”
While all the other NAND makers were basically printing money in 2017, Handy quipped, Intel was losing money in its Nonvolatile Solutions Group, which only sells NAND SSDs and Optane. That trend continued into 2018, and has worsened this year, he said.
“Intel’s SSDs were probably equally profitable to everyone else’s, so the loss comes after all of the NAND SSD profits got gobbled up. That’s a lot of money,” Handy said.
After plans to release their product two years ago, Micron is now saying it expects to ship QuantX before the end of this year.
Intel, which first targeted the PC/gamer community with its Optane memory – touting faster boot ups and game load times — this week announced plans to ship its second generation Optane memory (code named Barlow Pass) in a DIMM form factor for use in big data analytics environments and cloud service providers along side its third generation quad-level cell (QLC, 4 bits per cell) 3D SSDs. Both are expected to ship next year, Intel said.
Intel also announced plans for a second generation Optane SSD, code named Alder Stream.
Currently, there are four generations in development for Optane DC Persistent memory, according to Kristie Mann, Intel’s senior director of product management for non-volatile memory solutions.
Intel’s Optane DC Persistent Memory DIMM first arrived in April with the current Cascade Lake line of Xeon Scalable data center processors instead of with its Skylake processor as originally planned.
Micron’s upcoming QuantX product news and Intel’s latest plans for Optane is set against the backdrop of the two companies ending their 13-year memory development partnership created under their joint venture IM Flash Technologies LLC. Intel will hand Micron the keys to IM Flash Technologies’s fab facility in Lehi, Utah, and in turn it will open its new fab in Rio Rancho, New Mexico. (Intel abruptly stopped using the 3D XPoint development name early this year.)
Intel’s DIMM Optane module, which was released in April in 128GB, 256GB and 512GB capacities, is now shipping in the servers of multiple OEMs; Intel also plans to expand Optane memory modules to workstations, then to the clients.
“NAND has not kept up with capacity or compute power and created gap between NAND and DRAM. Optane filled it,” said Frank Hady, Intel fellow & chief Optane systems architect. “We’re making [Optane] available as memory and high performance storage. That’s how we’re filling that growing gap between DRAM and NAND.”
Gartner Research agreed, stating in a recent report the 3D XPoint non-volatile memory represents a new opportunity for transforming data center environments.
With the growth of online transactional data processing, cloud computing, artificial intelligence and big data analytics, workloads will require higher performance storage. Enter, 3D XPoint. Intel is selling it alongside its QLC SSDs, which use floating gate memory cells that Intel claims holds onto data longer versus charge trap cells commonly used in 3D NAND SSDs.
Floating gate technology, Hady said, will also enable its 3D NAND flash memory to be extensible to five bits per cell.
“Five-bit-per-cell technology would further increase density and reduce the cost beyond QLC. So, at the media level, the SSD level and all the way up into the system level we’re increasing density and decreasing costs to better handle data set size,” Hady said.
While there’s no way of knowing by how much Intel will slash its prices for its next-generation Optane and QLC SSDs, Gartner’s Unsworth said he assumes it will be a minimum of 20%.
Optane’s key selling points is its performance and “persistence,” a fancy way of saying data remains stored even when the memory isn’t powered up. That makes it a storage-class memory like NAND flash, but vastly faster. At the same time, Optane is slower than DRAM, but also much less expensive.
“I feel like we’re changing the way the world manages data. Now, you can have multiple tiers of memory,” said Kristie Mann, senior director of management for Intel’s Optane DC persistent memory.
Intel is positioning Optane to work in conjunction with its QLC 3D NAND memory. Intel this week announced its fourth generation QLC flash memory, which increases its stacked structure from 96- to144-layers; the denser NAND flash is expected to greatly impact both the capacity of SSDs and the price of the memory.
Optane is capable of running in two modes: Memory Mode, where applications and OSes perceives the DIMM as a cache of volatile memory that doesn’t require software optimization (it acts as a less expensive, slower DRAM); and App Direct Enabled mode, where applications and the OS see it as non-volatile memory that can be used for loading and storing data.
Intel markets App Direct Mode for in-memory databases, in-memory analytics frameworks and “ultrafast storage applications.”
In App Direct Mode, the DRAM and the Intel Optane DC Persistent Memory are seen by the serve as the total platform memory. In Memory Mode, the DRAM is used as a cache, and does not appears as an independent memory resource, so it is not included in the total memory perceived by the OS.
For example, a platform with 1.536 TB of Intel Optane DC Persistent Memory and 192 GB of DRAM would register with the OS as 1.728 TB of total memory in App Direct, but only appear as 1.536 TB in Memory Mode, according to Alper Ilkbahar, general manager of Intel’s Data Center Memory and Storage Solutions.
“I think the persistence is really the key differentiator over the long run; And, we make it adoptable and easy to use by end customers because we work with the software ecosystem to ensure the applications are enabled,” Mann said.
Adopting and adapting applications to use Optane is “medium to hard” for enterprises because it’s a new technology and there are perceived risks, according to Mann, who suggests IT managers start with proof of concepts and see if it will benefit a specific workload.
“Not every workload is going to see the same results,” Mann said. “What we’re seeing is most enterprises after a PoC move into pilot mode and then they go into scale. I think that’s a very pragmatic approach.”
Intel is currently working with 100 using who’ve deployed a PoC using Optane memory, and more than 500 users queued up and “waiting in the wings” to deploy it, Mann said.
“What we’re seeing is very high conversion rate to pilot. Given that we’ve only been in the market for a few months, it’s going to take us time to see how many of them will move into scale,” Mann said. “We’re seeing a lot more adoption in the cloud [service providers].”
Cloud service providers are adopting Optane because they’re able to optimize their own software, and don’t need to wait on software vendors for application enablement, Mann said.
Last year, for example, Google worked with Intel and SAP to offer Compute Engine VMs running Optane memory for SAP HANA workloads.
Alibaba, China’s largest online retailer, is deploying Optane in support of “real-time” data analytics.
Chinese e-commerce giant Baidu is building an in-memory database using Optane to increase the performance of search engine results used to feed its streaming data services.
In October, Microsoft also plans to go live with an Optane Memory and Storage Management application via its Azure cloud service.
Optane’s greatest opportunity is in hyperscale environments, or the very largest data center operators, such as ISVs, according to Joseph Unsworth, Gartner’s research vice president for semiconductors and NAND flash.
While hyperscale customers are expected to lead initial adoption, in-memory computing, virtualized environments and artificial intelligence will be leading application workload drivers, according to a Gartner report authored by Unsworth.
While Optane won’t convince companies to rip and replace all of their server DRAM, it will allow IT managers to cut costs by replacing some of it — while also augmenting the performance of their NAND flash-based SSDs.
System configurations will require a mix of both persistent memory and DRAM, but early indications are that the ratio will vary from 3-1 to 8-1, depending on the workload, according to Gartner.
“The ROI is bigger memory footprint that can be more economical than DRAM, favorable TCO, and potentially better application boot-up speeds, which has high availability and productivity implications,” Unsworth said. “As for the industries, its more about applications that demand high performance and customers with the advanced infrastructure to exploit the benefits.”
Optane is best positioned for industries such as finance for use in high-frequency trading, natural resources, such as oil and gas mining; healthcare for genomics, and in government for high performance computing and AI/analytics.
“It’s really about which industries would benefit from higher levels of performance and where is the ecosystem optimized for it like SAP and its optimizations for Optane for its databases — that can span industries just like highly virtualized environments,” Unsworth said, referring to applications such as data analytics and artificial intelligence.
Optane is most efficient closest to a processor where there is little to no bottleneck from interface specifications. For example, NAND flash-based SSDs are limited in performance by the PCIe interface.
“This is why CXL [Compute Express Link] was announced by Intel for the next generation interconnect,” Unsworth added.
Optane will continue to find its best use case in the data center since it is about half the price of DRAM but seven times the cost of NAND, according to Gartner research.
The price of an Optane DIMM module compatible with a DDR4 slot is $6.57 per gigabyte, according to one published report. Depending on the retailer, the 128GB Optane DC Persistent Memory module is priced from $842 to $893, and the Optane DC 256 GB is priced at $2,668 to $2,850, according to two retail sites: CompSource.com and ShopBLT.com. Intel’s 512GB is selling for as much as $7,816, according to Tom’s Hardware.
In 2017, Intel’s launched its first Optane product — an M.2 form factor expansion card that came in 16GB and 32GB capacities and made available as an option for Intel-based client platforms. It connected to PC motherboards via storage slots using the PCIe/NVMe 3.0 x2 I/O lane interface.
Intel’s positioned Optane at the consumer industry saying it would fill the gap between faster and more expensive DRAM and cheaper but slower non-volatile NAND flash memory or hard drives using SATA connections. The company also demonstrated Optane memory cards performed 10 times faster than conventional SSDs.
Earlier this year, Intel launched its first Optane memory and solid-state drive (SSD) combo product, the Optane H10. Optane H10 combines the faster performance of Optane technology with the higher storage capacity of Quad Level Cell (QLC) 3D NAND technology in an M.2 form factor.
In 2016, Micron said it would market its 3D XPoint memory under the name QuantX, and its go-to-market plans were “completely separate” and distinct from Intel’s, according to Jon Carter, Micron’s vice president of emerging storage solutions.
The first generation of QuantX solid-state drives (SSDs) were to be aimed squarely at data center applications beginning in the second quarter of 2017, Carter said at the time. That never happened.
In fact, Micron had planned on garnering its first revenues from QuantX sales in the second half of 2017, with 2018 being a “bigger year,” and 2019 being the “break-out” revenue year.
One reason Micron has tapped the break on releasing QuantX may be it has witnessed its partner Intel struggle to turn a profit with its Optane memory, according to Objective Analysis’s Handy.
“It’s pretty obvious if you look at their financials, which I did in the Flash Memory Summit,” Handy said, referring to the August event in Santa Clara, Calif. “It is most likely from the economies of scale. If you make a lot of something then, it’s pretty cheap to make. If you make only a few, the costs go up.”
This story, “Storage trends: What are the best uses for Optane?” was originally published by