For years SanDisk has been presenting a memory roadmap (this post’s graphic is one rendition) that anticipates a move to ReRAM after 3D NAND has run through its natural life, which was expected to be as little as three generations. This has been backed by the idea that a 3D NAND stack would only be able to reach a certain number of layers before it would encounter difficulties caused by the need to etch a high aspect ratio hole through an increasing number of layers.
The aspect ratio issue is not hard to understand: Let’s assume that the hole in a 24-layer stack has an aspect ratio of 40:1, then a 32-layer hole would have an aspect ratio of about 50:1, and a 64-layer stack would be something close to 100:1. Today’s technology starts to have trouble etching holes with an aspect ratio higher than 60:1.
These high aspect ratios were thought to be the limiting factor that would prevent 3D NAND from continuing for more than three generations. 3D NAND could only have as many layers as the aspect ratio could support.
On a panel that I moderated at this year’s Flash Memory Summit one panelist, Dr. Myoung Kwan Cho of SK hynix, explained that although there is a limit Continue reading
How short is that list? Interestingly, Intel and Micron have different lists. The Micron list, shown in this post’s graphic (click to enlarge), cites seven types: “Ram” (showing a vacuum tube), PROM, SRAM, DRAM, EPROM, NOR flash, and NAND flash. Intel’s list adds magnetic bubble memory, making it eight. (Definitions of these names appear in another Memory Guy blog post.)
The Memory Guy finds both lists puzzling in that they left out a number of important technologies.
For example, why did Intel neglect EEPROM, which is still in widespread use? EEPROMs (or E²PROMs) are not only found in nearly every application that has a serial number (ranging from WiFi routers to credit cards), requires calibration (like blood glucose monitoring strips and printer ink cartridges), or provides operating parameters (i.e. the serial presence detect – SPD – in DRAM DIMMs), but they still ship in the billions of units every year. In its time EEPROM was an important breakthrough. Over the years EEPROM has had a much greater impact than has PROM.
And, given that both companies were willing to include tubes, a non-semiconductor technology, why did both Continue reading
A July 13 Wall Street Journal article disclosed that China’s state-owned Tsinghua Unigroup has bid to buy Micron Technology for $21 a share or $23 billion, which would make this the largest-ever Chinese takeover of a U.S. company.
Objective Analysis has been telling our clients for the past few years that either China or India would create a new DRAM/NAND manufacturing company, especially since memory chip makers have enjoyed a long period of profits, and this usually motivates outsiders to invest in new DRAM makers. We did not anticipate an acquisition.
Countries with heavy industry typically move into the semiconductor business during an extended upturn, and become DRAM suppliers since DRAM is an undifferentiated commodity. Commodities sell almost solely on price and success is based on little more than manufacturing strength. This is a business model that industrial economies understand.
In addition to Micron’s tangible assets, including Continue reading
In a comment to a recent Memory Guy post I stated that NAND flash can reduce DRAM requirements, even in PCs. Some readers have told me that they wonder how this could be, so I will write this post to explain.
Some years ago Objective Analysis noticed that clever server administrators were able to use SSDs to reduce their systems’ DRAM requirements. Not only did this save them money, but it lowered power and cooling requirements as well.
Thinking that this might work on other kinds of computers, we commissioned a number of benchmarks to be performed on a PC.
These benchmarks found that after a system already has a certain minimum amount of DRAM, users can get a bigger performance boost by adding a dollar’s worth NAND flash than they can get by adding a dollar’s worth of DRAM.
In every case the minimum amount of DRAM was very small.
There has been quite a lot of interest over the past few days about the apparently-inadvertent disclosure by Intel of its server platform roadmap. Detailed coverage in The Platform showed a couple of slides with key memory information for the upcoming Purley server platform which will support the Xeon “Skylake” processor family.
One slide, titled: “Purley: Biggest Platform Advancement Since Nehalem” includes this post’s graphic, which tells of a memory with: “Up to 4x the capacity & lower cost than DRAM, and 500x faster than NAND.”
The Memory Guy puzzled a bit about what this might be. The only memory chip technology today with a cost structure lower than that of DRAM is NAND flash, and there is unlikely to be any technology within the leaked roadmap’s 2015-2017 time span that will change that. MRAM, ReRAM, PCM, FRAM, and other technologies can’t beat DRAM’s cost, and will probably take close to a decade to get to that point.
Since that’s the case, then what is this mystery memory? If we think of memory systems, rather than memory chips we can come up with one very plausible answer. Intel may be very Continue reading
A lone inventor has developed a data compression algorithm that defies the theoretical “Shannon Limit“. The press hasn’t covered this recent news, even though it has dramatic implications. This is probably because the technique is so very arcane. The inventor is none other than the great-great-great granddaughter of the inventor of the tabulated punch card, Herman Hollerith.
The algorithm reduces most of the data while converting the remaining information into as many ones as possible. This not only shrinks storage requirements and costs, but in the case of flash memory, it also has an important impact on total power. Flash is erased by setting all bits to ones, and bits are written by either leaving them alone (one) or by changing them (zero). The fewer zeros in the code, the less energy required to change the bits. Energy is also saved during an erase, since fewer bits need to be brought back to the erased state.
To explain the algorithm in its simplest terms, a byte of data is evaluated. If it has more zero bits than one bits the byte is inverted and an index bit is set to reflect this fact. Next, the four bits on either side of the byte are evaluated and if one has more zeros than ones it is inverted and another index bit is set. This process continues until Continue reading
The following is excerpted from an Objective Analysis Alert sent to our clients on March 26: On March 25 SanDisk and Toshiba announced sampling of their 3D NAND flash technology, a 128Gb (gigabit) 48-layer second-generation product based on the BiCS technology that the companies pioneered in 2007. Pilot production will begin in the second half of 2015 with meaningful production targeted for 2016. This release was issued at the same time that Intel and Micron were briefing the press and analysts for their March 26 announcement of their own 3D NAND offering (pictured), which is currently sampling with select customers, and is to enter full production by year-end. The Micron-Intel chip is a 32-layer 256Gb device, which the companies proudly point out is the densest flash chip in the industry.
Similarities and Differences
These two joint ventures (Intel-Micron and SanDisk-Toshiba) are taking very different Continue reading
Last week Micron and IBM announced that Micron would be IBM’s main supplier of NAND flash chips. The week before Micron announced a strategic agreement with Seagate to supply NAND flash. Why all this activity?
It comes down to today’s budding NAND flash shortage and the fact that suppliers tend to groom their customer lists when supplies get short.
Neither IBM nor Seagate represent the enormous opportunities that major consumer electronics firms like Apple do. Since many NAND suppliers are very cost-focused they look for customers that need very little support and purchase in high volumes.
IBM and Seagate look for a lot of support, and, since they both ship mostly enterprise flash systems or SSDs, they consume relatively small unit volumes of NAND flash chips.
These companies need to have an understanding of Continue reading
Last week Toshiba and SK hynix announced an agreement to jointly develop Nano Imprint Lithography (NIL), building on a memorandum of understanding (MOU) that two companies signed in December last year. Development efforts will begin this April and practical adoption is expected to start in 2017. The collaboration is expected to reduce risk and accelerate commercialization of this technology.
NIL is expected to produce next-generation lithography at high throughput rates more economically than established lithography tools. It is should compete against Extreme Ultraviolet (EUV) lithography, an alternative technology whose use has been delayed by numerous technical challenges. EUV, a euphemism for X-Rays, cannot use transmissive optics like glass lenses, so a completely new reflective imaging technology has had to be developed to support its use. The advantage of EUV is that the light wavelength is only 13nm, which is an order of magnitude smaller than the 193nm light currently used to produce leading-edge chips, allowing it to print significantly smaller features.
Unlike today’s lithography, which uses a purely photographic process, NIL mechanically stamps a pattern into the photoresist in a similar manner to the sealing wax stamp shown in the photo (courtesy of BackToZero, a wax stamp maker). The stamp is produced using Continue reading
SanDisk co-founder Eli Harari was awarded the National Medal of Technology and Innovation on November 20 by President Obama. The medal, which was bestowed upon Dr. Harari in a White House ceremony, is the United States’ highest honor for scientific and technological achievement, and recognizes those whose lasting contributions have created a greater understanding of the world and improved the lives of many.
Harari co-founded SanDisk more that 25 years ago with the vision that flash memory would be used to store data in mobile products, a vision that initially took seed in photography in the 1990s, and has since become the fastest-growing Continue reading