Global Sources
EE Times-India
Stay in touch with EE Times India
 
EE Times-India > Networks
 
 
Networks  

Significance of SRAMs in nextgen IoT and wearables

Posted: 17 Dec 2014     Print Version  Bookmark and Share

Keywords:SRAM  processors  DRAMs  Flash 

In the mid-90s, Intel's decision to incorporate SRAM into its processors spelled doom for stand-alone SRAM suppliers across the world. Overnight the biggest market for SRAMs (PC Cache) vanished, leaving only a few niche applications. The SRAM value proposition of being a high performance memory (low access time, less standby power consumption) was highly restricted by its higher price and density limitations (the highest density available today is 288 Mb). Since SRAMs have four to six transistors per cell, it is impossible to compete with DRAMs and Flash memories (both have 1 transistor per cell); fewer transistors per cell translates to lower board density and lower cost. Thus, for traditional storage applications – which comprise 98% of the market – SRAM is an impractical solution.

Since Intel started embedding SRAMs, most SRAM suppliers have moved on, either by shutting shop or by diversifying their product portfolio beyond SRAMs. SRAM use shifted to specific applications requiring high performance, mostly in the industrial, automotive, and defence spaces. The overall market for SRAMs has declined at -13% CAGR (compound annual growth rate) from 2002-13. However, it would be incorrect to assume that as a technology, it is set to die. In fact, in the coming years we expect to see a revival of the good 'ol stand-alone SRAM, driven by a variety of factors. In this article, we will discuss the technology advances that necessitate SRAMs and also the evolving trends in SRAM technology that make it ready to service the needs of the future.

The return of SRAMs to mainstream embedded design
The irony of the return to SRAMs is that it's being driven by a reversal of the very trend that sought to replace it. When Intel decided to embed SRAM, this was an intelligent course to take. Apart from being more cost-effective, it was also technologically a superior solution – embedded SRAMs have better access time than external SRAMs, given that access time is the most important factor for cache memories.

Between then and now, processors have become more powerful and haveshrunk in size [1]. As processors become more powerful, they require commensurate improvements in cache memory. But at the same time, increasing the embedded cache memory becomes a growing challenge with every new process node. SRAM has a six-transistor architecture (logic area is typically four transistor/cell). This means that with smaller process nodes, the number of transistors per square centimeter will be extremely high (figure 1). Such high transistor density can cause many problems, including:

Increased susceptibility to soft errors: Soft error rates are expected to increase seven-fold [2] as process technology scales from 130nm to 22nm (figure 1).

Lower yields: Due to the shrinking of the bit-cells coupled with higher transistor density, the SRAM area is more susceptible to defects due to process variation. Such defects reduce the overall yield of the processor chip.

Increased power consumption: If the SRAM bit cells have to be of same size as the logic bit cells, then the SRAM transistors will have to be scaled smaller than the logic transistors. The small size of the transistorscauses an increase in leakage current [3] which in turn increases the standby power consumption.

 SER vs process node

Figure 1: Soft error rate (SER) increases as process geometry node gets smaller.


1 • 2 • 3 • 4 Next Page Last Page



Comment on "Significance of SRAMs in nextgen IoT..."
Comments:  
*  You can enter [0] more charecters.
*Verify code:
 
 
Webinars

Seminars

Visit Asia Webinars to learn about the latest in technology and get practical design tips.

 

Go to top             Connect on Facebook      Follow us on Twitter      Follow us on Orkut

 
Back to Top