Looking for a sound investment in the wake of the global financial crisis? Then talk to your investment adviser...
about solid state disks (SSDs), a technology taking consumer and enterprise electronics markets by storm.
Enthusiasm for the technology is driven by several factors, foremost amongst which is the fact that SSDs are faster at reading data than conventional hard disks. They are also more physically resilient, thanks to their lack of moving parts. In these environment-conscious times, their low power consumption is a bonus, while their small size and low heat emissions further contribute to green credentials to again leave disks reliant on spinning platters eating SSDs’ digital dust.
While SSDs are still more expensive than hard disks in terms of storage capacity, their input output (I/O) capacity per gigabyte exceeds that of today’s disks. For applications that demand swift retrieval of data, SSDs can sometimes therefore be more cost-effective than conventional disk.
These qualities mean that IDC forecasts 200% annual compound growth in the SSD market from 2007 to 2011, according to its study “Worldwide Solid State Drive 2007-2011 Forecast and Analysis”
How do SSDs deliver this performance? The devices are based on NAND gates, basic silicon circuits that generally store a single bit of information. When assembled into various configurations, NAND gates can be used to perform logical operations. In SSDs, however, the gates are not asked to get smart. Instead, they simply sit side by side with millions of identical gates which together store data.
NAND is not the fastest type of silicon circuit that can be used for Flash memory. That honour goes to NOR gates, another type of silicon circuit that can be used for Flash.
“With NAND, the disk needs to go through a sequence of bits to reach the bit it wants to write,” explains says Intel Australia’s Sean Casey. “With NOR, a disk can get to every bit simultaneously,” Casey explains.
But NAND gates are cheaper to make than NOR. It is also possible to cram more NAND gates onto a piece of silicon, which results in smaller chips. NOR is therefore used in RAM more often than in Flash
Chipmakers have also tweaked NAND gates and the electronics that surround them to make the technology more suitable for use in large-scale storage devices.
“Some of the things we do is to build 10 channels of NAND into an SSD,” Casey says. “You put a bunch of NAND chips in parallel to get the read/write capability: we can support 32 concurrent operations.”
Other SSD makers have similar hacks ticking away inside their machines. The result is that these disks are blazingly fast at reading data
Helping these tricks along is a controller between the collection of NAND memory and the SATA interface, while a Flash file system is also called upon to do a lot of work in the background.
Today’s Flash drives are also offered in multi-cell or single cell configurations.
The latter can operate at three different voltages, giving them a zero, a one and an in-between. This arrangement doubles the information each cell of NAND gates can store. The trade-off is slower write speeds. Single cell SSDs are a simple on/off proposition and are faster than their multi-cell cousins.
Single-cell SSDs are therefore destined for use in enterprise storage arrays, while the slower but more compact multi-cell SSDs will likely be used in drives destined for use in consumer electronics or laptop computers.
In either scenario, SSDs are simply packaged up and have the appropriate interface attached so they can be dropped into either a computer or a storage array.
PART TWO: Are SSDs unreliable?