Tuesday, October 17, 2006

"EtherRAM" speeds data center I/O

Startup Gear6 emerges from stealth mode today [Oct 16] with a novel concept for boosting performance in the data center with rackable RAM cache systems that sit on an Ethernet network. The CacheFx appliance creates a scalable pool of shared semiconductor memory that fills a gap between an individual server’s main memory and the data center’s storage-area network. Future products will sit on Fibre Channel and iSCSI networks.

Even over an Ethernet link, access time for the cache appliance is 10 to 50 times faster than getting data from a storage network, the company claims.

A principal engineer from Google shared the stage at the Intel Developer Forum in September with Intel Corp. CTO Justin Rattner, noting the need for something to help fill that performance gap.

The startup’s secret sauce is in its software that makes the systems easy to use. Paving a road to what could be an eventual acquisition, Gear6 has already forged partnerships with Sun Microsystems and Network Appliance to ensure their systems interoperate with the CacheFx boxes. --rbm

3 comments:

Anonymous said...

Hi, i was looking over your blog and didn't
quite find what I was looking for. I'm looking for
different ways to earn money... I did find this though...
a place where you can make some nice extra cash secret shopping.
I made over $900 last month having fun!
make extra money

Anonymous said...

You should get a tool that filters out SPAM from blogs.

Anonymous said...

For workloads that utilize quite a bit of relatively static or long-lived content, a fabric attached cache would provide benefit. The number of use cases is likely limited. Even then, is the cache really going to provide that much benefit once the overhead is taken into account?

Technically, some application needs to access the content, process it, then forward it on to the end consumer. Even the VOD cited requires a server to download content blocks at the receiver's rate and if multiple consumers are accessing the same data, its cache isn't that stretch.

For other applications such as data warehouse / decision support, these require large amounts of data to be read and processed. While the fabric cache can provide some level of speed mismatch compensation, its value would be limited.

The company's website does not really provide much material on their secret sauce so it is difficult to judge whether this has potential beyond niche markets.

 
interconnects