Monday, July 14, 2014

Using SSD As A Foundation For New Generations Of Flash Databases - Nati Shalom

0 comments
“You just can't have it all” is a phrase that most of us are accustomed to hearing and that many still believe to be true when discussing the speed, scale and cost of processing data. To reach high speed data processing, it is necessary to utilize more memory resources which increases cost. This occurs because price increases as memory, on average, tends to be more expensive than commodity disk drive. The idea of data systems being unable to reliably provide you with both memory and fast access—not to mention at the right cost—has long been debated, though the idea of such limitations was cemented by computer scientist, Eric Brewer, who introduced us to the CAP theorem.

The CAP Theorem and Limitations for Distributed Computer Systems

Through this theorem Brewer stated that it was impossible for any distributed computer system to be able to provide users with these three following guarantees simultaneously:


  • Consistency (Every node will be able to view the same data at the same time) 
  • Availability (Every request will receive a response) 
  • Partition Tolerance (The system will continue to operate even if the system faces any arbitrary failures)
Read more here

Leave a Reply

 
All Tech News IN © 2011 DheTemplate.com & Main Blogger .