Handbook of Data Intensive Computing

Handbook of Data Intensive Computing

4.11 - 1251 ratings - Source

Data Intensive Computing refers to capturing, managing, analyzing, and understanding data at volumes and rates that push the frontiers of current technologies. The challenge of data intensive computing is to provide the hardware architectures and related software systems and techniques which are capable of transforming ultra-large data into valuable knowledge. Handbook of Data Intensive Computing is written by leading international experts in the field. Experts from academia, research laboratories and private industry address both theory and application. Data intensive computing demands a fundamentally different set of principles than mainstream computing. Data-intensive applications typically are well suited for large-scale parallelism over the data and also require an extremely high degree of fault-tolerance, reliability, and availability. Real-world examples are provided throughout the book. Handbook of Data Intensive Computing is designed as a reference for practitioners and researchers, including programmers, computer and system infrastructure designers, and developers. This book can also be beneficial for business managers, entrepreneurs, and investors.Table 29.1 The relational tables of some tablet products Product Manufacturer OS Hardware iPad 2 Apple iOS wifi TF101 ... techniques for different types of data sources is an important and interesting research problem [2, 9a€“11, 46, 48, 49].

Title:Handbook of Data Intensive Computing
Author:Borko Furht, Armando Escalante
Publisher:Springer Science & Business Media - 2011-12-10


You Must CONTINUE and create a free account to access unlimited downloads & streaming