CIO Review recently spoke with Jeff Cotrupe about issues holding back adoption and deployment of Big Data solutions, and the result is a two-page article in the May 2014 issue. In one of the key sections of the piece, Cotrupe urged those deploying Hadoop to “seriously consider Hadoop–but get all the Hadoop you need.” Said Cotrupe, “It is certainly not an exact analogy, but I liken Hadoop to my Firefox browser: I love it, but only with my favorite add-ons like Tab Mix Plus, Fireshot, QuickTime, and Adobe Shockwave. Similarly with Hadoop, to get the most out of your data management solution you need modules that have been developed by other Apache Software Foundation working groups: modules such as Pig to accommodate semi-structured data and simplify various tasks. Hive to use Hadoop as an enterprise data warehouse (EDW). Sqoop and Flume to import Web log data into a Hadoop Distributed File System (HDFS), and to integrate structured data into the mix. Ambarri, Whirr, Zookeeper, and Oozie to accelerate deployment, simplify ongoing management, manage workflow, and maintain data synchronization.”
The 3D May 2014 issue of CIO Review appears here and the article featuring Cotrupe, Big Data: Delivering More with Less, appears on pages 31 and 32.