Today's Top SOA Links
Industry News Desk
Red Hat’s Big Data Strategy: First Rip Out the Hadoop File System
Whether Hadoop on GlusterFS performs at scale and what is scale remains to be seen
By: Maureen O'Gara
Feb. 22, 2013 07:45 AM
Red Hat, which has staked its fortunes on hybrid computing, sees Big Data as a “killer app for the open hybrid cloud.”
It sketched out the direction it’s gonna take with Big Data and the cloud the other day when it said it was gonna open source its Hadoop plug-in – which is based on the Gluster File System, the open source widgetry Red Hat bought for $136 million in 2011 – and give it to the Apache Software Foundation sometime later this year after the plug-in gets out of preview.
The company says “Red Hat Storage brings enterprise-class features to Big Data environments, such as geo replication, high availability, POSIX compliance, disaster recovery and management, without compromising API compatibility and data locality.”
And because Red Hat has joined its storage and virtualization at the hip, “Customers now have a unified data and scale-out storage software platform to accommodate files and objects deployed across physical, virtual, public and hybrid cloud resources.”
EMC, NetApp and QuantCast also have HDFS alternatives and QuantCast’s is open source.
Naturally Red Hat’s move comes with strings that tie back into its greater vision of hybrid cloud computing.
Reader Feedback: Page 1 of 1
Web 2.0 Latest News
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week