yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.

2008 West
Data Direct
SOA, WOA and Cloud Computing: The New Frontier for Data Services
Red Hat
The Opening of Virtualization
User Environment Management – The Third Layer of the Desktop
Cloud Computing for Business Agility
CMIS: A Multi-Vendor Proposal for a Service-Based Content Management Interoperability Standard
Freedom OSS
Practical SOA” Max Yankelevich
Architecting an Enterprise Service Router (ESR) – A Cost-Effective Way to Scale SOA Across the Enterprise
Return on Assests: Bringing Visibility to your SOA Strategy
Managing Hybrid Endpoint Environments
Game-Changing Technology for Enterprise Clouds and Applications
Click For 2008 West
Event Webcasts

2008 West
Get ‘Rich’ Quick: Rapid Prototyping for RIA with ZERO Server Code
Keynote Systems
Designing for and Managing Performance in the New Frontier of Rich Internet Applications
How Can AJAX Improve Homeland Security?
Beyond Widgets: What a RIA Platform Should Offer
REAs: Rich Enterprise Applications
Click For 2008 Event Webcasts
Today's Top SOA Links

Big Data Security for Apache Hadoop
Ten Tips for HadoopWorld attendees

Big Data takes center stage today at the Strata Conference & Hadoop World in New York, the world’s largest gathering of the Apache Hadoop™ community. A key conversation topic will be how organizations can improve data security for Hadoop and the applications that run on the platform. As you know, Hadoop and similar data stores hold a lot of promise for organizations to finally gain some value out of the immense amount of data they're capturing. But HDFS, Hive and other nascent NoSQL technologies were not necessarily designed with comprehensive security in mind. Often what happens as big data projects grow is sensitive data like HIPAA data, PII and financial records get captured and stored. It's important this data remains secure at rest.

I polled my fellow co-workers at Gazzang last week, and asked them to come up with a top ten list for securing Apache Hadoop. Here's what they delivered. Enjoy:

Think about security before getting started – You don’t wait until after a burglary to put locks on your doors, and you should not wait until after a breach to secure your data. Make sure a serious data security discussion takes place before installing and feeding data into your Hadoop cluster.

Consider what data may get stored – If you are using Hadoop to store and run analytics against regulatory data, you likely need to comply with specific security requirements. If the stored data does not fall under regulatory jurisdiction, keep in mind the risks to your public reputation and potential loss of revenue if data such as personally identifiable information (PII) were breached.

Encrypt data at rest and in motion – Add transparent data encryption at the file layer as a first step toward enhancing the security of a big data project. SSL encryption can protect big data as it moves between nodes and applications.

As Securosis analyst Adrian Lane wrote in a recent blog, “File encryption addresses two attacker methods for circumventing normal application security controls. Encryption protects in case malicious users or administrators gain access to data nodes and directly inspect files, and it also renders stolen files or disk images unreadable. It is transparent to both Hadoop and calling applications and scales out as the cluster grows. This is a cost-effective way to address several data security threats.”

Store the keys away from the encrypted data – Storing encryption keys on the same server as the encrypted data is akin to locking your house and leaving the key in your front door. Instead, use a key management system that separates the key from the encrypted data.

Institute access controls – Establishing and enforcing policies that govern which people and processes can access data stored within Hadoop is essential for keeping rogue users and applications off your cluster.

Require multi-factor authentication - Multi-factor authentication can significantly reduce the likelihood of an account being compromised or access to Hadoop data being granted to an unauthorized party.

Use secure automation – Beyond data encryption, organizations should look to DevOps tools such as Chef or Puppet for automated patch and configuration management.

Frequently audit your environment – Project needs, data sets, cloud requirements and security risks are constantly changing. It’s important to make sure you are closely monitoring your Hadoop environment and performing frequent checks to ensure performance and security goals are being met.

Ask tough questions of your cloud provider – Be sure you know what your cloud provider is responsible for. Will they encrypt your data? Who will store and have access to your keys? How is your data retired when you no longer need it? How do they prevent data leakage?

Centralize accountability – Centralizing the accountability for data security ensures consistent policy enforcement and access control across diverse organizational silos and data sets.

Did we miss anything? If so, please comment below, and enjoy Strata +HadoopWorld.

About David Tishgart
David Tishgart is a Director of Product Marketing at Cloudera, focused on the company's cloud products, strategy, and partnerships. Prior to joining Cloudera, he ran business development and marketing at Gazzang, an enterprise security software company that was eventually acquired by Cloudera. He brings nearly two decades of experience in enterprise software, hardware, and services marketing to Cloudera. He holds a bachelor's degree in journalism from the University of Texas at Austin.

Web 2.0 Latest News
My favorite writer, Gil Press, sums it up with, “It’s Official: The Internet Of Things Takes Over Big Data As The Most Hyped Technology” where he talks about how Gartner released its latest Hype Cycle for Emerging Technologies, and how big data has moved down the “trough of disillusion...
When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potentia...
Last week, I was a keynote speaker at the Mobile Edge '16 conference in Lisbon, Portugal. Following the conference I met individually with many different companies. These companies represented a variety of industries, sizes and digital maturity levels. I was sharing my latest digital ...
Ten short years ago, Apache Hadoop was just a small project deployed on a few machines at Yahoo and within a few years, it had truly become the backbone of Yahoo’s data infrastructure. Additionally, the current Apache Hadoop market is forecasted to surpass $16 billion by 2020. This mi...
COMED, my power company, sends out a monthly report that shows me my energy consumption relative to my neighbors. Every month I’m considerably higher than all my neighbors. This report also has a list of things I could do to reduce my energy consumption. The problem with this report is...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Click to Add our RSS Feeds to the Service of Your Choice:
Google Reader or Homepage Add to My Yahoo! Subscribe with Bloglines Subscribe in NewsGator Online
myFeedster Add to My AOL Subscribe in Rojo Add 'Hugg' to Newsburst from CNET Kinja Digest View Additional SYS-CON Feeds
Publish Your Article! Please send it to editorial(at)!

Advertise on this site! Contact advertising(at)! 201 802-3021

SYS-CON Featured Whitepapers