We’ll have a number of Q&As coming your way in the days and weeks to come. Here’s one to kick things off — it’s with Red Hat’s Scott Clinton. Scott is senior director of product management & marketing for Storage and Big Data.
Storage has come a long way since magnetic tape…and, given the seismic shift to software and virtualization, its journey is just beginning. Right before the kick-off to this year’s Red Hat Summit in San Francisco, we sat down with Scott Clinton, who leads product management and product marketing for Red Hat Storage, to discuss the ins and outs of storage as well as some of the most important trends impacting it.
Q: Why has storage become such a hot topic?
A: There has really been a transformation around storage. It’s gone from a vertically integrated, locked-down approach, which has held IT back, to becoming innovative, open, and agile. One of the driving forces behind this is the growth of big data. Before a lot of data we collected was structured and resided in a data warehouse. Now, that data is largely unstructured. But the costs of traditional storage devices to hold this new type of data is prohibitive. This means we need new ways to store and access this information. There’s also the increasing use of open-source analytic tools like Hadoop, which is creating new ways to pull the value out of this information.
Q: Recently Gartner identified software-defined storage as the No. 2 most important tech trend for 2014. What exactly is software-defined storage and why is it important?
A: There have been a lot of definitions and they often come from the point of view of where a vendor started. Just like virtualization started with compute, the storage side of the world needs to start with the data. Some vendors talk about software-defined but what they are really talking about is virtualizing existing systems.
Software-defined storage is really about being able to truly decouple the data from wherever it’s stored, whether it’s a physical piece of hardware or running in the cloud. This gets around the big challenge IT organizations face: moving data from one storage platform to another. Software-defined storage eliminates that challenge. The data becomes portable.
But what makes storage really interesting is when we combine both applications and data in the same infrastructure–running applications on nodes. That’s something we’ve talked about for years, and it’s creating a whole new way of thinking about your big data infrastructure. It enables you to run jobs where the data is.
So storage has really moved from being this dumb device where you dump data to being a smart platform.
Q: We’re just beginning to see storage’s full potential, right?
A: Yes, correctly mining the business value in all that unstructured data is very early on. Right now, most companies are simply making sure that they keep the data they used to throw out. They might not really understand where this data exists or have the ability to store attributes of data, which can be used to do analytics. At Red Hat we have the unique ability to store the metadata with the data itself so you become smarter about what your data is and what attributes it possesses.
In the end data becomes smarter, which hasn’t been the case in the past.
Q: How else is Red Hat addressing this new disruptive power of storage?
A: This year when Gartner looked at the key disruptors of storage, they identified 4: Open source, which plays directly into where we are creating value for customers, software-defined storage, flash, and the cloud. We are providing a platform to help companies capitalize on all 4 of these innovations. We are one-third the costs of some of the traditionally storage solutions. It’s really a sea change.
There are a lot of software-defined solutions out there that aren’t truly open, and if they are open they usually have a proprietary component to them. But that hybrid model is really closed. The big value that Red Hat brings to the market is delivering a truly open solution to software-defined storage.