Red Hat Ceph Storage is a proven, petabyte-scale, object storage solution designed to meet the scalability, cost, performance, and reliability challenges of large-scale, media-serving, savvy organizations. Designed for web-scale object storage and cloud infrastructures, Red Hat Ceph Storage delivers the scalable performance necessary for rich media and content-distribution workloads.
While most of us are familiar with deploying block or file storage, object storage expertise is less common. Object storage is an effective way to provision flexible and massively scalable data storage without the arbitrary limitations of traditional proprietary or scale-up storage solutions. Before building object storage infrastructure at scale, organizations need to understand how to best configure and deploy software, hardware, and network components to serve a range of diverse workloads. They also need to understand the performance and scalability they can expect from given hardware, software, and network configurations.
This reference architecture/performance and sizing guide describes Red Hat Ceph Storage coupled with QCT (Quanta Cloud Technology) storage servers and networking as object storage infrastructure. Testing, tuning, and performance are described for both large-object and small-object workloads. This guide also presents the results of the tests conducted to evaluate the ability of configurations to scale to host hundreds of millions of objects.
After hundreds of hours of [Test ⇒ Tune ⇒ Repeat] exercises, this reference architecture provides empirical answers to a range of performance questions surrounding Ceph object storage, such as (but not limited to):
What are the architectural considerations before designing object storage?
What networking is most performant for Ceph object storage?
What does performance look like with dedicated vs. co-located Ceph RGWs?
How many Ceph RGW nodes do I need?
How do I tune object storage performance?
What are the recommendations for small/large object workloads?
What should I do? I’ve got millions of objects to store.
Strata+Hadoop World 2015 is underway in San Jose, CA and Red Hat Storage is on the scene. Today, Brian Chang chats with Greg Kleiman, director of storage and big data at Red Hat, to learn what news Greg is expecting to hear at the show and also what Red Hat is up to. Watch the video or read on for a summary.
Continue reading “Storage & Big Data Tutorial – Live at Strata+Hadoop World with Greg Kleiman”
One of the on-going challenges of IT is managing the never-ending demands for more and more data processing and information storage. One organization facing this challenge was the Metro de Madrid. Their IT infrastructure managed 45-50TB (terabytes) of information, divided into two main blocks: high-speed storage via fiber optic channels and storage across networks. Responsible for the administration and maintenance of operational computer systems of the Metro de Madrid, such as train traffic, energy management, audio systems, traveller information systems, and more, they required:
Continue reading “Keeping the trains on time: Improved scalability, capacity, availability and zero downtime on 90% of critical systems for Metro de Madrid”
by Irshad Raihan, Red Hat Storage – Big Data Product Marketing
The trusty paper shredder in my home office died last week. I’m in the market for a new one. Years ago, when I purchased “Shreddy” (of course, it had a name) after a brief conversation with a random store clerk, choices were few and information scarce. In fact, paper shredders weren’t really considered standard personal office equipment as they are today. Most good shredders were built for offices not homes. Back in the market more than a decade later, it’s clear that the search for a new shredder is going to be trickier than I had imagined.
A paper shredder is a lot like big data.
Continue reading “What Can a Paper Shredder Teach Us About Big Data?”
by Irshad Raihan, Red Hat Storage – Big Data Product Marketing
Digital data has been around for centuries in one form or the other. Commercial tabulating machines have been available since the late 1800’s when they were used for accounting, inventory and population census. Why then do we label today as the Big Data age? What dramatically changed in the last 10-15 years that has the entire IT industry chomping at the bit?
More data? Certainly. But that’s only the tip of the iceberg. There are two big drivers that have contributed to the classic V’s (Volume, Variety, Velocity) of Big Data. The first is the commoditization of computing hardware – servers, storage, sensors, cell phones – basically anything that runs on silicon. The second is the explosion in the number of data authors – both machines and humans.
Continue reading “The Data Life Cycle Has Changed. Are You Ready?”
We’ll have a number of Q&As coming your way in the days and weeks to come. Here’s one to kick things off — it’s with Red Hat’s Scott Clinton. Scott is senior director of product management & marketing for Storage and Big Data.
Continue reading “A Q&A with Red Hat’s Scott Clinton”
Summit Spotlight: Don’t Miss These Storage Tracks and Sessions
Red Hat Summit kicks off this year from April 14-17th in San Francisco, CA. We’ve organized more than 150 breakout sessions, each with a unique solution-focus, but attendees of all experience levels will see a variety of products, demos, customer success stories, and more.
Continue reading “Summit Spotlight: Don’t Miss These Storage Tracks and Sessions”
By Steve Bohac, Red Hat Storage Product and Solution Marketing
Open software-defined storage is transforming the way organizations tackle their data management challenges. We are seeing that more and more customers are realizing that an open software-based approach can create opportunities to significantly reduce costs and efficiently contend with their exploding data landscape. Additionally, open software-defined storage solutions can help discover new roles and value for enterprise storage.
Continue reading “Manageability Becoming A Key Component of Open, Software Defined Storage (Red Hat Storage Console Now Available!)”
Several weeks ago, we posted the blog “Open Software Defined Storage – Don’t Get Fooled By The False ‘Open’ and Get Locked-In Again”. Today’s entry is the conclusion of this four part mini-series.
We understand how difficult it is to optimize your storage for innovation and growth, and our goal is to help enterprises on their journey to convert their data centers from cost centers into revenue-generators. Red Hat Storage Server has helped businesses of all varieties achieve their objectives. Here’s how open, software-defined storage has helped a few organizations get to the next level:
Continue reading “Red Hat’s Approach with Open, Software-Defined Storage (A Four Part Series)”