Published: April 20, 2011
How many of you believe you have too much data? Do you even know how much you have? Or how many copies have been made? At what point is it too much?
Now that we have "Big Data"--does that mean our data centers need to be Super Sized in order to hold all that valuable information? Managing data growth is already a concern to many IT departments who are required to support Business Process Improvements while Controlling Costs (according to ESG's 2011 Spending Intentions Survey). When data accumulates in databases, deteriorating application performance and the need to add storage and compute resources is an opposing force to these corporate priorities.
I recently posted a Market Landscape report reviewing solutions that help organizations manage data growth. While this report is about how technology can be applied to either shrink data volumes or shrink the data footprint, there is no silver bullet that is the be-all/end-all to managing data growth.
One could argue that deleting data is the answer. So why not just delete data? Deleting database data reduces the data volumes and shrinks the data footprint--deleting takes the proverbial "bricks out of the database trunk."
Why not just delete data? What would happen if you decided to just take any data that was older than 7 years that hasn't been touched in the last 5 years and just purge it? Just think of the license savings--the amount of software licenses needed for the database could be decreased, the server and storage resources could be freed up, and DBAs could focus on data that was created today to support mission-critical business processes.
I worked on a project recently where we deleted so much data we saved over $14 million in avoided data center and resource costs. They nicknamed me "The Deleter"--because I like to delete.
Here's the catch. The only reason we were able to delete so much is because they had on average 14 copies of the same data. We started from the top and evaluated their records retention requirements, evaluated business processes, and mapped data flows all the way down to which databases and tables the data resided in. We then did a bottom up assessment quantifying how much data they had, what it was stored on, what it was costing them to retain all of this information, and what it would cost them if they didn't start cleaning house. It was an eye popping exercise.
While this organization was historically a data hoarder and were their storage vendor's best friend, it didn't hit them until Oracle conducted an audit and their license true-up was like a sucker punch to the face. Our project immediately took the shotgun seat right next to the New Data Center Build Out project. It turned out that if they had implemented a data management strategy from the beginning, they probably wouldn't have needed the second data center to begin with. Hindsight is 20-20.
Not every organization will have the luxury of being able to delete that much data--but that doesn't mean you shouldn't consider it. In the meantime, if you are suffering from having too much data, take a look at my Market Landscape Report for Managing Data Growth. Remember, the only prize for having the Biggest Data Center is the pile invoices from your vendors.
*All views and opinions expressed in ESG blog posts are intended to be those of the post's author and do not necessarily reflect the views of Enterprise Strategy Group, Inc., or its clients. ESG bloggers do not and will not engage in any form of paid-for blogging. Click to see our complete Disclosure Policy.
Browse by Content Type