Technologies

Prevent Data Cholesterol from Clogging Your Enterprise Applications

Keeping too much data can have dire consequences

What were you doing when the clock struck midnight on January 1, 2000? If you had a job managing enterprise systems, you were probably toasting yourself for having upgraded those systems in time for the big deadline—and I’m sure you deserved that glass of bubbly. After all, you prevented your company from getting stuck on legacy applications that might not have worked as of 1/1/2000.

Perhaps unwittingly, you also helped usher in a golden age of enterprise resource planning (ERP) software. In the late 1990s, many organizations made large-scale investments in ERP solutions in part to avoid potential Y2K problems with legacy systems.

Those initial ERP investments—though based on weak return on investment (ROI) justifications—are now generating an enormous ROI 10 to 12 years later. So if you implemented an ERP in the late 1990s, you averted the Y2K crisis and delivered ongoing benefits to your organization. But are you ready for Y2K Part Two?

Too much of a good thing?

All those gigantic ERP implementations set the stage for another crisis that’s now looming on the horizon. After companies purchased their new ERP systems, they transferred countless gigabytes of enterprise data into these systems and launched an unprecedented wave of data collection that continues today. Most major companies now store their data in massive relational databases that can be accessed by employees across the enterprise using standard systems.

Having centralized, accessible corporate data is a good thing. Data is the raw ingredient of analysis. Analysis drives strategic decisions. Strategic decisions lead to greater profits, heightened competitive advantage, and higher productivity.

But you can have too much of a good thing. After about 10 years of frenzied activity in moving data into ERP systems, IT departments are now dealing with the effects of what can best be described as data cholesterol.

Today’s ERP crisis: Data cholesterol

Data cholesterol is a condition in which the excessive buildup of data leads to sluggishness across your production systems. It extends to nonproduction data copies and affects the way all data is managed. Just as too much cholesterol in the human body can lead to serious health problems, data cholesterol hinders the smooth functioning of enterprise systems. It causes slower response times to customer service requests and report queries. It prolongs testing and reporting. It ripples through everything you do in IT, forcing you to use more labor to support your infrastructure. And it could expose your corporation to needless litigation.

None of the leading enterprise software vendors have provided an easy way for customers to archive or purge their data. Because they focused on creating integrated repositories—that is, on making it easy to get the data in—these vendors did not consider that customers might not want to keep that data forever. And given the complexities of ERP data models and their referential integrity, it is very difficult to pull out data without breaking something.

That’s why so many corporations now suffer from data cholesterol. Even midsized companies are amassing databases larger than one terabyte that are expanding at 30 to 70 percent each year. There are several reasons why databases of this size are not ideal (despite the fact that Moore’s Law keeps bringing down the price of hardware):

  • The current legal and regulatory environment creates potential liabilities when organizations keep data longer than is legally necessary. (Envision getting entangled in multimillion-dollar lawsuits.)
  • Maintaining predictable application performance is much tougher with large data volumes. (Imagine the system outages, frustrated employees, and lost customers.)
  • Although CPU and storage costs continue to decline, it’s not easy to free up IT budget to throw hardware at the problem. (Think of the boardroom arguments.)
  • At a time when CIOs must do more with less, the labor costs associated with managing terabyte-sized application environments are growing. (Do you want to beg your CFO for more money?)
  • The proliferation of production data in nonproduction systems (for development, testing, and training) exposes companies to data privacy liability. (Consider the devastating PR consequences.)

Companies are turning to EDM

The painful combination of tighter IT budgets, data cholesterol buildup, and strict regulatory requirements has driven savvy companies to begin focusing on how they manage their data. That’s why they’re using the principles of enterprise data management (EDM) as they implement their data governance functions. EDM focuses on creating accurate, consistent, and lean data content and integrating it into business applications. Today’s EDM solutions address the critical issues of data growth risk management; data privacy compliance; nimble test-data management; e-discovery; and application upgrades, migrations, and retirements—providing an effective way to avoid data cholesterol’s potentially dire adverse effects.

How is your company dealing with data cholesterol? Let us know in the comments.

Previous post

You are on the most current article.

Next post

A Brief History of Application Development

Marc Hebert

Marc Hebert joined Estuate in December 2008 as its first Chief Operating Officer. Marc brings with him 30+ years of IT industry experience, including 11 at Oracle in various VP roles, including Oracle’s first CIO. In 2010, 2011 and 2012, IBM awarded Marc the IBM Champion award in the Data Management segment. The award reflects IBM’s recognition of his role in evangelizing IBM Information Management.