By Leon Katsnelson
By Susan Visser
By Bernie Spang
By the DB2 Guys
By Fred Ho
By Louis T. Cherian
By Shweta Shandilya
By Lawrence Weber
By Serge Rielau
By Dwaine Snow
How do you make sure you get the most value from business analytics? Not by empowering the tool users, but by enabling their internal clients, the consumers of the information.
But organizational structure, costs, onboarding challenges, and lack of coordination get in the way—and the problems multiply with the size of the organization. IBM is no different; it’s one of the largest organizations in the world, and its business intelligence (BI) processes and practices had become more and more complex, redundant, and expensive over time. In 2008 it set out to change that with a System z–hosted, cloud-based initiative named Blue Insight, and the company is now well on its way toward its goal of saving US$20 million over five years. How it got there is a lesson in effective management of technology, culture, resources—and restraint.
Like other giant companies, IBM was awash in tools in 2008: Hyperion, Actuate, Arcplan, Brio, and others were in wide use. Lawrence Yarter, chief architect of the IBM Worldwide Business Analytics Center of Competence (BACC), notes that analytics investments were driven by business needs but were not coordinated. “We had a lot of BI strategies; it was essentially ‘roll your own’ within business unit within geography,” he says.
Every internal organization—from product development to sales and marketing, finance, and beyond—produced reports. Each organization had its own BI tools and infrastructure, each was funded with its own budget, and each was driven by its own metrics and priorities. From all appearances, the situation was only going to deteriorate. IBM’s many internal CIOs had BI investments on their radar, and most involved new hardware and software acquisitions. As many as 50 new installations—sizable ones—were already on the drawing board when Yarter and his team kicked off the Blue Insight project.
Yarter and his team began a six-month process to create a three-year plan with three watchwords: consolidate, virtualize, automate. It began with a simple set of objectives—centralized operational support and onboarding processes, common hardware and software, shared data, and a BACC to provide and promote the value of a service provider model to internal clients.
The overarching objective was to rationalize costs and improve effectiveness yet preserve the individual businesses’ ownership of their content, says Yarter. “It’s essential in any plan to respect the way business units want to function. They don’t want to lose the autonomy that lets them be effective at what they are being measured on: creating revenue.”
The planning process set the stage, and Yarter was ultimately able to use the big-picture perspective to identify opportunities for early successes that would help motivate the team and the clients.
Early results from the analysis revealed complexity that extended beyond product duplication. In six weeks, the portfolio analysis uncovered more than 50 multiproduct, departmental deployments, each with over 100 users, and running across more than 60 data sources—200,000 global named users in all.
Legacy portals had been deployed with custom authorization and authentication methods, each with custom code that needed to be updated and maintained. The resulting redundancy, lack of standards, absence of sharing, and opacity of costs were leading to inefficient systems that lacked agility and relied on multiple nonstrategic skill sets.
These metrics showed the scope of the problem, and they created a basis for documenting improvement as the project progressed. By paying attention and staying flexible, Yarter’s team was able to identify and incorporate opportunities to improve on the original plan. For example, the team revised the project goals to include a services delivery model that would simplify deployment to new audiences. This would help reduce costs and lower barriers to entry, which would likely make the needed changes an easier sell.
Business unit autonomy also guided decisions on staffing issues. If business managers were to have confidence in the outcome, they would want to keep their specialists on their own teams, not surrender them to a centralized authority. “We had many talented people with different skill sets—whatever transformation program emerged, I could not assume that I could move people around,” says Yarter. Since the title “analytics IT specialist” did not mean the same thing in every department, the BACC’s mandate to be a service provider, training center, and evangelist was crucial.
At every turn, Yarter made it a point to think about how the project would be perceived by local executives. One of the important socialization tactics that Yarter adopted was to be careful not to appear to provide a “solution.” Executives are responsible for their own success—they want to piece together their own answers. “If they feel they maintain autonomy,” says Yarter, “nobody fights you.”
In fact, defining standard services—creating a “data deli” that the business units could access—became the point of the spear for Yarter. This facilitated even more recognition of business unit autonomy within the final structure. “We agreed to maintain the hardware and put security around it,” says Yarter. “After that, we made a clear statement: ‘Data strategy for process owners is outside our purview. You decide that the data is trustworthy; then we’ll connect to it and use it.’ ”
This was a clear signal that Blue Insight was not yet about centralizing governance. Recognizing that the participants “owned” their data was not only useful politically, it also sped up the adoption process by eliminating lengthy negotiations and the creation of data “ownership standards”—at least for the time being. “We will facilitate that at some point,” says Yarter. “I can analyze the replication and redundancy, and as more participants join, we can add governance later. Right now, it would affect our ability to go fast.”
A few months later, the project was well under way. A dedicated infrastructure was in place, the cross-functional BACC was staffed, and the converged BI infrastructure—which includes IBM WebSphere Application Server, IBM HTTP Server, DB2, Java Runtime Environment 32-bit and IBM SDK for Java, and IBM Cognos 8 BI—had been built. Defined, standardized processes for onboarding participants were in place. The team set about migrating the identified business units into Blue Insight.
The trade-off for participants is clear: participation drives value. “The named user model, where participants sign up for services, has driven down costs for all,” says Yarter. Participants eliminated the need to buy new hardware and software from their own budgets and, where they were already in place, eliminated maintenance costs for both.
Even with latest upgrades, Blue Insight’s cost is about US$10 per head per year. This gets past the difficult part of agreeing on standards and general availability. Business units simply cannot build and deploy their own comparably equipped environment for less than what Blue Insight costs.
Thus far, the project has exceeded expectations. The target was 55,000 users in the first year—a high bar that Yarter and his team beat by 31 percent, getting to 72,000 by the end of 2009. The new target is 120,000 users by the end of 2010 and 200,000 in 2011, at which point growth is expected to level off.
As of midyear 2010, Yarter reports that they have onboarded 113,000 users, a giant piece of the year’s 120,000-user goal. “Adopting executives represent all of our geographies and business process areas. We have more than 50 adopters in production, and user communities range from fewer than 50 users to more than 50,000 users,” Yarter says. He also expects to meet Blue Insight’s 2010 target allocation of the projected US$20 million savings in short order. “More than 80 percent of the projected savings for 2010 has already been booked,” says Yarter.
The softer goals are being met as well. Within IBM, adding users to the existing z-based system is now 13 times faster than creating new deployments on prior IBM System p–based stand-alone platforms. Because of the elasticity of the environment, Yarter says, “I don’t have to be concerned with whether resources are there.”
Blue Insight is now ubiquitous, always on, and well defined. Because it’s positioned as a service, not a solution, data preparation may need to occur—new constituents may want to add content and deliver it via a portal with a specific lens. Blue Insight need not become involved with that; new users understand the services offered. As a result, the time to value is much shorter. In this case, if they build it, they have already come. Lesson learned.
DB2 TechTalk: Deep Dive on BLU Acceleration in DB2 10.5, Super Analytics Super Easy
Thursday, May 30: 12:30 – 2:00 PM ET
Big Data Seminar 2013, Featuring Krish Krishnan
June 14 in New York City
marcus evans Pharma Data Analytics Conference
July 10-11 in Philadelphia
IBM Smarter Content Summit 2013
Big Data at the Speed of Business
Broadcast event replay now available
Information on Demand 2013: Early Bird Registration Now Open
November 3-7 in Las Vegas