March 20, 2003

http://www.intelligententerprise.com/030320/605feat1_2.shtml

BI's Promised Land

Performance management's value transcends that of business intelligence.
Can Six Sigma techniques extend that value even further?

By Erik Thomsen

Like Moses leading his tribe out of Egypt toward the land of milk and honey, business intelligence (BI) analysts, associations, and vendors over the last year have attempted — once again, after many tries — to lead the tribes of BI consumers toward their version of the Promised Land.

Instead of the original milk and honey, BI consumers are promised the representation and management of processes, not just data. IDC, S.G. Cowen Securities, and Meta Group use the term business performance management (BPM) to describe the process of managing organizational performance, whereas Gartner Group uses the term corporate performance management (CPM) for the same thing. Not surprisingly, vendors have begun to adopt the same slogans. Cognos Inc., for example, falls into the CPM camp, while Hyperion Solutions Corp. has positioned itself as a BPM provider. However, because the underlying concepts, technology, and methods are relevant to any organization, I'll use the term organizational performance management (OPM).

OPM's emphasis on process does legitimately differentiate it from traditional BI, which emphasizes data and the querying of data. I can't imagine any members of the BI tribe arguing against the need for managing organizational performance. And in the months and years to come, I expect the operational definition of OPM to mature along with the available technology and solutions.

But is anybody else already delivering on these promises? Are there any other tribes in BI's Promised Land? In this article — the first of two installments — I'll describe how one tribe, which calls itself "Six Sigma," is at least close to this Promised Land (if not already in it), and why and how the essence of Six Sigma should be incorporated into OPM.

A Tale Of Two Tribes

The term Six Sigma is a buzzword meant to evoke images of BI's Promised Land. Specifically, "Six Sigma" is shorthand for an organizational process that's performing at such a high level of quality that six standard deviations of variance about the mean outcome of the process fall within customer-defined ranges of acceptability for the result of that process. This translates into only 3.4 unacceptable outputs, or defects, per million repetitions of the process — meaning that the process is operating at 99.9996 percent of perfection. (See Table 1)

Business intelligence, as a destination or purpose term, replaced decision support, executive information systems, and management information systems (and is itself on its way to being replaced by BPM, CPM, or OPM). Similarly, the term Six Sigma, which was coined at Motorola in the 1980s, publicly adopted by such companies as Allied Signal and Cisco Systems, and made famous by Jack Welch at General Electric, has replaced — and to some degree competes with — other tribe-defining terms such as total quality management and continuous improvement.

In the same way that the real substance behind BI comprises differently named things such as relational and multidimensional databases, query optimization, and visualization, the substance behind Six Sigma derives from statistical process control and methods for organizational improvement that trace back to the early parts of the 20th century. (Key contributors included F.W. Taylor, who spearheaded the move for organizational efficiency by attempting to scientifically measure and improve all aspects of a process; R.A. Fisher, the leading statistical thinker behind both statistical process control and the design of experiments; and W.E. Deming and J.M. Juran, who applied Fisher's techniques to industrial settings, most notably post-WWII Japan.)

The difference between the BI tribe and the Six Sigma tribe is less about the destination than the origin. BI (and now OPM) began with finance and sales and marketing managers in financial and retail industries and is slowly moving to encompass at least the top third of employees across all organizational functions, including manufacturing, through ideas such as activity-based management. In contrast, Six Sigma began with statisticians and engineers in manufacturing firms and is slowly moving to encompass all the functions of an organization including senior management, finance, and sales and marketing — as well as moving beyond manufacturing firms to encompass a range of industries.

According to Michael Edwards, senior statistician at Rexam (www.rexam.com), a multinational consumer packaging company with more than 20,000 employees, the company's initial successes using Six Sigma for the management of its manufacturing has led to the adoption of Six Sigma techniques for management, as well as the design of information systems. Edwards says that Six Sigma techniques are so helpful to organizational management that if it could do so, Rexam would start over by training its management team and then pushing the techniques out to the rest of the company.

Dan Thorpe, director of statistical resources at W.L. Gore & Associates (www.gore.com), the makers of Gore-Tex fabrics among many other innovative products, says that his company uses Six Sigma techniques in forecasting and financial performance, where the emphasis is the same as in manufacturing: separating the noise from what's real in a process.

Key Concepts And Methods

There are four key concepts and methods behind Six Sigma, but I'll only discuss the first two here (saving the rest for the next installment):

  • Think in terms of processes.
  • Measure and interpret processes against a backdrop of historically determined relative likelihoods.
  • Discover and then think in terms of measurable attributes of the processes that generated an outcome and that are the drivers of the measurable attributes of the outcome, rather than thinking just in terms of the measurable attributes of the outcome.
  • Improve processes through intentional experimentation with the drivers of measurable attributes of processes (or quality indicators). This method requires randomizing exogenous influences.

Thinking In Terms Of Processes

Manufacturing products is a process; developing software or producing a news program is a process; selling shoes, cars, insurance, or software is a process; creating a brand is a process; managing people is a process; predicting sales, costs, and operating profit is a process; improving a customer's happiness or value function is a process; deciding whether to buy a company is a process. Anything that can be described using action, motion, or activity terms within a sequence of steps that occupies some space(s) can be thought of as a process.

Thinking in terms of processes sets the stage for you to look at entire sets of interconnected events with an emphasis on the factors that explain, drive, or cause outcomes rather than just the outcomes themselves.

Take sales, for example. Within a BI context, organizations typically measure and report on how many products are sold for any time period (and location, channel, brand, and so on). In contrast, within a Six Sigma context, an organization would think about sales as the outcome of a sales process. Even if the organization is initially only tracking measurable attributes of the outcome of the process (how well sales are doing, for example), if it wants to improve product sales (whether by increasing the number of products sold or by lowering the cost of sales), it can't (without cooking the books) directly manipulate the outcome or the fact that sales were below forecast. Rather, it can only act on aspects of the sales process — whether it be the way the product is marketed, the way sales calls are followed up, the way demos are given, or the way the company works with its channel partners.

Within a process-centric framework, Six Sigma combines a statistically informed way of representing processes and a method for improving them in an iterative loop.

Framework For Representing Processes

There are two key steps to the representation of processes: representing the boundaries of acceptable and unacceptable process quality, and representing the inherent variability in the quality of those same processes.

Step one: Define what's acceptable. Where's the line between acceptable and unacceptable? And, is there one line or two? Some (unidimensional) process measurements, such as baking temperature for a ceramic process or driving speed in the curves for a race car at the Indy 500, have two boundaries: an upper and a lower. Other processes, such as customer service, may have just one.

Some acceptability boundaries, as is frequent in manufacturing, may be inherent in the process; others may be drawn by the customers (which is common among services-based businesses): How long is a customer willing to be kept on hold? How often is a customer willing to find a bug in your software, or your pizza?

Figure 1 shows the output of a sales forecasting process in the form of a graph where the upper and lower quality boundaries are defined to be where Plan is more than 10 percent above or below Actual. Because these forecasts are for consumption by the organization (as opposed to the financial markets), "beating the estimates" by a big margin isn't a good thing. That's because the estimates are used for capital budgeting, human resource planning, and other planning purposes, and it can be just as damaging to the organization to underplan as to overplan.

Step two: Measure how you're doing. Figure 2 represents a filling-in of the graph in Figure 1 where each point on the graph represents a monthly forecast. It's easy to see for each forecast point (and in hindsight) whether it was an acceptable or unacceptable forecast.

Now, imagine you're the new CFO and you just asked one of your analysts how recent sales forecasts compared with their respective actuals. And let's say your analyst told you the last two plans averaged about 15 percent above actuals. What could you do with that information aside from grumble? The answer is: not too much.

Here's where Six Sigma is a great approach. To be able to do something with the knowledge that your most recent, and reconciled, forecasts were averaging about 15 percent over actuals, you need to know whether it was a random fluke caused by chance or whether it's a sign of a systematic problem. And to know that, you have to calculate the inherent variability in the sales forecasting process, which can be done either by measuring every sales forecasting instance (called the population), or by measuring a sample drawn from the population.

For example, if the inherent variability in your sales forecasting process were very small (see Figure 3), the chance that two forecasts in a row differ from actuals by an average of 15 percent would be about one in 5,000! Thanks to your knowledge of the historical variability in your forecasting process, you can be pretty sure that some extra-systemic thing is affecting your otherwise high-quality sales forecasting process. Searching for whatever is affecting your sales forecasting process would be a rational response.

In contrast, if the inherent variability in your sales forecasting process were large (see Figure 4), the likelihood that two forecasts in a row differ from actuals by an average of 15 percent would be about one in two! In other words, your forecasting process is so poor that a series of two poor forecasts in a row had a 50 percent chance of occurring naturally. Here, a rational response would be to look for ways to reduce the variability in your sales forecasting process.

Most Six Sigma implementations provide a number of basic process scorecarding techniques that work within an event-based triggering system to alert the user to combinations of events (single events or series of events), that have an unusually low risk of occurring purely by chance. For example, the occurrence of four out of five consecutive observations occurring two or more standard deviations above the mean, or the occurrence of six of more consecutive observations either all increasing or decreasing, is an indicator that something systematic is happening and needs to be investigated. This stuff is useful for any OPM system!

Regardless of whether the inherent variability in your organizational processes are large or small, knowing that inherent variability, which can only result from an ongoing process of measurement, is the key to understanding the meaning of real-time events and the appropriate decisions that need to be taken as a result. As John Wilkes (in his oft-quoted presidential address to the American Statistical Association in 1951) incorrectly attributed to H.G. Wells: "Statistical thinking will one day be as necessary for efficient citizenship as the ability to read and write."

OPM's process-centric approach is a great step forward for the BI community. But if OPM is going to lead BI consumers all the way to the Promised Land, it needs to incorporate the statistically grounded process representation techniques of its sister tribe, Six Sigma.

Erik Thomsen [ethomsen@dsslab.com] is a researcher and consultant for DSS Lab Inc. and focuses on integrated multitechnology analytic solutions. He is the author of OLAP Solutions, Second Edition (John Wiley & Sons, 2002) and coauthor of Microsoft OLAP Solutions (John Wiley & Sons, 1999).


RESOURCES

Cochran, William G., and G.M. Cox. Experimental Designs: Second Edition. John Wiley & Sons, 1957.

Johnson, Richard A. Miller & Freund's Probability and Statistics for Engineers: Sixth Edition. Prentice-Hall, 2000.

Pande, Peter S., R.P. Neuman, and R.R. Cavanagh. The Six Sigma Way: How GE, Motorola, and Other Top Companies are Honing Their Performance. McGraw-Hill, 2000.

Taylor, Frederick W. The Principles of Scientific Management. W.W. Norton & Co., 1967.

Geishecker, L., and N. Rayner. "Corporate Performance Management: BI Collides with ERP." Gartner Research, Dec. 17, 2001.

Blumstein, Robert, and H. Morris. "Analytic Applications for Business Performance Management: Worldwide Financial/Business Performance Management — Software

Forecast and Analysis, 2002-2006." IDC, June 2002.

S.G. Cowen Securities Corp. Business Performance Management, January 2002.

ASQ: www.asq.org

Juran Institute: www.juran.com

Six Sigma Academy: www.6-sigma.com

W. Edwards Deming Institute: www.deming.org

Return to Article