Old Lean Dude

Artificial Ignorance

Written by GBMP | Jan 29, 2015 9:24:03 PM

For a few years back in the early ‘80’s I fell prey to information automation fascination. I managed an IT department transitioning first from a basic accounting system managed by an external service bureau to a batch inventory control system to an order processing and manufacturing control system running on a succession of minicomputers with names like Dec 1170 and HP3000.  If you recognize the names of these systems or if you are familiar with RPG, assembler, Cobol or FORTRAN, then you too may be an old lean dude. The hardware of that decade was slow, flimsy and subject to frequent crashes; the term “user-friendly” as applied to the user interface had not yet been invented.

Today, by comparison, I regularly carry around in my coat pocket a thumb drive with a million times the storage capacity accessible and one thousand times the speed of what was available to my entire company in 1980. And that’s puny compared to the multi-parallel processing power available to businesses today. Today’s super computers pile up so much logged data, that decision rules under the heading of “artificial ignorance” have been created to intentionally ignore data that has been deemed (by someone) to be insignificant. Amazing!

What’s more amazing, however, is that most of the software running on today’s super machines follows essentially the same network-scheduling model we were using in 1980. Call it MRP or ERP, it may be zippier and have more tentacles today than it did then, but the deterministic model that assumes we can know what to produce today based upon a forecast created weeks or months ago is still alive and well. Shigeo Shingo called this model “speculative production.” I call it computerized fortune telling. If production lead times are very long, the fact that we can now run that forecasting model one thousand times faster doesn’t improve its efficacy.

If we add to this model of forecasting and back scheduling a standard cost accounting system whose operating assumptions go back a century, then we have created a model that systematically optimizes local efficiency to four decimal places as it pyramids inventories. Eli Goldratt used to call these calculations “precisely wrong.”

I was at a company several weeks ago that is in the process of replacing a 1980’s MRP system with a later model ERP system. “We’ll be able to allocate our parts for specific orders,” the materials manager, Bob, explained to me. “Hmm,” I thought, “why is that a good thing?”

Bob continued, “We’ll have real-time data.” I reflected, “What does that mean? At best he’ll have a rear view mirror. He’ll be reading yesterday’s news." Assuming the transactions have been completed correctly and in a timely fashion, Bob will still only know the last place the material has been. Is it still in department A or is it in transit? Or has it arrived in department B, but not yet been transacted? This out of phase situation causes many a supervisor to chase down either parts or transactions to enable production.

“What happened to the pull system you were implementing last year?”, I ask.

“We’ve had to put our continuous improvement activities on hold until we go live,” Bob apologized, “but things will run much smoother once the new system is completely rolled out. We’re discussing an electronic kanban – going paperless.”

And this is where I cringe. I know that it’s been months since the continuous improvement effort was mothballed in order to redeploy resources to the ERP implementation. And I also know that Bob will likely have many more reasons to postpone CI efforts once they do go live. There will be ugly discoveries regarding the differences in rules and assumptions between the new and legacy systems. Material will be over-planned to compensate for shortages arising from start-up misunderstandings. Overtime will be rampant to catch up late deliveries.

Pardon me for sounding cynical. I’ve witnessed it too many times. In the last three months alone, I’ve heard similar stories from nine different organizations large and small. Immense resources are consumed to install hardware and software that runs counter to the objectives of improvement efforts. Thousands of resource hours are spilled into the abyss of information automation with a promise of productivity improvement – hours that would be have been far better spent on simplifying or eliminating questionable business processes. But nobody wants to talk about it publicly. One executive confided recently, “We’ve spent too much money to turn this off now.”

In 1976, Joe Weizenbaum, one of the early leaders in the field of artificial intelligence warned in his landmark book, Computer Power and Human Reason (on Amazon for $.01) that while computers can make decisions based upon rules, only humans should make choices. I worry that with each step change in computer power, human reason takes a step back. Dr. Weizenbaum foresaw the era we now live in where choice is reduced to a set of rules that hide beneath the legitimacy of the “system.” Ironically, this system, built up of nothing more and 0’s and 1’s and once described by Weizenbaum as the “universal machine” because it could be programmed to do anything, has on the contrary become the hugest monument process in any organization. Today’s popular ERP systems, with more than a quarter-billion lines of code, have too often become the tails that wag the dogs.

Maybe I’m just old school, but it seems that thirty-five years after my first love affair with computerization I’m still feeling jilted. How about you? Is your IT strategy supporting productivity and competitiveness or is it the tail that wags the dog? Tell me I’m wrong. Or share a story.

Happy New Year.:)

O.L.D.