Posts Tagged ‘Human judgment’

Changing the role of the supply chain planner

Published May 25th, 2012 by Lori Smith 0 Comments

We try to keep the self-promotion to a minimum, but we have a great customer, TriQuint Semiconductor, that recently allowed us to write up their case study story that I think has merit in sharing.

There’s lots of good bits on how they are using RapidResponse and the benefits they see (like realizing inventory reductions a month before the solution even went live). But the jewel of the story is where they say that, with RapidResponse, they “are changing the role of the planner to be more analytical rather than just the data hunter.”    A simple statement… with enormous implications.

From the corporate perspective, this means higher productivity, more value-added activity, and results-oriented decisions…all leading to better business results.   But equally important, from the supply chain professional perspective, think of the satisfaction that happens when someone feels equipped and able to actively and effectively contribute to the business.  It becomes not just about getting things done, but about making a difference.

JP Swanson of TriQuint said it best…

“Previously we spent so much time gathering the information that we weren’t able to spend enough time analyzing the information, understanding it and doing something about it…Our team likes the fact that they can use their minds more and they are getting to touch a different skill level because they have more time to do it and more information in front of them to do it with.”

Awesome.

A big thank you to TriQuint for letting us tell their story.  We love what you are doing with the product!

Posted in Supply chain management


Forget social, it’s imbedded apps that matter

Published January 13th, 2012 by Trevor Miles @milesahead 0 Comments

I’ve been going through all the predictions for 2012 by different analysts and bloggers, and while there have been some interesting takes, the one that stands out for me is the one by Vinnie Merchandani titled “The Real Mega-trend in IT. Hint: it’s not social, cloud, Big Data” in which Vinnie writes about “smart’ products.

Vinnie mentions the fact that Daimler AG has over 1,000 developers in R&D, and a few hundred IT people supporting their internal enterprise systems such as SAP.  Of course they still have lots of people designing cars, but it is the amount of software that is in a car that is amazing. We are all aware of the more obvious innovations such as OnStar, which is developing into a platform like any other such as iPad or Android, with the recent announcement at CES of a developer community.

But it is the stuff ‘under the hood’ that is truly revolutionary, especially for the supply chain.  We are only getting there, but stuff is happening, especially in the industrial world. We are getting ‘smart’ machines that can self-diagnose, determine the corrective action, start a case, determine inventory availability, and schedule a technician to make a physical change when the material is available.  But now that so much software is imbedded in systems, this may require a software patch to be developed and downloaded when available.

Of course this is an extension of concepts first described 10-15 years ago when RFID first became available, even as long ago as when bar codes were becoming standard.  But these technologies could only really expand the use of inventory data.  I don’t mean to belittle the value of inventory data.  Far from it since ultimately it is the availability of inventory on a store shelf or in a warehouse that triggers the need for replenishment to satisfy demand.  But ‘smart’ machines go so much further.  In addition, naturally there will be analytic apps that absorb the information coming from many ‘smart’ machines in order to detect trends, for example that a certain mechanical part is failing more frequently or after a certain length of use.

Naturally the ‘smart’ machine scenario I paint above is most applicable to the service supply chain, but with the increased technology content in nearly everything that is man-made, the ability for a company to change how their product behaves by simply downloading new software is an incredible change for both the users and manufacturers/designers of the equipment.  I am old enough to remember washing machines and dishwashers that had electro-mechanical controllers for different cycles, and the mechanical systems themselves were often designed specifically for the cycles offered in that model. Now the mechanics are nearly identical, with different cycles being driven by software downloaded onto identical controllers.  I can see the day when we will upgrade our appliances by simply downloading new software, assuming the mechanics hold up and the aesthetics are still acceptable.  But why not have a ‘smart’ skin with selectable color and/or pattern that the user can change at any time?

But, while I agree with Vinnie that this is a major trend, at the same time, as illustrated by OnStar recruiting an independent apps development community, ‘smart’ machines should really be viewed as simply another user experience, such as tablets. In fact, I don’t see any reason why a tablet couldn’t be imbedded in a car’s dash, or a dishwasher’s front panel. So in the end I think many of the other trends are still very relevant, especially big data and cloud. But this is ‘structured’ data. The ‘smart’ machines will be programmed to issue clear and consistent instructions.

So where does social fit into the mix?  Andrew McAfee wrote an interesting blog in 2010 titled “Did Garry Kasparov Stumble Into a New Business Process Model?” in which he explores the trends of machine intelligence. Machines are very good at doing repetitive things quickly, including ‘smart’ machines. As McAfee notes,

Kasparov notes that computers play chess not by simulating human reasoning, but instead by comparing all possible moves and their consequences — the resulting board positions, subsequently available countermoves, possible counter-countermoves, etc. — until time runs out and a decision is necessary. And time will always run out; there are 10^40 possible legal board positions and 10^120 possible games, so even today’s fastest computers can’t be exhaustive. But they can be thorough, precise, and consistent. They evaluate lots of options, compare them rigorously, and never ever overlook or forget anything that they’ve been programmed to take into account

In other words machines, even ‘smart’ machines struggle to deal with nuance and ambiguity, though of course the recent experiment run by IBM in which they pitted Watson against top Jeopardy winners shows that even in this realm computers are beginning to gain ground. However, as stated by Greg Lindsay in his blog titled “How I Beat IBM’s Watson at Jeopardy (3 Times!)”, Watson is beatable by competing in areas in which humans excel, namely ambiguity and nuance.

Binary relationships–countries and their capitals, for instance–would be easy for him to figure out, and he would beat me to the buzz every time. So I had to steer him into categories full of what I called “semantic difficulty”–where the clues’ wordplay would trip him up. I would have to out think him.

So I am confident that in supply chains we will forever need human judgment and what computers will bring is primarily speed.  It will be in the exchange of unstructured data and the exploration of nuance and ambiguity that social will play its biggest role within the supply chain. After all, most decisions made between two supply chain partners are about compromise in which there isn’t one right answer, even though that is a view many supply chain vendors have promoted over the years through the use of optimization.

In conclusion, I agree with Vinnie that ‘smart’ machines and imbedded apps are big news and a major trend in IT and user experience. But from a supply chain perspective my sense is that ‘smart’ machines will mostly be an additional contributor to big data, which in itself is big news in the supply chain.  But it will be the manner in which we exploit social concepts in the supply chain that will be a real breakthrough because so much of supply chain management is about risk mitigation across a range of possibilities in a very ambiguous world.

Posted in Best practices, Milesahead


Is collaboration the next supply chain optimizer?

Published June 17th, 2010 by John Sicard 2 Comments

On June 15th, I had the privilege of presenting at the world first Chief Supply Chain Officer Summit alongside a very well-known and respected Supply Chain Leader. I say alongside because Angel Mendez, Senior Vice President of Customer Value Chain Management at Cisco (NASDAQ: CSCO), really did the majority of the work. On this occasion, his message focused on the path he’s taking towards creating the “Next Generation Value Chain to Deliver Customer Value” for Cisco. While still a work in progress, with over 9,000 strong under his influence across 90+ locations and 32 countries, my money is on Mr. Mendez succeeding with his endeavor.

It begins with what he believes defines the customer experience value chain:

“Network of internal and partner processes, people and capabilities that translate innovation into customer value while delivering an unrivaled customer experience”

While closely formulated from Forrester’s definition, loosely defined as “activities through which companies create value, competitive advantage, and superior customer experiences”, what I find unique and interesting about Cisco’s definition is the specific attention and promotion of “people” and their “capabilities”. Perhaps this resonates so much with me because I have long believed that collaboration is the next supply chain “optimizer”, and collaboration is decisively a purpose-driven human activity. To be more precise, it is the unifying of actions taken by uniquely capable people for a common good (more on this later).

Angel identified four legs required to support the creation of strategic advantage; Customer Focus, Agility, Collaboration and Sustainability. At first glance, you might find these to be obvious and perhaps not so unique – and indeed, many companies are talking about these elements in one form or another. What is different about Angel’s message, for one, is the maturity and execution of the model. For example, I’ve never met a company who would say they are “not customer focused”; however, most continue to govern themselves according to traditional, and very operationally focused, metrics (e.g. cost, quality, delivery and speed). Cisco, on the other hand, measures their customer focus by focusing on perfect product launch, perfect order, order-to-invoice cycle time and last but not least “moment of truth customer satisfaction measurements” – thus, redefining their balanced scorecard to align with its customer focus.

A significant portion of Angel’s presentation was spent on the Flexibility/Agility leg. What caught my eye most is a theme I am seeing across multiple manufacturing segments, and is becoming a key requirement for many looking to improve their supply chain management and S&OP processes: the growing gap between Demand Chain and Supply Chain. Today, it is not uncommon to see completely disjoint demand side planning (S&OP) and detailed supply chain planning solutions, and yet, it is in between the two where a significant amount of efficiency and performance can be lost. I believe the gap is widening at a steady rate, and this is what is driving the need for new and innovative solutions to “collaborate and effect change in real time”.

So we’re back to Collaboration – the third leg. In my humble opinion, it will be in this area where excellence will be won or lost. You might look at collaboration as the combination of people + processes + technology/tools, but I was very impressed to see a slight variant of this long standing equation. In Angel’s vision, it is “culture” + process + technology/tools. I admit never having thought about it as a cultural challenge, but having worked with many large organizations on this problem, I’ve come to realize how unique a problem this is… collaboration amongst peers and employees is often challenging enough across departments. The type of collaboration Angel is talking about is inter-enterprise – which means that on a given day, you may very well be collaborating with a complete stranger living on a different continent. Indeed, there are cultural implications to achieving this level of maturity.

Again, I might say there is nothing new about promoting collaboration as a key to success; however, it is what Cisco is doing about it that distinguishes them from the rest. They are leveraging many of their own technologies to produce what they call an “Integrated Workforce Experience” (IWE) platform capable of bringing teams together to collaborate and solve ‘moment of truth’ problems that occur in the gap between demand chain and supply chain planning. Unlike social networking platforms, such as Facebook, MySpace and the like, which use friends, family and fun as a hook, I believe platforms like IWE will motivate productive usage and involvement through content, context, and consequence.

Finally, we have Sustainability, which is extremely topical these days as we watch in horror the catastrophe still hemorrhaging under the Gulf of Mexico. Here, we heard some common themes on creating efficiencies and innovations in product design, educating and increasing employee involvement, and a particularly catchy tag line: “Don’t just ‘comply’, lead, innovate, differentiate”. The one resonating message around sustainability, more of a lesson really, is the reminder that sustainability should not be viewed as a factor for competitive advantage, but rather, the one common flag around which everyone can unite and learn from one another. Industry collaboration will be the key to effecting a meaningful and lasting change.

Does Angel’s vision align with yours? Do you see effective collaboration as an emerging competency that will distinguish your company’s performance?

By the way, if you missed the presentation, grab a soda and sandwich and watch the replay of this presentation by registering here.

Posted in Best practices, Sales and operations planning (S&OP), Supply chain collaboration, Supply chain management


Do YOU have enough SUPPLY?

Published May 17th, 2010 by Carol McIntosh 0 Comments

The supply chain is tightening up again.  We are hearing about this in high tech and automotive and most likely in other industries as well.  While we never seem to avoid the cyclical nature of demand and supply, it is refreshing to see the level of interest in making good decisions with limited supply.

In working with clients, I have never seen so much interest in supply allocation options,  demand prioritization and customer segmentation. Companies are realizing that the FIFO approach to customer demand isn’t good enough anymore.  However, you do find some companies that think that they need VERY complex rules. Supply chain doesn’t have to be complex to be effective.  In fact, I would argue that the opposite is true.

Companies all want to maximize revenue and minimize stock levels. I believe that global competition has also had a significant influence on the need for supply allocation options. If you can’t satisfy everyone on time, which customers are most important and what impact will not satisfying some customers have on your business?  Many companies have top tier customers that represent such a significant portion of their business that they always need to satisfy.  Other companies are rewarding their customers with the best forecast accuracy.

Supply is being allocated at either the finished good level or the material component level. If you are managing the finished goods side of the business, you may need a combination of automated supply allocation rules and also the option to manually allocate or create firm allocations to distribution centers or regions.

There is no one rule however for supply allocation. These are collaborative decisions that need to be supported by software but not decided by software.

What has been your experience with limited supply? How are you solving this today and do you need a better way? I would be really interested in hearing your comments.

Posted in Demand management, Inventory management, Supply chain management


Human intelligence and machine stupidity: Supply chains are about effectiveness, not only efficiency

Published May 13th, 2010 by Trevor Miles @milesahead 1 Comment

Before I start on the body of my blog posting, let me state unequivocally that I believe, no, that I know, that computers and software have a huge role to play in decision making and execution in a wide range of business functions.  After all, I have worked in the software industry for the past 25 years.  I am also not one of those wacky people who think that machines are going to take over the world.  However, I am one of those people who believe that humans have unique skills that no machine is able to match currently, particularly the ability to evaluate nuance, uncertainty, and risk.  Computers and programs, on the other hand, are capable of processing huge amounts of data far more quickly than humans, but they always assume that the data they are fed and the algorithms/heuristics they are using to analyse the data are absolutely correct.  In other words, computers are hopeless at evaluating nuance, uncertainty, and risk.

All too often we don’t put processes in place which couple the human ability to evaluate nuance “intelligently” with the machine ability to evaluate vast amounts of data “dumbly”.  All too often we confuse efficiency with effectiveness, and pursue efficiency over effectiveness, exemplified by the use of the term “machine intelligence”.

Nothing brings this out more clearly than the recent stock market behaviour.  All the “quants” were quick to identify “human error” initially.  Not only did they say it was human error, but it was female error.  I’m surprised they didn’t suggest she was a blond too.  After all, we know how they confuse their B’s with their M’s.  How ridiculous!  Now that calmer analysis has taken place, it would seem that nothing of the sort happened, and not by a female either.  There is a very interesting article – I am sure there must be many more out there – in the Wall Street Journal (WSJ) by Aaron Lucchetti titled “Exchanges Point Fingers Over Human Hands” that analyzes what really went on last week Thursday. Lucchetti makes no bones about the fact that this is a man vs machine tussle:

“In the man-vs.-machine argument for financial markets, proponents of technology say machines do it faster and cheaper. Those in support of human involvement say people can use their experience and pull the emergency brake when the computers, or their programmers, make mistakes.

But when that happened Thursday, it appeared that some humans couldn’t react quickly enough, while those using computers just kept pushing the market lower.”

I would argue that human involvement should have been used to prevent the situation from occurring, not just as an “emergency brake”.

Let’s start by understanding the role of the “quants” in financial organizations.  A “quant” is short for a quantitative analyst.  These are math and physics whizzes that have been brought into financial institutions to create mathematical models to evaluate market behaviour, particularly algorithmic trading.  Algorithmic trading is a trading system that utilizes very advanced mathematical models for making transaction decisions in the financial markets. The strict rules built into the model attempt to determine the optimal time for an order to be placed that will cause the least amount of impact on a stock’s price. Large blocks of shares are usually purchased by dividing the large share block into smaller lots and allowing the complex algorithms to decide when the smaller blocks are to be purchased.  The use of algorithmic trading is most commonly used by large institutional investors due to the large amount of shares they purchase every day. Complex algorithms allow these investors to obtain the best possible price without significantly affecting the stock’s price and increasing purchasing costs.

Let me come clean;  I am an engineer, so I am a “quant” by nature and by training.  But I had the good fortune to study “decision under uncertainty” at the PhD level.  During this time I also came across “fuzzy logic”.  Forget the math and theory.  Fundamentally what it comes down to is that some people (quants) believe that any and all systems can be modelled exactly – given enough time and insight – and that the models can then be used to predict behaviour under any other circumstances.  I think this is a load of hogwash.  No mathematical model is ever complete and data is never 100% accurate.  However, when computers are used by humans to understand “directionally correct” decisions, they are of huge benefit.  In other words the model of the supply chain may indicate a 5.21% improvement in gross margin from 23.42% to 28.63% if supplier A is used rather than supplier B.  I would interpret the result to mean that it is highly likely that we could increase gross margin by more than 2.5% by using supplier A.  It would have probably taken a human months to gather, collate, and analyse the data by hand, and probably with a great deal of “human error”.  The same analysis could be achieved in a few hours using a computer, provided some of the primary data was already available.

There is an interesting little snippet in the Wikipedia description of quants which I think is of particular relevance.

“Because of their backgrounds, quants draw from three forms of mathematics: statistics and probability, calculus centered around partial differential equations (PDE’s), and econometrics. The majority of quants have received little formal education in mainstream economics, and often apply a mindset drawn from the physical sciences. Physicists tend to have significantly less experience of statistical techniques, and thus lean to approaches based upon PDEs, and solutions to these based upon numerical analysis.”

Statistical techniques are based upon uncertainty, or randomness.  Physicists, mathematicians, and engineers, on the other hand, hate uncertainty, and spend enormous amounts of time looking deeper and deeper into atoms trying to prove that everything is predictable, if only we had the knowledge and wisdom to understand the observations.  And they bring this perspective to the analysis of financial market behaviour, as pointed out in the Wikipedia quote.  Einstein once made the statement that “God doesn’t play dice with the universe”, which he came to regret, incidentally.  He was questioning the notion of randomness as opposed to determinism. Determinism is defined as understanding every event in nature as having a particular cause. Randomness defines an aspect in nature that has only a probability such as in quantum uncertainty.  My engineering training was replete with this deterministic attitude which informed Einstein statement, as was the training of my fellow engineers and scientists.  So the quants are in constant pursuit of the ultimate model to describe all situations so that they can predict the movement of the market under any and all conditions.  This is an attitude that is very common in supply chain management too.  I think it is flawed from the start.

In a separate article in the WSJ titled “Did a Big Bet Help Trigger ‘Black Swan’ Stock Swoon?”, it is clear that what happened last week Thursday was not “human error”, but rather “model error” in the sense that there was an over reliance on computer models, which in turn drove market behaviour.

The non-quants have been fighting back for some time since the market crash in 2008 and the whole CDO mess.  A good example of this is Scott Patterson’s book “The Quants: How a New Breed of Math Whizzes Conquered Wall Street and Nearly Destroyed It”.  It is a fascinating read, and very instructive.  But also fairly predictable in the blame game.  What I found most interesting was a comment by a reader of a book review of “The Quants” in the Globe and Mail.  Interestingly the title of the book review is “Quants accept no blame for financial crisis”.  Can’t be more explicit than that.  The reader wrote that

“In finance, you have a lot of people in high positions who are surprisingly innumerate (MBAs and the like) – they didn’t really understand what the quants were doing but didn’t mind as long as they were making money. Let’s not forget who hired the quants in the first place! When you combine this lack of technical oversight with poor regulation, you have a toxic mix.”

I believe we have a very similar situation in manufacturing operations, particularly supply chain management.  Senior management doesn’t really understand the complexities of operations and rely too heavily on the quants.  As long as they see inventories go down and stock prices go up, all is well.

To go back to Lucchetti’s article in the WSJ, the first act in the blame game for the market behaviour last week Thursday was to focus on “human error”.  Clearly a first salvo from the quants.  Later in the article, Lucchetti quotes Jamie of White Cap Trading as stating that

“Markets are a mix of technology and human judgment. Thursday, we saw far too much technology and not enough (human) judgment.”

I could not agree more.  I think I am going to print out that statement in 94 pt font and put it in a frame on my wall.  I would like to see everyone in supply chain management follow my example.

All too often I see this same behaviour in supply chain management where optimization engines are thrown at a problem.  I do not have too much of an issue with the use of optimization engines.  What I struggle with is that there is a slavish belief that the results are accurate to the nth decimal.  There is no understanding of the likelihood of achieving this optimum nor the degree to which the model is inaccurate nor the degree by which the result is affected by inaccurate data.  What happened in the stock markets is a classic example of relying too much on machines in the pursuit of efficiency.  The parallel’s in the supply chain space where we rely too much on optimization, be that Lean or mathematical optimization.  Do you know the first sign of when the quants have taken over your supply chain?  It’s when you hear that your data isn’t clean enough after you have already spent millions implementing an ERP system and countless hours “cleaning” data.

I am not suggesting that we unplug the ERP and APS systems we have deployed over the past 20 years.  I think there is a huge amount of value that has been received from the use of these tools.  But they are tools.  Let us treat them in that manner.

As always, I look forward to a robust debate, perhaps including some of my erstwhile colleagues.

Reblog this post [with Zemanta]

Posted in Milesahead, Miscellanea, Supply chain management


I am adamant that good decisions can be made without perfect data

Published April 27th, 2010 by Trevor Miles @milesahead 4 Comments

Tom Wailgum over at CIO.com wrote a blog titled “Supply Chain Data: Real-Time Speed Is Seductive and Dangerous” in which he quotes from an Aberdeen report by Nari Viswanathan and Viktoriya Sadlovska.  Tom writes about the adoption of real-time data that “Before any company hits the accelerator, it would be wise to ensure that the existing and new supply chain data is sound: Bad data delivered that much faster is still bad data—and can lead to worse decision-making.”  I agree with nearly everything Tom writes, but I don’t buy into this quest for data nirvana.

Let us look outside of supply chain for examples where the data quality is good enough to make sensible decisions.  We know that child mortality is higher in poor countries than in rich countries.  The UNICEF mission is “To reduce child mortality by two-thirds, from 93 children of every 1,000 dying before age five in 1990 to 31 of every 1,000 in 2015.”  I’m good with the first part of the UNICEF mission (reduce child mortality by two-thirds) and will continue to donate to them on this basis.  It’s the second part that confuses me.  Does is really matter that in poor countries child mortality is 93 per 1000 births or 100 per 1000 births?  I would just go with “more than 90 per 1000 births”.  I just don’t see how the precision of the statistics improves the quality of UNICEF’s decisions.  And I think too often we confuse the 2 issues of quality of decision and quality of data.

Before I am misunderstood, let me state quite clearly that data quality can always be improved.  There is no question in my mind that all enterprises should have “data police” that ensure that the data is of reasonably good quality.  But let’s all recognize that data quality is like a sales forecast.  We all need to get better at it, but we will NEVER get it absolutely right, as in complete, correct, and there when we need it.  What we want to avoid is getting it absolutely wrong.

In addition, I think that data latency is a key element of data quality. So I don’t agree with Tom Wailgum. The speed with which you receive data is a big part of its quality. I would much rather have partially correct data quickly than precise data slowly. One of the most important insights that can be gained from data is trend. Trend is often more important than the actual value, and trend is totally absent from Tom Wailgum’s discussion. In other words, data should have 3 major measures of quality:

  • Completeness
  • Correctness
  • Timeliness

In case we forget, people are operating supply chains right now with the quality of the data they have right now. They are making multi-million dollar decisions in the long term based upon the current data. They are making 1000′s of decisions on a daily basis – expedite this PO, cancel that PO, promise this date to a customer, … – based upon the current data.  In many cases they are using paper and pencils and gut-instinct to make these decisions, and more often than not they are the right decisions. Maybe not precisely correct, but still correct. I am sure many of you have horror stories of when bad decisions were made using bad data. I am equally sure that you have horror stories of bad decisions have been made on good data.  And, by the way, I am sure you have many stories of good decisions having been made on bad data.  Above all, I am sure that many of your horror stories will revolve around having known about something too late, and many of your good stories will revolve around having known about something quickly.  The value of knowing sooner is the central lesson to be learned from the famous Beer Game that illustrates the Bull-Whip Effect.

Also, let us not confuse the quality of the decision with the quality of the data. In other words, the decision might be directionally correct without being precise, and infinitely better than doing nothing. For example, it may be correct to split a purchase order (PO) for 1000 units and expedite part of the quantity in order to meet unexpected customer demand. We may chose to expedite 500 when it would have been better to expedite 600, but expediting 500 would be a lot better than not expediting any of the PO.  The decision to split the order and expedite part of it is 100% correct.  The quality of the data may mean the difference between expediting 500 and not 600.  I can accept that imprecision better than the inaction caused by waiting for “better” data before making a decision.

Naturally, we all want to avoid the situation where it would have been better not to split the order, but because of poor data quality a decision is made to split the order. In general I think the quality of supply chain data is a lot better than that.  The reason to have “data police” is that of course no-one knows which incorrect data will lead to disastrous decisions.

If the current data says “go North East” that is good enough for me. Leave it to the “accountants” to decide to “go 47 degrees 28 seconds”, especially if it takes 2 minutes to decide to “go North East” and 2 days to decide to “go 47 degrees 28 seconds”.  By the time the “accountants” have reached their conclusion, the entire demand and supply picture will have changed anyway.

In closing, I think we should all take a word of advice from Warren Buffet when he wrote in the Washington Post that “… it is better to be approximately right than precisely wrong.” I argue that most of the time waiting for precise data is precisely wrong, and that acting quickly based upon the existing data is approximately right.

Posted in Best practices, Milesahead, Response Management


Old-school organizational power structures thwart business performance: The old dogs need to learn new tricks

Published March 11th, 2010 by Trevor Miles @milesahead 0 Comments

John Westerveld, a colleague of mine, wrote a great 2-part blog post titled “Top ten reasons YOU should be doing S&OP” in which he gives a great practical example of when S&OP can be of great benefit to an organization.  The first reason John selects is alignment across different functions in an organization.  This set me thinking on what are the fundamental reasons for a lack of alignment across functions.  Of course, in today’s multi-tier outsourced supply chains, alignment is also an issue between organizations.  Hau Lee at Stanford has written a lot about this in his concept of “Agility, Adaptability, and Alignment” which is driven by “extreme information exchange”, according to Lee.

While trying to formulate my ideas about the causes of lack of alignment, I came across a set of postings by Dustin Mattison on his Logipi blog and on one of the LinkedIn discussions, which postulates that the problems at Toyota can be boiled down to organizational structure and culture.  This has been manifested by power “fiefdoms”, lack of transparency, and therefore lack of alignment between different functions.

There is a great section in “The Big Switch” in which Nicholas Carr traces the origins of organizational structures and their impact on performance.  (I wish I had a more formal source, and I am sure some of our readers can point me to one.)  Our organizational structures have been inherited from the military and really date from as far back as Roman times when there was no ability to communicate in real time.  Imagine the time it took to get a message from Rome to Cairo?  As a consequence, hierarchical structures were developed to ensure a process of central command and control.  Loyalty was prized above all else and disloyalty was dealt with very harshly.  The 20th century phenomenon of the corporation used the same organizational structures and same command and control attitudes, largely because the means of communications had not progressed since the Roman times, though the penalties for disloyalty (or poor performance) are considerably less harsh.

The business process reengineering efforts led by Michael Hammer, Tom Peters, and Peter Drucker in the 1990′s was the first attempt to correct this by “de-layering” management. But think about it: They were doing this before the wide-spread adoption of the internet, when faxes were still considered state of the art.  While the enthusiasm for BPR has waned because when put into practice it focused too much on efficiency (read headcount reduction), the fundamental idea that business processes can be more effective – not just more efficient – has been carried forward by Lean and Six Sigma concepts.  And the internet specifically, but technology more generally, is the enabler.  This is what can/does provide/enable the transparency Richard Wilding of Cranfield University talks about in an interview with Dustin Mattison, which is so crucial in breaking down the power barriers to more effective sharing of information across functional and organizational boundaries.

And yet we still have senior management (and professors in business schools) to whom IT in general, but the internet specifically, is a learned phenomenon.  Before anyone thinks “Yeah, yeah”, let me point out that I am one of the people who have “learned” how to use the internet and I am still not comfortable with “tweeting” and “blogging”.  In short, I am not comfortable with that level of personal “transparency”.  At the same time, I am staggered at how many mid-tier managers, let alone senior managers, still receive paper-based reports, scribble all over them, and then send the scribbled notes back to an underling who is supposed to act on the scribbled notes.  This is all about power and has little to do with effectiveness.  They could have just as easily made changes to values in a system and annotated these with some comments. This information would be available immediately to anyone who had to take action or make further decisions based upon the inputs from the senior manager.

Exacerbating the fact that much of senior management does not come from the “internet” generation is the difficulty of using existing IT applications and systems.  The fundamental drawback of existing supply chain systems specifically, but operations systems in general, which prevents their wide adoption by senior management is that they lack the ability for people (read senior management) to perform quick and effective what-if analysis.  It takes too long for them, and in truth it is also too complex, to create and analyze scenarios themselves, so they devolve this to more junior people who don’t really understand what it was the senior manager wanted to investigate in the first place.  More correctly, the senior manager is forced to take a structured approach to investigating and solving an issue whereas in reality problem solving is a very unstructured process governed strongly by exploration and discovery.  Even when senior managers have monster spreadsheets available to them, there is:

  • little to no connection to the current situation
  • insufficient level of detail to get a realistic evaluation of the future consequences of their decisions on financial and operational metrics, and
  • very limited ability to explore multiple scenarios.

They have to wait until the month end or quarter end to get a report on what has happened, and by that time it is almost impossible to deconstruct the cause and effect.

While I realize the limitation of my thinking (fundamentally I am an operations person) and recognize the impact – both short term and long term – that Finance, and HR, as examples, can have on the performance of a company, in companies that sell, design, and/or manufacture a physical product, Operations is the core business process that determines the current and future success of an organization.

All of this gets me to a brief discussion of Sales and Operations Planning (S&OP).  There are many definitions of S&OP out there and also a lot of discussion on S&OP “maturity” models.  At its heart and in its more simplistic form, S&OP is all about demand/supply balancing.  In other words alignment between the demand and supply side of the organization.  In a multi-tiered outsourced environment, this is not a simple exercise, so my use of “simplistic” is not meant to denigrate this level of S&OP adoption.

The greatest long term benefit of S&OP, even if this is difficult to quantify, is increased transparency and alignment, as noted by John Westerveld and discussed by Richard Wilding. AMR Research calls this “East-West” alignment.  And yet there are so many more benefits that are achievable by linking Operations to the Executive, by linking financial measures and objectives such as revenue, margin, cash flow to operational metrics such as orders delivered on-time and in-full, inventory turns, and capacity utilization.  AMR Research calls this “North-South” alignment.  A number of the analysts such as Ventana Research, Aberdeen, Gartner, and AMR Research (now part of Gartner) have referred to this North-South alignment as Integrated Business Planning.  Tom Wallace and Oliver Wight have referred to this an Executive S&OP, and now Accenture is referring to this as “Profit, Sales, and Operations Planning”.  Whatever we call it, there are lots of benefits.

The principle barrier to tapping into these phenomenal benefits is the organizational power structures we have inherited from a previous era.  These will not be easy to break down.  But an S&OP process – however sophisticated or rudimentary – will start this process of greater transparency and alignment.  I’ve been participating in 2 discussions on LinkedIn (Has Sales & Operations Planning (S&OP) improved your forecast accuracy? and  What is your biggest S&OP pet peeve?, both of which require membership) and in both discussions there is consensus that the greatest contributor to the successful adoption of an S&OP process is Executive support because this is required to get everyone to “play nicely” with each other.  Clearly this is simply a symptom of the organizational power structures.  S&OP is challenging these power structures, which leads to resistance. There is plenty of technology out there to assist in this process, but ultimately you will need both for a truly successful S&OP process that contributes massively to your company’s future success.  But there is no need to wait until you have organizational buy-in.  As with all organizational change, showing the people how they will benefit for adopting new practices is the best way of getting their buy-in.  So start small and give people information that is useful to them and over time you will be able to ask for information that is useful to you.  If this is too slow for you, make a pitch to your executive team to make sure they back you up to get faster adoption.  Either way, you should not wait.  The benefit to your company is too great to ignore.  Help us create the organizational structures of the future.

Posted in Milesahead, Miscellanea, Sales and operations planning (S&OP), Supply chain management


Top companies empower employees to act on insights

Published January 11th, 2010 by John Westerveld 1 Comment

I was catching up on some reading and came across this article from last month in IndustryWeek.  The article describes a study done by IBM that shows “top-performing companies were 15 times more likely to apply analytics to strategic decisions” and “were 22 times more prepared to challenge the status quo in their organizations”.  These top performing organizations were also “six times more likely to entrust a broader base of employees with greater authority to make decisions and act on insights”.

This lines up with what I’ve thought for years.  The strength of a company is in its people.  Those on the front line are willing and able to make strategic and tactical decisions every day.  However, companies are often afraid that their employees will make the wrong decisions and therefore push the decision making up the organizational hierarchy.  We’ve all seen these types of organizations – significant lag time between the event and response because people at the point of action are not allowed to make a decision. These companies often miss significant opportunities because they can’t react fast enough.   From the article, it’s clear that top performing companies are those that provide their employees with appropriate tools, empower them to make decisions, and build a culture of trust.

What tools are needed to empower decision making in the workplace?

  • Visibility – You need to see events as they happen, you need to understand the impact of those events across your enterprise and you need to understand the what impact your response to these events will have
  • Alerts – In order to respond quickly to an event, you need to know that the event has occurred. Events can take many shapes and sizes.  Not only is it often difficult to recognize when an event has occurred, it can be very difficult to understand whether the event is important.  Does this late supply order actual impact revenue?  Does this forecast change create a problem?  To be effective, an alert must be based on the impact the event has, not necessarily the event itself.
  • Analytics – The world happens at a pace faster than your ERP system can respond.  New orders, forecast changes, supply disruptions all occur on a minute by minute basis.  ERP systems have always been batch focused and cannot give you an adequate picture of the impact of an event until the next batch run.  To respond, to make accurate decisions you need to have real-time analytics that run continuously as new information comes in
  • Analysis tools – You need tools that allow you to see the impact an event will have.  You need to see the details of components, inventory, other orders that will be impacted by this change.   All this information must be made visible in an intuitive, easy to read, customizable format.
  • Simulation – Once you understand the impact and wish to respond to the event, you need to be able to simulate various resolution options and more importantly, compare the various solutions to one another and against the corporate metrics to ensure that the resolution solves the problem AND meets the goals of the enterprise.

What it boils down to is this.  Give your people the tools to understand the issues and the authority to make decisions, then stand back and let them get to work.

Posted in Best practices, Supply chain management