Archive for April, 2010

Can you get too Lean?

Published April 30th, 2010 by Trevor Miles @milesahead 12 Comments

Without a doubt, Lean Manufacturing has been a transformative idea that has its genesis in the Toyota Production System.  Many companies have been able to reduce inventories and reduce order-to-delivery times simultaneously.  There is nothing new in this statement.

What is interesting is an article in Business Week discussing how Deere & Co is struggling to satisfy customer demand because of their adoption of Lean.  While undoubtedly, as stated in the Business Week article, there has been a big emphasis on Lean adoption in Deere which has led to large reductions in inventories, I think there is more that can be drawn from the story. The question is whether you can get too Lean?

Source: AGCO Corp. http://www.AGCOcorp.com

Before I start though, here are some caveats:  I am not a Lean practitioner, I am not a financial analyst, and I do not know the strategies of the companies I will discuss below.  But I am a farm boy, and had a Massey Ferguson, one of the AGCO brands.  I spent many happy hours ploughing fields and reaping crops on “big red”.  I learned to drive the tractor when I was 5 and had to pull down on the steering wheel because I wasn’t heavy enough to push the clutch down.  Going to the tractor dealer was infinitely more interesting than going to the car dealership or toy store.  However, I digress.  As a child I experienced the side of the buyer, and as an adult I have been inside more than one of the Ag equipment manufacturers as a consultant, so I know a little of their operations too.

As the Business Week article states, this is a highly seasonable business.  One thing that has changed dramatically since my father bought tractors is that they are now highly configurable with all sorts of options that include air conditioning and GPS.  Combining seasonality with configurability is a toxic mix for getting Lean wrong.  A central tenet of Lean is “level loading” which is all about keeping a regular cadence in production.  This is fine when you can predict the demand very well, but outside of these tight boundaries, it is really easy to get into trouble either in terms of missing customer shipments (as is the case at Deere), or in terms of not making best use of capacity. From the numbers, it appears that Deere has focused on reducing inventories without implementing an adequate postponement plan to reduce the order-to-delivery cycle.  Let’s go and look at the numbers.

The data in the table above is based upon the 2009 financial results.  (You can do the same analysis using our free benchmarking service.) From this, we can see that Deere & Co has by far the lowest days of inventory (DOI).  However, look at their cash-to-cash (C2C) and days of sales outstanding (DSO).  Their DSO is 7 times more than that of AGCO, which is best-in-class.  What I suspect is that the Deere DSO represents a lot of inventory sitting on dealer floors for which Deere has extended long payments terms to the dealers.  I don’t know this for a fact, but from what I know of the relationships between OEM’s and dealers in the Ag Equipment industry, I suspect this is the case.

More interesting from an operational perspective, is to analyze Deere’s inventories, especially in comparison with those of AGCO.  We see that Deere has roughly double the finished goods (FG) inventory of AGCO and roughly half the raw material (RM) inventory of AGCO.  The work-in-progress (WIP) inventories are roughly the same, though in real terms, AGCO’s are 33% lower.  The conclusion I come to is that Deere makes FG and stuffs the channel.  If the FG values don’t convince you, what about the DSO?  It would appear that AGCO has worked out how to forecast dependent demand – components and sub-assemblies – so they buy a bunch of stuff and then do late stage assembly to meet market demand.  I come to this conclusion because they have the highest RM yet the lowest WIP and FG.  Compare that to Deere which is organized the other way around with more FG and relatively little WIP and RM.  My conclusion is that AGCO is the company that has truly embraced Lean.  Simply reducing inventories without a good postponement strategy is a recipe for poor performance. Where we see the real benefit to the AGCO investors in the return on invested capital (ROIC). Clearly there is a lot that Deere is doing that is correct.  Of the big Ag Equipment companies they have the highest margin values, all the way from gross margin through to net margin.  So I hope they get this right.

Let me repeat at this point that none of my analysis is based upon deep knowledge of how these companies operate.  I could be wildly wrong, but I don’t think I am.  What do the Lean experts out there think?

Posted in Inventory management, Lean manufacturing, Milesahead


Will companies think differently after suffering the consequences of Eyjafjallajokull?

Published April 29th, 2010 by Carol McIntosh 3 Comments

There has been much written about Iceland’s Eyjafjallajökull volcano. It certainly has had a significant impact on the global supply chain. One would need a very good crystal ball to predict this unplanned event, but it certainly exposes the vulnerability of distributed networks.

Here’s the big question:  Will companies think differently after suffering the consequences of this natural disaster?  What will they do different?

I don’t think the answer is building more just in case inventory. In order to stay competitive supply chains have to be lean. (In fact, they are becoming even leaner with late stage postponement to satisfy increasing levels of customization on consumer goods.)

Here are some questions for consideration:

  1. Can you proactively analyze and understand the risk of unplanned events?  This may be the upside or downside in demand or supply disruptions. This also includes the identification of sole sourced material.
  2. Do you have the visibility and access to information in your supply network that you need? More and more companies are looking for a global view of all of their inventory with the need to rebalance as the demand and supply fluctuate
  3. Do you know what to do when you have a problem that you just can’t solve?  When a volcano happens there is not much you can do about it. The question is are you making the best use of the supply that you have? How do you want to prioritize demand and allocate your supply? How quickly are you able to make these decisions?

While there may never be another volcano that disrupts the supply chain, there are daily disruptions that affect companies every day, and that taken in sum can have a material impact to the business. How do you deal with them?  Send in your stories!

Posted in Best practices, Response Management, Supply chain risk management


Excel doesn’t excel in all cases…

Published April 28th, 2010 by Monique Rupert 4 Comments

I recently read a blog post titled “Beware Supply Chain Excel Users—YOU are DOOMED!!!!” by Khudsiya Quadri of TEC.  I completely agree with the author that there is a big risk to SCM Professionals who rely too heavily on Excel.  There are all the reasons listed in the article such as  lack of collaboration, visibility, control and no ability to perform “what-if” scenarios.  I would like to add some additional thoughts to this discussion.

A big limitation of Excel in my view is that it cannot mimic the analytics in the company’s source ERP system.  Why is this important?  If someone is using Excel to make business decisions without all the capabilities the ERP source system has, then they may not be making the right decisions.  How can you make planning decisions if your spreadsheet doesn’t take into consideration functionality like sourcing rules, constraints and order priorities?

A company’s supply chain map is very complex, typically there are internal manufacturing data sources, external manufacturing data sources, inventory site data, etc.  It is possible to get data from multiple sources into Excel, but the big challenge is that the data is not always the same from each source system, so many organizations may have multiple spreadsheets to perform the same function/analysis.  But can any of those spreadsheets be truly accurate if they don’t show a true picture of the whole supply chain?

It is almost impossible to control the integrity of spreadsheet data and access to the spreadsheet.  With multiple people accessing the spreadsheet and no security, how can anyone have any confidence in the data?   In addition, most spreadsheets need to be reviewed by many people which typically requires pushing the spreadsheet around.  Without system standard security, data integrity could be an issue and auditing who made changes could be an issue.   How can there be a high level of confidence in the data and subsequent business decisions made?

I have known many supply chain companies who do make critical business decisions based off of spreadsheets.  For example, one company would use spreadsheets to analyze big order drop ins.  If they had a big order drop in they would use their spreadsheet(s) to determine the effect on their business and when they could commit to the customer to deliver the order.  This would typically require multiple spreadsheets getting data from multiple sources, tons of manipulation, trying to tie data together, and many different users from the organization looking at their piece, which would take several days and by then the data had changed and the end user would only have a 50% confidence level in the answer back to their customer.  This can be crippling if your products are very expensive like in the aerospace industry where the products are multi-million dollar and the customer is the government who may impose penalties if orders aren’t delivered when promised.

You need to:

  • get all the supply chain data in one place for visibility (with frequent data refreshes),
  • mimic the source system analytics,
  • have all the system standard security functionality and
  • output data in a familiar “Excel-like” format.

True nirvana is: one source of the truth, multiple users having access at the same time, data integrity, “what-if” capability with the power and flexibility of “Excel-like” outputs.

Posted in Products, Supply chain management


I am adamant that good decisions can be made without perfect data

Published April 27th, 2010 by Trevor Miles @milesahead 4 Comments

Tom Wailgum over at CIO.com wrote a blog titled “Supply Chain Data: Real-Time Speed Is Seductive and Dangerous” in which he quotes from an Aberdeen report by Nari Viswanathan and Viktoriya Sadlovska.  Tom writes about the adoption of real-time data that “Before any company hits the accelerator, it would be wise to ensure that the existing and new supply chain data is sound: Bad data delivered that much faster is still bad data—and can lead to worse decision-making.”  I agree with nearly everything Tom writes, but I don’t buy into this quest for data nirvana.

Let us look outside of supply chain for examples where the data quality is good enough to make sensible decisions.  We know that child mortality is higher in poor countries than in rich countries.  The UNICEF mission is “To reduce child mortality by two-thirds, from 93 children of every 1,000 dying before age five in 1990 to 31 of every 1,000 in 2015.”  I’m good with the first part of the UNICEF mission (reduce child mortality by two-thirds) and will continue to donate to them on this basis.  It’s the second part that confuses me.  Does is really matter that in poor countries child mortality is 93 per 1000 births or 100 per 1000 births?  I would just go with “more than 90 per 1000 births”.  I just don’t see how the precision of the statistics improves the quality of UNICEF’s decisions.  And I think too often we confuse the 2 issues of quality of decision and quality of data.

Before I am misunderstood, let me state quite clearly that data quality can always be improved.  There is no question in my mind that all enterprises should have “data police” that ensure that the data is of reasonably good quality.  But let’s all recognize that data quality is like a sales forecast.  We all need to get better at it, but we will NEVER get it absolutely right, as in complete, correct, and there when we need it.  What we want to avoid is getting it absolutely wrong.

In addition, I think that data latency is a key element of data quality. So I don’t agree with Tom Wailgum. The speed with which you receive data is a big part of its quality. I would much rather have partially correct data quickly than precise data slowly. One of the most important insights that can be gained from data is trend. Trend is often more important than the actual value, and trend is totally absent from Tom Wailgum’s discussion. In other words, data should have 3 major measures of quality:

  • Completeness
  • Correctness
  • Timeliness

In case we forget, people are operating supply chains right now with the quality of the data they have right now. They are making multi-million dollar decisions in the long term based upon the current data. They are making 1000′s of decisions on a daily basis – expedite this PO, cancel that PO, promise this date to a customer, … – based upon the current data.  In many cases they are using paper and pencils and gut-instinct to make these decisions, and more often than not they are the right decisions. Maybe not precisely correct, but still correct. I am sure many of you have horror stories of when bad decisions were made using bad data. I am equally sure that you have horror stories of bad decisions have been made on good data.  And, by the way, I am sure you have many stories of good decisions having been made on bad data.  Above all, I am sure that many of your horror stories will revolve around having known about something too late, and many of your good stories will revolve around having known about something quickly.  The value of knowing sooner is the central lesson to be learned from the famous Beer Game that illustrates the Bull-Whip Effect.

Also, let us not confuse the quality of the decision with the quality of the data. In other words, the decision might be directionally correct without being precise, and infinitely better than doing nothing. For example, it may be correct to split a purchase order (PO) for 1000 units and expedite part of the quantity in order to meet unexpected customer demand. We may chose to expedite 500 when it would have been better to expedite 600, but expediting 500 would be a lot better than not expediting any of the PO.  The decision to split the order and expedite part of it is 100% correct.  The quality of the data may mean the difference between expediting 500 and not 600.  I can accept that imprecision better than the inaction caused by waiting for “better” data before making a decision.

Naturally, we all want to avoid the situation where it would have been better not to split the order, but because of poor data quality a decision is made to split the order. In general I think the quality of supply chain data is a lot better than that.  The reason to have “data police” is that of course no-one knows which incorrect data will lead to disastrous decisions.

If the current data says “go North East” that is good enough for me. Leave it to the “accountants” to decide to “go 47 degrees 28 seconds”, especially if it takes 2 minutes to decide to “go North East” and 2 days to decide to “go 47 degrees 28 seconds”.  By the time the “accountants” have reached their conclusion, the entire demand and supply picture will have changed anyway.

In closing, I think we should all take a word of advice from Warren Buffet when he wrote in the Washington Post that “… it is better to be approximately right than precisely wrong.” I argue that most of the time waiting for precise data is precisely wrong, and that acting quickly based upon the existing data is approximately right.

Posted in Best practices, Milesahead, Response Management


An imminent threat to Western brand owners

Published April 23rd, 2010 by Trevor Miles @milesahead 0 Comments

It is always good to have one’s ideas validated.  It is fantastic when the validation comes from no less than the Economist.  I wrote a blog  in June 2009 titled “Recession or Reset?” in which I explored what the new normal would look like after the recession.  It is always easier to analyze, and a lot more tricky to predict.  However I felt secure in the use of the Nirma case study to bring out 2 key points:

  • There is a huge consumer market in the rapidly developing economies (these being principally the BRIC countries) largely untapped by companies in the developed economies.
  • To reach the consumers in these markets will require a different type of innovation, exemplified by the Nirma case study, focused on product simplicity (and price) and distribution effectiveness.

In their April 15th, 2010 edition, the Economist ran a special report called “The new master’s of management” (subscription may be required) in which the authors state

“Emerging countries are no longer content to be sources of cheap hands and low-cost brains. Instead they too are becoming hotbeds of innovation, producing breakthroughs in everything from telecoms to carmaking to health care. They are redesigning products to reduce costs not just by 10%, but by up to 90%. They are redesigning entire business processes to do things better and faster than their rivals in the West. Forget about flat—the world of business is turning upside down.”  They go on to say “the rich world is losing its leadership in the sort of breakthrough ideas that transform industries.”

In a supplemental report “The world turned upside down”, the Economist states that

“They (the BRIC countries) are coming up with new products and services that are dramatically cheaper than their Western equivalents: $3,000 cars, $300 computers and $30 mobile phones that provide nationwide service for just 2 cents a minute. They are reinventing systems of production and distribution, and they are experimenting with entirely new business models. All the elements of modern business, from supply-chain management to recruitment and retention, are being rejigged or reinvented in one emerging market or another.”

On the issue of reaching the broad consumer market the Economist goes on to state that

“It is not enough to concentrate on the Gucci and Mercedes crowd; they have to learn how to appeal to the billions of people who live outside Shanghai and Bangalore, from the rising middle classes in second-tier cities to the farmers in isolated villages. That means rethinking everything from products to distribution systems.” (My emphasis.)

And then there is Apple, with record sales into the BRIC countries confusing the issue.  A Wall Street Journal article in September 2009 titled “Apple Rides Recent Growth in Asia to Earn Top Honors” states that “Apple held just a 1.6% share of the personal-computer market in Asia in the second quarter of this year, and a 0.6% sliver of the region’s mobile-phone market, according to technology market-research firm IDC.”  It is Apple’s latest results that are startling.  Shipment of iPhone units grew 474% in Asia Pacific, 183% in Japan, and 133% in Europe. Total revenue from iPhones was $5.45 billion, and China accounted for $1.3 billion, up 200% following the iPhone’s launch at China Unicom.  I am not sure what this means in terms of market share growth, but the unit growth is impressive.

I must say I consider Apple’s results to be the exception rather than the norm.  I think Nokia’s approach is a safer bet for most Western companies that do not have the “trendiness” of Apple, even though Nokia’s stock price has plummeted on the back of Apple’s gains.  Focus on bringing innovation to large populations, not the elites in the BRIC countries.  Work out how to get your products to the “last mile” in countries that do not have the most sophisticated infrastructure.  On the other hand, perhaps Apple’s approach is correct because of the huge increase in disposable income in the BRIC countries.

Whatever your approach, I think it is absolutely necessary for Western companies to place a lot of emphasis on their growth in the BRIC countries.  Many of the large companies are doing this already.  What about the mid-sized companies that employ the bulk of the people in the Western countries?  What are they doing in terms of supply chain innovation to reduce costs?  I’d really like to hear your stories and opinions.

Posted in Milesahead, Supply chain management


3D TV – Technical marvel, Forecasting puzzle

Published April 21st, 2010 by John Westerveld 1 Comment
LAS VEGAS - JANUARY 06:  CEO of DreamWorks Ani...
Image by Getty Images via Daylife

3D TVs were the big story at the CES show in January and  now I’m starting to see them show up at Best Buy.    While I tend to be in the early majority of the technology adoption curve, I don’t think I’ll be buying a 3D TV any time soon.   So, am I representative of the population, or am I just receding to the technological late adopters side of the curve? Is my geekiness declining as my age advances? Or are others as apathetic about this as I am?

On the one hand, there is definitely a “gee whiz” factor that you simply can’t deny.  It’s that coolness that have been drawing people into 3D movies since the 1950’s   Classics like House of Wax and  It Came from Outer Space drew people into the movie theatres  despite having questionable plots and an excess of things flying out of the screen at you.  Recent hits like Up , Avatar and Alice in Wonderland are fanning the flames of interest in 3D.  Now you can have this experience in your home.

On the other hand,  there are several issues that could impact the adoption of 3D;

  • Glasses – Today’s TV and movie theatre technology requires you to where special glasses to view 3D. A CNET article points out that the 3D glasses are specific to a given set and cost as much as 150 per pair.  This might not be prohibitive if you are just watching yourself, but if you have a family, you need a pair for each person.  And what if you have friends or family over for a visit – do you need a pair for them as well?  Besides…I have enough trouble keeping track of my remote control; who needs another thing to look for?
  • Content – Aside from a few movies, and some sporting events, there is not a whole lot of content currently available.  Similar to the implementation of HD, there will be more and more content coming as devices become available and more common.  It’s that traditional chicken and egg problem.  From the consumer’s perspective;  What is the point of spending thousands of dollars for a new TV if there is nothing to watch?  From the content producers perspective; Why should I spend millions making 3D content when there are very few people with 3D sets? 
  • Consumer fatigue – Many consumers have just upgraded their home theatre set-ups to allow them to consume high definition content;  Large 1080p televisions, HD Tivos, new set-top boxes from the cable company or satellite provider and of course new upconverting DVD players or Blue-ray decks.   Those who have just recently upgraded are not likely to want to go back and buy another new system.   Those who haven’t yet gone HD will likely not go 3D when the time comes to replace their systems.
  • Technical issues – multiple formats, differences in perspective (kids see 3D differently than adults do because their eyes are closer together), all put the market in a high state of flux.

So the question then becomes if you are a TV manufacturer and are offering a 3D TV, how do you forecast future demand?  Do you go bullish and forecast big sales?  Or do you play conservative and risk stockouts and the loss of significant potential revenue?  

The big concern is that the technology is still in a state of flux and if you are holding significant inventory when the next advance hits, you could be stuck with a lot of obsolete inventory.  One example of potential game changing technology is the introduction of 3D televisions that don’t require special glasses.   You don’t want to be sitting on a pile of old stock when that change hits.

Traditional forecast accuracy mitigation issues can’t help in this case. Normally, you could use techniques like postponement strategies to limit inventory costs while improving responsiveness.  With 3D TVs it is entirely likely that technology changes and that the majority of your strategically placed inventory would need to be replaced. 

So what to do?  Personally, I’d probably go conservative, but as I’ve already pointed out, I’m not that excited about 3D. Let’s hear from you…  Are you excited about the upcoming “3D revolution”?  If you were to forecast 3D TV sales, what factors would you consider?   Comment back and let us know.

Reblog this post [with Zemanta]

Posted in Demand management, Inventory management


Guess who won BtoB Magazine’s 2010 Social Media award alongside Cisco?

Published April 20th, 2010 by John Sicard 0 Comments

Please forgive the self-promotion but we wanted to share some exciting news that we are quite proud of. BtoB Magazine‘s inaugural 2010 Social Media Award recently recognized Kinaxis for our use of social media. And we are in great company being honored alongside Cisco Systems and Microsoft Corp. in the category of Best Integrated Campaign from a Tech company. We were also selected as one of the judges’ top 4 favorite entries making us eligible for a People’s Choice Award. (Online voting is open until this Friday at www.BtoBonline.com/vote)

The award honors Kinaxis for our multi-faceted ‘Learn, Laugh, Share and Connect’ program which includes such programs as this blog, our LinkedIn activity, our use of Twitter; and most significantly, our Supply Chain Expert Community. The judges were particularly intrigued by our use of comedy, which has added much personality to our social media efforts. The “Suitemates” series was recognized in particular. The fourth episode, Suite Caroline, was posted today by the way (it’s the boldest of the bunch!)

Entering the social media arena has been a fun and natural extension of who we are as a company. Special thanks goes to all of you, our readers, who have contributed greatly along the way. We’ll continue to challenge ourselves to use this blog, our community and other social media forums to facilitate thoughtful industry dialogue and share insightful perspectives in ways that we hope will entertain, educate and inspire.

Posted in General News


Top ten signs your S&OP process is in trouble – a little supply chain humor

Published April 19th, 2010 by Bill DuBois 0 Comments

There seems to be a great deal of content posted recently on the subject of S&OP.  Analysts are writing about it, providing great information on where Sales and Operations Planning has been and where it is heading. There have been conferences which have been very well attended. This kind of activity indicates that there is a real need and market for improving the Sales and Operations Planning process.

Does your S&OP process need improvement? Here are some signs that all may not be well in your S&OP world.

  1. Your CEO pronounces it Sales and Oooops!
  2. Your VP of Manufacturing calls it Sales OR Operations Planning, but not both.
  3. The S&OP team is proud they reduced their monthly S&OP process cycle time down to 6 weeks? (This one is true, sorry if I offended anyone)
  4. The S&OP team is asking why they have to go through the process again; they just did it last year?
  5. You are celebrating the first time your forecast accuracy has hit double digits.
  6. Your Director of IT is extremely excited about the new S&OP tool just purchased, it’s called Excel.
  7. Operations can’t support all sales so you hold “tribal council” to vote off the customers that will not get their orders.
  8. Who needs S&OP when you built up all that inventory during the recession?
  9. You’ve just hired Zelda, an online Psychic as your S&OP consultant.
  10. You’ve come up with a new, more descriptive acronym for S&OP; 3M, Misalignment Management Meeting.

Any more suggestions?

Posted in Sales and operations planning (S&OP)