Archive for March, 2010

Have you tried to buy a Wii Fit lately?

Published March 31st, 2010 by Max Jeffrey 1 Comment

About a month ago, someone in my family had to have a Wii Fit.  We already had the Wii console so just needed the Wii fit add on.  I thought OK, I will just check where I can pick one up or maybe order it on-line.  Looking at all the standard places I would normally buy something like a Wii Fit here in the US (Walmart, Best Buy, Target etc.), I found that everywhere I looked it was out of stock, both in the local stores and on-line.  I ended up having to order it at a significant premium price from a distributor who was fortunate or smart enough to accurately speculate on the high demand versus short supply of this product.  Obviously, there was advance information in the market that this product was going to be in short supply.

I do not know what the details are around the supply of this product, whether the forecast was low or manufacturing capacity was just not there, or if it was just a distribution issue, but it got me thinking about what can be done by OEMs to mitigate this type of situation.  The situation being that at some point, the forecasted demand was a lot different than the actual demand.  As I was taught years ago, the first rule of forecast is that they are always wrong.  My conclusion is that since the forecast is always wrong, the solution has to be in quickly responding to the changes or variances between the forecast and the actual demand.

I recommend you read the short white paper “Are Yesterday’s Solutions Conflicting with Today’s Challenges?”, by Charlie Barnhart when he was with Technology Forecasters.  While the paper was written a bit ago, the premise remains very true. In this paper, Charlie describes a response management approach to dealing with change (such as variations between forecasted demand and actual demand) especially in an outsourced manufacturing environment and given shrinking product life cycles and rapidly shifting customer preferences.  Also, some interesting points are raised in the white paper as to why increasing inventory is not a viable solution.  As stated in the paper,

“Response Management provides organizations the ability to rapidly test and score options for responding to change by identifying what’s possible today with today’s resources”.

What are your thoughts?  Do you have any insight into this particular supply issue with the Wii Fit or similar ones?  Do you feel that the response management approach described in the white paper can effectively mitigate forecast errors?

Posted in Inventory management, Response Management, Sales and operations planning (S&OP)


Collaboration – The good, bad and ugly

Published March 30th, 2010 by Kerry Zuber 0 Comments

Collaboration is without a doubt a key component in many companies’ strategic initiatives.    The touted benefits (the good) are significant and they extend well beyond the financials, to include service improvements and risk sharing.    While some companies have begun the journey with collaborating with their customers, others have elected to start on the supply side.  In virtually all cases, the challenges of making collaboration successful are a series of painful lessons in the dynamics of people, process, and technology.  Attempts to initiate collaboration without a careful consideration of all three elements (the bad) can lead to projects that only deliver a fraction of the promise.  This in turn leads to faulty conclusions concerning the value of continuing the journey.    

I was recently exposed to a project where the customer collaboration project was focused on establishing a clearly defined process leveraging the limits of the existing legacy systems.   The ugly result was overtly complex and I seriously questioned it’s sustainability.  In this case, the relevance of technology as an enabler to simplify and accelerate the exchange of data was being ignored.    On the other hand, investments in technology without serious consideration on the impact to the people and processes can be equally fatal.   

I would strongly encourage any company traveling down this path to evaluate how new technologies are dramatically simplifying collaboration tasks.     My experience in implementing lean systems has shown that high quality results are best achieved when the human component is simple, repeatable, and designed with mistake-proofing in mind.   Consider whether the tools currently at hand will meet these standards and if not, then invest in learning what is available to help deliver the full value of the collaboration promise.

Posted in Supply chain collaboration


Leggo my Eggo

Published March 29th, 2010 by Monique Rupert 1 Comment

As I went into the freezer this morning to retrieve the frozen waffles for my kids breakfast I was warmed by the fact that there were actually waffles to get.  The last few months have been very difficult as the ability to procure Eggo Waffles has been difficult.  As many of you may be aware, two things have plagued Kellogg’s in making these wonderful waffles; one is a flood at their Atlanta factory last fall and the second are production line repairs at their largest bakery in Tennessee.  Try explaining that to a 3, 5, and 6 year old.  They don’t care about the problems, they just want their cinnamon waffles.  I think my family is not alone, there are numerous posts on Twitter and Facebook as well as many other blog posts about it.  Unfortunately, these shortage problems are expected to last until the middle of this summer. 

Each week that I went to the store and saw a sign “Eggo Waffles are temporarily experiencing a shortage” on the freezer door my heart started pounding thinking about the drama I would have to endure at home.  This made me think about the whole issue of the supply chain breaking down for this product that is beloved by so many people.  I can’t remember in recent history (other than Elmo at Christmas) any time where a consumer product was missed so much by so many.  How could this happen??

Obviously, you can’t predict a natural disaster or even some technical repairs, but you could prepare for both of these unpredictable events if you had the ability to simulate these types of changes and understand the financial and operational impact in advance.  With a capability to simulate demand changes or supply changes, a company could put in place backup measures to ensure that if this event occurs they could minimize the disruption to their business.  Being able to respond to unpredictable events is clearly not unique to Kellogg’s; it is something all manufacturers should think about.  The world today is very unpredictable with many natural disasters, the state of the economy, political issues, etc.; I think it would be wise for everyone to plan for the unexpected and be able to respond quickly to change… as I would hate to go through the “Great Eggo Disaster” again with any other product.

Posted in Supply chain risk management


How accurate does the forecast need to be?

Published March 26th, 2010 by Bill DuBois 4 Comments

In getting ready for a trip I went into the drug store to buy travel sized toothpaste and contact lens solution. Looking at the packaging, I started to wonder how accurate a forecast needs to be. (You know you’re consumed with everything supply chain when that’s what you think about while shopping!)

I’m sure no one was predicting the need for these products in 100ml sizes a couple of years ago. And what if the airlines lifted the size requirement on liquids or reduced it to 50ml? What chaos would that cause the demand planners of the world? Walking to the front of the store I noticed some Olympic wear. As you know, Vancouver just finished hosting very successful Olympic and Paralympic games. I could only imagine the heroics and horrors that were experienced to make these games the success they were. Everything from scheduling materials for the new venues to the clothing, flags, food and everything else required for the games. Will the promotions to sell off Olympic paraphernalia make up for the excess inventories now on the shelves and in the warehouses?

In a discussion thread on the supply chain expert community, Joshua Gao asked what your “Vision of the Supply Chain” is?  Well, if we look to the past many things are different from our grandparents’ supply chain. Two of the biggest stand out.  First, customers are more demanding. I mean that in a positive sense, in that customers can quickly research products, understand trends in technologies and purchase what they want with a few clicks of a mouse. The second is that supply has become more fragile. Outsourcing, margin pressures and even catastrophic events can cause supply challenges. So this gets us back to the vision of the future and the question, how accurate does the forecast need to be.

In the past, good enough may have worked because there were fewer demand and supply pressures. But today and in the future, is it better to have an accurate forecast or should the focus be on handling the deviation?

If the focus is to manage the deviation and leverage your supply chain as a competitive advantage, then how much effort should go into developing the forecast if you know it is going to be wrong anyway? This is where it would be helpful to get your feedback since the answer may vary based on industry etc. Does the forecast need to be more accurate given the supply chain challenges of today or do you just need some number to start with since you will have to handle change regardless what the forecast states? How close does the forecast need to be, 40%, 60%, 80%?

Just one final request for feedback: if you were involved in any Olympic related supply chain stories, it would be great to hear them. Maybe your story will make the podium and win gold, silver or bronze!

Posted in Demand management, Inventory management, Sales and operations planning (S&OP), Supply chain management, Supply chain risk management


Talent and the supply chain

Published March 25th, 2010 by Trevor Miles @milesahead 0 Comments

So often we discuss the nitty-gritty of supply chains in terms of inventory levels, customer service, supplier performance, capacity availability, etc.  All too often we gloss over the organizational structures required to operate an effective and efficient supply chain.  I was fortunate enough to attend an AMR Research call on March 23rd  titled “The Supply Chain Top 25: Lessons From Leaders” (which I suspect will require registration to view).  Since the presenters were Debra Hoffman and Kevin O’Marah, the focus of the webinar was on the AMR Top 25 and the hard numbers that can be used to arrive at a balanced manner in which to measure supply chain performance.

As part of the discussion, Kevin brought up an extended SCOR model around the concepts of demand, supply, and products/engineering that includes the usual Plan, Source, Make, and Deliver functions of the SCOR model.  To the SCOR model, AMR added Customer Management, Post-Sales Support, and New Product Development and Launch (NPDL).  AMR has overlaid a  maturity model on top of this, as captured in the numbers 1-4.  Lastly, they included some enabling skills in the outer circle around the concepts of Governance, Strategy and Change Management, Performance Measurement and Analytics, and Technology Enablement.

AMR conducted a survey to understand which of these functions is included in the supply chain organization in participating companies, the results of which are captured in the diagram below.  What is remarkable is that only the ‘Deliver’ function is included in the supply chain organization in more than 75% cases, while the other functions are included in at best 50%-75% of cases.  What is amazing is that the ‘Make’ function is most often not part of the supply chain organization. Instead, ‘Make’ usually reports up into the COO though a completely different hierarchy.  I was also very surprised that in only 50%-75% cases is the ‘Plan’ function included in supply chain.  I was less surprised that ‘Post Sales/Service’ and ‘New Product Development and Launch (NPDL)’ are often not included in supply chain.  NPDL has long been a separate function and Post-Sales/Service has long been the “poor cousin” of many organizations, even if in some cases it accounts for a large part of a company’s revenue and margin.

The diagram was published in the AMR report: “Supply Chain Talent: State of the Discipline” (see page 10 of report as available here).  The report goes into some depth on the skills required in supply chain and their relative importance to the respondents, including the availability of these skills on the employment market, which is captured in the diagram below (see page 13 of report as available here).  What I find really interesting is how these 2 diagrams relate to each other in that many of the functions that are included in supply chain less than 50% of the time also score low in terms of the level of skill sought.   While this may seem obvious, I still find it surprising that companies in general are not looking for high level of skill in some of the functions, such as ‘Make’, whether they report into the supply chain function or into a different function.  This must be a reflection of the high degree of manufacturing outsourcing and off-shoring that has occurred in the US over the past 20-30 years.  The notable exceptions to this observation are “Strategy and Change Management” and “Technology Enablement”.  Kevin O’Marah made the observation that “Technology Enablement” scoring less that 50% indicates that “someone else” is choosing the systems that are used by supply chain to carry out their day-to-day activities.  My guess is that it is usually IT that is making these decisions for the business.

I am very pleased to note that my alma mater, Penn State, was ranked #1 in an AMR study of US universities offering programs in supply chain management conducted in 2009.  (AMR is currently refreshing this information.)

What I find fascinating having been in and around the industry for about 25 years now, is that supply chain managment has matured from an ill-defined discipline promoted by AMR and some software vendors into a recognized function within companies and programs of study at university.  The AMR study includes 19 universities in the study.  When I was at Penn State the closest I could get to studying supply chain management was Operations Research and Industrial Engineering.  Don’t get me wrong, I have been very pleased with the skills I learned from studying OR and IE, but all the real supply chain knowledge I have gained has been through experience.  I think these have given me a very sound theoretical knowledge in which to understand what I have experienced, and perhaps even more importantly, extrapolate from what I have experienced into new situations.  Believe me folks, I am not that comfortable talking about myself to this degree, but the relevance is in an observation made by AMR in the university study that “… industry has stated its most pressing need: the additional capabilities required for most advanced supply chain organizations demand a different academic experience that educates generalists.”  So while I am encouraged to see the emergence of so many supply chain programs at universities, let us not forget to give broad knowledge and skills to the graduates of these programs, particularly a grounding in Finance and Economics.

My greatest satisfaction in all of this is seeing some of the amazing supply chain talent in our customers and prospects and the emergence of supply chain organizations in most of the AMR Top 25.  Even some companies in the $100M annual revenue range (which are not included in the AMR Top 25) have come up with amazing concepts and ideas on customer or demand segmentations, methods of collaborating in an outsourced environment, and methods of ensuring maximum on-time delivery at minimum cost,  to name a just few.

Yet it is quite clear from the first AMR diagram in this blog that we have a long way to go before the supply chain function has a broad enough scope to manage the end-to-end supply chain within an organization, let alone across organizational boundaries.

What is your experience?  Are the needs for  supply chain management skills being recognized within your company?  What about your experience at some of the universities?  Did the program prepare you for the “real” world?

Posted in Milesahead, Supply chain management


The SCM SaaS debate: multi-tenant vs. multi-instance

Published March 24th, 2010 by Lori Smith 0 Comments

Posted on behalf of Rob Bell, Senior Director, Service Ops and IT, Kinaxis.

We are becoming more and more familiar with the key benefits of SaaS:

To deliver these benefits, should a SaaS provider use a multi-tenant architecture? What are the pros and cons of this architecture?

Multi-tenancy, if done correctly, can drastically reduce the costs for the SaaS provider. If an application can service multiple customers on ONE piece of hardware, with ONE OS license and ONE database license it is far less expensive to operate than a multi-instance architecture. In addition, the cost of resiliency investments like server clustering can be kept much lower. All in all, the provider can keep his costs lower… and pass these savings on to his customers. (One hopes!)

The ultimate expression of this model is Google. They have driven the cost to zero for email and personal productivity applications. They leverage a massively multi-tenant architecture that is free to users. Talk about a value proposition. We can all see what’s in it for the customers… the very best price going!

But as we all know, Google is providing a ‘commodity as a service’ which implies a very different business model from other software/service vendors. What about supply chain applications deployed to meet a particular company’s needs? What value does a multi-tenant architecture bring to them?

Well if supply chain services were free, the case would be simple. However, there may be factors at play that are more important than cost: security, availability, performance and scalability to name a few. How does multi-tenancy improve value to the customer in these areas?

SECURITY: There is always the possibility of a ‘leak’ in the security model for a multi-tenant model that could be exploited by a creative individual. Data is living in physical proximity, in the same database. Multi-instance architecture by contrast is more secure by its insular nature.

AVAILABILITY: A reputable SaaS provider takes great care to meet the Service Level Agreement in the contract with its customers. It’s an axiomatic part of the operations of a SaaS company. Customers can depend on it. How does multi-tenancy help deliver value to customers here? Other than the risk of a massive hit to the reputation when hundreds or thousands of customers are without service, there’s nothing inherent in a multi-tenant architecture that delivers better availability. High availability is all about great design, great equipment and great process. (As I write this I just got notified that our Salesforce.com service is down- ouch.)

PERFORMANCE: All multi-tenant architectures are built to govern and limit read/write access to prevent resource over-utilization. This type of design can restrict flexibility whenever user extensions to standard applications are built. This is a clear limitation for customers in the multi-tenant model. Also, there are application areas that don’t lend themselves to this type of ‘governed’ model. Multi-instance might be the only alternative that can deliver satisfactory performance without the risk of compromising another customer’s performance.

SCALABILITY:  Does multi-tenancy deliver scalability? It’s obvious that an application built for hundreds of customers using it simultaneously must be scalable. But, in which dimensions is it built to scale? Users, data size, transaction volume? Again, by nature the multi-tenant model does not deliver this scalability- the application whether multi-tenant or multi-instance must be designed to scale in these dimensions… with the cpu, network cycles, and memory space to back it all up.

UPSHOT: It is clear that well designed multi-tenancy is essential for delivering software service on a large scale where profit margins are thin. However, by its inherent nature, multi-tenancy does not have a customer value delivery advantage in many respects. On the contrary, for delivering a single customer software service, like a supply chain solution or even a social media solution, a great single instance design and operating environment might just be superior, with better security and dedicated resources allowing consistent performance, inherent in the model.

WHAT DO YOU THINK?? When is multi-tenancy an ‘anti-feature’? Are you willing to pay more for features or benefits that are only available in the multi-instance model?

Posted in On-demand (SaaS)


Can you reduce inventory by rescheduling late demand?

Published March 17th, 2010 by Max Jeffrey 2 Comments

A fact of life for many manufacturers is that there are customer orders or forecast for products that are going to be late because one or more of the components will be received late from the supplier.  In many cases, the components that are not late are received from the suppliers and held in inventory and the suppliers are paid or in the process of being paid.  Furthermore, there may have been sub-assemblies built and sitting in inventory awaiting the late component(s) from suppliers.  Since the end products cannot be built and delivered until all the components are received, there is excess inventory being carried.  

The obvious first approach is to fix the situation with the late supply, but a lot of times this cannot be accomplished.   Many different strategies are in place at some manufacturers to mitigate this type of situation, such as: vendor managed inventory, lean manufacturing, schedule sharing with suppliers etc. but, regardless, I have seen at many of the customers I have worked with a lot of late and past due end product demand.
So when late supply, and therefore late end product demand is inevitable, what is the best way to deal with this situation and reduce inventory?  From my point of view, to effectively plan and reduce inventory, there are some key capabilities required:

  1. Ability to identify the gating components and determine when they will be available.  Required here is a tool to easily identify gating components as well as an effective way to collaborate with and get reliable commitment dates from suppliers.
  2. Visibility into the gating components far enough in the future to reschedule the end product demand so that purchase orders and production orders on other components can be delayed in time to realize inventory savings.
  3. Capability to determine if the rescheduling of demand is worth the inventory savings given the administrative effort involved, as well as the change and disruption at the suppliers.  This implies an ability to simulate the change and calculate the potential inventory savings as well as the amount of rescheduling that will need to be executed.

I would like to know your thoughts on this subject. If this situation is applicable to your manufacturing operations, how do you deal with it? What tools or applications do you have that assist you in effectively managing late supply against customer satisfaction and inventory levels?

Posted in Inventory management


The supernatural supply chain – it’s a SCARY place!

Published March 16th, 2010 by John Westerveld 0 Comments

A few months ago, I blogged about zombies in the supply chain. Now, it turns out, Zombies aren’t your only problem.  Now you need to worry about ghosts!

I heard about this issue on TWIT (This week in Tech – a technology podcast).   I followed up and found this story from the LA Times.  The LA Times story is based on a post in Bunnie Huang’s personal blog.

It goes something like this…

Bunnie Huang, Founder of Chumby Industries was called in to look at a quality problem with one of his products, the Chumby one – a handheld digital device. It turns out that the memory card being used in the product failed the quality tests. The failing memory cards were all Kingston branded and all from a single batch.  When Bunnie tried to exchange the cards from that batch, Kingston refused because the memory cards had already been programmed.

Not to be dissuaded, Bunnie did some detailed (and I mean DETAILED – check out Bunnie’s post to see the extents he went to) investigation and was able to determine that the defective cards were very likely produced on the same machines as the certified Kingston memory. This led Bunnie to believe that the Micro SD cards he had been sold had been run in a “ghost shift”.  A ghost shift is where a rogue  worker walks into the factory after hours and runs off a couple hundred units of a product without the knowledge or consent of the factory.  Further, there is no quality control checks made on the finished product, and the products are often made with rejected materials.   When presented with this evidence, Kingston decided to exchange Bunnie’s defective chips with new ones.

This raises issues for both component buyers, and for component suppliers;

For component buyers, the source of your supply is as important as the brand of your supply.  There have been numerous stories outlining the risk and impact of counterfeit components. Similarly, we need to be aware of the risk of supplies that aren’t really counterfeit – they are actually produced in the same factory, on the same machines, but are not certified by the brand owner.  In Bunnie’s case, he was very lucky that the QA process caught the bad parts before they went out to his customers.  This won’t always happen.  The flaws may well show up weeks or months after the customers get their hands on the products.  Depending on the nature of the flaws, the  impact could be anywhere from an inconvenience, to a full blown disaster (a-la Toyota).  When buying components, make sure that your supply source is a reputable dealer. You may end up paying a bit more, but you have a better chance of getting what you paid for.

For component manufacturers, make sure your equipment is being used only to run those components you have authorized.  How many customers would have had the perseverance and technical where-with-all to do the analysis that Bunnie did.  Most would have chalked up the bad chips to poor quality on behalf of the manufacturer – in this case Kingston – after all, the chips were Kingston branded – right?   Companies such as Kingston that have (and deserve) a stellar quality reputation can see that market perception erode if branded rogue products start entering the market.

Do you have a similar story to tell?  Have you run into counterfeit products?  Respond back and let us know.

Posted in Inventory management, Supply chain risk management