academy – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com S&OP/ IBP, Demand Planning, Supply Chain Planning, Business Forecasting Blog Wed, 16 Sep 2015 14:23:57 +0000 en hourly 1 https://wordpress.org/?v=6.6.4 https://demand-planning.com/wp-content/uploads/2014/12/cropped-logo-32x32.jpg academy – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com 32 32 Forecasting & Planning Learnings from Day 2 of IBF Academy: An Attendee’s Perspective https://demand-planning.com/2015/09/16/forecasting-planning-learnings-from-day-2-of-ibf-academy-an-attendees-perspective/ https://demand-planning.com/2015/09/16/forecasting-planning-learnings-from-day-2-of-ibf-academy-an-attendees-perspective/#comments Wed, 16 Sep 2015 14:23:57 +0000 https://demand-planning.com/?p=3054 Last Month, I had the opportunity to attend IBF’s Business Forecasting & Planning Academy held in Las Vegas. I recently shared some insights from the first day of the program. Day 2 was similarly eventful. Here are some highlights.

Forecast Error

The first session I attended on Tuesday was “How to Measure & Reduce Error, and the Cost of Being Wrong” an advanced session presented by Dr. Chaman Jain from St. John’s University.  Dr. Jain reviewed the basic methods and mechanics of how to compute forecast error and the pros and cons of each technique. It was interesting that IBF has found that more and more companies are moving from MAPE (Mean Absolute Percentage Error) to a Weighted MAPE (WMAPE) to focus their attention on errors that have a relatively larger impact or little to no impact at all.  Standard MAPE treats all errors “equally”, while WMAPE places greater significance on errors associated with the “larger” items. The weighting mechanism can vary, typically unit sales are used, but I was intrigued by the notion of using sales revenue and profit margin as well.  If a company has low volume items but they are big revenue and profit items, they would not want to miss an opportunity to focus attention on why they have significant errors on these items.

Another interesting concept that Dr. Jain discussed was the use of confidence intervals around error measurements.  Many companies report their error measurement as a single number and rarely present the error measure in terms of a range of potential errors that are likely. Having a view into the potential range of errors can allow firms to exercise scenario planning to understand the impact to supply chain operations and the associated sales based upon multiple forecast errors instead of a single number.

My last takeaway is related to the question of how much history should be used to support time series analysis. Dr. Jain stated, and I believe rightly so, that it depends. Are there potential seasonality, trend, business cycles, or one-time events? How much does one need to see these? What if the past is really not a good indicator anymore of the future? What if the drivers of demand for a product have substantially shifted? One technique suggested that seems sound is to test the forecasting model’s performance using different periods of historical data. Use a portion of the history to build the model, and the remaining portion to test the accuracy of the forecast against the actuals held out of model construction. Try different lengths until you find the one that has the lowest error and also allow the process to have different history lengths for each time series forecast.

Lean Forecasting & Planning

Next I attended another advanced session led by Jeff Marthins from Tasty Baking Company/Flowers Foods on “Lean Forecasting & Planning: Preparing Forecasts Faster with Less Resources”. The session focused on doing more with less, a common theme that has permeated the business world these last several years. Marthins’ session was really about how to focus on what matters in demand planning: looking at the overall process, agreeing to and sticking with the various roles and responsibilities in the process, and understanding how the resulting forecasts and plans are to be used by various consumers in the business which drives the level of detail, accuracy and frequency of updates.

To gain an understanding of the demand planning process, Marthins asked the participants to look at a picture of his refrigerator and answer “Do I have enough milk?” This relatively simple, fun question elicited numerous inquiries from the participants around consumption patterns, replenishment policies and practices, sourcing rules, supplier capacity and financial constraints that illustrated the various types and sources of information that are required to develop a solid, well-thought-out demand plan. It was a very effective approach that can be applied to any product in any company.

To illustrate the need to understand the level of accuracy required of a forecast, Marthins used the weather forecast. How accurate is the weather forecast? How often is it right? How precise does it need to be? Once we know the temperate is going to be above 90 degrees fahrenheit, does it matter if is 91 or 94 degrees?  Is there a big difference between at 70% chance of rain or an 85% chance of rain?  What will you do differently in these situations with a more precise weather forecast? Should I plan to grill tonight? Will I need to wear a sweater this evening? Can we go swimming?  If the answer is nothing, then the precision does not really matter and spending time and effort creating or searching for greater forecast accuracy is a “waste” and wastes should be eliminated or reduced in Lean thinking. Marthins also stressed the value of designing your demand planning process with the usage of information in mind. Adopting a Forecast Value Add (FVA) mentality to assess whether each step in your forecasting and demand planning process is adding value will help to accomplish this. Start by asking if the first step in your forecasting process results in greater accuracy than a naïve forecast such as using the same number as last time you forecasted, or a simple moving average? When your accuracy improves with each step in the process, is it worth the effort or time it takes? Can I be less accurate and more responsive and still not have a negative impact? If I can update my forecast every day with 90% accuracy versus once a week with 92% accuracy, or once a month with 96%, which is better? How responsive can I be to the market by making daily adjustments that are nearly as accurate as weekly ones?

In yet another session, the topic of scenario analysis was raised. The team at IBF are getting this one right making sure it is discussed in multiple sessions. What I wonder is how many companies are adopting scenario analysis in the demand planning and S&OP processes? From my experience it is not the norm.  Marthins suggested testing the impact of various forecasts, and hence forecast accuracies, would have on supply chain performance and even using scenario analysis to understand if a systematic bias, either high or low, might make sense. I have known companies that have employed the policy of allowing overestimating to ensure their resulting demand plan was on the high side. Carrying more inventory even with all the associated costs was of greater benefit to the company than a lost sale or backorder. Bias is not a bad thing if you understand how it is used and its resulting impact, just like inventory is not an evil when used in a planned and methodical manner.
[bar group=”content”]

Data Cleansing

After lunch I attended my second session delivered by Mark Lawless from IBF “Data Cleansing: How to Select, Clean, and Manage Data for Greater Forecasting Performance”. As in any analytical process, the quality of the inputs are crucial to delivering quality results. Unfortunately I had another commitment during the session and I could not stay for all of it.

Lawless discussed a variety of ways to look at the data available, decide if it should be used, update or modify it, fill in missing values and apply various forecasting techniques.  Simple reminders and tips such as consideration and awareness for how data is provided in time periods, e.g., fiscal months (4/4/5) or calendar months, and how they should be reported was a good reminder to make sure the data inputs are clearly understood as well as how the output from the forecasting process will be used.

While most of what I heard was related to the data going into the forecasting process, Lawless did spend time talking about various analytics associated with assessing the output of the process. You might be expecting me to talk about various error and bias metrics again but that is not the case. Rather, the idea is to look at the error measurement over time.  What is the distribution of errors? Do they have a pattern or random? If there is a pattern, there is likely something “wrong” with the forecasting process. It made me think about the application of Statistical Process Control (SPC) techniques that are most often applied to manufacturing processes but can be applied to any process. SPC control charts can be applied to check for patterns such as trends, systematic sustained increases, extend periods of time at unexpected very high or very low errors, randomness of errors, and many more. It gets back to the notion that in order to improve the quality of the demand planning process it must be evaluated on a regular basis and causes for its underperformance understood and corrected as much as possible or warranted.

Regression Analysis/ Causal Modeling

The final advanced session of the Academy was delivered by Charles Chase from the SAS Institute on “Analytics for Predicting Sales on Promotional Activities, Events, Demand Signals, and More”.  This session was about regression modeling on steroids.  As someone who has used regression models throughout my career I could easily relate to and appreciate what Chase was discussing.  In two hours Chase did a great job exposing attendees to the concepts, proper use, and mechanics of multivariate regression modeling that would typically be taught as an entire course over weeks.

While time series models are a staple used to forecast future demand, they provide little to no understanding of what can be done to influence the demand to be higher or lower. They can be used to decompose the demand into components such as trend, seasonality and cycles which are important to understand and respond to.  They are focused on the “accuracy” of the predicted future.  Regression models however describe how inputs effect output. They are an excellent tool for shaping demand. Regression models can help us understand the effect internal factors such as price, promotional activity, and lead-times, as well as external factors such as weather, currency fluctuations, and inflation rates have on demand. The more we can create predictive models of demand based on internal factors the more we can influence the resulting demand as these factors are ones we control/influence as a firm. If external factors are included, forecasts for the future values of these inputs will be needed and we become more reliant on the accuracy of the input forecasts to drive our model demand.

In case you missed it, you can see pictures from the 2015 IBF Academy HERE.

I trust I have brought some insight into IBF’s recent Academy in Las Vegas and perhaps offered a nugget or two for you to improve your forecasting and demand planning activities. If only I would have learned something to apply forecasting success at the gaming tables :).

]]>
https://demand-planning.com/2015/09/16/forecasting-planning-learnings-from-day-2-of-ibf-academy-an-attendees-perspective/feed/ 1
New Learnings from Day 1 of IBF's Business Forecasting & Planning Academy: An Attendee's Perspective https://demand-planning.com/2015/08/25/new-learnings-from-day-1-of-ibfs-business-forecasting-planning-academy-an-attendees-perspective/ https://demand-planning.com/2015/08/25/new-learnings-from-day-1-of-ibfs-business-forecasting-planning-academy-an-attendees-perspective/#respond Tue, 25 Aug 2015 19:52:26 +0000 https://demand-planning.com/?p=3027 Last week I had the opportunity to attend IBF’s Business Forecasting & Planning Academy held in Las Vegas. The two days were filled with fourteen educational sessions, three roundtable discussions, and multiple opportunities for connecting with peers and instructors.

Each educational session, organized as introductory or advanced level, was two hours in length allowing for a deeper dive into content with plenty of opportunity for participant interaction. The instructors were academics, industry practitioners, and software providers giving the attendee a nice blend of viewpoints and experiences.

The first session I attended on Monday was conducted by Dr. Larry Lapide from MIT on Designing and Implementing a Successful Collaborative Demand Forecasting Process. The introductory level session was hands on and highly interactive. Participants were placed into four teams and asked to focus on a case study with questions around organizational design of the demand planning function, reporting needs of the Sales & Marketing, Operations and Finance organization, and various forecasting methods to employ. Dr. Lapide “challenged” the various answers provided by the teams in a manner that allowed for deeper understanding and awareness.

One of my takeaways from the session, and one I heard in several others, is the ongoing challenge companies have to not take the unbiased, unconstrained statement of demand, or for that matter the demand plan, and replace it with the financial budget. Too often firms are not paying attention to the demand signals in the market and turning the projection of future demand (forecast) into a demand plan that mirrors the financial budget created anywhere from weeks to months to quarters before.

Another takeaway was the reminder to design a forecasting process that incorporates multiple methods based upon the various characteristics of the customers, markets, channels and products. Applying segmentation approaches prior to selecting techniques such as time series forecasting, lifecycle forecasting, and collaboration to gain real time knowledge and expertise, will allow for a more robust and effective process tailored to the needs of each segment.

Next I attended the introductory session How to Sell Forecasts to Top Management and Understand the Power of One Number Planning given by Jeff Marthins, Director Supply Chain Operations, Tasty Baking Company/Flower Foods. This was a very pragmatic session with Marthins sharing Tastykake’s journey with S&OP starting in 2005. He spoke about the value of running the business from one set of numbers and using the budget as a benchmark rather than the demand plan or forecast. He made it clear that the forecasts need to be in terms that the various consumers of information can relate to: revenue, units, capacity, etc…

I was intrigued by one of his questions related to demand planner capabilities: if you could pick between analytical or communication skills which would you choose? While both are needed, I believe the analytical skills are the easier of the two to become good at. I would start with solid communication skills. To develop a comprehensive plan that is adopted, a demand planner needs to be an excellent listener, taking information and insights from various sources; an engaging and thoughtful facilitator to guide consensus dialogues; and a crisp, clear, and confident speaker to communicate and defend the rationale for the demand plan being presented and ultimately agreed to by senior leaders and stakeholders.

Marthins’ discussed the need to spend more time to understanding why the plan is different than the actual demand. Was the forecast and/or demand plan low or high because of promotional lift errors; unforeseen market changes; new production launch timing, trajectory, or cannibalization estimates of existing product; or outside influencers such as weather and competitor actions to name just a few? Root cause analysis is something that as a supply chain planning and analysis community we need to do more. Demand plans and forecasts will always be wrong. Hopefully over time they will become more and more accurate. But if we are not researching the reasons why our plans and KPI targets are not being met, we should not have high expectations that they will be achieved in the future.
[bar group=”content”]

I had a huge smile and kept nodding my head when Marthins started praising the need and benefits of scenario management and contingency planning as part of the S&OP process. While the output of an S&OP cycle is typically an agreed to set of numbers, they should not be obtained by looking at only one set of “inputs”. Understanding the implications of various scenarios with changes to demand and supply is needed to have a comprehensive understanding and agreement for a course of action. Scenario management is an excellent means to show decision makers the impact of their opinions about the future while keeping the discussion fact based. Contingency planning allows for a higher degree of responsiveness for risk mitigation actions to be put in place.

The final session of the day I attended was presented by Mark Lawless, Senior Consultant from IBF on Long Term Demand Planning & Forecasting: The Key to Corporate Strategic Planning. Lawless did a nice job throughout the session educating the attendees on the differences between long term (three to five years) and short term demand planning and forecasting. It was helpful to be reminded of the difference between a forecast – an unbiased prediction or estimate of an actual value at a future time and a demand plan – a desired outcome at a future time. Time was spent discussing how firms can shape the future demand, the more aggregated levels of detail to plan with, and the need to engage external subject matter experts in the planning process.

Looking three to five years into the future is not just about applying a time series technique. Companies must rely on internal and external domain experts to assist with potential changes in markets, competitors, customers, and consumers; technology and business cycle impacts; changes in demographics and regulatory environment and many other areas of potential impact. Thinking about where competition will come from is not always obvious. Five or more years ago, would the camera manufacturers have seen their market being potentially challenged by smart phones? Totally not related to the event, but I was intrigued to search for more: in 2000, 86 billion photos were taken with 99% analog (film), in 2011, over 380 billion photos were taken 1% analog. If you were the long range demand planner for camera film would you have seen this coming? Another crazy statistic, that shows that history alone is not always a great gauge for developing future demand plans, in 2011 we snapped as many photos in two minutes as humanity as a whole snapped in the 1800’s. Would this long range trend have been detected by a time series technique?

Long range demand planning requires us to understand the drivers of our demand even more so than short term demand. Our ability to respond to short term sharp changes may be limited, while changes in long term demand can be addressed. Regression, ARIMA, or ARIMAX models are very helpful in this area. Developing models that help explain demand as a function of price, feature/function, market trends, economic factors, age, income, education, marketing, and numerous others allows us to not only see the impact to demand of changes in these variables, but enables us to determine the levers to pull to shape the demand in our favor.

See my next post on the highlights from day two of the Academy. Your thoughts and feedback are always welcomed!  You can also see pictures from the IBF Academy HERE.

]]>
https://demand-planning.com/2015/08/25/new-learnings-from-day-1-of-ibfs-business-forecasting-planning-academy-an-attendees-perspective/feed/ 0