forecasting models – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com S&OP/ IBP, Demand Planning, Supply Chain Planning, Business Forecasting Blog Fri, 24 Jun 2022 14:02:07 +0000 en hourly 1 https://wordpress.org/?v=6.6.4 https://demand-planning.com/wp-content/uploads/2014/12/cropped-logo-32x32.jpg forecasting models – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com 32 32 My Favorite Forecasting Model https://demand-planning.com/2022/06/24/my-favorite-forecasting-model/ https://demand-planning.com/2022/06/24/my-favorite-forecasting-model/#respond Fri, 24 Jun 2022 12:47:52 +0000 https://demand-planning.com/?p=9679

One of the questions I get asked most frequently is “What is your favorite forecasting model?” My answer is “it depends” because not all problems need a hammer. Sometimes you need a wrench or a screwdriver which is why I advocate having a forecasting toolbox that we can draw on to tackle whatever forecasting project arises.

When it comes to forecasting methods, we have everything from pure qualitative methods to pure quantitative methods, and everything in between. On the far left of the image below you’ll see judgmental, opinion-based methods with knowledge as the inputs. On the far right, we have unsupervised machine learning – AI, Artificial Neural Networks etc. where the machine decides on the groupings and optimizes the parameters as they learn the test data. In between these two extremes we have naïve models, causal/relationship models, and time series models.

All of the models should be in our toolbox as forecasters.

But with dozens of methods available to you, how you decide which ones to use? There are cases when sophisticated machine learning will help you and there are cases when pure judgement will help but somewhere in between the extremes is where you’ll find the models you’ll need on a  day-to-day basis.

Picking The Right Model For A Particular Forecast

The main thing is to have a toolbox full of different methods that you can draw on depending on the data available and the resources you have. We must balance 3 key elements when choosing a model:

Time available: How much time do you have to generate a forecast? Some models take longer than others.

Interpretability of outputs: Do you need to explain how the model works to stakeholders? Outputs of some models are difficult to explain to non-forecasters.

Data: Some models require more data than others and we don’t always have sufficient data.

For example, putting together a sophisticated machine learning model and training it could take months to build, plus extra time for it to provide a useable output. When a forecast is needed now, this kind of model won’t help. Similarly you have little or no data as with new products, you may have to use judgmental methods.

Balancing interpretability and accuracy is also key. There are models whose accuracy can be finetuned to a great degree but as we become more accurate, interpretability (explaining the rationale behind the number) often becomes more difficult. Artificial Neural Networks, for example, can be very accurate, but if you need to explain to partners in the S&OP process or to company execs how the model works and why they should trust it, well you might have some difficulty.

Time series models like regression or exponential smoothing are much easier for stakeholders to understand. So what kind of accuracy do you need? Do you need 99% accuracy for a particular forecast, or is some margin of error acceptable? Remember that there are diminishing returns to finetuning a model for accuracy – more effort doesn’t necessarily provide more business value.

This is why the best model depends on the context you’re working in.

 

Judgmental Methods

These are not sophisticated but they have their place. When I have no historic data, i.e. for a new product or customer, I have nothing to forecast with. Remember, human judgement based on qualitative factors is a forecast and it’s better than no forecast at all.

Judgments are also important in overriding statistical forecasts when an external variable emerges that is isn’t accounted for. A model doesn’t know if you’ve just opened a new store or if supply constraints just unexpectedly emerged. Of course, human judgement has bias – be sure to identify it if you’re using judgmental models. In the Judgmental category we have:

Salesforce Method: This involves asking what salespeople think about future demand based on their knowledge of the market and customers.

Jury Method: This simply involves surveying stakeholder’s opinions and letting the consensus decide what future demand is likely to be.

Dephi Method: A more systematic version of the Jury Method where stakeholders blindly submit their estimates/forecasts. You then take a mean of the responses, which is a more robust/accurate method than you might think.

Time Series Models

58% of planning organizations use time series methods. It’s popular because we all have the data we need for this method – we can use sales data or shipment data. Also our colleagues in Finance, Inventory Management and Production can all use these forecasts. Here we identify patterns (whether level, trend, seasonality) and extrapolate going forward.

The key assumption here is that what happened in the past is likely to continue into the future. This means this method works best in stable environments with prolonged demand trends. It doesn’t perform so well with volatile products/customers, new products and doesn’t explain noise.

Averaging Models

Instead of using one single data point like a naïve forecast, here we’re using more data points and smooth them, the theory being that this provides a more accurate value. In this category we have simple moving averages and exponential moving averages. The difference between the two is that SMA simply calculates an average of price data while EMA applies more weight to more recent data points.

Decomposition Models

These models take out the elements of level, trend, seasonality, and noise components, and add them back in for a forward-looking projection. It’s a good statistical method to understand seasonality and trend of a product.

Exponential Smoothing  

These are the most used methods and include single and double exponential smoothing, with the Holt model and Winters model being widely used. There is also Holt-Winters which is a combination of the two which is a level, trend, and seasonal model so we’re getting 3 attributes of the time series, enabling an exponential curve weighting the past exponentially.

If a naïve model is taking a single point and averaging them to make multiple points, we’re now taking multiple points and weighting them differently, considering level, trend and seasonality. I find this to be a very versatile model that is appropriate for a lot of data sets. It’s easy to put together, can be used with relatively little data, and is easy to interpret and explain.

Going Beyond Time Series Models 

All data is not time related or sequential. And all information is not necessarily contained within a dataset. Casual or relationship methods assume that there is an external variable (causal factors) that explains demand in a dataset. Examples of causal factors include economic data like housing starts, GDP, weather etc. Relationship models include penetration and velocity models where you add variables to a model.

These carry on nicely from exponential smoothing models that identify level, trend, seasonality and noise. The noise can be explained with causal models and can identify whether there is an external variable (or several). This is useful when there is a lot of noise in your data. Generally speaking, these models are useful alongside series models to explain the consumer behavior changes that are causing the changing demand patterns/noise.

Machine Learning Models

Machine learning models take information from a previous iteration or training data set and use them to build a forecast. They can handle multiple types of data which makes them very useful. There are interpretability issues with these models, however, and there is a learning curve when it comes to using them. But it’s not too difficult to get started with the basics – Naïve Bayes is a good place to start.

Clustering Models

Clustering, a form of segmentation, allows us to put data into smaller more manageable sub-groups of like data. These subgroups can then be modeled more accurately. At a simple level, classification can be the Pareto rule, or they can be more complex like hierarchical clustering using a dendrogram (a form of distribution which considers distribution of points) and K-means where we group data based on their distance from a central point. They’re all ways of breaking up large data sets into more manageable groups.

Picking the Best Model

Understand why you’re forecasting. Think about how much time you have, the data you have, your error tolerance and the need the need for interpretability then balance these elements. Start simple (naïve might get you there) and work from there. You might need a hammer, screwdriver, or wrench – be open to using all the tools in your toolbox.


To add the above-mentioned models to your bag of tricks, get your hands on Eric Wilson’s new book Predictive Analytics For Business ForecastingIt is a must-have for the demand planner, forecaster or data scientist looking to employ advanced analytics for improved forecast accuracy and business insight. Get your copy.

 

]]>
https://demand-planning.com/2022/06/24/my-favorite-forecasting-model/feed/ 0
What Forecasters Can Learn From Punxsutawney Phil On Groundhog Day https://demand-planning.com/2021/02/02/what-forecasters-can-learn-from-punxsutawney-phil-on-groundhog-day/ https://demand-planning.com/2021/02/02/what-forecasters-can-learn-from-punxsutawney-phil-on-groundhog-day/#respond Tue, 02 Feb 2021 14:07:52 +0000 https://demand-planning.com/?p=8928

Along with the 14th of March, today should be our field’s public holiday. Why? Because today we celebrate one of the greatest forecasters and prognosticators in history, Mr. Punxsutawney Phil, our field’s hairiest analyst who provides his annual forecast on one of America’s most quirky holidays. This is unquestionably February 2nd and today is once again Groundhog Day.


His methods and results might be questionable but, us being in the field of business forecasting and demand planning, we should give him respect. He only comes out once a year and to make a prediction – it really doesn’t matter if he is right or not, and people throw him a party and celebrate what he says regardless. If only we were treated with the same respect at our companies.

For those living in a groundhog hole or live outside of our borders and wonder what these celebrations are about (I don’t blame you), every year we wait anxiously for the forecast of a famous groundhog from the little town of Punxsutawney, Pennsylvania. Since 1887, legend has it that if the groundhog named Punxsutawney Phil sees his shadow on February 2nd, six more weeks of winter weather lay ahead; no shadow indicates an early spring. Contrary to popular belief, Phil doesn’t actually have to see his shadow; he just has to cast one to make his wintery prophecy.

The new holiday for forecasting professionals? IBF wishes you a Happy Groundhog Day.

Phil Prefers a Random Walk Model

If we think about this, what Phil is doing is pretty much a naïve forecast. A naïve model is a model in which minimum amounts of effort and manipulation of data are used to prepare a forecast. The commonly used example is a random walk which uses the current or prior value as a forecast of the next period.

What Phil is doing in his forecast is taking the current period, February 2nd, and using that to forecast the next period or next six weeks. It is easy to make the mistake at looking at his “shadow model” as more of a causal model where the model is based on a cause-and-effect relationship where the independent variable of his shadow is correlated to the length of winter. In  actuality he is relying more on time series data and simply doing a random walk using current observations to predict the next period.

How Accurate Is Phil?

In 2014, Phil was right on the money. After he saw his shadow, the country endured the 37th coldest February on record (1.6°F below the 20th century average) and the 43rd coldest March (1.0°F below the 20th century average). Compared with his other predictions, however, it seems that Phil might have just got lucky.

According to the Groundhog Club’s records, Punxsutawney Phil has predicted 108 forecasts of more winter and 18 early springs. If we look at the actuals for those same periods the data shows that 49 times Phil’s six-week prognostications were correct. Looking at the total his average Mean Absolute Percentage Error (MAPE) is just over 61 percent and has been correct about 39 percent of the time.

It also appears that this groundhog naïve or random walk prediction model is slightly better when he doesn’t see his shadow. Every time Phil predicted a short winter, he was much more likely to be right. Of the 18 times that he didn’t see his shadow and predicted an early spring, he got it right 8 times – that is a 44 percent accuracy rate.

Phil Should Consider a Seasonal Model

It would appear that, when looking at the forecast accuracy, Phil is committing one of the cardinal sins of forecasting. He may be selecting a model solely to fit history or, in this case, a naïve forecast based on fit to most recent history and ignoring other data or external factors.

Looking at the actuals, we have had 67 early springs and 59 late thaws and extended winters. I believe one of the key factors that Punxsutawney Phil may be missing is seasonality. Taking this into account, if he still wants to utilize a naïve forecast, it would serve him better to consider a seasonal random walk model that would take the prior period from the prior year instead of the current or last period.

An example of a seasonal random walk would be to use the actual from a year ago as the forecast for the next six weeks. Thus, Phil would have been correct a little over 53 percent of the time. And considering the contiguous United States just experienced its 18th consecutive year with an above-average annual temperature, Phil may be wise to use the seasonal random walk compared to the random walk model and more often predict an early spring.

Phil’s MAPE is Poor – Is Forecast Bias To Blame?

Another fatal flaw that may be impacting Phil’s predictions is an overly complex and politicized forecasting process. Politics and bias can be the most damaging influence to a forecast and worse than poor forecast accuracy. Looking at Phil’s results his Mean Percentage Error (MPE) for the 126 years is 33 percent and in the past 20 years Phil has over-forecasted (projected longer winters with spring actually coming earlier) over 40 percent of the time.

Unfortunately, Phil is not alone and any forecasting process can be degraded in various places by the biases and personal agendas of participants. It seems the more elaborate the process, with more human touch points, the more opportunity exists for these biases to pollute what should be an objective and unbiased process. That said, can we really believe a rodent when it is being manipulated by multiple human touch points and a third party is interpreting his forecast?

Do We Have an Unrealistic Expectation of Forecast Accuracy?

I refuse to repeat the mantra of “the forecast is always wrong” but, forecast accuracy is ultimately limited by the nature of the behavior of what we are trying to forecast. There is inherent variability in the data and many times inherent flaws in the process.  With Phil, logic would conclude that we have roughly a little bit better than a 50 percent probability of forecasting an early spring. Why would we expect him to do better than that with the tools he is provided? Consider this same statement when judging the forecast accuracy of your team or your own forecasting ability.

If the behavior is wild and erratic (referring to the data, not the groundhog) with no structure or stability, then we have no hope of using it to generate a reasonable forecast, no matter how much time, money, and ceremony goes into it. The most sophisticated methods cannot forecast unforecastable behavior, and we have to learn to live with that reality.

Should We Depend On A Marmot As A Genuine Forecasting Method ?

I think the ultimate lesson here is not that Phil is wrong, but he falls prey to many of the same forecast problems that we face every day. What we can learn from Phil is how not to fall into those traps ourselves and work on taking bias out, finding the right models and proper use of fitting, understanding variability and probabilities, and using error to improve instead of judge. Finally, the greatest lesson Phil teaches us is that no matter how bad your forecasts are, there is still hope to become a celebrity and have your company celebrate you for what you do.

 

 

 

]]>
https://demand-planning.com/2021/02/02/what-forecasters-can-learn-from-punxsutawney-phil-on-groundhog-day/feed/ 0
Product Portfolio Optimization – Journal of Business Forecasting (Special Issue) https://demand-planning.com/2016/02/29/product-portfolio-optimization-journal-of-business-forecasting-special-issue/ https://demand-planning.com/2016/02/29/product-portfolio-optimization-journal-of-business-forecasting-special-issue/#respond Mon, 29 Feb 2016 17:09:24 +0000 https://demand-planning.com/?p=3148 COVER_Winter_2015-2016_Product_Portfolio_Optimization_HIGH_RESWithin the pages of this particularly exciting issue, you will read articles written by the best minds in the industry to discuss multiple important aspects of Product Portfolio Optimization. This is an important topic because in today’s highly competitive market, it is becoming more important than ever to look for ways to cut costs, and increase revenue and profit. Markets are now demand driven, not supply driven.

Globalization has intensified competition. Every day, thousands and thousands of new products enter the market, but their window of opportunity is very narrow because of shorter life cycles. Plus, too much uncertainty is associated with new products. Their success rate is from poor to dismal—25% according to one estimate. Despite that, they are vital for fueling growth. Big box retailers are putting more pressure on suppliers to provide differentiated products. Consumers want more choices and better products. All these factors contribute to the greater than ever number of products and product lines, making management of their demand more complex, increasing working capital to maintain safety stock, raising liability of slow-moving and obsolete inventory, and increasing cost of production because of smaller lots and frequent change overs. Product portfolio optimization deals with these matters.

Product portfolio optimization includes the following: one, how to rationalize products and product lines and, two, how to manage most effectively their demand. Product rationalization includes deciding which products and product lines to keep and which ones to kill, based on the company’s policy. Demand management, on the other hand, is leveraging what Larry Lapide from University of Massachusetts and an MIT Research affiliate calls 4Ps (Product, Promotion, Price, and Place) to maximize sales and pro‑t. The sales of low-performing product lines may be bumped up with a price discount, promotion, line extensions, or by finding new markets.

[bar group=”content”]
Although the S&OP process has a component of product portfolio optimization, its team members pay nothing more than lip service to it. Pat Bower from Combe Incorporated discusses in detail the process of product portfolio optimization in the framework of new products. How new products should be filtered from ideation to development and, after launch, how they should be leveraged. Their window of opportunity is very small; most CPG products flame out within the first year of their existence, says Pat.

Mark Covas from Coca-Cola describes in detail 10 rules for product portfolio optimization. He suggests companies should divest low margin brands, no matter how big they are. Many companies such as ConAgra Foods, General Mills, Procter & Gamble, and Estée Lauder are doing it. This makes the allocation of marketing dollars more productive—taking funds away from low performing brands and giving to high performing ones.

Charles Chase from SAS and Michael Moore from DuPont recommend the Pareto principle of 80/20 to determine which products or product lines to concentrate on in their portfolio optimization e­fforts. Greg Schlegel from SherTrack LLC. Goes even further and proposes that this principle should be extended even to customers. He categorizes customers into four: 1) Champions, 2) Demanders, 3) Acquaintances, and 4) Losers. He then describes a strategy for dealing with each one of them. Greg Gorbos from BASF points out hurdles, political and others, that stand in the way of implementing the optimization policy, and how to deal with them. Clash occurs among different functions because of difference in their objectives. Sales looks to achieve revenue targets, while Marketing looks to hold market share and increase profit. Finance also looks at profit, but seeks to reduce cost and increase capital flow, while Supply Chain looks at cost savings. Communication is another issue Greg points out. The company may decide to deactivate a product, but information about it is not communicated to all the functions. Je­ff Marthins from Tastykake talks, among other things, about the exit strategy, which he believes is equally important. He says that we cannot deactivate a product without knowing its inventory position, as well as holding of raw and packaging materials for it.

For survival and growth in today’s atmosphere, it is essential to streamline the product portfolio to reduce costs, and increase revenue, profit, and market share. This issue shows how.

I encourage you to email your feedback on this issue, as well as on ideas and suggested topics for future JBF special issues and articles.

Happy Forecasting!

Chaman L. Jain
Chief Editor, Journal of Business Forecasting (JBF)
Professor, St. John’s University
EMAIL:  jainc [at] stjohns.edu

DOWNLOAD a preview of this latest Journal of Business Forecasting (JBF) Issue

Click HERE to join IBF and receive a JBF Complimentary Subscription

 

]]>
https://demand-planning.com/2016/02/29/product-portfolio-optimization-journal-of-business-forecasting-special-issue/feed/ 0
Risk-Adjusted Supply Chains Help Companies Prepare for the Inevitable https://demand-planning.com/2016/02/19/risk-adjusted-supply-chains-help-companies-prepare-for-the-inevitable/ https://demand-planning.com/2016/02/19/risk-adjusted-supply-chains-help-companies-prepare-for-the-inevitable/#respond Fri, 19 Feb 2016 16:25:51 +0000 https://demand-planning.com/?p=3116 Each time I get in my car and drive to work, or the grocery store or wherever, there are a myriad of dangers that I might encounter. I could get t-boned at an intersection by a distracted driver; I might blow a tire and swerve into a ditch or a piece of space debris could crash through my windshield. Some perils are, obviously, less likely than others, but the reality is, anything can happen.

While I don’t obsessively worry about every possible risk, I am aware of the possibilities and I take measures to lower both the odds and severity of a mishap. I keep my vehicle well maintained, I buckle up and I pay my auto insurance. Similarly, today’s supply chain professionals must be more conscientious and proactive in their efforts to mitigate the risk of a supply chain disruption and to minimize the impact when the inevitable does occur.

As much as we may feel at the mercy of disruptions from severe weather, natural disasters, economic instability or political and social unrest, members of today’s high tech supply chain have never been better equipped to minimize the risks and capitalize on the opportunities that may arise from a supply chain disturbance.

One of the most simple, but powerful, tools at our disposal is information. Twenty-four hour news stations, social media and cellular communications give us literally instant access to events occurring in the most remote reaches of the world.

More tactically, mapping the physical network of the supply base, including manufacturing facilities, warehouses and distribution hubs, is an important part of any risk management strategy. The key here is mapping the entire supply chain network, not just top-spend suppliers or first-tier contract manufacturers. Most of this information is relatively accessible through supplier audits and, with the help of Google maps, you can create a pretty comprehensive picture of your physical supply chain.

[bar group=”content”]

Remember, though, supply chains are much more fluid than they have ever been. Today’s multinationals are likely to rely on three to five different contract manufacturers (CMs) and original design manufacturers (ODMs), and scores of other suppliers around the world for the tens of thousands of parts needed to build and maintain their products. With outsourced production so commonplace, production lines can be shifted between locations within a matter of weeks, so frequent monitoring and updating of supply chain shifts is critical.

IoT technology such as sensors and RFID tracking can also provide meaningful intelligence that may be used to identify and mitigate risk throughout the end-to-end supply chain process. The ability to gather and analyze these constant data inputs is a recognized challenge throughout the supply chain profession. Those who master the digital supply chain sooner, will enjoy a substantial competitive advantage.

Once these various vehicles are used to create a composite picture of the risk landscape, then risk mitigation strategies take center stage. These efforts can range from traditional techniques such as the assignment of a cache of safety stock to more intricate maneuvering of storage facilities and full network design. Deployment of these mitigation strategies requires a detailed recovery and communications plan.

In my upcoming presentation at IBF’s Supply Chain Forecasting & Planning Conference at the DoubleTree Resort by Hilton in Scottsdale, AZ, February 22-23, 2016, I will delve deeper into the growing range of potential disruptors in the high tech supply chain. I will outline the core elements of a comprehensive supply chain risk management strategy, including how to define and map the physical supply chain, the landscape around supply chain risks and their impact on financial metrics, and how to proactively assess potential risk. I hope to see you there.

]]>
https://demand-planning.com/2016/02/19/risk-adjusted-supply-chains-help-companies-prepare-for-the-inevitable/feed/ 0
Forecasting & Planning Learnings from Day 2 of IBF Academy: An Attendee’s Perspective https://demand-planning.com/2015/09/16/forecasting-planning-learnings-from-day-2-of-ibf-academy-an-attendees-perspective/ https://demand-planning.com/2015/09/16/forecasting-planning-learnings-from-day-2-of-ibf-academy-an-attendees-perspective/#comments Wed, 16 Sep 2015 14:23:57 +0000 https://demand-planning.com/?p=3054 Last Month, I had the opportunity to attend IBF’s Business Forecasting & Planning Academy held in Las Vegas. I recently shared some insights from the first day of the program. Day 2 was similarly eventful. Here are some highlights.

Forecast Error

The first session I attended on Tuesday was “How to Measure & Reduce Error, and the Cost of Being Wrong” an advanced session presented by Dr. Chaman Jain from St. John’s University.  Dr. Jain reviewed the basic methods and mechanics of how to compute forecast error and the pros and cons of each technique. It was interesting that IBF has found that more and more companies are moving from MAPE (Mean Absolute Percentage Error) to a Weighted MAPE (WMAPE) to focus their attention on errors that have a relatively larger impact or little to no impact at all.  Standard MAPE treats all errors “equally”, while WMAPE places greater significance on errors associated with the “larger” items. The weighting mechanism can vary, typically unit sales are used, but I was intrigued by the notion of using sales revenue and profit margin as well.  If a company has low volume items but they are big revenue and profit items, they would not want to miss an opportunity to focus attention on why they have significant errors on these items.

Another interesting concept that Dr. Jain discussed was the use of confidence intervals around error measurements.  Many companies report their error measurement as a single number and rarely present the error measure in terms of a range of potential errors that are likely. Having a view into the potential range of errors can allow firms to exercise scenario planning to understand the impact to supply chain operations and the associated sales based upon multiple forecast errors instead of a single number.

My last takeaway is related to the question of how much history should be used to support time series analysis. Dr. Jain stated, and I believe rightly so, that it depends. Are there potential seasonality, trend, business cycles, or one-time events? How much does one need to see these? What if the past is really not a good indicator anymore of the future? What if the drivers of demand for a product have substantially shifted? One technique suggested that seems sound is to test the forecasting model’s performance using different periods of historical data. Use a portion of the history to build the model, and the remaining portion to test the accuracy of the forecast against the actuals held out of model construction. Try different lengths until you find the one that has the lowest error and also allow the process to have different history lengths for each time series forecast.

Lean Forecasting & Planning

Next I attended another advanced session led by Jeff Marthins from Tasty Baking Company/Flowers Foods on “Lean Forecasting & Planning: Preparing Forecasts Faster with Less Resources”. The session focused on doing more with less, a common theme that has permeated the business world these last several years. Marthins’ session was really about how to focus on what matters in demand planning: looking at the overall process, agreeing to and sticking with the various roles and responsibilities in the process, and understanding how the resulting forecasts and plans are to be used by various consumers in the business which drives the level of detail, accuracy and frequency of updates.

To gain an understanding of the demand planning process, Marthins asked the participants to look at a picture of his refrigerator and answer “Do I have enough milk?” This relatively simple, fun question elicited numerous inquiries from the participants around consumption patterns, replenishment policies and practices, sourcing rules, supplier capacity and financial constraints that illustrated the various types and sources of information that are required to develop a solid, well-thought-out demand plan. It was a very effective approach that can be applied to any product in any company.

To illustrate the need to understand the level of accuracy required of a forecast, Marthins used the weather forecast. How accurate is the weather forecast? How often is it right? How precise does it need to be? Once we know the temperate is going to be above 90 degrees fahrenheit, does it matter if is 91 or 94 degrees?  Is there a big difference between at 70% chance of rain or an 85% chance of rain?  What will you do differently in these situations with a more precise weather forecast? Should I plan to grill tonight? Will I need to wear a sweater this evening? Can we go swimming?  If the answer is nothing, then the precision does not really matter and spending time and effort creating or searching for greater forecast accuracy is a “waste” and wastes should be eliminated or reduced in Lean thinking. Marthins also stressed the value of designing your demand planning process with the usage of information in mind. Adopting a Forecast Value Add (FVA) mentality to assess whether each step in your forecasting and demand planning process is adding value will help to accomplish this. Start by asking if the first step in your forecasting process results in greater accuracy than a naïve forecast such as using the same number as last time you forecasted, or a simple moving average? When your accuracy improves with each step in the process, is it worth the effort or time it takes? Can I be less accurate and more responsive and still not have a negative impact? If I can update my forecast every day with 90% accuracy versus once a week with 92% accuracy, or once a month with 96%, which is better? How responsive can I be to the market by making daily adjustments that are nearly as accurate as weekly ones?

In yet another session, the topic of scenario analysis was raised. The team at IBF are getting this one right making sure it is discussed in multiple sessions. What I wonder is how many companies are adopting scenario analysis in the demand planning and S&OP processes? From my experience it is not the norm.  Marthins suggested testing the impact of various forecasts, and hence forecast accuracies, would have on supply chain performance and even using scenario analysis to understand if a systematic bias, either high or low, might make sense. I have known companies that have employed the policy of allowing overestimating to ensure their resulting demand plan was on the high side. Carrying more inventory even with all the associated costs was of greater benefit to the company than a lost sale or backorder. Bias is not a bad thing if you understand how it is used and its resulting impact, just like inventory is not an evil when used in a planned and methodical manner.
[bar group=”content”]

Data Cleansing

After lunch I attended my second session delivered by Mark Lawless from IBF “Data Cleansing: How to Select, Clean, and Manage Data for Greater Forecasting Performance”. As in any analytical process, the quality of the inputs are crucial to delivering quality results. Unfortunately I had another commitment during the session and I could not stay for all of it.

Lawless discussed a variety of ways to look at the data available, decide if it should be used, update or modify it, fill in missing values and apply various forecasting techniques.  Simple reminders and tips such as consideration and awareness for how data is provided in time periods, e.g., fiscal months (4/4/5) or calendar months, and how they should be reported was a good reminder to make sure the data inputs are clearly understood as well as how the output from the forecasting process will be used.

While most of what I heard was related to the data going into the forecasting process, Lawless did spend time talking about various analytics associated with assessing the output of the process. You might be expecting me to talk about various error and bias metrics again but that is not the case. Rather, the idea is to look at the error measurement over time.  What is the distribution of errors? Do they have a pattern or random? If there is a pattern, there is likely something “wrong” with the forecasting process. It made me think about the application of Statistical Process Control (SPC) techniques that are most often applied to manufacturing processes but can be applied to any process. SPC control charts can be applied to check for patterns such as trends, systematic sustained increases, extend periods of time at unexpected very high or very low errors, randomness of errors, and many more. It gets back to the notion that in order to improve the quality of the demand planning process it must be evaluated on a regular basis and causes for its underperformance understood and corrected as much as possible or warranted.

Regression Analysis/ Causal Modeling

The final advanced session of the Academy was delivered by Charles Chase from the SAS Institute on “Analytics for Predicting Sales on Promotional Activities, Events, Demand Signals, and More”.  This session was about regression modeling on steroids.  As someone who has used regression models throughout my career I could easily relate to and appreciate what Chase was discussing.  In two hours Chase did a great job exposing attendees to the concepts, proper use, and mechanics of multivariate regression modeling that would typically be taught as an entire course over weeks.

While time series models are a staple used to forecast future demand, they provide little to no understanding of what can be done to influence the demand to be higher or lower. They can be used to decompose the demand into components such as trend, seasonality and cycles which are important to understand and respond to.  They are focused on the “accuracy” of the predicted future.  Regression models however describe how inputs effect output. They are an excellent tool for shaping demand. Regression models can help us understand the effect internal factors such as price, promotional activity, and lead-times, as well as external factors such as weather, currency fluctuations, and inflation rates have on demand. The more we can create predictive models of demand based on internal factors the more we can influence the resulting demand as these factors are ones we control/influence as a firm. If external factors are included, forecasts for the future values of these inputs will be needed and we become more reliant on the accuracy of the input forecasts to drive our model demand.

In case you missed it, you can see pictures from the 2015 IBF Academy HERE.

I trust I have brought some insight into IBF’s recent Academy in Las Vegas and perhaps offered a nugget or two for you to improve your forecasting and demand planning activities. If only I would have learned something to apply forecasting success at the gaming tables :).

]]>
https://demand-planning.com/2015/09/16/forecasting-planning-learnings-from-day-2-of-ibf-academy-an-attendees-perspective/feed/ 1
Are You Effectively Leveraging Point-of-Sale (POS) Data In Your Forecasting & Inventory Management? https://demand-planning.com/2015/09/09/are-you-effectively-leveraging-point-of-sale-pos-data-in-your-forecasting-inventory-management/ https://demand-planning.com/2015/09/09/are-you-effectively-leveraging-point-of-sale-pos-data-in-your-forecasting-inventory-management/#comments Wed, 09 Sep 2015 17:39:09 +0000 https://demand-planning.com/?p=3039 Today, we have an explosion of data. It is estimated that 2.5 quintillion bytes of data are created every day with 90% of the world’s data created in the past 2 years!

The key question becomes what do we do with all this data? In the past, companies have always struggled with managing and analyzing large sets of data and could seldom generate any insights.

However, what’s different today vis-à-vis five years ago, is that we now have the ability to cleanse, transform and analyze this data to generate actionable insights. Moreover, today’s retail consumers are extremely demanding and want choices on “When”, “Where” and “How” to purchase product. Whether it is a traditional stand-alone retail store, shop-in-shop, website or mobile app; consumers want the flexibility to research, purchase and return product across multiple channels.

Today, many retailers and wholesalers have a vast amount of POS data available. However, many of them still don’t use the data at the lowest level of detail in their demand planning cycle. The result is significant out of stocks and inability of consumer to find product at the stores.

For a company to be successful in today’s Omni-channel environment, three key steps are needed:

1) Use Point-of-Sale (POS) data as a key input into demand plans: POS is the data that is closest to the consumer and is the purest form of demand- it is critical to leverage this data at the right level of detail into a product’s demand plans. Information available at stock-keeping-unit (SKU) level- should be aggregated and disaggregated to ensure that all attributes of a product are factored into the planned forecast.
[bar group=”content”]

2) Link Point-of-Sale (POS) data to your Allocation & Inventory Management Systems: Today’s allocation systems have the ability to read sell-thru at POS and react and replenish based on what product is selling and what is not. It is critical to make sure that these systems are linked together so that the process is automated and seamless. Linking these systems will allow retailers to send the right product to the right store at the right time- thereby maximizing the chances of making a sale. This will not only contribute to top-line, but will also make our inventory investments more productive.

3) Collaboration with Value Chain Partners to share Point-of-Sale (POS) data: Today’s retail world is complex, many companies have multi-channel operations and work with a number of channel partners to distribute their products. In such a scenario, it is not always easy to gain access to POS data. However, it is important for companies to invest in a CPFR program (Collaborative Planning, Forecasting and Replenishment) that can give them access to downstream POS data which can be used to build better forecasts. It is critical to emphasize a “Win-Win” relationship for both companies and channel partners to bring everyone along on the collaboration journey

Along with Rene Saroukhanoff, CPF, Senior Director at Levi’s Strauss & Co, we’ll be talking about the above, as well as how to use size forecasting, optimized allocation, and visual analytics at IBF’s Business Planning & Forecasting: Best Practices Conference in Orlando USA, October 18-21, 2015.  I look forward to hopefully meeting you at the conference!  Your comments and questions are welcomed.

]]>
https://demand-planning.com/2015/09/09/are-you-effectively-leveraging-point-of-sale-pos-data-in-your-forecasting-inventory-management/feed/ 1
New Learnings from Day 1 of IBF's Business Forecasting & Planning Academy: An Attendee's Perspective https://demand-planning.com/2015/08/25/new-learnings-from-day-1-of-ibfs-business-forecasting-planning-academy-an-attendees-perspective/ https://demand-planning.com/2015/08/25/new-learnings-from-day-1-of-ibfs-business-forecasting-planning-academy-an-attendees-perspective/#respond Tue, 25 Aug 2015 19:52:26 +0000 https://demand-planning.com/?p=3027 Last week I had the opportunity to attend IBF’s Business Forecasting & Planning Academy held in Las Vegas. The two days were filled with fourteen educational sessions, three roundtable discussions, and multiple opportunities for connecting with peers and instructors.

Each educational session, organized as introductory or advanced level, was two hours in length allowing for a deeper dive into content with plenty of opportunity for participant interaction. The instructors were academics, industry practitioners, and software providers giving the attendee a nice blend of viewpoints and experiences.

The first session I attended on Monday was conducted by Dr. Larry Lapide from MIT on Designing and Implementing a Successful Collaborative Demand Forecasting Process. The introductory level session was hands on and highly interactive. Participants were placed into four teams and asked to focus on a case study with questions around organizational design of the demand planning function, reporting needs of the Sales & Marketing, Operations and Finance organization, and various forecasting methods to employ. Dr. Lapide “challenged” the various answers provided by the teams in a manner that allowed for deeper understanding and awareness.

One of my takeaways from the session, and one I heard in several others, is the ongoing challenge companies have to not take the unbiased, unconstrained statement of demand, or for that matter the demand plan, and replace it with the financial budget. Too often firms are not paying attention to the demand signals in the market and turning the projection of future demand (forecast) into a demand plan that mirrors the financial budget created anywhere from weeks to months to quarters before.

Another takeaway was the reminder to design a forecasting process that incorporates multiple methods based upon the various characteristics of the customers, markets, channels and products. Applying segmentation approaches prior to selecting techniques such as time series forecasting, lifecycle forecasting, and collaboration to gain real time knowledge and expertise, will allow for a more robust and effective process tailored to the needs of each segment.

Next I attended the introductory session How to Sell Forecasts to Top Management and Understand the Power of One Number Planning given by Jeff Marthins, Director Supply Chain Operations, Tasty Baking Company/Flower Foods. This was a very pragmatic session with Marthins sharing Tastykake’s journey with S&OP starting in 2005. He spoke about the value of running the business from one set of numbers and using the budget as a benchmark rather than the demand plan or forecast. He made it clear that the forecasts need to be in terms that the various consumers of information can relate to: revenue, units, capacity, etc…

I was intrigued by one of his questions related to demand planner capabilities: if you could pick between analytical or communication skills which would you choose? While both are needed, I believe the analytical skills are the easier of the two to become good at. I would start with solid communication skills. To develop a comprehensive plan that is adopted, a demand planner needs to be an excellent listener, taking information and insights from various sources; an engaging and thoughtful facilitator to guide consensus dialogues; and a crisp, clear, and confident speaker to communicate and defend the rationale for the demand plan being presented and ultimately agreed to by senior leaders and stakeholders.

Marthins’ discussed the need to spend more time to understanding why the plan is different than the actual demand. Was the forecast and/or demand plan low or high because of promotional lift errors; unforeseen market changes; new production launch timing, trajectory, or cannibalization estimates of existing product; or outside influencers such as weather and competitor actions to name just a few? Root cause analysis is something that as a supply chain planning and analysis community we need to do more. Demand plans and forecasts will always be wrong. Hopefully over time they will become more and more accurate. But if we are not researching the reasons why our plans and KPI targets are not being met, we should not have high expectations that they will be achieved in the future.
[bar group=”content”]

I had a huge smile and kept nodding my head when Marthins started praising the need and benefits of scenario management and contingency planning as part of the S&OP process. While the output of an S&OP cycle is typically an agreed to set of numbers, they should not be obtained by looking at only one set of “inputs”. Understanding the implications of various scenarios with changes to demand and supply is needed to have a comprehensive understanding and agreement for a course of action. Scenario management is an excellent means to show decision makers the impact of their opinions about the future while keeping the discussion fact based. Contingency planning allows for a higher degree of responsiveness for risk mitigation actions to be put in place.

The final session of the day I attended was presented by Mark Lawless, Senior Consultant from IBF on Long Term Demand Planning & Forecasting: The Key to Corporate Strategic Planning. Lawless did a nice job throughout the session educating the attendees on the differences between long term (three to five years) and short term demand planning and forecasting. It was helpful to be reminded of the difference between a forecast – an unbiased prediction or estimate of an actual value at a future time and a demand plan – a desired outcome at a future time. Time was spent discussing how firms can shape the future demand, the more aggregated levels of detail to plan with, and the need to engage external subject matter experts in the planning process.

Looking three to five years into the future is not just about applying a time series technique. Companies must rely on internal and external domain experts to assist with potential changes in markets, competitors, customers, and consumers; technology and business cycle impacts; changes in demographics and regulatory environment and many other areas of potential impact. Thinking about where competition will come from is not always obvious. Five or more years ago, would the camera manufacturers have seen their market being potentially challenged by smart phones? Totally not related to the event, but I was intrigued to search for more: in 2000, 86 billion photos were taken with 99% analog (film), in 2011, over 380 billion photos were taken 1% analog. If you were the long range demand planner for camera film would you have seen this coming? Another crazy statistic, that shows that history alone is not always a great gauge for developing future demand plans, in 2011 we snapped as many photos in two minutes as humanity as a whole snapped in the 1800’s. Would this long range trend have been detected by a time series technique?

Long range demand planning requires us to understand the drivers of our demand even more so than short term demand. Our ability to respond to short term sharp changes may be limited, while changes in long term demand can be addressed. Regression, ARIMA, or ARIMAX models are very helpful in this area. Developing models that help explain demand as a function of price, feature/function, market trends, economic factors, age, income, education, marketing, and numerous others allows us to not only see the impact to demand of changes in these variables, but enables us to determine the levers to pull to shape the demand in our favor.

See my next post on the highlights from day two of the Academy. Your thoughts and feedback are always welcomed!  You can also see pictures from the IBF Academy HERE.

]]>
https://demand-planning.com/2015/08/25/new-learnings-from-day-1-of-ibfs-business-forecasting-planning-academy-an-attendees-perspective/feed/ 0
How Many SKUs Can A Forecaster Manage? —IBF Research Report 14 https://demand-planning.com/2015/08/03/how-many-skus-can-a-forecaster-manage-ibf-research-report-14/ https://demand-planning.com/2015/08/03/how-many-skus-can-a-forecaster-manage-ibf-research-report-14/#respond Mon, 03 Aug 2015 15:54:11 +0000 https://demand-planning.com/?p=3018 Cover_IBF_RSCH_Report_14_v2

 

It is difficult to arrive at one fixed number of SKUs that a forecaster can manage, because situations vary from industry to industry and company to company. There are several factors at play. It depends on how easy or difficult it is to forecast, what the lead time is, the cost of forecast error, whether forecasts are prepared on an aggregate or granular level, type of data used, whether ABC
classification is used to allocate forecasting time, whether customers’ input are used in reconciling forecasts, and/or the sophistication of technology used to generate forecasts.

This Institute of  Business Forecasting & Planning – IBF Research Report provides guidance on how many demand planners we really need, as well as, how many SKU’s they should manage respectively.

The Table of Contents includes:

1. Introduction
2. How Easy or Difficult to Forecast
3. Cost of Forecast Error
4. Level of Aggregation Required
5.  Type of Data Used
6. Segmentation / ABA Classification
7. State of Technology
8. Survey Results
9. Conclusion
10. Table 1 | Number of SKUs Per Forecaster By Size of Company
11. Table 2 | Number of SKUs Per Forecaster By Total Number of SKUs at the Company

Preview this IBF Research Report 14, HERE.

 

 

]]>
https://demand-planning.com/2015/08/03/how-many-skus-can-a-forecaster-manage-ibf-research-report-14/feed/ 0
Passion & Spice Missing From Your S&OP Relationship? https://demand-planning.com/2015/06/25/passion-spice-missing-from-your-sop-relationship/ https://demand-planning.com/2015/06/25/passion-spice-missing-from-your-sop-relationship/#comments Thu, 25 Jun 2015 15:45:14 +0000 https://demand-planning.com/?p=3001 Patrick Bower

Patrick Bower

During a recent IBF conference in Chicago I was asked numerous times about how to make Sales & Operations Planning (S&OP) less boring.    It is not the first time.    This is a very common question.    I suspect this is becoming more common place as S&OP processes mature past “storming” and into “norming”.   When processes hit this inflection point it helps to draw on communication theory while at the same time asking hard questions about how the process is managed. Here is an example of such a letter:

Subject: My S&OP meeting is boring?!?

Dear S&OP Doctor,

My S&OP executive review meetings are boring. Is this normal? I was hoping they would be more interesting, but they are not. Even I am bored and my executives seem to be bailing out of my meetings. What can I do to save my process?

Signed,
Unbalanced In Poughkeepsie.

—————————————————————–

Dear Unbalanced In Poughkeepsie,

You have a very common problem. I am asked this question all the time. Let’s be honest: boring happens. You can choose to accept this and act on it, or you can ignore it. If you ignore it, you run the risk of losing executive support for your S&OP process.

Of course, some would say that boring S&OP meetings aren’t all that bad; that boring is probably an indicator of process stability. If you are bored in a marriage or other long-term relationship, it might not be great but it is probably not bad. A good marriage counselor might suggest putting a little spice into your relationship. Maybe you schedule a date night or introduce a romantic wish bowl into the equation to reconnect and reengage in slightly different ways. Unfortunately, there is no easy advice for spicing up bad S&OP meetings.

It may be helpful to break up the routine every now and then to add some interest to the process. But this requires looking at how you orchestrate your process to determine if there are ways to make the meetings more relevant and interesting to your executives:

  • Are you gap-closing prior to the executive meeting? Some process leaders try to resolve all plan gaps prior to the meeting, leaving nothing for the executive team to decide. Ideally the executive S&OP meeting is the forum where senior-level meeting participants leverage their insights and expertise to determine the best ways to resolve gaps to the plan. If there are no gaps in the business then count your blessings (lucky you). However, if you are closing all gaps prior to the meeting, you are essentially discouraging the executive-level dialogue that can be such a valuable part of the process. Like a good marriage, S&OP should be a participative process.
  • Are you inadvertently stifling conversation during the executive meeting? There is often a tendency to overly discuss or vet content prior to the executive S&OP meeting, which often leads to boring S&OP meetings. Think of it this way: if you trade text messages and e-mails with your significant other and tweet or post every action and thought you may have all day long, you effectively remove all mystery from the relationship, leaving little to discuss in person. If you preview and debate all content prior to an S&OP meeting, in smaller settings, you leave nothing for the group as a whole to chew on. Save some conversation for the meeting.
  • Is the meeting managed in a way that is too routine, so there are no items of interest? While keeping a consistent agenda may be a safe way to run a process meeting, it will surely lead to boredom. It is important to mix up the content a bit—to change formats or occasionally add some tangential information that may not be required of the process—to effectively change up the routine. For example, if you always take your partner bowling on Wednesday’s and have dinner at Luigi’s on Friday, eventually they will likely find you boring! Mixing up the routine is as good for S&OP as it is for your relationship. For instance, I often ask sales and marketing personnel to present additional information. We commonly review and discuss details about new products, new package designs, new media campaigns, or the analysis of new promotions. Such factoids not only spark interest and awareness by mixing up the standard meeting content, they also yield insights that add meaningful (sometimes actionable) context to the overall plan.
  • Imagine going to the same dinner party with the same people every month. After a few years, you will have heard Joe’s struck-by-lightning story a hundred times and know about every bolt and screw in Mary’s 1970 302 Boss Mustang. Want to mix things up? Try bringing a new guest each month. Consider inviting a salesperson from a special class of trade or account to talk about what is selling well and what is not. Or bring in a plant manager or a distribution manager or a logistics manager to talk about their challenges. There is always a standard S&OP invite list, but adding other people to the mix can promote interest as well as drive greater awareness of the S&OP process throughout your organization. I have invited folks from general counsel, treasury, consumer affairs, and HR, as well as media and creative to S&OP meetings. They are universally grateful for a chance to see inside the process, and they often add great insight.
  • Have you ever asked your partner “How are we doing?” or “Are you happy?” Sure, these can be risky questions to ask, but openness and honesty are keys to effective communication and proactively developing strong relationships. Similarly, astute S&OP process leaders should not hesitate to discuss questions related to risk. I often ask presenters during S&OP meetings to discuss any risks in their business. There could be risk in the plan, or in net income, or there could be supply side risk such as capacity issues or obsolete inventory. Risk-related discussions stimulate deeper thought and engagement.
  • When I was a kid I used to hate when my Aunt Ann would make us watch slides from her trips to Mexico or Japan. I always seemed to fall asleep somewhere in the middle of the show. For this same reason I do not always display S&OP content using an overhead projector. I consciously mix it up. If I see highly graphical information being presented I will put it on the screen, otherwise I have found that working off of paper (or tablets, or laptops) is much better in terms of stimulating a dialogue than showing my S&OP content on a screen.
  • Are you at the front of the room giving a standard presentation? If so, you may want to consider moving to the center of the room. Positioning yourself within the midst of meeting participants engages more people in the process. I’ve learned to do this all the time. I feel closer to the people I am talking with, and it allows me to pick up on important visual cues and micro Imagine observing your significant other ever so subtly shaking his or her head no, even when they are agreeing with you. If you are adept at this skill, you may want to ask how they really feel (and of course you will look brilliant for caring so much). When I observe people expressing doubt, either via their facial expressions or their body language, I will stop and ask their opinion – in the hope of defusing that negative energy from the room.
  • Sometimes, you need to have a deep conversation. When something appears to be impacting the business, I will often ask our market research group to present a deep-dive review—on the competition, or market/category trends, or external influences on the business (economic trends, weather, etc.). I normally focus on a business line that is struggling and ask the market researcher to examine in detail what is happening. Most S&OP presentations lack true depth; knowing this, it is important to force in-depth conversation from time to time based on pressing need or urgent circumstances.
  • In relationships—in life and in S&OP meetings—you need to have some fun. Start your meeting with a corny joke (suitable for the workplace) or some self-effacing humor. Many studies have found that adding humor at the start of a meeting will create much more fluid and honest communication.

S&OP is chock-full of facts and numbers and can be decidedly uninteresting. Sparking interest, excitement, and a sense of engagement in the S&OP process is an unwritten part of the S&OP leader’s job description.

So, Unbalanced in Poughkeepsie, please try to introduce some fun into your meeting. Mix things up, try a little variety, think about how to add more passion & spice, and don’t be afraid to point out sensitive topics when it is important to discuss them.  Good luck!

Patrick Bower
Senior Director, Corporate Planning & Customer Service
Combe Incorporated

Patrick Bower has a wide area of expertise, including S&OP, Demand Planning, Inventory, Network Optimization, and Production Scheduling. A recognized expert on demand planning and S&OP, and a self-professed “S&OP geek” – Patrick was previously Practice Manager of Supply Chain Planning at the consulting firm, Plan4Demand where his client list included Diageo, Bayer, Glaxo Smith Kline, Pfizer, Foster Farms, Cabot Industries and American Girl. Patrick’s experience encompasses tenures with Cadbury, Kraft Foods, Unisys, and Snapple. Patrick also worked for the supply chain software company – Numetrix, and was Vice President of R&D at Atrion International. He was also the recipient of IBF’s 2012 award for Excellence in Business Forecasting & Planning and will be speaking at IBF’s Leadership Business Planning & Forecasting Forum taking place on October 19, 2015 in Orlando USA.  He is a regular contributor to IBF’s Journal of Business Forecasting and other well known publications on various topics around S&OP.

]]>
https://demand-planning.com/2015/06/25/passion-spice-missing-from-your-sop-relationship/feed/ 4
Organizational Agility: S&OP and Financial Integration Creates Integrated Business Planning (IBP) https://demand-planning.com/2015/06/16/organizational-agility-sop-and-financial-integration-creates-integrated-business-planning-ibp/ https://demand-planning.com/2015/06/16/organizational-agility-sop-and-financial-integration-creates-integrated-business-planning-ibp/#respond Tue, 16 Jun 2015 20:08:06 +0000 https://demand-planning.com/?p=2955 Mike Pape - Cabot

Mike Pape

Does your S&OP process require you to integrate plants across a global organization?  Do you have multiple forecasts for different functions and possibly different regions?  Increasingly, companies are challenged to lead the S&OP process across a global multi-national company.

As everyone is aware the S&OP process starts with an accurate forecast.  While there is no magic button to increase accuracy there are benefits to having one forecast that is used by Sales, Operations and Finance.  Measuring accuracy of the forecast is critical to driving ownership which leads to an improved forecast.

Once you align on a forecast then you need to drive that across the global organization.  If you have an integrated planning system you can quickly identify how the new demand impacts production and purchases across the entire supply chain.  As many companies experienced in 2009 the ability to quickly react to a changing demand is important for survival.

As a public company the ability to understand the Financial impact of the business plan is just as critical as defining your Operations plan.  Using a single source of data across Sales, Operations and Finance provides one source of truth and enables functions to align on the best direction for the company.

I’ll be speaking more about this journey at the APICS & IBF Best of the S&OP Conference taking place in Chicago, June 18-19, 2015.  S&OP is an ongoing journey and we will share what we have been able to accomplish as we proceed along our path to improve our business planning.  In this discussion we will review a global process that leverages a common set of data to drive the S&OP process. We will also review a global S&OP process and the technology used to develop an integrated plan.

Mike Pape
Director Global Planning & Logistic
Cabot Microelectronics

]]>
https://demand-planning.com/2015/06/16/organizational-agility-sop-and-financial-integration-creates-integrated-business-planning-ibp/feed/ 0