advanced analytics – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com S&OP/ IBP, Demand Planning, Supply Chain Planning, Business Forecasting Blog Tue, 05 Jul 2022 11:02:47 +0000 en hourly 1 https://wordpress.org/?v=6.6.4 https://demand-planning.com/wp-content/uploads/2014/12/cropped-logo-32x32.jpg advanced analytics – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com 32 32 Transitioning From Times Series To Predictive Analytics https://demand-planning.com/2022/06/30/transitioning-from-times-series-to-predictive-analytics-with-dr-barry-keating/ https://demand-planning.com/2022/06/30/transitioning-from-times-series-to-predictive-analytics-with-dr-barry-keating/#respond Thu, 30 Jun 2022 11:45:16 +0000 https://demand-planning.com/?p=9698

I recently had a fascinating and enlightening conversation with one of the leading figures in predictive analytics and business forecasting, Dr. Barry Keating, Professor of Business Economics & Predictive Analytics at the University of Notre Dame.

He is really driving the field forward with his research into advanced analytics and applying that cutting-edge insight to solve real-world forecasting challenges for companies. So I took the opportunity to get his thoughts on how predictive analytics differs to what we’ve been doing so far with time series modeling, and what advanced analytics means for our field. Here are his responses.

What’s the Difference Between Times Series Analysis & Predictive Analytics?

In time series forecasting, the forecaster aims to find patterns like trend, seasonality, and cyclicality, and makes a decision to use an algorithm to look for these specific patterns. If the patterns are in the data, the model will find them and project them into the future.

But we as a discipline realized at some point that there were a lot of things outside our own 4 walls that affected what we are forecasting and we asked ourselves what if we could in some way include these factors in our models. Now we can go beyond times series analysis by using predictive analytics models like simple regression and multiple regression, using a lot more data.

The difference here compared to time series is that time series looks only for specific patterns whereas predictive analytics lets the data figure out what the patterns are. The result is much improved forecast accuracy.

 

Does Time Series Forecasting a Have Place in the age of Advanced Analytics?

Time series algorithms will always be useful because they’re easy to do and quick. Time series is not going away – people will still be using holt-Winters, Box-Jenkins, times series decomposition etc. long into the future.

What’s the Role of Data in all This?

The problem now isn’t using the models but collecting the data that lies outside our organization. Data these days has different observations. We used to think when we had 200 or 300 observations in a regression we had a lot of data – now we might use 2 or 3 million observations.

“We used to think 200 observations was a lot of data – now we might use 2 or 3 million”

Today’s data is different not only because of the size, but also in its the variety. We don’t just have numbers in a spreadsheet – it may be streaming data, it may not be numbers but text, audio, or video. Velocity is also different; in predictive analytics we don’t want to wait for monthly or weekly information, we want information from the last day or hour.

The data is different in terms of value. Data is much more valuable today than it was in the past. I always tell my students to not throw data away. What you think isn’t valuable, probably is valuable.

Given we are Drowning Data, how do we Identify What Data is Useful?

When the pandemic started, digital purchases were increasing at 1% a year and constituted 18% of all purchases. Then, in the first 6 weeks of the pandemic, they increased 10%. That’s 10 years’ worth of online purchases happening in just weeks. That shift meant we now need more data and we need it much more quickly.

“You don’t need to figure out which data is important; you let the algorithm figure it out”

You don’t need to figure out which data is important; you put it all in and let the algorithm figure it out. As mentioned, if you’re doing time series analysis, you’re telling the algorithm to look for trend, cyclicality and seasonality. With predictive analytics it looks for any and all patterns.

Predictive analytics assumes that you have a lot of data – and I mean a lot

It’s very difficult for us as humans to take a dataset, identify patterns and project them forward but that’s exactly what predictive analytics does. This assumes that you have a lot of data and I mean a lot, and different to what we were using in the past.

Do you Need Coding Skills to do This?

Come to an IBF conference or training boot camp and you will learn how to do Holt-Winters, for example. Do we teach people how to do that in R, Python, or Spark? No. You see a lot of advertising for coding for analytics. Do you need to do that to be a forecaster or data scientist? Absolutely not.

There are commercial analytics packages where somebody who is better at coding than you could ever hope to be has already done it for you. I’m talking about IBM SPSS Modeler, SAS Enterprise Miner, or Frontline Systems XLMiner. All of these packages do 99% of what you want to do in analytics.

Now, you have to learn how to use the package and you have to learn enough about the algorithms so you don’t get in trouble, but you don’t have to do coding.

“Do you need to be a coder? Absolutely not”

What about the remaining 1%? That where coding comes in handy. It’s great to know coding. If I write a little algorithm in Python to pre-process my data, I can hook it up to any of those packages. And those packages I mentioned can be customized; you can pop in a little bit of Python code. But do you need to be a coder? Again, absolutely not.

Is Knowing Python a Waste of Time Then?

Coding and analytics are two different skills. It’s true that most analytics algorithms are coded in R, Python and Spark but these languages are used for a range of different things [i.e., they are not explicitly designed for data science or forecasting] and knowing those language allows you do those things, but being a data scientist means knowing how to use the algorithms for a specific purpose. In our case as Demand Planners, it’s about using K Nearest Neighbor, Vector Models, Neural Networks and the like.

All this looks ‘golly gee whiz’ to a brand-new forecaster who may assume that coding ability is required, but they can actually be taught in the 6 hour workshops that we teach at the IBF.

What’s the Best way to get Started in Predictive Analytics?

The best way to start is with time series, then when you’re comfortable add some more data, then try predictive analytics with some simple algorithms, then get more complicated. Then when you’re comfortable with all that go to ensemble models where, instead of using 1 algorithm, use 2, 3, or 5. The last research project I did at Notre Dame used 13 models at the same time. We took an ‘average’ of the results and the results were incredible.

The IBF workshops allow you to start out small with a couple of simple algorithms that can be shown visually – we always start with K-Nearest Neighbor and for a very good reason. I can draw a picture of it and show you how it works without putting any numbers of the screen. There aren’t even any words on the screen. Then you realize “Oh that’s how this works.”

“Your challenge is to pick the right algorithm and understand if it’s done a good job”

It doesn’t matter how it’s coded because you know how it works and you see the power – and downsides – to it. You’re off to the races; you’ve got your first algorithm under your belt, you know the diagnostic statistics you need to look at, and you let the algorithm do the calculation for you. Your challenge is to pick the right algorithm and understanding whether it’s done a good job.


To add advanced analytics models to your bag of tricks, get your hands on Eric Wilson’s new book Predictive Analytics For Business ForecastingIt is a must-have for the demand planner, forecaster or data scientist looking to employ advanced analytics for improved forecast accuracy and business insight. Get your copy.

 

 

 

 

]]>
https://demand-planning.com/2022/06/30/transitioning-from-times-series-to-predictive-analytics-with-dr-barry-keating/feed/ 0
Big Data & Advanced Analytics Will Not Save Your Company https://demand-planning.com/2020/05/26/big-data-advanced-analytics-will-not-save-your-company/ https://demand-planning.com/2020/05/26/big-data-advanced-analytics-will-not-save-your-company/#comments Tue, 26 May 2020 11:17:18 +0000 https://demand-planning.com/?p=8511

I’ve got bad news for you . . . Well, perhaps not bad news but certainly news that is important.

All that data you have been collecting? Most of it is probably junk.

Those advanced analytical tools? Worthless.

Pretty bold statements, right?

Not really.

In my nearly 30 years in supply chain, I have never seen a system or data that by itself had any value. It was the people who used these that added the value.

Example:

This statement has no value: “This item is 92% in stock.”

So? Is 92% good or bad? For an ongoing item, it might be too low. For a discontinued item it could well be too high, since the goal is to run out of inventory.

OK, smarty-pants, what is your point?

Your data and analytical tools are only as good as the people who use them.

With all the focus on data and analytical tools, I think we are in danger of losing focus on the one element that is key to their successful use:

Tools that are not properly used are worthless and can actually be harmful.

Do not give your 2-year old son a hammer. Just some advice.

Now that I have your attention – and I am sure I have made some of you angry – here are 5 key practices that will make or break your big data and advanced analytical dreams.

1. People will not effectively use data they do not understand

A person who does not understand data probably will not use it well. In fact, they may abuse it to explain away a problem. Someone who does not grasp that your comp sales percentage is negative, or that your in-stock level of 92% is well under the expected level cannot possibly take the right actions to address either problem. And saying, “We’re only 8% out of stock on the item – it can’t be that bad”, is missing the point of using the metric.

2. People need to know how to use and evaluate the data and systems they use

This is largely a matter of education and training. Users need to know that the systems they use have limitations (they all do) and that they are good for some processes and not good for others. For example, does your demand planning system allow for adjusting the history of an item? And does the user understand why this is important and how to do it properly? When do your reports update, and how are key measures (in-stock, lead-time, fill rate, etc.) calculated? Users need the training and tools to be able to evaluate both the tools they use and the data that drives these tools to be able to use them effectively.

3. Users need to trust the data and tools they use

This follows from point # 2 above.

Users who do not understand the data and tools they use cannot properly evaluate and effectively use them. Reports may have bad data and systems may fail to update properly. Users need to be able to spot when a report has errors or a system gives an incorrect output. Some of this comes only with experience, but training users to evaluate and question these is important, since they and others will be using them to make decisions about where to spend company resources – including their own time and energy.

4. Users need to understand how to use the data and tools they are given

This is where training is key, and where many companies suffer because they leave this to chance. The approach is often, “Here’s a manual – figure it out.” Some users can manage learning this way, but many cannot. And while training is expensive and often hard to justify, it amazes me what companies will tolerate as users “learn how to use the systems.” (And in the interest of full disclosure, yes, I have “accidentally” ordered $2M dollars of unneeded inventory. In this business we all get many opportunities to make spectacular errors.)

5. Users need to know how to challenge the validity of the data they use and the value of the systems they manage

To me, this is the goal of all training. When users understand how the tools they use are constructed, can evaluate their usefulness for themselves, trust them, and know how to use them effectively, they can then become effective users of those tools.

And only at this point does all the data you have compiled, and all the fancy systems you purchased start to add significant value to your company.

So the next question is, where are your users in this process of growth?

Your data and analytical tools are only as good as the people who use them.

I know, training is expensive and hard to justify. But this is usually because we don’t calculate the cost of not training our people. We simply live with the cost and inconvenience and call this a “cost of doing business.”

I have personally trained hundreds of users in effectively managing complex replenishment and reporting systems. It’s challenging to try to meet all the levels of ability and understanding in even a small class. But the payback in terms of productivity and the student’s sense of personal accomplishment – while hard to price – is worth all the effort.

And it’s also the only way that any company will get the full value out of the data and systems that they invest in.

 

 

]]>
https://demand-planning.com/2020/05/26/big-data-advanced-analytics-will-not-save-your-company/feed/ 1
Big Data? Chill Out & Keep It Old School https://demand-planning.com/2018/03/22/big-data-chill-out-keep-it-old-school/ https://demand-planning.com/2018/03/22/big-data-chill-out-keep-it-old-school/#respond Thu, 22 Mar 2018 13:12:02 +0000 https://demand-planning.com/?p=6474

Over the past few years, the Demand Planning community has become quite starry eyed over advancements in predictive software and tools. The concepts of “Big Data” and “advanced analytics” are enough to make seasoned practitioners stand to attention – and even catch the interest of the Executive Team. But when many of us still struggle with the fundamentals, it is worth investing in new-fangled technology?

Admittedly, in a field where you know you will never be “right”, this fancy technology and impressive phrases are quite attractive – they bring to mind a picture of a utopian state where analytical horsepower and near infinite data points lead to a 100% accurate forecast. There may even be a unicorn there. For me, I can’t help but be reminded of two much-loved colloquialisms that I urge Demand Planning professionals to consider as we journey into the future with new tools and ideas that may or may not usher in a new age of Demand Planning.

The only thing we know about Advanced Analytics is that you must have clear, fully costed plan as to how it is going to provide a return.

“Is The Juice Worth The Squeeze?”

This is a phrase I say probably too frequently when considering new tools, methods, and processes to improve Demand Planning. Does the effort required to explore and/or implement the new approach measure up to its expected return? For some organizations, intensified data collection (or data purchase) and the machine capability to chug through it may be cost prohibitive.

Computing equipment and data aside, the organization may not have the human resources on hand to give these capabilities their due. Perhaps the organization already enjoys a high level of forecast accuracy. Is the expenditure worth that extra percentage point? Maybe, maybe not. The only thing we know is that you must have clear, fully costed plan as to how this new tech is going to provide a return.

Don’t Throw The Baby Out With The Bathwater”

Or, “don’t throw the fundamentals out when you get your shiny new tools”. Even if your organization does decide to invest in Big Data and/or Advanced Analytics, it’s important to not abandon some of the tried-and-true measures and methodologies of effective forecasting. If your organization decides not to invest in these buzzworthy tools, there is still a great amount of improvement that can be made using some tried-and-true Demand Planning basics. Additionally, these concepts can assist in answering the juice-vs-squeeze question of a potential upgrade or data investment if the organization chooses to entertain new solutions. Some of the most impactful are as follows:

Put down the Big Data Kool Aid – FVA is great low-hanging fruit to pursue prior to making a new technology investment.

Forecast Value Add Analysis (FVA)

Whether or not Advanced Analytics and insights are in your future, the impact of a simple Forecast Value Add (FVA) analysis cannot be overemphasized. FVA is a measurement of your forecasting process – from the statistical models utilized, to the overrides added by analysts and the insights from salespeople. Each step in the forecasting process is measured to determine the added value the step brings to the overall process. Advanced Analytics or sophisticated tools could of course be an added forecasting layer to be measured, but I would caution that if steps in your process are continuing to devalue the forecast, there are things to look at first. Put down the Big Data Kool Aid – FVA is great low-hanging fruit to pursue prior to making a new technology investment.

Keeping an eye on tracking signal is important no matter how sophisticated the forecasting methodology.

Tracking Signal

While somewhat reactive in nature, I love using tracking signal as an indicator to let me know if my forecast needs a second look. Tracking signal is simply a measure of consistent bias over time. In short, if actual demand has come in lower than forecasted for each of the last three months, you may want to reinvestigate your demand assumptions.

Not only is consistent under- or over- forecasting a reliable indication to an analyst that their projections may be incorrect, it is also a great signal of potential inventory shortages or surpluses. Keeping an eye on tracking signal is important no matter how sophisticated the forecasting methodology.

[bar id=”527″]

Forecast Accuracy

I’m sure many of you are rolling your eyes at this point. Of course we measure forecast accuracy, this isn’t even worth talking about! I challenge you to revisit and audit your metric. Most organizations are familiar with the debate on precisely when forecast accuracy should be measured – is it a month before the actuals are due to come in? Three months? A week before the actuals come in? The answer is likely that a measurement at material lead time is the most appropriate. After all, this is the time in which the supply chain can, in a perfect world, respond appropriately and without expediting to the demand signal.

Recent analysis in my own organization found that the traditional “T (time) minus a generic lead time” approach was not allowing us to gather the proper insights from our forecast accuracy metrics because lead times are so wildly disparate. As a result, a change to the metric was required and more insightful conversations are now being driven during the S&OP process.

There’s No Unicorn In Your Advanced Analytics Utopia

The latest and greatest technologies offer a very tempting vision of what the future could be; after all, who doesn’t want the powerhouse predictive analytics of an Amazon or Target? However, it’s important to approach these decisions with a healthy dose of skepticism. Be mindful to evaluate the promises being made and ensure they are aligned with your needs. And, if the juice truly is worth the squeeze and you embark along the new frontier of Demand Planning, don’t forget the babies floating in that bathwater.

 

]]>
https://demand-planning.com/2018/03/22/big-data-chill-out-keep-it-old-school/feed/ 0