algorithms – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com S&OP/ IBP, Demand Planning, Supply Chain Planning, Business Forecasting Blog Mon, 22 Apr 2019 18:37:03 +0000 en hourly 1 https://wordpress.org/?v=6.6.4 https://demand-planning.com/wp-content/uploads/2014/12/cropped-logo-32x32.jpg algorithms – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com 32 32 The 6 Models Used In Forecasting Algorithms https://demand-planning.com/2019/04/22/forecasting-algorithms/ https://demand-planning.com/2019/04/22/forecasting-algorithms/#comments Mon, 22 Apr 2019 18:37:03 +0000 https://demand-planning.com/?p=7717

An algorithm is a procedure or formula for solving a problem, based on conducting a sequence of finite operations or specified actions. Generally speaking, when most people talk about algorithms, they’re talking about a mathematical formula or something that is happening behind the scenes, like the operations that power our social media news feeds.

While these are indeed algorithms and they are a sequence of steps designed to perform a task, algorithms are more than just math.

An algorithm is any detailed operation used to carry out an operation or solve a problem and may be as simple and ‘non-mathy’ as the recipe to bake a cake.

How Algorithms Work In Forecasting

In demand planning, where the cake we are baking is a forecast, our recipe generally entails different prediction methods and approaches, along with layers built from inputs from various sources. The steps and sequence of the inputs, the configuration of the methods, the repeating of steps, and the outputs all come together to form an algorithm.

And this can easily consist of multiple methods and inputs reduced to three logical operations: AND, OR, and NOT. While these operations can chain together in extraordinarily complex ways, at their core, algorithms are built out of simple rational associations and a limited series of steps.

What this means is that an algorithm can be anything you like, for example, an exponential smoothing model that takes an input, uses a set of rules, parameters and steps to deliver an output to your forecasting process.

After you have properly defined the need and have the right data in the right format, you get to the predictive modeling stage which analyses different algorithms that to identify the one that will best future demand for that particular dataset.

The 6 Models Commonly Used In Forecasting Algorithms

Yet, while there are many different algorithms available to us, there’s a smaller set of fundamental predictive modeling techniques that are typically applied in forecasting, including the following:

Clustering analysis: This technique is a way to help understand and analyze data by putting it into smaller manageable subgroups to highlight attributes and manage or make better predictions. The resulting classification model can be used both to categorize new records and to do predictive modeling against the data for the designated subgroups.

Descriptive analysis: This helps tell you what has happened in the past and attempts to analyze and characterize it, with an eye toward predicting similar events in the future. Describing past behavior and then applying predictive models to the resulting data helps to frame opportunities for operational improvement and identify new business opportunities.

Outlier analysis: Detecting the outlying values in a dataset to identify noise and improve prediction and anomalies.  A database may contain data objects that do not comply with the general behavior or model of the data and may be isolated to better understand or determine impacts or calculated responses.

Factor analysis: This helps you understand relationships and dependencies between different data variables to predict how they’ll affect one another going forward. The information enables you to predict future developments related to the dependent variable based on what happens with related factors.

Time series analysis: looks at a collection of values observed sequentially over time and is used to perform time-based predictions. Assuming that past data patterns such as level, trend, and seasonality repeat this can create models using only of the data being forecasted to predict future patterns.

Regression analysis: This helps understand relationships and help predict continuous variables based on other variables in the dataset.  This technique is designed to identify meaningful relationships among data variables, specifically looking at the connections between a dependent variable and other independent factors that may or may not affect it.

We’ll be discussing predictive analytics and data science at IBF’s Predictive Business Analytics, Forecasting & Planning Conference in New Orleans from May 6-8, 2019. Join us at Harrah’s Hotel in the heart of the city for 3 days of world-leading training, networking and socializing. Features special Data Science Workshop to help attendees leverage the latest forecasting techniques.

]]>
https://demand-planning.com/2019/04/22/forecasting-algorithms/feed/ 2
The Quantitative Skills Gap Means Leveraging Machine Learning Is Still 10 Years Away https://demand-planning.com/2019/01/16/machine-learning-is-still-10-years-away/ https://demand-planning.com/2019/01/16/machine-learning-is-still-10-years-away/#respond Wed, 16 Jan 2019 15:40:23 +0000 https://demand-planning.com/?p=7529

As we start each new year, there comes a fresh list of ideas or prognostications about future trends in demand planning and S&OP. These days everyone is using the word digital—digital supply chain, digital transformation, digital quests, and digital horizons. Clearly, we are in the middle of a transformative moment. The real question is whether the moment is truly “digital” or actually something else.

Putting aside my general dislike for trendy projections and overly aspirational future-based dialogues, I believe that something else fundamental is happening and it is overshadowed by all the talk about digital everything.

I think we have entered the age of algorithms.

And it is this transformation that is more important and warrants more buzzworthy consideration than any talk of digital revolutions.

If we look at Amazon as an example, their pricing arbitrage is one of those black box curiosities (i.e., algorithmic/heuristic) that folks try to reverse engineer. The same goes for their stocking model and forecasting model. In 2012, Amazon filed for the patent officially known as “method and system for anticipatory package shipping”, an algorithm-based system that could conceivably ship products before you even place an order.

Algorithms, algorithms, algorithms.

The Revolutionary Tools Are All Algorithm Based

Nearly all recent discussions about emerging supply chain trends, including machine learning and deep learning, artificial intelligence, predictive analytics, demand sensing, natural language processing, and block chain—each use algorithms of some sort.

Even facial recognition and other biometric tools use algorithms.

In the end, the real technology worth noting is the explosion of applied mathematics tools, not really “digital” anything. Digital seems to appear noteworthy only because there is now a plethora of data to feed the algorithms. In the end, it is the algorithms that matter.

machine learning 2

Revolutionary? Sure, But Don’t Get Too Excited Yet

Some posit that these technologies will change the fundamental nature of supply chain planning—be it supply or demand planning—potentially eliminating these roles.

As a pragmatist, I just shake my head, smirk, and move along to the next topic; it is silliness.

One recent projection was that machine learning will be used in 60% of demand planning tools by 2024. This is a perfect example of why I dislike discussions about future trends. Sixty percent represents huge market share, and it fails to account for the organizational inertia inherent at companies that already have embedded demand planning applications. Sixty percent by 2024? I don’t see it happening.

Frankly, I believe it will take four to five years alone just to develop meaningful use cases, trials, and a return on investment model worth even considering. At best, five years to reach 60% is a stretch.

But this is the least of my concerns about such predictions.

Availability of Technology Is One Thing, Being Able To Use It Is Another

In the mid- to late ’90s, when I worked at supply chain software company Numetrix, we found ourselves in a quandary.

We had great tools, designed for sophisticated supply chains, but we were trying to sell to a prospective user base largely uninformed about applied mathematics and thus ill-prepared to appreciate the potential power of our tools or their advantages compared with competitors’ tools.

As an example, we offered the first graphical linear programming application. It was wonderful, but only a limited subset of our prospect list had a clue about linear programming —making it terribly difficult to advocate the use of our tool to optimize supply chains. Only a handful of top companies “got it”.

The talent pool is ill-equipped to fill the near term needs to support these new-fangled tools

The same was true for our finite capacity scheduling tool. The difficulty went beyond the sales cycle, these tools were so complex that we had to hire industrial engineering grads from some of the best operations research schools to help us implement these applications.

The potential userbase has to be ready for the revolution

These were heady applications and the user base was simply not prepared for the revolution that such tools brought to supply chain thinking. The current expansive use of algorithms gives me a similar cause to pause – and revisit those same concerns I had twenty plus years ago.

Most of today’s emerging technologies leverage higher-end math, yet the talent pool is limited and ill-equipped to fill the near term needs to support these new-fangled tools. I often joke at conferences that many people in business develop severe headaches whenever any math is discussed. Therein lies the real problem – we are not quantitative enough in our work and decision making processes.

Companies Don’t Understand Math Well Enough Yet To Use New Algorithmic Tools

I believe that the general lack of quantitative skills will not only slow the development of many of today’s hot, new applications but also diminish the acceptance of—and satisfaction with—the tools once in place as well as their potential benefit streams.

I see the quantitative skills gap as a dampening factor, slowing acceptance of many emerging concepts.

The real need that I see is to fast-forward education in the use of quantitative methods, so that these tools can provide maximum benefit to their potential users.

And I suspect it will take 10 or more years until users learn and understand the math well enough so that the marketplace will finally embrace the tools.

I hope I am wrong.

In the meantime, I will wait for use cases (or look for my own) for these tools, with an eager and curious mind – tempered to not chase after every shiny new object that comes down the road (of course, until it proves to be worthy of my attention). Algorithms are here to stay – I encourage everyone to learn as much as they can about the math underlying these application sets. It is the present and the future.

]]>
https://demand-planning.com/2019/01/16/machine-learning-is-still-10-years-away/feed/ 0