big data – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com S&OP/ IBP, Demand Planning, Supply Chain Planning, Business Forecasting Blog Mon, 22 Jun 2020 12:36:24 +0000 en hourly 1 https://wordpress.org/?v=6.6.4 https://demand-planning.com/wp-content/uploads/2014/12/cropped-logo-32x32.jpg big data – Demand Planning, S&OP/ IBP, Supply Planning, Business Forecasting Blog https://demand-planning.com 32 32 How Much Data Is Enough In Predictive Analytics? https://demand-planning.com/2020/06/22/how-much-data-is-enough-in-predictive-analytics/ https://demand-planning.com/2020/06/22/how-much-data-is-enough-in-predictive-analytics/#respond Mon, 22 Jun 2020 12:36:24 +0000 https://demand-planning.com/?p=8565

If we can gain insights from just a small amount of internal structured data, then how much more could we glean from Big Data? I’m talking that external mass of structured and unstructured data that is just waiting to be collected and analyzed.

But there’s a balance between not enough data and too much. What’s the right amount of data to work with as demand planner or data scientist?

There is a debate about how much data is enough and how much data is too much. According to some, the rule of thumb is to think smaller and focus on quality over quantity. On the other hand, Viktor Mayer-Schönberger and Kenneth Cukier explained in their book Big Data: A Revolution That Will Transform How We Live, Work, and Think, that “When data was sparse, every data point was critical, and thus great care was taken to avoid letting any point bias the analysis. However, in many new situations that are cropping up today, allowing for imprecision—for messiness—may be a positive feature, not a shortcoming.”

The obsession with exactness is an artifact of the information-deprived analog era.

Of course, larger datasets are more likely to have errors, and analysts don’t always have time to carefully clean each and every data point. Mayer-Schönberger and Cukier have an intriguing response to this problem, saying that “moving into a world of big data will require us to change our thinking about the merits of exactitude. The obsession with exactness is an artifact of the information-deprived analog era.”

Supporting this idea, some studies in data science have found that even massive, error-prone datasets can be more reliable than simple and smaller samples. The question is, therefore, are we willing to sacrifice some accuracy in return for learning more?

Like so many things in demand planning and predictive analytics, one size does not always fit all. You need to understand your business problem, understand your resources, and understand the trade-offs. There is no rule about how much data you need for your predictive modeling problem.

The amount of data you need ultimately depends on a variety of factors:

The Complexity Of The Business Problem You’re Solving

Not necessarily the computational complexity, (although this an important consideration). How important is precision verses information? You should define this business problem and then select the closest possible data to achieve that goal. For example, if you want to forecast the future sales of a particular item, the historical sales of that item may be the closest to that goal. From there, other drivers that may contribute to future sales or understanding past sales should be next. Attributes that have no correlation to the problem are not needed.

The Complexity Of The Algorithm

How many samples are needed to demonstrate performance or to train the model? For some linear algorithms, you may find you can achieve good performance with a hundred or few dozen examples per class. For some machine learning algorithms, you may need hundreds or even thousands of examples per class. This is true of nonlinear algorithms like random forest or an artificial neural network. In fact, some algorithms like deep learning methods can continue to improve in skill as you give them more data.

How Much Data Is Available

Are the data’s volume, velocity, or variety beyond your company’s ability to store, or process, or use it? A great starting point is working with what is available and manageable. What kind of data do you already have? In Business-to-Business, most companies are in possession of customer records or sales transactions. These datasets usually come from CRM and ERP systems. A lot of companies are already collecting or beginning to collect third party data in the form of POS data. From here, consider other sources, both internal and external, that can add value or insights.

Summary

This does not solve the debate and the right amount of data is still unknowable. Your goal should be to continue to think big and work with what you have, gather the data you need for the problem and algorithm you have.

When it comes to gathering data, it is like the best time to plant a tree was ten years ago.  Focus on the data available and the insights you have today while building the roadmap and capabilities you want to achieve in the future. Even though you may not use it now, don’t wait until tomorrow to start collecting what you may need for tomorrow.

 

 

]]>
https://demand-planning.com/2020/06/22/how-much-data-is-enough-in-predictive-analytics/feed/ 0
Big Data & Advanced Analytics Will Not Save Your Company https://demand-planning.com/2020/05/26/big-data-advanced-analytics-will-not-save-your-company/ https://demand-planning.com/2020/05/26/big-data-advanced-analytics-will-not-save-your-company/#comments Tue, 26 May 2020 11:17:18 +0000 https://demand-planning.com/?p=8511

I’ve got bad news for you . . . Well, perhaps not bad news but certainly news that is important.

All that data you have been collecting? Most of it is probably junk.

Those advanced analytical tools? Worthless.

Pretty bold statements, right?

Not really.

In my nearly 30 years in supply chain, I have never seen a system or data that by itself had any value. It was the people who used these that added the value.

Example:

This statement has no value: “This item is 92% in stock.”

So? Is 92% good or bad? For an ongoing item, it might be too low. For a discontinued item it could well be too high, since the goal is to run out of inventory.

OK, smarty-pants, what is your point?

Your data and analytical tools are only as good as the people who use them.

With all the focus on data and analytical tools, I think we are in danger of losing focus on the one element that is key to their successful use:

Tools that are not properly used are worthless and can actually be harmful.

Do not give your 2-year old son a hammer. Just some advice.

Now that I have your attention – and I am sure I have made some of you angry – here are 5 key practices that will make or break your big data and advanced analytical dreams.

1. People will not effectively use data they do not understand

A person who does not understand data probably will not use it well. In fact, they may abuse it to explain away a problem. Someone who does not grasp that your comp sales percentage is negative, or that your in-stock level of 92% is well under the expected level cannot possibly take the right actions to address either problem. And saying, “We’re only 8% out of stock on the item – it can’t be that bad”, is missing the point of using the metric.

2. People need to know how to use and evaluate the data and systems they use

This is largely a matter of education and training. Users need to know that the systems they use have limitations (they all do) and that they are good for some processes and not good for others. For example, does your demand planning system allow for adjusting the history of an item? And does the user understand why this is important and how to do it properly? When do your reports update, and how are key measures (in-stock, lead-time, fill rate, etc.) calculated? Users need the training and tools to be able to evaluate both the tools they use and the data that drives these tools to be able to use them effectively.

3. Users need to trust the data and tools they use

This follows from point # 2 above.

Users who do not understand the data and tools they use cannot properly evaluate and effectively use them. Reports may have bad data and systems may fail to update properly. Users need to be able to spot when a report has errors or a system gives an incorrect output. Some of this comes only with experience, but training users to evaluate and question these is important, since they and others will be using them to make decisions about where to spend company resources – including their own time and energy.

4. Users need to understand how to use the data and tools they are given

This is where training is key, and where many companies suffer because they leave this to chance. The approach is often, “Here’s a manual – figure it out.” Some users can manage learning this way, but many cannot. And while training is expensive and often hard to justify, it amazes me what companies will tolerate as users “learn how to use the systems.” (And in the interest of full disclosure, yes, I have “accidentally” ordered $2M dollars of unneeded inventory. In this business we all get many opportunities to make spectacular errors.)

5. Users need to know how to challenge the validity of the data they use and the value of the systems they manage

To me, this is the goal of all training. When users understand how the tools they use are constructed, can evaluate their usefulness for themselves, trust them, and know how to use them effectively, they can then become effective users of those tools.

And only at this point does all the data you have compiled, and all the fancy systems you purchased start to add significant value to your company.

So the next question is, where are your users in this process of growth?

Your data and analytical tools are only as good as the people who use them.

I know, training is expensive and hard to justify. But this is usually because we don’t calculate the cost of not training our people. We simply live with the cost and inconvenience and call this a “cost of doing business.”

I have personally trained hundreds of users in effectively managing complex replenishment and reporting systems. It’s challenging to try to meet all the levels of ability and understanding in even a small class. But the payback in terms of productivity and the student’s sense of personal accomplishment – while hard to price – is worth all the effort.

And it’s also the only way that any company will get the full value out of the data and systems that they invest in.

 

 

]]>
https://demand-planning.com/2020/05/26/big-data-advanced-analytics-will-not-save-your-company/feed/ 1
Overcoming The Challenges of Big Data https://demand-planning.com/2019/12/17/overcoming-the-challenges-of-big-data/ https://demand-planning.com/2019/12/17/overcoming-the-challenges-of-big-data/#respond Tue, 17 Dec 2019 14:39:49 +0000 https://demand-planning.com/?p=8126

We can no longer ignore data. Now that we have begun to define it and find new ways of collecting it, we see it everywhere and in everything humans do. Our current output of data is roughly 2.5 quintillion bytes a day and as the world becomes ever more connected with an ever-increasing number of electronic devices, it will grow to numbers we haven’t even conceived of yet. 

We refer to this gigantic mass of data as Big Data. First identified by Doug Laney, then an analyst at Meta Group Inc., in a report published in 2001, Big data has commonly been defined as “information that is high-volume, high-velocity, and/or high-variety beyond normal processing and storage that enables enhanced insights, decision making, and automation”. 

The problem is that “high volume” and “normal” are relative to your company size and capabilities. For this reason, I prefer to look at Big Data as a continual growth in data Volume, Velocity, and Variety beyond your company’s ability to store or process or use it.

The Problem With Big Data

The challenge with the sheer amount of data available is assessing it for relevance. The faster the data is generated, the faster you need to collect and process it. Not only that, data can be structured in many different ways and comes from a wide variety of different sources that need to be tied to together and sorted out. And finally, when we talk about Big Data, we think of it as raw information and not about the strategies to deal with it or the tools to manage it.

[Credit: Doug Laney]

Volume, Velocity and Variety – dubbed the three Vs – are used to describe Big Data according to three vectors and are key to understanding how we can measure big data compared to traditional datasets or collection methods.

Volume

Big data is about volume and refers to the large amount of data involved. The size of available data is growing at an increasing rate. If there was ever “small-data”,  it was generated internally from enterprise transactional systems and stored on local servers. Today, businesses are constantly collecting data from many different outlets like social media, website lead captures, emails, eCommerce and more. This has begun to outgrow an organization’s capabilities to manage these larger volumes of data – a major issue for those looking to put that new data to use instead of letting it go. If this sounds familiar, you are dealing with Big Data, and it’s probably a big headache.

More data sources that create more data combine to increase the volume of data that has to be analyzed. The world holds an enormous amount of data, possibly an incomprehensible amount. With over 90% of today’s data being generated in the past 2 years, that comes to about 2.5 quintillion data bytes daily. Perhaps 10 or 15 years ago, terabytes qualified as high-volume data, but these days you’re not really in the Big Data world unless you’re dealing with exabytes (1 million TB) or petabytes (1,000 TB).

To deal with these larger volumes of data, companies are moving from desegregated data sources to data lakes and warehouses, and data management systems. Storage is transforming from local servers to the cloud and external partners like Amazon and others. For processing, we are considering tools like Hadoop and Apache. Business intelligence software for data cleansing and data visualization are becoming more prevalent. And in predictive analytics, we are considering new methods and approaches to analyze larger sets of data and capture greater insights.

Velocity

Velocity measures how fast the data is coming in. Big Data isn’t just big; it’s growing fast. It’s also coming in at lightning speed and needs to be processed just as quickly. In the olden days (3 to 5 years ago), companies would usually analyze data using a batch process. That approach works when the incoming data rate is slower than the batch processing rate and when the result is useful (considering there’s a delay). But with the new sources of data and the need to be more agile in decision making, the batch process breaks down. The data is now streaming into the server in real time, in a continuous fashion and the result is only useful if the delay is very short.

Think about how many website clicks, consumer transactions and interactions, or credit card swipes are being completed every minute of every day. Consider the sheer number of SMS messages, the 300,000 social media status updates, the 140,000 photos uploaded, and the 500,000 comments made every minute. Add to this the Internet of Things and the constant real time transmissions and you’ll have a good appreciation of the speed at which data is being created.

We need real time tools (or close to real time) to collect, analyze, and manage all this data, then act on it. Demand sensing is the key to this. Demand sensing is sensing demand signals, then predicting demand, and producing an actionable response with little to no latency.

According to the Summer 2012 issue of The Journal of Business Forecasting, demand sensing sorts out the flood of data in a structured way to recognize complex patterns and to separate actionable demand signals from a sea of “noise.”

Besides this, velocity also calls for building Big Data solutions that incorporate data caching, periodic extractions and better data orchestration, and deploying the right architecture and infrastructure.

Variety

Data is big, data is fast, but data also can be extremely diverse. Data variety refers to all the different types of data available. Data was once collected from one place (more than likely internal) and delivered in one format. It would typically be in the form of database files such as Excel, CSV and Access. Now there is an explosion of external data in multiple forms and unstructured data that doesn’t fit neatly on a spreadsheet. This, more than any of the other vectors, can quickly outpace an organization’s ability to manage and process their data.

Variety is one the most interesting developments in data as more and more information is digitized. A few decades ago, data would’ve been in a structured database in a simple text file. Nowadays we no longer have control over the input data format. Consider the customer comments, SMS messages, or anything on social media that helps us to better understand consumer sentiment. How do we bring together all the transactional data, POS data from trading partners and sensor data we collect in real time?  Where do you put it?

Although this data is extremely useful to us, it can create more work and requires more analytics to decipher it so it can provide insights. To help manage the variety of data there are also a variety of techniques for resolving problems. We no longer just extract and load, we are now importing data into universally accepted and usable formats such as Extensible Markup Language (XML). To sort through the volume and variety of data we are using data profiling techniques to find interrelationships and abnormalities between data sources and data sets.

The Bottom Line

Big data is much more than just a buzzword or simply lots of data. It is way to describe new types of data and new potential for greater insights. The three V’s do well to describe the data, but we still need to remember that even Big Data is still the small building blocks. For Big Data to be valuable, we need more data coming in faster from multiple sources – and we need the systems, analytics, techniques, and people to manage that process and derive value from it.

[Editor’s note: The 3 Vs in Big Data concept is taken from “3D Data Management: Controlling Data Volume, Velocity, and Variety”, Gartner, file No.949. 6]

 

 

 

]]>
https://demand-planning.com/2019/12/17/overcoming-the-challenges-of-big-data/feed/ 0
Demand Planners’ Guide To Surviving the Amazon Onslaught https://demand-planning.com/2018/05/21/demand-planners-guide-to-surviving-the-amazon-onslaught/ https://demand-planning.com/2018/05/21/demand-planners-guide-to-surviving-the-amazon-onslaught/#respond Mon, 21 May 2018 18:18:31 +0000 https://demand-planning.com/?p=6900

The “Amazon effect” has already changed shipping, logistics, employment and consumer behavior, not to mention giving a good kicking to brick-and-mortar stores. As a result, Demand Planning and Forecasting is very different to how it was just a few years ago, and our jobs have evolved drastically. We must understand that Big Data and shifting consumer demand have changed everything.

New Opportunities And Challenges in Demand Forecasting

When it comes to Demand planning and Forecasting, Amazon and – e-Commerce generally – has been a phenomenally disruptive force. This new world is impacting what we forecast, how we forecast it, and when we forecast it. This creates challenges but there is also a key opportunity presented by the new e-Planning environment, and that lies in the abundance and availability of data. Big Data is driving the following fundamental changes in Demand Planning:

  • Machine Learning and Neural Networks are replacing traditional time series methods
  • Web crawlers are replacing syndicated data
  • We are using more real-time or daily forecast at a more granular level
  • We are using more analytical decision making and demand shifting techniques

Adapt Your Techniques

In the new e-Planning world, we are forecasting orders less and looking more at a wider variety of inputs. The most well-established forecasting techniques are based on historical demand and they have served us well, but with e-Commerce, we have a new wealth of information that tell us not only about the future, but also the present. Predictive analytics encompasses a variety of statistical techniques like predictive modeling, Machine Learning and data mining that analyze current and historical facts to make predictions about the future – all in a way that we could never hope to replicate with time series forecasts.

Once you understand the drivers, you can  influence demand like never before.

Instead of looking at just shipments or sales history, we have access to website clicks, rankings and the number of customer reviews. Using a multivariate regression or recurrent neural networks, for example, you can isolate variables and determine their impact to predict future sales. The greatest benefit of looking at these attributes and variables is that we may not only be able to predict sales by week going forward, but also understand what will change and by how much.

Prediction is becoming more about behavior than history. This is powerful because once you understand the drivers, you can  influence demand like never before. In the age of Big Data, therefore, we can be proactive instead of reactive. Amazon are a perfect example of this; when shopping on the Amazon website, you will have seen the recommendation engine. It recommends an item based on what you purchased before on the same website. Your recommendations are filtered by a variety of drivers such as genre, price, brand and interests etc.

Be More Agile

In today’s business environment, changes in the marketplace are swift and sudden and may not follow the historical pattern, meaning time series models cannot always be relied on for accurate forecasts.  The new e-Planning environment is not only dynamic, it operates on the power of technology and innovation. If it is still taking you a month to gather data, create a forecast, add assumptions, and develop a forecast, I’ll tell you now that you won’t be able to keep up. But you may know that already.

The winners in this new era will be the ones that can see, interpret, and act on data the most efficiently.

In many industries, forecasting actual product sales is often a lengthy business process. With e-Commerce, however, you are competing almost in real time with price, features, and delivery promises. And feedback comes just as quickly in the form of reviews and competitive responses. To be more agile, companies are looking at demand sensing techniques to translate the drivers into rules based or machine-learned responses. This brings us closer to not only to the level of demand, but also closer to demand intent.

Even more importantly, the e-Business environment provides an opportunity to readily collect information about potential or future buyers. Along the different channels of communication between an organization and its potential customers, most companies now routinely log every visit to the product webpages, every call made to an inquiry response center and every email that was received. Many organizations also use every customer touch point as an opportunity to perform a brief customer survey to collect information about their customers and comments on their products.

Where traditional demand sensing focuses purely on Point Of Sale (POS) data from retailers aggregated weekly, you are now absorbing sales on an hourly basis or even quicker. Where third-party syndicated data from Neilsen and others could help you better understand markets and competitors, now you have web crawlers that traverse multiple sites and bring you relevant data whenever you want. We have more data than ever, and we’re getting it faster than ever before – the winners in this new era will be the ones that can see, interpret, and act on it the most efficiently.

Get Detailed On Promotions

Targeted marketing has created complications in the analysis of promotional effects. The traditional way of applying a general “lift factor” to nominal demand when a certain promotion is performed may not be adequate.  Add to this the changes we are seeing with dynamic pricing models, and time series forecasting reveals its limitations.

Online reviews correlate to sales so much that you can even use them in modeling.

Changes in the selling price and the presence of product promotions are known to have a significant effect on demand in many industries. Today, in large part due to analytics and the proliferation of data, price changes and promotions are cheaper to do and have greater impact. Price changes on the web or a pop up on your website shifting demand incur little incremental cost. Even in traditional retail stores, the day will soon come when a button on a computer is pressed to issue a price change, and new prices will be reflected on an electronic label in a physical store a few seconds later. Such opportunities imply that price changes and promotion actions may be used very frequently, and so they can no longer be analyzed separately from “normal” demand. This requires a disciplined process to capture the information in a timely manner.

Offer What the Customer Wants

One of the biggest challenges we have seen in demand forecasting over the past few years is shorter and shorter product life cycles. This is absolutely necessary to meet consumer demand. In the e-Planning realm, this will be compounded by a whole new set of challenges. Why? Because constant new products do little for building up the necessary reviews in the e-Commerce marketplace. Think of your experience shopping online – you have one item with 2000 reviews and 4-star rating and another with just 5 reviews and a rating of 4. It is actually more beneficial to keep older items with good rankings than introduce new items every 18 months. Online reviews correlate to sales so much that you can even use them in modeling.

This works well for items with multiple reviews but we still need to replace poorer performing products.  With e-Commerce going directly to consumers, speed to market is crucial. What’s more it makes it very difficult to plan. Because individual products are phased out and new products come in constantly over time, and the fact that the hierarchy might be reorganized fairly frequently to reflect the fast-changing business environment (e.g. gaining or losing major customers or markets), the product hierarchy is dynamic.

And there you have it. Amazon has totally revolutionized the marketplace, and with it demand forecasting and Demand Planning. If there’s one there’s one concept that all forecast analysts and Demand Planners must understand, is that companies will live and die by their ability to gather, interpret and act on data.

 

 

 

 

 

]]>
https://demand-planning.com/2018/05/21/demand-planners-guide-to-surviving-the-amazon-onslaught/feed/ 0
Predictive Analytics: Real Life Use Cases https://demand-planning.com/2018/03/29/predictive-analytics-real-life-use-cases/ https://demand-planning.com/2018/03/29/predictive-analytics-real-life-use-cases/#comments Thu, 29 Mar 2018 13:27:59 +0000 https://demand-planning.com/?p=6529

Since 2011, I’ve been teaching Supply Chain Risk Management at Lehigh University. Developing a new MBA class in Supply Chain Risk for our supply chain MBA students gave us the opportunity to explore the landscape of Supply Chain disruptions and the statistical tools and techniques in the areas of Predictive Analytics and Big Data that are improving supply chain performance. Let’s start with the data landscape as of today, and chat about a few use cases where Big Data has real, practical applications.

According to IBM’s 2014 Big Data Study:

  • More than 90% of all data that exists in the world today was created just 2 years ago!
  • We’re in an age where more than 2.5 quintillion bytes of data are created EVERY DAY!
  • That’s more data than all the data in all the libraries in the USA

Here’s IBF’s definition of  Predictive Analytics:

“A group of statistical techniques including modelling, machine learning, and data mining that are used to analyze current and historical data in an effort to create projections about the future.”

The exciting aspect of Predictive Analytics in supply chain and risk management is that the computing power has now caught up to the algorithmic strength in the discipline, creating huge opportunity to leverage these age-old tools to enhance supply chain performance and mitigate supply chain risk.

Use Cases Reveal Predictive Analytics To Be a Growth Driver

As of today, there are over 20 different industries exploring the use of Predictive Analytics and Big Data. A recent E&Y study stated that 66% of companies with well-established advanced analytics are reporting operating margins and revenue growth of 15% or more. And 60% of the respondents said they also have improved their risk profiles. Very exciting! Let’s take a look at some quick-hit use cases.

Lowe’s Robot

Starting in 2016, Lowe’s began to utilize autonomous retail robots to help both customers and store staff. It helps customers find products while it helps staff by scanning items and identifying price discrepancies. More interestingly for us, however, is that it tracks inventory. It knows what is being sold and where, all in real-time. That provides great visibility into demand and supply, enabling its supply chain to react faster. The robot is bilingual and driven by Predictive Analytics – and could soon be replicated in a variety of retail environments.

[bar id=”527″]

IBM and Watson

Accenture’s 2016 report into the Internet of Things concluded that Predictive Analytics could save up to 12% on scheduled repairs and 30% on maintenance, while reducing breakdowns by up to 70%. When it comes to reducing breakdowns, this is where Predictive Analytics is truly remarkable. See The IBM commercial below, where Watson predicts that an elevator will malfunction in 2 days, allowing the repairman to ‘fix’ it before it breaks.

IBM, a big advocate of end-to-end supply chain management, Predictive Analytics, Big Data and SCRM, Supply Chain Risk Management, is leveraging Watson to enhance their SCRM approach. IBM uses Watson to identify, assess and mitigate supply chain risks, 24/7, 365 days a year.

IBM, The Weather And Retail

Consumer behaviour is strongly influenced by the weather. IBM, who bought The Weather Company, is using Watson to make stunning predictions about the weather. Now, making predictions about the weather is nothing new but using integrating the findings for better supply chain decisions is. IBM uses databases from The Weather Company and a variety of of Big Data, including news stories and social media to figure out in real-time how consumers are reacting to weather conditions, and the subsequent commercial opportunities.

Their weather predictions coupled with pattern recognition tools are providing forecasts months in advance, which are then integrated with retail companies’ supply chain data. Armed with this information, companies can arrange resources accordingly, with Demand Planners able to make more accurate demand plans, and Sales and Marketing able to promote certain items and price accordingly (Figure 1).

FIGURE 1

Predictive Analytics In Supply Chain

Predicting patterns in weather can mitigate Supply Chain risk

Walmart

Jon Walker from Tech Emergence reveals that Walmart has seen some very interesting (and highly profitable) correlations between weather and consumer spending behavior. The company noticed that people tend to buy steaks when it’s warm, windy and cloudy. Why exactly? We don’t know, but they do. Meanwhile, hamburger sales increase in hot and dry weather. They utilized these correlations to promote hamburgers based on weather predictions and saw an 18% improvement in beef patty sales. Go figure!

Anecdotally, we know (broadly) how weather will impact spending behavior for different industries, but never before have we have had this volume of data to support findings, this ability to reveal cause and effect, this ability to remove bias, nor this ability to automate the process and integrate into supply chain systems.

 

 

 

 

 

 

 

]]>
https://demand-planning.com/2018/03/29/predictive-analytics-real-life-use-cases/feed/ 2
Big Data? Chill Out & Keep It Old School https://demand-planning.com/2018/03/22/big-data-chill-out-keep-it-old-school/ https://demand-planning.com/2018/03/22/big-data-chill-out-keep-it-old-school/#respond Thu, 22 Mar 2018 13:12:02 +0000 https://demand-planning.com/?p=6474

Over the past few years, the Demand Planning community has become quite starry eyed over advancements in predictive software and tools. The concepts of “Big Data” and “advanced analytics” are enough to make seasoned practitioners stand to attention – and even catch the interest of the Executive Team. But when many of us still struggle with the fundamentals, it is worth investing in new-fangled technology?

Admittedly, in a field where you know you will never be “right”, this fancy technology and impressive phrases are quite attractive – they bring to mind a picture of a utopian state where analytical horsepower and near infinite data points lead to a 100% accurate forecast. There may even be a unicorn there. For me, I can’t help but be reminded of two much-loved colloquialisms that I urge Demand Planning professionals to consider as we journey into the future with new tools and ideas that may or may not usher in a new age of Demand Planning.

The only thing we know about Advanced Analytics is that you must have clear, fully costed plan as to how it is going to provide a return.

“Is The Juice Worth The Squeeze?”

This is a phrase I say probably too frequently when considering new tools, methods, and processes to improve Demand Planning. Does the effort required to explore and/or implement the new approach measure up to its expected return? For some organizations, intensified data collection (or data purchase) and the machine capability to chug through it may be cost prohibitive.

Computing equipment and data aside, the organization may not have the human resources on hand to give these capabilities their due. Perhaps the organization already enjoys a high level of forecast accuracy. Is the expenditure worth that extra percentage point? Maybe, maybe not. The only thing we know is that you must have clear, fully costed plan as to how this new tech is going to provide a return.

Don’t Throw The Baby Out With The Bathwater”

Or, “don’t throw the fundamentals out when you get your shiny new tools”. Even if your organization does decide to invest in Big Data and/or Advanced Analytics, it’s important to not abandon some of the tried-and-true measures and methodologies of effective forecasting. If your organization decides not to invest in these buzzworthy tools, there is still a great amount of improvement that can be made using some tried-and-true Demand Planning basics. Additionally, these concepts can assist in answering the juice-vs-squeeze question of a potential upgrade or data investment if the organization chooses to entertain new solutions. Some of the most impactful are as follows:

Put down the Big Data Kool Aid – FVA is great low-hanging fruit to pursue prior to making a new technology investment.

Forecast Value Add Analysis (FVA)

Whether or not Advanced Analytics and insights are in your future, the impact of a simple Forecast Value Add (FVA) analysis cannot be overemphasized. FVA is a measurement of your forecasting process – from the statistical models utilized, to the overrides added by analysts and the insights from salespeople. Each step in the forecasting process is measured to determine the added value the step brings to the overall process. Advanced Analytics or sophisticated tools could of course be an added forecasting layer to be measured, but I would caution that if steps in your process are continuing to devalue the forecast, there are things to look at first. Put down the Big Data Kool Aid – FVA is great low-hanging fruit to pursue prior to making a new technology investment.

Keeping an eye on tracking signal is important no matter how sophisticated the forecasting methodology.

Tracking Signal

While somewhat reactive in nature, I love using tracking signal as an indicator to let me know if my forecast needs a second look. Tracking signal is simply a measure of consistent bias over time. In short, if actual demand has come in lower than forecasted for each of the last three months, you may want to reinvestigate your demand assumptions.

Not only is consistent under- or over- forecasting a reliable indication to an analyst that their projections may be incorrect, it is also a great signal of potential inventory shortages or surpluses. Keeping an eye on tracking signal is important no matter how sophisticated the forecasting methodology.

[bar id=”527″]

Forecast Accuracy

I’m sure many of you are rolling your eyes at this point. Of course we measure forecast accuracy, this isn’t even worth talking about! I challenge you to revisit and audit your metric. Most organizations are familiar with the debate on precisely when forecast accuracy should be measured – is it a month before the actuals are due to come in? Three months? A week before the actuals come in? The answer is likely that a measurement at material lead time is the most appropriate. After all, this is the time in which the supply chain can, in a perfect world, respond appropriately and without expediting to the demand signal.

Recent analysis in my own organization found that the traditional “T (time) minus a generic lead time” approach was not allowing us to gather the proper insights from our forecast accuracy metrics because lead times are so wildly disparate. As a result, a change to the metric was required and more insightful conversations are now being driven during the S&OP process.

There’s No Unicorn In Your Advanced Analytics Utopia

The latest and greatest technologies offer a very tempting vision of what the future could be; after all, who doesn’t want the powerhouse predictive analytics of an Amazon or Target? However, it’s important to approach these decisions with a healthy dose of skepticism. Be mindful to evaluate the promises being made and ensure they are aligned with your needs. And, if the juice truly is worth the squeeze and you embark along the new frontier of Demand Planning, don’t forget the babies floating in that bathwater.

 

]]>
https://demand-planning.com/2018/03/22/big-data-chill-out-keep-it-old-school/feed/ 0
What Is Quick Response Forecasting? https://demand-planning.com/2018/02/28/what-is-quick-response-forecasting-and-why-do-we-need-it/ https://demand-planning.com/2018/02/28/what-is-quick-response-forecasting-and-why-do-we-need-it/#respond Wed, 28 Feb 2018 19:28:18 +0000 https://demand-planning.com/?p=6302

Some time ago I was on a call with IBF board members discussing situations where social media data signals rapid changes in demand for a product. These data might include some favorable online reviews of products or a celebrity mentioning a certain product, leading to a rise in demand. These are great “predictive” downstream demand signals, but since they cause very rapid spikes in demand, current forecasting processes could not incorporate the information quickly enough to act upon it. I blurted out “We need to start thinking about Quick Response Forecasting techniques”, making up the term on the spot.

Since then, Quick Response Forecasting (QRF) has gained a lot of traction among forward-thinking Demand Planners and Forecasters, and I recently wrote a detailed piece on it in the Journal Of Business Forecasting. Now we are at the point where demand planning organziations can identify very real applications for QRF in their own companies, and develop a framework to turn advance knowledge of changes into meeting spikes in demand with sufficient supply.

QRF leverages predictive analytics, social media information, and other Big Data.

What Is Quick Response Forecasting?

Quick Response Forecasting is updating forecasts in line with ‘real’ and rapid changes in demand, both during and between planning cycles. Data sources can be POS data or unstructured data like social media comments.

Quick Response Forecasting Solves The Availability Issue In Case of Spikes In Demand

A contact of mine works for a company that sells nail polish. Lady Gaga, at one of her concerts, wore a unique color of nail polish that his company sells. Shortly thereafter, social media lit up and sales of this item went through the roof. The company and its supply chain partners ran out of the product, as well as a key ingredient that went into making it. The company was not prepared to take full advantage of the rapid change of demand for this nail polish and missed out on a significant revenue opportunity. Had QRF been employed by the company, they would have been able to react quickly enough to ensure enough inventory was available.

Quick Response Forecasting Makes Sense Of Big Data

QRF leverages predictive analytics, social media information, and other Big Data. This relates to the explosion in digital data and the enormous amount of information available about customers and product users on the World Wide Web. As data get bigger, companies are looking for techniques and methods to both find and incorporate a few key demand signals among Big Data’s noisy information deluge.

QRF is a way to maximize revenue from rapidly emerging opportunities that are happening right now, and opportunities that are likely to develop in the very near future.

Quick Response Forecasting Supports Short-Cycle Planning

Like the nail-polish demand spike mentioned above, supply chains often cannot act quickly enough to take full advantage of the opportunity that the demand signal offers. More frequent forecasting has the potential to increase forecast accuracy in terms of identifying rapid changes in demand but supply chain responsiveness may be too sluggish to take full advantage of it. For example, manufacturing managers might complain about getting whipsawed by demand forecasts that change rapidly, despite their increased accuracy.

One quote I often use with regard to planning responsiveness came from a manager who ran the S&OP process at a high-tech company. Generally, these types of companies operate ‘responsive’ rather than “efficient” supply chains. Responsive supply chains handle high-margin, high-value products.

Their major goals are less about minimizing operating costs and inventories, and more about maximizing inventory availability at the point-of sale/consumption in order to capture potential upside revenue. One thing QFR is not, is supply chain optimization.

[bar id=”6270″]

How QFR Can Work In Practice

A typical S&OP process is a routine planning process that would be too disrupted by having to incorporate QRF, so it is not a good candidate process. QRF is needed to support teams that are specially put in place, on an ad hoc basis, to manage significant event-based and substantial demand changes like natural disasters, celebrity endorsements etc. These teams should be cross-functional and be supplemental to the S&OP process. The teams need to be quickly assembled once an on-going QRF forecasting organization detects that a demand spike or significant demand change has occurred, or is likely to occur.

Once the team is put in place, QRF forecasts for the event need to be continually provided to the quick response supply team. If forecasters ask managers whether they need QRF today, they will likely say no. They don’t want their operations to be whipsawed by frequently changing forecasts. This is why a separate and special quick-response supply process will be needed to handle each event.

If you are on the lookout for an organization to partner with to develop QRF and supply response teams, your sales organization is the best bet. The teams are focused on going after significant revenue opportunities, for which current processes are too slow to take advantage of. If you can identify highly lucrative revenue opportunities, Sales will jump at the chance to exploit it. A supply response team will be tasked with taking full advantage of an opportunity, in terms of squeezing as much revenue from it as possible.

Bottom Line: We All Need QRF To Maximize Big Data Opportunities

In short, QRF is a way to maximize revenue from rapidly emerging opportunities that are happening right now, and opportunities that are likely to develop in the very near future. It incorporates the ability to forecast unstructured data like Facebook comments or online reviews, and requires ta specialist supply team that sites apart from the standard S&OP process to respond quickly, switching supply up or down as required. In the age of Big Data where technology promises to deliver greater insight and greater revenue opportunities, QRF is exactly what we need to make it a practical reality.

Larry first discussed QRF in the Spring 2018 issue of the Journal Of Business Forecasting. Become an IBF member and gain access to the journal, as well as a host of other benefits.

]]>
https://demand-planning.com/2018/02/28/what-is-quick-response-forecasting-and-why-do-we-need-it/feed/ 0
The Demand Planning Career, Is it a Curse or a Blessing? https://demand-planning.com/2017/04/17/the-demand-planning-career-is-it-a-curse-or-a-blessing/ https://demand-planning.com/2017/04/17/the-demand-planning-career-is-it-a-curse-or-a-blessing/#comments Mon, 17 Apr 2017 13:52:44 +0000 https://demand-planning.com/?p=2432

If you have any knowledge of Demand Planning, I am sure that you have heard the following: “Demand Planners are like meteorologists, they rarely get credit for doing the job correctly and they’re only noticed when they get it wrong.” Even so, the bottom line is that there are serious and costly ramifications which can occur if these decisions are wrong. For this reason, the demand planning position can be one of the most important and visible in the company. It is a great place to impact many areas of business and gain corporate approval. It is best to take a positive approach and be an agent for fact based decision making. Using this approach, along with good communication skills, is a great avenue to gain the knowledge and build relationships that will prepare you for success and lead you along a career path with much variety.

Demand Planning is Transferable To Any Industry

Demand Planning touches every aspect of the business and the impact can make this person a valuable asset very quickly. It requires broad business knowledge and detailed customer interaction. Also, it is a functional area that has the ability to transfer these skills to any industry. It involves working with several areas of the business simultaneously and provides an excellent opportunity to tap into the  knowledge of others. Also, working in cross functional teams can be very rewarding by providing a lot of variability to the job and making it more pleasurable.
[bar group=”content”]

Demand Planning Is A Collaborative Process That Provides Visibility, And Opens Doors

It is a collaborative process which aids in developing many relationships through the internal organization, as well as, customers and other suppliers. It is a highly visible position which can lead to new and exciting projects. Along with the knowledge to be gained from these groups, the relationships become an asset for your forecasting success and in turn your career path. Also, it can be very rewarding to work with other people to help them attain their goals and reach a collaborative decision that will benefit the entire company. A successful demand planner must become a leader in fact based decision making and a champion for change.

The Required Leadership In The Role Is a Challenge

Along with business knowledge and relationship building, leadership skills are also an asset to a successful demand planner. A successful demand planner uses the knowledge gained and is able to interact with customers, managers, sales representative, marketing, pricing and supply chain colleagues. Becoming a good communicator is imperative to collaboration among internal and external customers. This will enable the demand planner to guide various groups in terms that make sense to them and to reach consensus among the group. All of these things together help the demand planner to provide the best forecast possible which in turn will become a huge advantage for both the company and the demand planner.

Ultimately, a bad forecast leads to bad corporate decisions and the loss of career possibilities. Take the positive approach using business knowledge, building relationships and leading your colleagues to collaboration. Pave the way for fact based decisions that will benefit you and your company. Don’t become a victim and fall for the curse. I have learned over the years to approach my career and my life with gratitude and a “can I help you” attitude. This will take you farther than any expertise on any day. Curse or Blessing, well maybe it was best defined by the Beatles, “I get by with a little help from my friends.”

Sylvia Starnes
Demand Planning Leader
Continental Tire

]]>
https://demand-planning.com/2017/04/17/the-demand-planning-career-is-it-a-curse-or-a-blessing/feed/ 1
Product Portfolio Optimization – Journal of Business Forecasting (Special Issue) https://demand-planning.com/2016/02/29/product-portfolio-optimization-journal-of-business-forecasting-special-issue/ https://demand-planning.com/2016/02/29/product-portfolio-optimization-journal-of-business-forecasting-special-issue/#respond Mon, 29 Feb 2016 17:09:24 +0000 https://demand-planning.com/?p=3148 COVER_Winter_2015-2016_Product_Portfolio_Optimization_HIGH_RESWithin the pages of this particularly exciting issue, you will read articles written by the best minds in the industry to discuss multiple important aspects of Product Portfolio Optimization. This is an important topic because in today’s highly competitive market, it is becoming more important than ever to look for ways to cut costs, and increase revenue and profit. Markets are now demand driven, not supply driven.

Globalization has intensified competition. Every day, thousands and thousands of new products enter the market, but their window of opportunity is very narrow because of shorter life cycles. Plus, too much uncertainty is associated with new products. Their success rate is from poor to dismal—25% according to one estimate. Despite that, they are vital for fueling growth. Big box retailers are putting more pressure on suppliers to provide differentiated products. Consumers want more choices and better products. All these factors contribute to the greater than ever number of products and product lines, making management of their demand more complex, increasing working capital to maintain safety stock, raising liability of slow-moving and obsolete inventory, and increasing cost of production because of smaller lots and frequent change overs. Product portfolio optimization deals with these matters.

Product portfolio optimization includes the following: one, how to rationalize products and product lines and, two, how to manage most effectively their demand. Product rationalization includes deciding which products and product lines to keep and which ones to kill, based on the company’s policy. Demand management, on the other hand, is leveraging what Larry Lapide from University of Massachusetts and an MIT Research affiliate calls 4Ps (Product, Promotion, Price, and Place) to maximize sales and pro‑t. The sales of low-performing product lines may be bumped up with a price discount, promotion, line extensions, or by finding new markets.

[bar group=”content”]
Although the S&OP process has a component of product portfolio optimization, its team members pay nothing more than lip service to it. Pat Bower from Combe Incorporated discusses in detail the process of product portfolio optimization in the framework of new products. How new products should be filtered from ideation to development and, after launch, how they should be leveraged. Their window of opportunity is very small; most CPG products flame out within the first year of their existence, says Pat.

Mark Covas from Coca-Cola describes in detail 10 rules for product portfolio optimization. He suggests companies should divest low margin brands, no matter how big they are. Many companies such as ConAgra Foods, General Mills, Procter & Gamble, and Estée Lauder are doing it. This makes the allocation of marketing dollars more productive—taking funds away from low performing brands and giving to high performing ones.

Charles Chase from SAS and Michael Moore from DuPont recommend the Pareto principle of 80/20 to determine which products or product lines to concentrate on in their portfolio optimization e­fforts. Greg Schlegel from SherTrack LLC. Goes even further and proposes that this principle should be extended even to customers. He categorizes customers into four: 1) Champions, 2) Demanders, 3) Acquaintances, and 4) Losers. He then describes a strategy for dealing with each one of them. Greg Gorbos from BASF points out hurdles, political and others, that stand in the way of implementing the optimization policy, and how to deal with them. Clash occurs among different functions because of difference in their objectives. Sales looks to achieve revenue targets, while Marketing looks to hold market share and increase profit. Finance also looks at profit, but seeks to reduce cost and increase capital flow, while Supply Chain looks at cost savings. Communication is another issue Greg points out. The company may decide to deactivate a product, but information about it is not communicated to all the functions. Je­ff Marthins from Tastykake talks, among other things, about the exit strategy, which he believes is equally important. He says that we cannot deactivate a product without knowing its inventory position, as well as holding of raw and packaging materials for it.

For survival and growth in today’s atmosphere, it is essential to streamline the product portfolio to reduce costs, and increase revenue, profit, and market share. This issue shows how.

I encourage you to email your feedback on this issue, as well as on ideas and suggested topics for future JBF special issues and articles.

Happy Forecasting!

Chaman L. Jain
Chief Editor, Journal of Business Forecasting (JBF)
Professor, St. John’s University
EMAIL:  jainc [at] stjohns.edu

DOWNLOAD a preview of this latest Journal of Business Forecasting (JBF) Issue

Click HERE to join IBF and receive a JBF Complimentary Subscription

 

]]>
https://demand-planning.com/2016/02/29/product-portfolio-optimization-journal-of-business-forecasting-special-issue/feed/ 0
Risk-Adjusted Supply Chains Help Companies Prepare for the Inevitable https://demand-planning.com/2016/02/19/risk-adjusted-supply-chains-help-companies-prepare-for-the-inevitable/ https://demand-planning.com/2016/02/19/risk-adjusted-supply-chains-help-companies-prepare-for-the-inevitable/#respond Fri, 19 Feb 2016 16:25:51 +0000 https://demand-planning.com/?p=3116 Each time I get in my car and drive to work, or the grocery store or wherever, there are a myriad of dangers that I might encounter. I could get t-boned at an intersection by a distracted driver; I might blow a tire and swerve into a ditch or a piece of space debris could crash through my windshield. Some perils are, obviously, less likely than others, but the reality is, anything can happen.

While I don’t obsessively worry about every possible risk, I am aware of the possibilities and I take measures to lower both the odds and severity of a mishap. I keep my vehicle well maintained, I buckle up and I pay my auto insurance. Similarly, today’s supply chain professionals must be more conscientious and proactive in their efforts to mitigate the risk of a supply chain disruption and to minimize the impact when the inevitable does occur.

As much as we may feel at the mercy of disruptions from severe weather, natural disasters, economic instability or political and social unrest, members of today’s high tech supply chain have never been better equipped to minimize the risks and capitalize on the opportunities that may arise from a supply chain disturbance.

One of the most simple, but powerful, tools at our disposal is information. Twenty-four hour news stations, social media and cellular communications give us literally instant access to events occurring in the most remote reaches of the world.

More tactically, mapping the physical network of the supply base, including manufacturing facilities, warehouses and distribution hubs, is an important part of any risk management strategy. The key here is mapping the entire supply chain network, not just top-spend suppliers or first-tier contract manufacturers. Most of this information is relatively accessible through supplier audits and, with the help of Google maps, you can create a pretty comprehensive picture of your physical supply chain.

[bar group=”content”]

Remember, though, supply chains are much more fluid than they have ever been. Today’s multinationals are likely to rely on three to five different contract manufacturers (CMs) and original design manufacturers (ODMs), and scores of other suppliers around the world for the tens of thousands of parts needed to build and maintain their products. With outsourced production so commonplace, production lines can be shifted between locations within a matter of weeks, so frequent monitoring and updating of supply chain shifts is critical.

IoT technology such as sensors and RFID tracking can also provide meaningful intelligence that may be used to identify and mitigate risk throughout the end-to-end supply chain process. The ability to gather and analyze these constant data inputs is a recognized challenge throughout the supply chain profession. Those who master the digital supply chain sooner, will enjoy a substantial competitive advantage.

Once these various vehicles are used to create a composite picture of the risk landscape, then risk mitigation strategies take center stage. These efforts can range from traditional techniques such as the assignment of a cache of safety stock to more intricate maneuvering of storage facilities and full network design. Deployment of these mitigation strategies requires a detailed recovery and communications plan.

In my upcoming presentation at IBF’s Supply Chain Forecasting & Planning Conference at the DoubleTree Resort by Hilton in Scottsdale, AZ, February 22-23, 2016, I will delve deeper into the growing range of potential disruptors in the high tech supply chain. I will outline the core elements of a comprehensive supply chain risk management strategy, including how to define and map the physical supply chain, the landscape around supply chain risks and their impact on financial metrics, and how to proactively assess potential risk. I hope to see you there.

]]>
https://demand-planning.com/2016/02/19/risk-adjusted-supply-chains-help-companies-prepare-for-the-inevitable/feed/ 0