Data integrity: Bad data and 3 good things to do about it when supply chain planning

addtoany linkedin
Close up at hands writing in a printed report close to a computer

Every business plans, but not every business runs as planned. Delays, shortages, quality issues, catastrophic weather events and fluctuating commodity prices are just a few examples of the exhaustive list of worries that will throw plan into disarray. Achieving a realistic forecast and aligning supply plans is an extreme long shot at best.

The best supply chains need to manage business when it’s not business as usual. That’s what sets them apart. However, even the best supply chains struggle with a recurring issue – data integrity. The alignment of demand and supply is more difficult because most, if not all, supply chains have data integrity issues. That means even if you take away all the supply chain disruptions, your plans are off before you even get started.

Successful supply chain planning starts with data

Setting yourself up for successful planning starts with your data. What could arguably be the single biggest deterrent to undertaking a supply chain planning improvement project is, “my data is crap.” Even though it’s likely true, you’re using the current state of your data to plan, and there’s still value in that. Data integrity shouldn’t be the reason not to take on a process improvement initiative, it should be a part of any supply chain planning improvement project.

Why the data issues?

Like the supply chain disruptions listed earlier, there are just as many reasons why data accuracy is as difficult as maintaining forecast accuracy. Here are the big reasons:

  • First off, there is just a lot of data. Depending on your company, you’re likely looking at record counts in the millions or billions. In addition, these record counts are never static. With that amount of data, something is going to be off.
  • New data sources. With mergers and acquisitions, new data sources are added along with the data in these systems. Depending on the system and processes inherited with the new source, data issues could be significant.
  • Product proliferation. Product innovation means new products being added and older versions or products becoming obsolete. One data slip and you could be planning to the wrong revision of a product.
  • New supply chain relationships. Establishing new customer or supplier relationships brings with it all the data elements like order policies, cost and lead times, all of which are error prone.
  • Things change! As we all know, when it comes to supply chains, things change. Planning parameters set today may not be what’s required tomorrow.

For these reasons, it’s wise to have a plan of attack against data integrity issues.

What’s wrong with data?

The low hanging fruit of data integrity issues is simply missing data. Standard costs, bill of material records like quantity per, safety stock or order policy information are all typical records that may be left unattended for any of the reasons mentioned above. Missing a lead-time or quantity per record means your plan will be wrong no matter how good your planning processes are.

Speaking of being wrong often, even if the field is populated the number can be wrong. With fractured data systems, data elements may be different in each system and it can become a challenge to know which one is right. Yield and scrap factors will change as manufacturing processes improve after first runs of a part.

You may set a Kanban policy based on current demand patterns, cycle times and lot sizes, but if any of these factors change, like demand, Kanban policies could be driving excess or shortage conditions. You may also have order status details that are not up to date, one of the most common being completed orders not closed. For these reasons, cleansing data isn’t a one-time event.

What’s the data integrity plan?

  1. Shine a light on the problem. There are companies that have found a way to interrogate their data with standard data integrity processes, views and metrics. It’s important to be able to identify and prioritize data cleansing efforts. One company had a “top 10” view into data cleansing priorities. They knew they couldn’t fix all of them, but were able to sort based on revenue impact or customer to ensure the highest priority issues were addressed first. Because you need to be able to look at all data across the entire network, one source of the truth will be invaluable.
  2. Continually monitor your data. Some data anomalies may be less obvious to spot and slowly come to a boil rather than explode on the scene. Actual lead times may trend away from planned lead times, so it’s important to get notifications when planning parameters may need to be reviewed and adjusted. In these types of situations, it would be difficult for a planner to spot the trend and will make a strong use case for some machine learning.
  3. Use scenario planning. When it’s time to set or modify data like planning parameters, it’s beneficial to be able to test the results of changes to planning data. Planning parameters impact each level of the supply network and the compounding effect on demand, supply, capacity and inventory can be staggering. Testing the impact of data changes will not only let you set more realistic performance expectations, but will also put out some fires before they start.

Data integrity challenges are not going anywhere anytime soon. Recognizing there will always be issues and that data cleansing is a continual process is the hardest part of the battle. Let us know if you’ve taken any unique approaches to solving your own data issues.

Discussions

Charan
- May 03, 2018 at 1:45am
Another low hanging fruit is to also do a check from a systems viewpoint and if the integrations are extracting information from the right source or if not for what reason. This is an especially frequent occurrence in the presence of a customization or a data warehouse kind of sitation
Bill DuBois
- May 08, 2018 at 2:05pm
Hi Charan,

That is a great point. Thanks for the insight. Certainly the work that goes into validating the data as well as the source will pay off. If even a fraction of that data is off, your plans will be off before you even start to execute.

Cheers, Bill

Leave a Reply

CAPTCHA