Table of contents
Everyone from procurement to the warehouse to the manufacturing floor sees bad data show up. A wrong lead time, an obsolete part on a BOM, a cost that doesn’t match reality. But in most cases, they don’t fix it. Sometimes it’s not seen as their priority. Other times, they don’t have the authority. Correcting a field can mean opening a ticket, waiting for approvals, or triggering a cross-functional debate. So people take the path of least resistance: work around it and move on.
The problem is bad data behaves like a virus. It doesn’t stay where it starts. One wrong lead time spreads into bad POs, missed commits, and last-minute expedites. An outdated cost shows up in quoting, margins, and forecasts. Every infected record chips away at the real target: trust. And once trust is gone, the system isn’t a system anymore, it’s just a database everyone works around.
The Scope of the Problem
Bad data isn’t a corner-case. It’s systemic. EMS companies report that 80% of incoming BOMs contain errors. Reviews of OEM and EMS databases show 50–60% of part numbers or BOMs are wrong. At the leadership level, 75% of executives don’t trust the data in their own systems.
That lack of trust filters straight down. In a recent survey, 86% of operations employees said they spend more than 30% of their day fixing data or chasing workarounds. A Forrester study estimates that as much as 30% of operations costs can be traced back to bad data.
And these aren’t isolated mistakes. They multiply. A single wrong lead time doesn’t just mess up one PO, it ripples through production schedules, expedites, vendor negotiations, and customer delivery. An obsolete part in a BOM doesn’t just waste engineering hours, it creates procurement churn, QA debates, and line stoppages.
Real World Example
Take a mid-size electronics manufacturer running 500 active assemblies. A buyer spots a wrong lead time on a key capacitor but doesn’t have authority to change it. They note it in a personal spreadsheet. Planning doesn’t see the correction, so the MRP run generates a late PO. To protect production, the buyer expedites, paying a premium to the vendor and adding freight cost. On the floor, the late delivery forces a partial build, which QA flags. Engineering gets pulled in to debate substitutions, delaying the run further. By the time the unit ships, margin has evaporated. All of it traces back to one outdated field that no one owned.
The numbers make it clear: this isn’t a nuisance. It’s a massive drag on productivity and cost, baked into day-to-day operations.
The cost of bad data is staggering, but most people don’t personalize it. If you talk to a C-level executive, they’ll say, ‘I have an inventory problem,’ or ‘My supply chain has issues.’ After thousands of projects, we know that almost 90% of the time it’s caused by a data problem. Either the right data isn’t properly accessible, or the data’s not clean enough for what needs to be done.
- Kevin Campbell, CEO of Syniti, on the change in the perceived value of data
Management Owns the Problem
It’s tempting to blame employees for bad data. After all, they’re the ones who work around it by keeping shadow spreadsheets, chasing down confirmations, or ignoring MRP recommendations. But the truth is simple: this isn’t an employee problem. It’s a management problem.
If the ERP isn’t trusted, people won’t act on it. And when people don’t act on it, the system collapses into noise. The responsibility for that trust sits squarely with management. Leaders must take the stance that ERP output is gospel, not a suggestion, not “for reference,” but the version of truth that must be acted on without hesitation.
That requires two things:
- Data quality worth trusting. If the system spits out nonsense, no amount of authority will force people to follow it.
- Cultural reinforcement. Teams need to know that acting on ERP output isn’t optional. If something is wrong, it should be flagged and fixed fast, not bypassed.
When a company can’t operate this way, the blame doesn’t belong to the planner, buyer, or operator. It belongs to management, for allowing a system that can’t be trusted and for tolerating workarounds that mask the real issue.
The Cure Comes in Three Steps
If bad data is a virus, then treating it takes more than good intentions. You need a process that identifies where it lives, eliminates it, and keeps it from coming back. That means three steps: Diagnose, Cure, Prevent.
1. Diagnose: Identify the Key Attributes
Start by deciding which data really matters. Not every field in the ERP deserves the same attention. Focus on the attributes that drive spend, planning, and execution. And don’t make this decision in a vacuum. The list should have cross-functional buy-in from procurement, warehouse, operations, manufacturing, and accounting. Each group feels the pain of bad data differently, and each needs confidence that their priorities are covered.
- Items: Verified MPNs, no obsolete or NRND in active BOMs, accurate vendor mapping, current costs and lead times, ownership fields (buyer/planner).
- BOMs: No zero-qty lines, valid items only, realistic manufacturing lead times.
- Routings: Correct labor times per workstation.
- ERP controls: No zero-cost BOMs or routings, roll-ups up to date.
2. Cure: Fix the Data
Once the key attributes are defined, the next step is to correct what’s wrong. This isn’t “fix it when you see it.” It’s a structured cleanup effort.
In many companies, that means standing up a short-term tiger team, a cross-functional group from procurement, operations, manufacturing, and accounting that can focus exclusively on the cleanup project. Without dedicated ownership, the work will always get pushed aside by daily firefighting.
Access is another hurdle. In some organizations, the data lives in spreadsheets and can be cleaned directly. In others, the ERP is locked down, and IT has to be brought in. That might mean generating custom reports, building ad hoc queries, or even pulling directly from SQL tables. If IT is the gatekeeper, management needs to allocate resources up front. Otherwise, the project stalls before it starts.
Once access is secured and a team is in place, the cleanup should move fast:
- Bulk corrections for obvious errors.
- Assign appropriate authority: for example procurement updates sourcing, engineering updates labor routing, accounting updates standard cost.
3. Prevent: Inoculate with Real-Time Monitoring
Once the data is clean, the real challenge is keeping it that way. Left alone, it will drift back into the same state; obsolete parts creeping into BOMs, lead times aging, costs going stale. Prevention means constant monitoring of high-leverage attributes.
In ERP environments:
- Build dashboards that auto-flag key risks: missing or stale lead times, zero-cost roll-ups, obsolete parts in active BOMs, orphan records.
- Power your dashboards using Tableau (the industry standard for visualization) or Qlik, one of the most widely adopted alternatives, especially in ERP-heavy industries.
- Qlik Sense connects directly to ERP databases (e.g., SAP, Oracle, NetSuite) and supports associative exploration, great for spotting anomalies fast.
- Regardless of tool, define alert thresholds clearly (e.g., lead times older than 90 days, no cost updates in 12 months) and push real-time alerts to the responsible teams, not just buried in dashboards.
In spreadsheet-driven environments:
- Use conditional formatting and simple macros to flag blanks, duplicates, or out-of-date fields.
- Automate regular refreshes from source files, so the team isn’t relying on snapshots that go stale after a week.
- Keep ownership visible: Each attribute should have a named person who gets the ping when something is off.
- Use online spreadsheets like Google Sheets or Airtable with graphs and charts.
Require BOM Scrubs
- Verify all attributes before importing into ERP or any planning system, even spreadsheets.
- Use distributors like DigKey and Mouser or aggregators like Ocotpart and Findchips.
Read More: How I Learned to Stop Worrying and Love the BOM Scrub
Use dedicated Procurement Software
Software services like Cofactr automate everything up front, providing not only visibility into data issues but providing solutions right inside the BOM. These systems act like a gate effectively preventing bad part data from ever entering the system.
What Changes When Data Is Trusted
Clean data doesn’t just make the ERP look better. It changes how people behave. Once the system earns trust, employees stop second-guessing and start acting. The difference shows up everywhere:
- Procurement: Buyers stop chasing confirmations and expediting last minute. PO cycle times shrink because vendor–MPN pairings and lead times are accurate.
- Operations: Schedulers can actually use MRP runs to plan work. No more building around “phantom stock-outs” caused by stale on-hand or obsolete parts.
- Engineering: BOM roll-ups reflect real costs. Quotes stop swinging wildly, and engineering change orders don’t trigger surprise margin hits.
- Warehouse: Inventory counts match the system, so receiving and kitting don’t waste hours reconciling mismatched records.
- Accounting: Standard costs are current and auditable, so financial roll-ups stop turning into fire drills.
Even meetings get shorter. Instead of debating whose numbers are “real,” teams spend their time making decisions.
Trusted data doesn’t just cut wasted effort. It makes the entire company move faster because hesitation is gone.
The Bonus Payoff: People Stay Longer When Systems Work
The first gains from clean data show up in costs and cycle times. But there’s a second payoff that’s harder to measure and just as valuable: employee satisfaction.
When the system works, the job works. Buyers stop firefighting, operators stop second-guessing, engineers stop debating revisions. Instead of spending hours chasing fixes, people get to focus on the work they were hired to do.
That shift pays off in morale and retention. Employees who believe their tools are reliable don’t burn out as fast — and they don’t leave as quickly either. Clean data makes the company run smoother, and it makes the company a better place to work.
A meta-analysis covering 1.8 million employees across 82,000 business units in 49 countries found that higher employee well-being consistently correlated with stronger productivity and profitability.
Conclusion
Bad data is a virus that spreads across the company and attacks the one thing every system depends on: Trust. When trust is gone, people slow down, build workarounds, and stop acting on the system that’s supposed to drive the business.
The cure is straightforward:
- Diagnose the attributes that matter most — with cross-functional buy-in.
- Cure the dataset with a focused cleanup, dedicated resources, and the right access.
- Prevent relapse with monitoring and dashboards that keep drift visible.
Do this well, and you get more than cleaner reports. You get an operation that moves faster, costs less, and keeps employees longer because the tools they use actually work.
Want to make this easy? Schedule a free, no obligation Cofactr demo to see how we can help you automate price evaluation, component swaps, and much more.
Frequently Asked Questions
Why does bad data have such a big impact on productivity?
Bad data doesn’t stay confined to a single error — it spreads like a virus. A wrong lead time can ripple through purchase orders, production schedules, and customer deliveries, while outdated costs affect quoting, forecasting, and margins. Over time, these errors erode trust in the system, forcing employees to work around the ERP instead of relying on it.
How much does bad data really cost a company?
Industry studies show that bad data can consume up to 30% of operations costs. For EMS companies, 80% of incoming BOMs contain errors, and more than half of part numbers are wrong. On the shop floor, 86% of employees report spending a third of their day fixing or working around data issues.
Isn’t fixing bad data an employee responsibility?
No. While employees often create workarounds, the responsibility lies with management. If the ERP cannot be trusted, employees naturally bypass it. Leaders must ensure the system provides clean, accurate data and reinforce the cultural expectation that ERP output is the single source of truth.
What is the recommended approach to curing bad data?
The article outlines a three-step process:
- Diagnose – Identify the most critical attributes (e.g., lead times, vendor mapping, costs, BOM validity).
- Cure – Launch a structured cleanup effort, often with a cross-functional “tiger team” dedicated to data correction.
- Prevent – Use real-time monitoring, dashboards, and ownership assignments to keep data clean going forward.
What tools help prevent data issues from recurring?
Visualization and monitoring platforms like Tableau and Qlik Sense can automatically flag anomalies such as stale lead times, obsolete parts, or zero-cost BOMs. In spreadsheet-driven environments, conditional formatting, macros, and automated refreshes can also help maintain accuracy.
What changes when employees trust the data?
When systems are reliable, buyers stop firefighting, schedulers trust MRP runs, engineers get accurate cost roll-ups, and accountants avoid last-minute reconciliations. Meetings become shorter and more productive because teams debate decisions, not data validity.
Beyond cost savings, what other benefits come from clean data?
Clean data directly improves employee satisfaction and retention. When workers don’t have to constantly chase down errors or build workarounds, they can focus on the tasks they were hired to do. This reduces burnout, improves morale, and contributes to longer employee tenure.