Main navigation

Sciemetric

What manufacturing data should I be collecting to improve part quality? | Manufacturing Data Series, Part 1

Headshot

Rob Plumridge - Leak Applications Engineer

engineer looking over factory floor

When it comes to the analysis of your manufacturing process, there’s one step that can’t be avoided—you can’t fix what you haven’t measured. And in order to effectively measure performance, you need to first collect the right data from your processes.

Manufacturing quality teams often ask us what data they should be collecting—what data is the most important to determine root cause when there is an issue with quality or throughput?

Our answer? “It depends.”

Collecting part production data for insights on part quality

Our advice for manufacturers seeking insight into their quality problems is to start by collecting part production data. This is the data that is directly generated by a process or test as it performs its function on each part in production. The best way to capture this data is in the form of a digital process signature—a waveform that clearly visualizes what happened through each millisecond of that process or test cycle.

How Sciemetric does this is by collecting the full digital process signature of each operation. With this data, we then develop a feature check for every way which a manufacturing process can go wrong. Whether it relates to time, force, distance, pressure, voltage etc., we gain a record of the entire process.

With this setup, you can be confident that the answer when a problem does arise is somewhere in the signature data already collected. We can apply our analysis tools to trace root cause and determine the best fix for prevention.

This extent of data collection can of course quickly add up on a line with dozens of stations, or even hundreds, in the case of a large OEM in a sector like automotive. Depending on the operation that occurs at a station, we could have a half dozen or more digital signatures 1-10X that number of feature checks. And depending on the speed of the line, there could be hundreds of parts moving through each station per hour.

For true data insight, this signature data should be stored in a way that allows it to be correlated with any other production data related to a part’s production, including scalars and machine vision. No relevant data should be left trapped in a silo somewhere on the floor. 

Prioritizing which data to collect

This of course ends up being a lot of data. Apply a Pareto approach and the old 80/20 rule. Have your team triage by considering what are the largest bottlenecks on the line that are impacting overall throughput or first-time yield—the 20 percent that is causing 80 percent of the grief.

Focus your data collection efforts on these trouble spots first by deploying the feature checks and signature analysis capability to get to the root cause. We can then apply algorithms to find trends and patterns that reveal the “how” and the “why” so processes or tests can be adjusted as needed. We can apply this to any controlled process – from press fitting and leak test to rundown, crimping, weld and dispensing. 

This approach works best if you have serialized production – each part that you will produce bears its own unique serial number. This makes it easy to create a birth history record for each production part that includes all the data from every process or test that touched it.

In our next blog, we will talk about data retention and storage

 

Looking to improve data collection and analysis on your production line? We can help. Contact us.

CONTACT US