Sciemetric

The Repair Bay, Part 3: A day in the life

Contributed by: Patrick Chabot - Manufacturing IT Manager

In the first post of this series, we highlighted the benefits of turning the repair bay into a defect data management station that drives continuous improvement. In the second post, we emphasized that it takes more than tools and processes – buy-in, from the corner office to the plant floor, is crucial.

Now let’s take a look at the different ways in which individuals from the plant floor to the office engage with and benefit from a repair bay data analytics system.

The repair technician

For the repair technician, a data management and analytics platform that ties the stations on the production line to the repair bay is doing all the heavy lifting in the background. They need only input the right data during the normal course of their work day.

When a part comes to the repair bay with an issue, they can quickly draw on historic data to red flag trends and patterns. It doesn’t matter if it is a part from the production line, or a finished product returned by a customer.

The technician can pull the full birth history record for the problem part and see if any of its assembly processes or tests were only borderline passes. This allows them to quickly trace the cause of the issue and decide what to do next – if the part should be scrapped, reworked in the repair bay and put back into production, or regressed up the line and run through the same process or test station again.

Once a repair is made, the part can be run through test processes again and the new signature data compared to the old.
Whatever the outcome, all the work done by technician is added to that birth history record.

The quality engineer

Their focus is on trends and patterns that can account for changes in yield. They review daily reports that highlight scrap and rework rates to see if trends are positive, negative or stationary.

An integrated data management and analytics platform allows them to dig deeper when they see red flags. They can tie negative trends back to what parts are showing up in the repair bay. From there, those birth history records allow them to see if there is a correlation with, for example, a specific test or process on the line (equipment problem?), a specific batch of components (supplier problem?), or a specific production shift (operator error?).

Data can be broken down in various granular ways so the quality engineer can apply the Pareto Principle, or 80/20 rule, to focus first on addressing those quality issues that will yield the overall greatest improvement. Specifications and tolerances up and down the line can then be adjusted to prevent the same quality issues from occurring again.

The manager/executive

This individual is likely on the hot seat to prove the value of investing in the data and analytics platform. They don’t care about digging down to the granular level of the quality engineer. Their focus is on the broader production trends over weeks, months and fiscal quarters. Is the operation running smoothly, hitting targets for yield and profitability, or missing the mark?

They need this insight for better decision making, to determine how best to allocate resources – people, equipment, training – and see where cuts can be made to rationalize costs. How does this line compare to that one, or this plant to that plant? They seldom look specifically at the repair bay data, but it is an essential piece that brings clarity to this broader picture.

Break down the repair bay silo

From the frontline repair technician to the executive focused on the big picture, making the repair bay an integrated piece of a plant’s data management puzzle is key to winning the Industry 4.0 quality race.