The Future of Smart Manufacturing: How SPC and Digital Data Drive Efficiency
11 mins read

The Future of Smart Manufacturing: How SPC and Digital Data Drive Efficiency

In the rapidly changing manufacturing industry, data has become a key driver of manufacturing quality and efficiency. With advances in technology, digital data collection systems provide companies with unprecedented access to data. However, having large amounts of data does not automatically translate into better process control or quality improvement. Here is where the need arises to introduce Statistical Process Control (SPC) – a time-honoured and proven methodology that focuses on using data to monitor and control manufacturing processes.

Challenges of digital data collection

Digital data collection systems, such as digital databases, are powerful tools. They can collect and store industrial data, such as measurement data, into time-series database applications. The advantage of these systems is the ability to track all data values (e.g., speed, feed, temperature) at all times, which drives manufacturing excellence.

However, we often run into problems when trying to use control charts to evaluate this digitised data. Control charts themselves are designed to provide insightful process control information from sampled data, focusing on analysing regularly collected, well thought-out data. However, they are not suitable for analysing data that is collected in near real time, for example, once per second.

This doesn’t mean that control charts are ineffective, but we need to use them intelligently. The key is to thoughtfully collect and use the available data: what are our goals? What questions are we trying to answer?

A sensible sampling plan should allow us to collect small amounts of data on a regular, repetitive basis and represent it on the control chart to indicate normal and abnormal operational behaviour. These data points may represent individual values, averages, or ranges, and the sampling frequency varies depending on the process characteristics and the sources of variation you wish to discover.

Now, consider this question: how much do you think the oven temperature changes in a second and a half? Most likely almost no change. If we were to plot the temperature change in one-second increments on a control chart, we would probably see a series of “chunks” of data points, since successive point values could be the same. Not only does this look unnatural, but it can lead to overly strict or tight control limit calculations.

The Future of Smart Manufacturing: How SPC and Digital Data Drive Efficiency

For example, if we plot 100 points on a control chart and 12 of them fall outside the control limits, this is not necessarily a process change of concern. More importantly, if the data is sampled too frequently, the use of control charts may lead to too many false positives, as changes in the data between seconds do not truly capture normal and abnormal changes in the process.

Three strategies for using control charts in conjunction with digitised data streams

Digital databases and statistical process control (SPC) systems are complementary technologies. One should not be considered a substitute for the other. With the vast amount of data provided by digital, intelligent plant software systems, SPC systems can also be used to accurately control processes and improve products, solve problems, reduce costs and increase efficiency.

Strategy 1: Develop a sensible sampling plan

Firstly, it is important to define the concept of “reasonable”. Process experts (including operators, engineers, and quality professionals) need to be on the same page when deciding on the most appropriate data sampling method. Here are some key questions to consider when developing your sampling plan:

  1. How much data should we collect?
  2. How often should the data be collected?
  3. what should our sample size be (e.g., two or three results collected per sample)?
  4. how often should control charts be reviewed?
  5. How important are the parameters for charting (e.g., for critical quality characteristics)?

Although many methods exist for determining a sampling plan, our advice is simple but effective: collect the right amount of data at the right frequency. This advice may seem general, but the idea behind it is that developing a sampling plan is a process that requires discussion, different perspectives, and ultimately rational decision-making. A good sampling plan should be just right, neither too much nor too little.

Once consensus on a sampling plan has been reached, stakeholders can select data from a digitised database (or other source) at an agreed time and manually enter it into the SPC program. This strategy is not very technical, but the benefits of a good sampling plan combined with an efficient and insightful control chart are clear.

The sampling plan should be dynamic, i.e. it should not be assumed to be fixed just because it was agreed at the outset. Instead, a second plan should be created based on the lessons learnt from the first sampling plan and iteratively improved using the information gained from the first plan. By optimising the sampling plan, the organisation learns more quickly about key processes, how to better control them, and how to get the most out of them.

Strategy 2: Sample data directly from digitised databases

After implementing a sound sampling plan as mentioned in strategy one, many SPC systems are able to automatically sample data directly from historical data streams. This feature greatly simplifies the data collection process.

Using the oven temperatures we mentioned earlier as an example, let’s say an expert predicts that the heating process will gradually increase over several hours. In order to keep the temperature consistent, the team might agree to automatically collect a temperature data point from the historical data stream every 15 minutes. Such a sampling schedule allows the team to track significant changes in temperature while avoiding data overload.

For example, a control chart might show a gradual increase in temperature over time. This trend can be observed by recording points below the control limit in the initial phase and points in the final phase. Eventually, most of the points may lie above the centre line, thus clearly depicting the overall trend in temperature.

This strategy helps achieve process control and cost efficiency by automatically extracting a small but representative number of data points from historical data. The automated and streamlined process ensures ease of understanding and ease of operation. However, it is important to remember that the sampling plan should be adjustable over time and as insights into the process are gained. Depending on the insights gained, the team may choose to increase or decrease the sampling frequency. And, adjusting the sampling plan should be a simple process when the SPC system is electronically connected to other software systems.

Strategy 3: Determine the data model and select data to build control charts

In Strategy Two, we saw an example of collecting temperature data every 15 minutes. Sometimes, however, management or other stakeholders may demand that more data be utilised, such as using more than four temperature data points per hour. While this demand may seem excessive, it is feasible with the ability to programmatically modify the sampling plan in the SPC software.

Suppose the team decides to collect temperature data every minute, which would result in 60 data points per hour. Using these individual data points directly on the control chart could lead to information overload and increase the risk of false alarms.

The solution was to continue to collect data, but instead of plotting each individual data point on the control chart, calculate statistics based on the collected data and then plot those statistics.

The methodology for implementing this strategy is as follows:

  1. create subgroups of 15 data points, i.e., provide one data point for each one-minute time period.
  2. calculate the average temperature in each of the 15 data points. The average of each subgroup will represent the average temperature over that 15 minute period.
  3. plot these averages on a control chart, treating each average as a single data point. This graph will be based on the average temperature change over each 15-minute time frame.
  4. similarly, calculate the sample standard deviation of the 15 data points for each subgroup, indicating the variability of the temperature over that time period.
  5. plot the standard deviation values for each subgroup on another control chart. This graph will show how temperature variability changes over time.

The result of this method is two control charts, each with 22 plot points over a five and a half hour period.

Both charts provide valuable information:

Mean value graph: depicts the expected trend of gradually increasing temperatures.

Standard Deviation Graph: demonstrates the larger fluctuations in temperature at the beginning of the run, as well as the smaller temperature changes during the second, third, and fourth hours.

Additionally, using this strategy, rather than the traditional X-bar and R chart or X-bar and S chart, avoids the problem of having too much autocorrelation in the data within a minute to correctly calculate the control limits. This approach allows the original hypothesis to be validated or adjusted based on the data collected.

This approach, not only manages large amounts of data efficiently, but maintains the clarity and utility of the control charts. This approach allows the team to monitor and understand the data more accurately while avoiding the confusion and misinterpretation that can result from too many data points.

Conclusion

Manufacturing organisations around the world are already benefiting extensively from digital data collection systems, and the use of SPC (Statistical Process Control) technology continues to show its value. In fact, SPC and digital data collection can not only co-exist, they actually complement each other. Using a combination of these two technologies can better exploit and utilise the potential of data.

To ensure that SPC delivers the most valuable and relevant information, the key is to sample data from digital data collection systems in a way that makes sense. In this way, organisations can effectively use the intelligent capabilities of SPC to control their processes, thereby improving efficiency and production quality.

The ultimate benefit is obvious: improved product quality while reducing costs also enhances an organisation’s competitiveness in the marketplace. In today’s data-driven era, effectively combining SPC and digital data collection systems is certainly a critical step forward.

Leave a Reply

Your email address will not be published. Required fields are marked *