Insights

Digitalisation in manufacturing – building on the foundations of good data

Mathew Price

Mathew Price

Principle Consultant

In the world of modern manufacturing, many organisations have already started building advanced tools for digitalising their manufacturing processes. But are they being built on strong foundations?

Most manufacturers are currently focussing their investment on AI, as well as analysing, storing and securing data. While there are huge gains to be had from doing this, the outputs from these tools are only as good as the data that they’re based on.

The old adage of ‘garbage in, garbage out’ rings true, regardless of whether the manufacturing process uses an advanced Digital Twin with Machine Learning, or simple, traditional closed-loop control.  

In our previous article about Digital Twins, we said: 

  • Capturing good data is the foundation to any digitalisation concept. 
  • Regardless of your ambitions for digitalisation, you should take small, incremental agile steps

It therefore stands to reason that your first small (but significant) step should focus on laying down the foundations of good data. In this article, we give some guidance on how to do just that. 

Where does the data come from?

Manufacturing processes are often complex, finely-balanced, dynamic ecosystems, in which the slightest deviation can cause significant quality issues. As such, sensors play a pivotal role in controlling the process, by feeding the machine’s control system with the vital information that it needs to make effective control decisions. 

Unfortunately though, using an accurate sensor doesn’t necessarily mean that you’ll get good data. 

Don’t assume that your existing sensors are giving you good data, just because you’ve been using them for years and your processes seem stable.

You should be mindful that measurements from a sub-optimal measuring system can make a process seem more stable than it actually is. Remember, OEM machines are built to a cost-target, so some design decisions might have been compromised by cost. 

It’s probably better to start from having no data source, than having a bad data source – even if you believe it is good. The former encourages you to think about it properly in detail, whereas the latter might give you a false sense of confidence from potentially misleading information.

Consequently, many organisations have already built advanced digitalisation tools, like Digital Twins, upon a machine’s existing built-in sensors without sufficient understanding of the sensor technology, performance or what exactly it is measuring.  

What do we mean by ‘good data’ and how to get it?

You can achieve good data by measuring the right parameters, in the right location, using the right sensor… but this isn’t as simple as it sounds! 

You should start by critically assessing the relevance, accuracy and value of your existing sensor setup and the data that you’re capturing. Taking the following steps will enable you to identify any gaps and weaknesses in your current data, and help you to get good data: 

  1.  Understand the science

The most important thing to have when implementing a measurement and control system is a deep understanding of the fundamental science… in both the manufacturing process and the measurement technology. 

A manufacturing process is essentially a sequence of transformation steps, in which each step is governed by the fundamental laws of physics and/or chemistry. Similarly, a sensor is governed by the laws of science in the way that it outputs an electrical signal in response to a physical or chemical stimuli. 

2.  Identify exactly what you need to measure 

On first impressions, it might seem obvious what you need to measure. e.g. measuring the temperature in a biscuit-baking process. But exactly what temperature should you measure?  

For example, in biscuit baking the critical temperature that you need to control (and should therefore want to measure) is the temperature of the dough throughout each biscuit, but this simply isn’t practical.

You will therefore need to measure something else that will help you to control the temperature throughout each biscuit, but what? As soon as you move away from directly measuring the critical parameter in the critical position you will need to consider the impact of the other variables. 

Unfortunately, the list of variables that affect heat transfer is vast, and it will likely include a mixture of both controlled and uncontrolled variables. You won’t want to measure all of them, but you should measure the ones that have a noticeable effect in your application. To do this, you will need to combine a deep understanding of the science with a healthy dose of pragmatism. 

3. Identify where you should take the measurements 

Using the example of temperature again: the temperature of an object or a system can vary significantly depending on where you measure it. Understanding the spatial distribution and transfer of heat is essential for gaining insights into the behaviour of a system. 

Example: In the food industry, heat-sealed packaging is widely used to prolong the shelf-life of food. As it’s not feasible to measure the temperature inside the packaging film as it melts, the next best thing is to measure the temperature of the sealing tools. The critical temperature to measure is at the sealing surface that comes into contact with the packaging film, but you might be surprised by where the thermocouples are located in many sealing tools… and not in a good way! We have measured temperature errors of greater than 30°C in some applications. 

Furthermore, taking measurements in several different positions can help you to identify variations, trends, and patterns that may be indicative of underlying phenomena or process inefficiencies. 

4.  Select the right sensorand how to mount it

There are many different types of sensor technology, each with their pros and cons. However, selecting the right sensor isn’t just about selecting the right technology, or even selecting the best performance according to the technical datasheet. You should also understand how the sensor works and how to use it properly. 

Common factors to consider when selecting a sensor are: 

– Technology – Response time 
– Size, shape – Resolution 
– Materials – Stability 
– Accuracy – Signal to noise ratio 
– Range – Calibration error 
– Sample rate – Sensitivity to other variables 
 – etc

Often overlooked, timing accuracy is crucial for measurement accuracy because many physical phenomena and processes occur dynamically over time, and sometimes change very quickly. 

Example: In a high-throughput heating process, you might need to measure any rapid changes in temperature. Although your measurement and control system might be capable of taking thousands of measurements per second, the thermal mass of your thermocouple will affect its ability to detect rapid changes in temperature.

Unfortunately, the thermocouples that are typically used in industrial/automation applications are generally quite large (i.e slow to respond), so they’re often not able to give an accurate measurement of the temperature at the time it was taken. Worse still, the time-lag isn’t constant, so it’s difficult to compensate for. 

For many sensors, the way you mount them can also have an impact on the accuracy of their measurements, so it’s vitally important to consider this too. With clever design and careful material selection the effect that the mounting method has on the data can be significantly reduced.  

Only once you’ve verified that you’re capturing good data should you start taking the next steps towards your exciting digitalisation goals. 

Don’t assume AI will fix bad data

If the existing data from your OEM machine isn’t showing the trends or correlations that you’d expect to see, it might simply be because the sensors aren’t actually measuring what you think they are. Don’t assume that using advanced analytics or Machine Learning will uncover any hidden patterns or correlations from bad data.

Your first priority should be getting good data, because the output from AI is only as good as the input data (garbage in, garbage out). AI has little value without good data, whereas good data still has significant value without AI. 

Conclusion

Digital Twins and other digitalisation tools have the potential to unlock immense value in the manufacturing sector, but only if they are built on the strong foundations of good data. 

Focusing on the science will help you to capture good data which will, in itself, unlock huge value for your organisation – including being more informed to assess the business-case for your digitalisation ambitions. The data also lets you create a baseline for your digitalisation journey to use as an indicator of progress. 

By taking small, iterative (agile) steps and re-evaluating your digitalisation ambitions (and business-case) after each step, you might discover that the majority of the value you want to achieve can be unlocked before you reach the level of digitalisation that you were originally aiming for. 

How can 42T help? 

At 42T, it is our objective is to help you to achieve your business objectives, by the most appropriate means. We take a pragmatic approach to choosing the right solutions for you, which might not require any digitalisation whatsoever!

We can do this by combining our: 

  • wealth of experience in the manufacturing sector 
  • deep understanding of the underlying scientific principles behind a manufacturing process 
  • broad and extensive knowledge of different technologies  

Importantly, we’re also solution-agnostic because we’re not in the business of selling specific tools and software.