Data: if it’s not big, then it better be good
When it comes to data, having huge data sets creates huge opportunities. That’s how consumer-focused companies train algorithms to make relevant recommendations around our choices of music, food, movies, and much more.
In the industrial sector, there are also large opportunities enabled by data. Huge volumes of information from assets such as airplane engines or gas turbines can give insights that power software capable of predicting equipment failure or the need for maintenance.
However, training algorithms and powerful software becomes more difficult when data is scarce, a common situation in the industrial sector. Industrial operators are protective of their data, so there is limited sharing of information around critical applications. Also, limited production errors minimize creation of error data, since automobile engine manufacturers see only a very small number of defective or out-of-specification engines. The same holds for battery manufacturers.
The solution?
If you don’t have big data, then the quality of the data you do have becomes paramount.
That’s because when your data set is in the millions, the bad, poor-quality data points are easily averaged out. But if your data set is small(er), every data point is important in building an accurate model. A few bad pieces of data can skew the entire data set, making any algorithm useless in the real world.
The next question then is, “How can I generate ‘good’ data?”.
Nature and nurture
The short answer is that good data comes from good sensors. But that tells only half the story. Yes, good sensors are critical because they provide the sensitivity, accuracy, durability, and reliability to generate accurate data day after day, year after year, no matter how extreme the conditions.
Equally important, though, is expertise in the application of those sensors. Understanding what type of technology is appropriate in each setting and where it should be placed is crucial, particularly for industrial inspection and non-destructive testing (NDT), where there are so many types of sensors available. From x-ray and ultrasound, to CT scanners and vibration sensors, each situation requires the right technology, and it takes expertise to know which one to use and how to apply it in the most efficient way, in order to guarantee consistent application.
Consistency is not only important across multiple lines in a single plant but also across multiple plants, whether located across the country or the globe. Consistent sensor application with replicable outcomes is at the heart of consistent quality data and, therefore, uniform product quality, no matter which facility it comes from.
The growth in just-in-time manufacturing and additive manufacturing highlights another reason why expertise in the application of sensors is so important. In this environment, speed is essential and NDT experts can quickly assess and deploy testing requirements to provide immediate feedback on product quality. This allows for faster iterations and prototyping to speed-up time-to-market, lower investment costs and improve productivity.
Generating business value
This reliance on NDT data early in the value chain and as a part of product lifecycle management is the future of our sector. Today, many customers come to us with quality and productivity issues that we can help them solve: What is the best way to improve the quality of 3D-printed parts? How do we ensure the quality of composite parts? This is an important evolution, because by serving as a partner and collaborator, we can help customers transform their view of testing, helping them see it as a tool to achieve a range of business objectives around cost savings, quality improvements and shorter production timelines. NDT also is changing the way the data is collected, analyzed and deployed. Powerful software, such as Waygate Technologies’ InspectionWorks TM, helps companies leverage the good data they collect into improved outcomes.
For example, InspectionWorks creates a holistic view of all relevant inspection data streams, often siloed behind different supplier technologies or different parts of the plant. The platform collects and collates these diverse data streams, analyzes the data to generate actionable insights, and helps customers take the steps necessary to drive improvements through workflows that connect and integrate with customer software packages such as Computer-Aided Design and Manufacturing (CAD/CAM), Manufacturing Execution Systems (MES), Product Lifecycle Management (PLM) and Enterprise Resource Planning (ERP) software.
A wind energy OEM customer of ours is able to create complex multi-instrument, multi-user inspection workflows and then push the workflows to their ERP system to deploy work orders to internal or external inspectors. Once the inspectors go into the field and conduct the inspections, the data is automatically attached to the right “locations” on the asset and the inspection reports are pushed back to their PLM system. Finally, the inspection data (rich with annotations and expert notes) is processed, analyzed and converted into actionable insights that can be digested by their CAD/simulation systems to enable design and process improvements.
As this type of software ecosystem continues to evolve, the importance of testing as a source of value creation, innovation and business differentiation will only grow.
Manufacturing and services are being transformed by data, and a big part of what will drive this sector forward is the data generated and analyzed by industrial inspection and NDT.
For business leaders, however, amidst the growing buzz around data, it’s essential to remember that high-quality data is as important as big data to drive the desired outcome.