Mining and mining equipment, technology and services have traditionally been engines of economic growth and prosperity for Australia and New Zealand and so it’s always mattered what operators in these sectors do. In Australia, the sector contributed a record $455 billion in export revenue over the 2022-23 financial year.
Industrial sectors like mining have always been more awake to the potential of data than other sectors. Due to their scale, it takes only a very small - often low single-digit - improvement to achieve a multi-million dollar savings outcome. A major mining contractor in Australia, for example, once said that a 1% operational improvement could save it “close to 3 million litres of diesel”.
The other effect of that is that a bigger percentage improvement is not only possible but also the impact is more pronounced. Newcrest Mining, for example, increased throughput by 650,000 tons in the first six months of using a soft IoT sensor to optimise the amount of newly crushed ore that could be emptied into a crushed-ore bin, where it is fed onward for further processing.
Both examples reinforce the power of being data-driven in Australian and New Zealand industry, where the bottom-line benefits of process optimisation can be enormous.
And here’s the big thing: this is often achieved today with a fairly limited set of data analytics approaches. The data analytics space has evolved considerably over the past few years, and with it, the ‘art of the possible’.
Layering additional rigorous models and AI-based advanced analytics on top of traditional data platforms and capabilities promises to unlock a level of process optimisations and improvement for a wide variety of Australian and New Zealand industrial sectors - from mining to manufacturing - that was unimaginable a few short years ago. Indeed, the value of data in enhancing collaboration across large industrial site with dispersed teams also empowers teams to make more informed and timely decisions.
The evolution of data sources
Industrial companies have traditionally sought to capitalise on their wealth of historical data to maximise operational efficiency and profitability. Today, this data is increasingly used to meet their reporting and compliance requirements in areas such as environmental, sustainability and corporate governance.
Many operations naturally start out on a data-driven journey by using data collected and stored in their data historian, a repository for historical plant data produced by process control systems.
This is a natural starting point: the data historian is, after all, capturing and storing high-fidelity process, alarm, and event history data, and allowing for it to be queried to enable troubleshooting and informed decision-making about production assets.
Traditional equipment monitoring programs rely on data measured throughout the process to inform maintenance decisions. For example, temperature and vibration data may be used to predict a variety of failure modes for a centrifugal pump. By using historical data, reliability engineers can determine a baseline value for each measurement and configure alerts when values fall outside of this range. This is known as condition-based monitoring and is a simple way to begin using measured data to improve process reliability.
While condition-based monitoring is useful for assets with relatively stable operation, accounting for different operating windows or process modes can quickly become a challenge.
The historian remains relevant and has a key role to play in asset monitoring, reporting and basic data-driven decision-making. But performing the latter using historical data alone can only achieve so much.
Operators today understand that both historical and real-time data is critical for optimising process performance, reducing production variation, and identifying crucial process events.
That, in turn, is driving - or has driven - industrial companies to use more predictive analytics and increasingly artificial intelligence and machine learning (AI/ML), as part of a robust Asset Performance Management program.
AI and digital twins are a force multiplier
Forward-looking industrial companies are choosing to layer rigorous models and AI-based advanced analytics on top of their data historians, to extract value and gain faster and better insights into processes and assets.
AI and ML models can act as deep learning tools to forecast an asset’s remaining useful life, giving teams critical information and prescriptive insights to analyse cost-versus-risk and devise plans that maximise efficiency and profitability.
Operators can define leading indicators based on sensor and other operations data and use this information to detect even subtle changes in asset performance.
Once teams have identified an anomaly, they can use advanced AI tools to predict performance degradation and component failures and then work together to prioritise maintenance needs based on urgency, schedules, available teams, resources, and spare part availability.
In addition to preventing asset failure, predictive AI-based guidance allows operators to minimise energy usage and compare asset performance, helping them meet regulatory and contractual obligations.
Some operators are taking the next step and creating digital twins that are enhanced with AI capabilities. These can be used to identify the most optimal operating conditions and act as advisors to operators to get more out of their assets.
In particular, a digital twin can provide insight into otherwise unmeasurable process variables, allowing tools to proactively predict best operating conditions, to increase yield, reduce energy as well as reliability issues for rotating and stationary assets across operations.
Every industrial company is on its own unique data-driven journey. Regardless of where they are today, they can still find new ways to optimise operations to increase profitability, while meeting safety requirements and sustainability goals.