InTech

JUL-AUG 2017

Issue link: http://intechdigitalxp.isa.org/i/858445

Contents of this Issue

Navigation

Page 12 of 53

INTECH JULY/AUGUST 2017 13 COVER STORY More and more data is now available. Through superior analytics, users can pinpoint where to look for additional productivity gains across the value chain. Analytics solutions allow them to consider many additional variables that traditionally could not have been consid- ered or afforded within the control, man- ufacturing execution system, or even ERP domains. The power and flexible architecture of this technology also al- lows starting small or scaling up by com- bining data from many domains and even across multiple sites. Third parties, like original equipment manufacturers, benchmark their equipment informa- tion at a significant level of fidelity, and, in some other areas, live trading informa- tion is at hand to be used. In the past, the lack of affordable computing power sim- ply did not allow value chain modeling at this granularity. The combination of big data with artificial intelligence (AI) opens a whole new range of possibilities. As this technology is maturing, there will still be significant investment in this space, which will be required to attain the next level of productivity and safety across the value chain—holistically. The industry today is still modeling product and process variables locally for what one would call unit or cross- unit optimization, such as ball mill optimization, grinding, or flotation. Sometimes these functions are per- formed in multivariable controllers in the process control layer. In other times, the models sit in Layer 3 or 4 per the ISA-95 standard for other functional or historical reasons. In the past couple of years, there has been an increased investment and uptake of this kind of model after organizations execute an initial analytics proof of concept (PoC), which is often aimed at proving its po- tential to traditional leadership. Still, it is worth asking the question given the analytics hype: How many companies crossed the chasm and have performed global rollouts of models in each of their core functions? From this perspective, mining is in the "early adopters" phase, where ener - gy, banking, telecommunications, and retail industries might be a step ahead, because the majority have adopted ana - lytics. For mining processing, the energy industry (upstream and refining) is likely to be where complex and mature mod - els already exist and where the basic unit operation functions are more repeatable and predictable. Directly applying these models in the mining industry is not al ways possible, because in mining vari- ability appears across the value chain in different areas. This is often overlooked when automation or software vendors want to bring in solutions from other industries. The attributes and variabili - ties of different commodities also result in the need to address a quite different process flowsheet, therefore posing the biggest challenge for today's tier 1 min - ing companies that traditionally have aimed for a diverse portfolio to reduce financial exposure and risk. How to start getting the right data? Targeting the core activities that all the assets have in common is a good starting point when looking for additional pro- ductivity—and is part of the intent behind the ISA-95 standard. Another benefit of the standard is that the definitions help with communication via a common lan- guage. Complex, mining-specific naming conventions do not allow engineers to collaborate with analysts or collaborate with programmers or mining, equipment, and technology specialists (METS). These professionals are all essential in running today's mines, but traditionally do not all have a strong IT background. Already the ISA integration definitions for the main integration points between core activities is well defined by B2MML interfaces that typically use XML. These interfaces help accelerate how min- ing systems should be communicating, both vertically and horizontally across the IT/OT solution architecture. In addition to the benefits of the standard, there are major technol- ogy advances in analytics and process control, where OPC-UA is now applied efficiently for secure data communica- tion across domains. Earlier protocols had integration and security challeng- es, because they were not natively de- signed to solve that issue. Is the answer to productivity directly creating one large, global, connected model of your entire value chain? The answer is "no." Analytics projects in early adopter mining companies have typically been more successful when ap - proached locally before trying a more global approach. After performing a cou - ple of focused PoCs, it becomes clearer how to scale and adopt models into the various functions for a more mature au - tomated and global approach that con- siders parameters applicable to the entire value chain. Teams need to learn how to collaborate and determine what the data means before adjusting their processes. When team confidence is going up, it is still wise to start with a low-fidelity approach to see the cross value-chain effects of specific local process set point changes on the overall target functions before any heavy lifting is considered. A light "global" model with variables like rate and grade and maybe residence time of production is a good starting point be - fore considering the many other possible variables. With at least these parameters as a baseline, it will become more obvious where model changes or oversimplifica - tion might be causing suboptimization. Light control model The light control model, therefore, is a good baseline for comparing the progress of the overall objective functions before investing more analytics dollars in larger and more complex multivariable (ma- chine learning) models. Having such a baseline is a good guide to see if you can achieve still more value when increasing the fidelity of your sub- or global models. If after some early wins, the "value per analytics dollar invested" decreases, there is still the ultimate challenge of starting to connect to even larger enterprise models that go beyond specific assets. A good example of such an enterprise model is blending from multiple iron ore sites (e.g., Pilbara, Hunter Valley, and Minas Gerais) to optimize product give - away initially with stockpile inventory, but gradually reducing inventories and costs as the models run faster and better. This allows companies to change from tradi - tional produce-to-stock models to just-in- time models. Such models only run well if they do not dilute the net present value and long-term strategy of your operation.

Articles in this issue

Archives of this issue

view archives of InTech - JUL-AUG 2017