Mining Technology reached out to Xore’s CEO Mikael Normark again to get some information about the outlook for improved automation in processing plants. Automation has come a long way in e.g. underground mines, but what is the situation above ground?
What is the general situation regarding automation in the ore processing plants?
We have clients with very high levels of automation, but also clients that still implement manual control of air valves, stirring speed etc. I think this also is quite representative for the industry as a whole, there are companies that are front runners and there are companies that want to wait for technology to mature before taking the next step.
One thing all our clients have in common though is that they have realized that the need for data is the same whether your actions based on the data is automated or not. Regardless if an algorithm or a human decides what to do next, they basically need the same input signals to make a good decision.
We hear buzz words like ‘big data’, ‘digital twin’ but what does it really mean?
I think it is a job for companies like Xore is to fill the buzz words with meaning. Sometimes these phrases make it harder rather than easier to understand what the essence really is.
To build a computer model of a process – a digital twin – a lot of data is needed. For ore processing, whether it is flotation, leaching plants or electrolytic refineries, the most important data of all is the metal content in a number of process stages. What grade is the process feed, what grade is the concentrate, what grade is the materials going to and from the repetition series, what grade is the tailings?
On top of this foundation you want to add a layer of information about how the process is set up. What is the power consumption of a process stage, what is the position of a valve? All this is important too, but without knowing the recovery of a process stage one cannot know what the effort of a process stage is really worth.
What we really want to emphasize is that in order to move forward you need to have your data sources in place first. To build a good and accurate computer model you need data that is representative of your process. That translates to the accuracy of the analyser and the representativity of the samples going to it. The quality of your computer model is directly dependant on the quality of the data from your analyser.
It is important that the data you use for a digital twin cover a wide range of process variations and that the data points have a short sample interval. Essentially, everything you want the computer model to be able to show must be in the data you use to create it. And this can take a long time to log, if a particular condition only appears once every year or so – then you need to collect the data for at least that amount of time. If you want to be really thorough you might want to register it more than once.
If you collect data points an hour apart, the computer model will only at best be able to simulate events with that rate of change, most likely only slower changes than that. That is why the analysis time of your on-stream analyser is very important. Since some reaction times to process stages is around 15 minutes one would want 2-3 samples within that period to be able to simulate it with reasonable accuracy.
But again, this is the same data you would need to be successful with manual process control. Upgrading your analyser will improve your decision-making process today, but also pave the way for future automation needs and improvements.
What benefits can a client expect on an ordinary day when they start automating their process control?
One of the main drivers for underground automation is that productivity drops radically when there is a shift change and for meal breaks. Even though the time required for these activities is much less in the relative convenience in a plant above ground, it still happens 2-3 times every day the plant is in operation, so it makes a lot of sense to go after the easy wins first.
What we also hear from potential clients is that they can see a difference in performance between shift teams. It is quite natural as the turnover in staff in many places used to be low, but times have changed and people change with it. The challenge for the companies becomes to transfer knowledge and control strategies between shift teams, and the experienced people can only work so many hours.
Working 20-25 years gives a lot of experience and gut feeling on how to keep the process at its peak, but there are so many factors to consider so it is quite difficult to put into words or writing. And for a new employee, it would be completely overwhelming to try to learn all those things in a short period of time.
Collecting data will give the companies the background to why some shifts have better results than others. The main benefit though is that the staff can browse back in the data to see what has been tried before and what the turnout was. When there is an issue the staff can quickly find some alternatives on how to fix it, and also exclude some ways that probably will not work. This reduces the trial and error in production, and lows in production will be recovered quicker or not allowed to go as low.
What benefits are there on a more strategic level?
We have some clients that mill batches of ore either from different mines or from different lenses in the same mine. They mill one ore for a few weeks and then they switch to the next one for a few weeks, and then they switch back and forth between 2-8 ores. The challenge they have is to keep recovery and productivity up in the transitions.
What they want to do is to essentially have some ready-made “recipes”, one basic plant setup for each ore. Naturally, they will know when the new ore starts coming into the mills, but the residence time in the process – or individual process stages – can vary. The on-line analysis will give the control room information about how the new ore propagates in the plant.
In the control room, they can then change the recipe for the plant, or individual process stages, just by selecting a pre-set from a list of alternatives. Everyone then knows that they have adapted the process to the new ore, and they can then focus on fine-tuning to reach maximum recovery.
A benefit with the Boxray on-stream analysers is that they also store pre-set calibrations tailored for up to 20 different ores. So, when the plant changes ore pre-set the analyser can also switch to another pre-set calibration, to follow the control room’s lead and to make sure the quality of the data they receive is maintained.
Mine sites are almost always in remote locations. In 2020 we have all become too familiar with the problems that might occur when travel options are limited. Automation can certainly help, for example by being able to run the plant with remote access from a control room in a different location. In the fly-in/fly-out locations, it might be difficult to arrange shift changes at certain times and increased automation can make life easier and endurance higher under these special circumstances.
We can access almost all of our installed base with remote desktop services, so we can give a lot of support and advice this way. We can even work on calibrations from our office and upload them, that is actually easier and faster than going out to do it on site.
How can Xore’s help clients move towards improved automation?
As said, we have clients already working at the forefront of this matter, so we have the experience to advise clients on what they need to think about to future-proof their plant. For the more advanced users, we can help with optimizing the analysis sequence or setting up calibrations to switch between.
We also have clients with little or no automation in operation, so we also understand their challenges. I think it is important for them to understand that you do not need to take on a massive project where everything is planned and performed – that might seem intimidating because it requires a lot of resources. But it is never too early to start gathering data. Our analysers can integrate to any control system using industry-standard communication protocols, so integrating on-line analysis is actually quite easy. And the more data you have, the more successful your automation project is going to be whenever you are ready for it.