#### Explore our team's technical articles featured in esteemed peer-reviewed journals

## Articles

Jelvez, E., Ortiz, J., Varela, N.M., Askari-Nasab, H. and Nelis, G., 2023. A Multi-Stage Methodology for Long-Term Open-Pit Mine Production Planning under Ore Grade Uncertainty. Mathematics, 11(18), p.3907.

The strategic planning of open pit operations defines the best strategy for extraction of the mineral deposit to maximize the net present value. The process of strategic planning must deal with several sources of uncertainty; therefore, many authors have proposed models to incorporate it at each of its stages: Computation of the ultimate pit, optimization of pushbacks, and production scheduling. However, most works address it at each level independently, with few aiming at the whole process. In this work, we propose a methodology based on new mathematical optimization models and the application of conditional simulation of the deposit for addressing the geological uncertainty at all stages. We test the method in a real case study and evaluate whether incorporating uncertainty increases the quality of the solutions. Moreover, we benefit from our integrated framework to evaluate the relative impact of uncertainty at each stage. This could be used by decision-makers as a guide for detecting risks and focusing efforts.

Riquelme, Á.I. and Ortiz, J.M., 2023. A Riemannian tool for clustering of geo-spatial multivariate data. Mathematical Geosciences, pp.1-21.

Geological modeling is essential for the characterization of natural phenomena and can be done in two steps: (1) clustering the data into consistent groups and (2) modeling the extent of these groups in space to define domains, honoring the labels defined in the previous step. The clustering step can be based on the information of continuous multivariate data in space instead of relying on the geological logging provided. However, extracting coherent spatial multivariate information is challenging when the variables show complex relationships, such as nonlinear correlation, heteroscedastic behavior, or spatial trends. In this work, we propose a method for clustering data, valid for domaining when multiple continuous variables are available and robust enough to deal with cases where complex relationships are found. The method looks at the local correlation matrix between variables at sample locations inferred in a local neighborhood. Changes in the local correlation between these attributes in space can be used to characterize the domains. By endowing the space of correlation matrices with a manifold structure, matrices are then clustered by adapting the K-means algorithm to this manifold context, using Riemannian geometry tools. A real case study illustrates the methodology. This example demonstrates how the clustering methodology proposed honors the spatial configuration of data delivering spatially connected clusters even when complex nonlinear relationships in the attribute space are shown.

Avalos, S., Ortiz, J.M. and Srivastava, R.M., 2023. Geostatistics Toronto 2021. Mathematical Geosciences, pp.1-2.

The International Geostatistics Congress has become a landmark for the geostatistics community to present the best work and most innovative research. The Congress is held once every four years.

The Eleventh Congress was originally planned for the summer of 2020 in Toronto, but the COVID-19 pandemic hit and, as with many other things, plans were completely derailed because of the impossibility to meet in person. After much discussion by the Geostatistics Executive Committee that supports the organization of the Congress, and given the many unknowns at the time, it was decided to postpone the Congress to the summer of 2021, hence rebranding it to Toronto 2021, as a reminder of the odd circumstances that we were faced with.

The Congress was held fully online, but with an original format, where all presentations were made available one week in advance, and the actual meeting time was used for brief summaries of each presentation and a Q &A session with extensive discussions about the topics. Despite being online, the discussions were lively and interesting, which again made this a special occasion. As always, presentations were split into sessions: Theory, Petroleum, Mining, Earth Science, and Domains. Authors were invited to submit a paper for a special issue on the conference in Mathematical Geosciences. These papers went through the rigorous review process that the journal has for all its papers. The special issue includes five articles on diverse topics, each of which is briefly discussed next.

Avalos, S. and Ortiz, J.M., 2023. Spatial Multivariate Morphing Transformation. Mathematical Geosciences, pp.1-37.

Earth science phenomena, in particular mineralization of ore deposits, can be characterized by the spatial and statistical features of multivariate information. The relationships among these variables are often complex, encountering non-linear features, compositional constraints, and heteroscedasticity. Capturing and reproducing their statistical and spatial distributions is essential for uncertainty management, allowing for better decision-making and process control. In this work, we present a novel spatial multivariate morphing transformation that maps the initial multivariate space into a spatially and statistically decorrelated multi-Gaussian space. The spatial structures of the Gaussian random variables are modeled independently, and values are simulated at unsampled locations using a conventional univariate geostatistical simulation algorithm. Multivariate features and relationships are reintroduced by mapping from the multi-Gaussian distribution into the initial space. The spaces are paired following the fundamentals of point cloud morphing using discrete optimal transport to minimize the distance between landmark points between spaces. New simulated values are mapped from the anchored multi-Gaussian space into the multivariate space via thin-plate spline interpolation conditioned to the k-spatially known closest samples. The effectiveness of the method is demonstrated in a 6-dimensional dataset with strong non-linear relationships and spatial continuity. The resulting multivariate statistical and spatial metrics have been compared with simulations obtained by projection pursuit multivariate transformation.

Avalos, S. and Ortiz, J.M., 2023. Multivariate Geostatistical Simulation and Deep Q-Learning to Optimize Mining Decisions. Mathematical Geosciences, pp.1-20.

In open pit mines, the long-term scheduling defines how the mine should be developed. Uncertainties in geological attributes makes the search for an optimal scheduling a challenging problem. In this work, we provide a framework to account for uncertainties in the spatial distribution of grades in long-term mine planning using deep Q-Learning. Mining, processing and metallurgical constraints are accounted as restrictions in the reinforcement learning environment. Such environment provides a flexible structure to incorporate geometallurgical properties in production scheduling, as part of the block model. Geometric constraints (block precedence) and operational restrictions have been included as part of the agent-environment interaction. The effectiveness of the method is demonstrated in a controlled study case using a real multivariate drill-hole dataset, maximizing the net-present value of the project. The present framework can be extended and improved, to meet the particular needs and requirements of mining operations. We discuss on the current limitations and potential for further research and applications.

Utili, S., Agosti, A., Morales, N., Valderrama, C., Pell, R. and Albornoz, G., 2022. Optimal pitwall shapes to increase financial return and decrease carbon footprint of open Pit mines. Mining, Metallurgy & Exploration, 39(2), pp.335-355.

The steepness of the slopes of an open pit mine has a substantial influence on the financial return of the mine. The paper proposes a novel design methodology where overall steeper pitwalls are employed without compromising the safety of the mine. In current design practice, pitwall profiles are often planar in cross-section within each rock layer; i.e., the profile inclination across each layer tends to be constant. Here instead, a new geotechnical software, OptimalSlope, is employed to determine optimal pitwall profiles of depth varying inclination. OptimalSlope seeks the solution of a mathematical optimization problem where the overall steepness of the pitwall, from crest to toe, is maximized for an assigned lithology, geotechnical properties, and factor of safety (FoS). Bench geometries (bench height, face inclination, minimum berm width) are imposed in the optimization as constraints which bind the maximum local inclination of the sought optimal profile together with any other constraints such as geological discontinuities that may influence slope failure. The obtained optimal profiles are always steeper than their planar counterparts (i.e., the planar profiles exhibiting the same FoS) up to 8° depending on rock type and severity of constraints on local inclinations. The design of a copper mine is first carried out employing planar pitwalls, secondly adopting the optimal pitwall profiles determined by OptimalSlope. The adoption of optimal slope profiles leads to a 34% higher net present value and reductions of carbon footprint and energy consumption of 0.17 Mt CO2 eq and 82.5 million MJ respectively due to a 15% reduction of rockwaste volume.

Moraga, C., Kracht, W. and Ortiz, J.M., 2022. Process simulation to determine blending and residence time distribution in mineral processing plants. Minerals Engineering, 187, p.107807.

Mineral processing plant performance depends on multiple factors, including the feed and the parameters to control the process. In this work, we show how to assess plant performance using geometallurgical modeling and dynamic simulation. Several models that describe comminution, classification, flotation, and residence time distribution (RTD) are implemented as modules and then connected to represent generic plant configurations. The estimation of the RTD is used to assess the ore blending generated within the plant through a methodology based on weighting the ore contribution at the plant discharge. Additionally, the RTD is used to display the ore permanence at different plant stages, which can be used as an operational input to anticipate the consequences of a perturbation in the feed. Different simulation scenarios are tested using synthetic data, including different plant configurations, time support for blending assessment, and ore feeding sequence. The results show that the simulation is sensitive to these attributes. Significant differences are detected in the generated product compositions when the plant configuration is changed. Also, distinct mine plans can be evaluated through simulation, predicting their processing performance. Therefore, the simulation tool developed can be used to evaluate real mineral processing operations and to test different operative strategies.

Jelvez, E., Morales, N. and Ortiz, J.M., 2021. Stochastic final pit limits: an efficient frontier analysis under geological uncertainty in the open-pit mining industry. Mathematics, 10(1), p.100.

In the context of planning the exploitation of an open-pit mine, the final pit limit problem consists of finding the volume to be extracted so that it maximizes the total profit of exploitation subject to overall slope angles to keep pit walls stable. To address this problem, the ore deposit is discretized as a block model, and efficient algorithms are used to find the optimal final pit. However, this methodology assumes a deterministic scenario, i.e., it does not consider that information, such as ore grades, is subject to several sources of uncertainty. This paper presents a model based on stochastic programming, seeking a balance between conflicting objectives: on the one hand, it maximizes the expected value of the open-pit mining business and simultaneously minimizes the risk of losses, measured as conditional value at risk, associated with the uncertainty in the estimation of the mineral content found in the deposit, which is characterized by a set of conditional simulations. This allows generating a set of optimal solutions in the expected return vs. risk space, forming the Pareto front or efficient frontier of final pit alternatives under geological uncertainty. In addition, some criteria are proposed that can be used by the decision maker of the mining company to choose which final pit best fits the return/risk trade off according to its objectives. This methodology was applied on a real case study, making a comparison with other proposals in the literature. The results show that our proposal better manages the relationship in controlling the risk of suffering economic losses without renouncing high expected profit.

Jélvez, E., Morales, N. and Askari-Nasab, H., 2020. A new model for automated pushback selection. Computers & Operations Research, 115, p.104456.

The design of pushbacks is essential to long-term open pit mine scheduling because it partitions the pit space into individual units, controlling ore and waste production. In this paper, a new model is proposed for the pushback selection procedure, which consists of characterizing the potential pushbacks based on the comprehensive family of nested pits and selecting those ones that meet a set of criteria, for instance, bounded ore and waste. An advantage of this method is the possibility to automate the pushback selection methodology, applying well-defined criteria for the selection and reducing the time employed in the planning task.

Avalos, S., Kracht, W. and Ortiz, J.M., 2020. An LSTM approach for SAG mill operational relative-hardness prediction. Minerals, 10(9), p.734.

Ore hardness plays a critical role in comminution circuits. Ore hardness is usually characterized at sample support in order to populate geometallurgical block models. However, the required attributes are not always available and suffer for lack of temporal resolution. We propose an operational relative-hardness definition and the use of real-time operational data to train a Long Short-Term Memory, a deep neural network architecture, to forecast the upcoming operational relative-hardness. We applied the proposed methodology on two SAG mill datasets, of one year period each. Results show accuracies above 80% on both SAG mills at a short upcoming period of times and around 1% of misclassifications between soft and hard characterization. The proposed application can be extended to any crushing and grinding equipment to forecast categorical attributes that are relevant to downstream processes.

Avalos, S. and Ortiz, J.M., 2020. Recursive convolutional neural networks in a multiple-point statistics framework. Computers & geosciences, 141, p.104522.

This work proposes a new technique for multiple-point statistics simulation based on a recursive convolutional neural network approach coined RCNN. The work focuses on methodology and implementation rather than performance to demonstrate the potential of deep learning techniques in geosciences. Two and three dimensional case studies are carried out. A sensitivity analysis is presented over the main RCNN structural parameters using a well-known training image of channel structures in two dimensions. The optimum parameters found are applied into image reconstruction problems using two other training images. A three dimensional case is shown using a synthetic lithological surface-based model. The quality of realizations is measured by statistical, spatial and accuracy metrics. The RCNN method is compared to standard MPS techniques and an improving framework is proposed by using the RCNN E-type as secondary information. Strengths and weaknesses of the methodology are discussed by reviewing the theoretical and practical aspects.

Peredo, O.F., Baeza, D., Ortiz, J.M. and Herrero, J.R., 2018. A path-level exact parallelization strategy for sequential simulation. Computers & Geosciences, 110, pp.10-22.

Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

Nelis, S.G., Ortiz, J.M. and Morales, V.N., 2018. Antithetic random fields applied to mine planning under uncertainty. Computers & Geosciences, 121, pp.23-29.

Traditional practice in mine planning often relies on estimation techniques that fail to account for the intrinsic uncertainty of geology and grades, which may have significant consequences in the mine operation. Dealing with this uncertainty has been a major topic in the last years, where different algorithms and stochastic optimization models have been proposed to tackle this issue. However, the increasing complexity of these stochastic models and the use of several simulations to represent the deposit variability impose a computational challenge in terms of resolution times, making them difficult to apply in large data or complex mining operations. In this paper we explore the antithetic random fields approach as a variance reduction technique, to solve a stochastic short-term mine planning problem, aiming to reduce the number of simulations required to obtain a reliable NPV value. The reliability of the result is measured by the variance of the NPV when the problem is optimized with different sets of realizations. Our results show that this technique produces a significant variance reduction in the inference of the expected NPV value in the stochastic problem for a copper deposit application, generating a lower dispersion with a smaller sample size, compared to traditional simulation techniques.

Baeza, D., Ihle, C.F. and Ortiz, J.M., 2017. A comparison between ACO and Dijkstra algorithms for optimal ore concentrate pipeline routing. Journal of Cleaner Production, 144, pp.149-160.

One of the important aspects pertaining the mining industry is the use of territory. This is especially important when part of the operations are meant to cross regions outside the boundaries of mines or processing plants. In Chile and other countries there are many long distance pipelines (carrying water, ore concentrate or tailings), connecting locations dozens of kilometers apart. In this paper, the focus is placed on a methodological comparison between two different implementations of the lowest cost route for this kind of system. One is Ant Colony Optimization (ACO), a metaheuristic approach belonging to the particle swarm family of algorithms, and the other one is the widely used Dijkstra method. Although both methods converge to solutions in reasonable time, ACO can yield slightly suboptimal paths; however, it offers the potential to find good solutions to some problems that might be prohibitive using the Dijkstra approach in cases where the cost function must be dyamically calculated. The two optimization approaches are compared in terms of their computational cost and accuracy in a routing problem including costs for the length and local slopes of the route. In particular, penalizing routes with either steep slopes in the direction of the trajectory or high cross-slopes yields to optimal routes that depart from traditional shortest path solutions. The accuracy of using ACO in this kind of setting, compared to Dijkstra, are discussed.

Lobos, R., Silva, J.F., Ortiz, J.M., Díaz, G. and Egaña, A., 2016. Analysis and classification of natural rock textures based on new transform-based features. Mathematical Geosciences, 48, pp.835-870.

This work develops a mathematical method to extract relevant information about natural rock textures to address the problem of automatic classification. Classical methods of texture analysis cannot be directly applied in this context, since rock textures are typically characterized by both stationary patterns (a classic kind of texture) and geometric forms, which are not properly captured with conventional methods. Due to the presence of these two phenomena, a new classification approach is proposed in which each rock texture class is individually analyzed developing a specific low-dimensional discriminative feature. For this task, multi-scale transform domain representations are adopted, allowing the analysis of the images at several levels of scale and orientation. The proposed method is applied to a database of digital photographs acquired in a porphyry copper mining project, showing better performance than state-of-the-art techniques, and additionally presenting a low computational cost.

Peredo, O., Ortiz, J.M. and Leuangthong, O., 2016. Inverse modeling of moving average isotropic kernels for non-parametric three-dimensional gaussian simulation. Mathematical Geosciences, 48(5), pp.559-579.

Moving average simulation can be summarized as a convolution between a spatial kernel and a white noise random field. The kernel can be calculated once the variogram model is known. An inverse approach to moving average simulation is proposed, where the kernel is determined based on the experimental variogram map in a non-parametric way, thus no explicit variogram modeling is required. The omission of structural modeling in the simulation work-flow may be particularly attractive if spatial inference is challenging and/or practitioners lack confidence in this task. A non-linear inverse problem is formulated in order to solve the problem of discrete kernel weight estimation. The objective function is the squared euclidean distance between experimental variogram values and the convolution of a stationary random field with Dirac covariance and the simulated kernel. The isotropic property of the kernel weights is imposed as a linear constraint in the problem, together with lower and upper bounds for the weight values. Implementation details and examples are presented to demonstrate the performance and potential extensions of this method.

Peredo, O., Ortiz, J.M. and Herrero, J.R., 2015. Acceleration of the Geostatistical Software Library (GSLIB) by code optimization and hybrid parallel programming. Computers & Geosciences, 85, pp.210-233.

The Geostatistical Software Library (GSLIB) has been used in the geostatistical community for more than thirty years. It was designed as a bundle of sequential Fortran codes, and today it is still in use by many practitioners and researchers. Despite its widespread use, few attempts have been reported in order to bring this package to the multi-core era. Using all CPU resources, GSLIB algorithms can handle large datasets and grids, where tasks are compute- and memory-intensive applications. In this work, a methodology is presented to accelerate GSLIB applications using code optimization and hybrid parallel processing, specifically for compute-intensive applications. Minimal code modifications are added decreasing as much as possible the elapsed time of execution of the studied routines. If multi-core processing is available, the user can activate OpenMP directives to speed up the execution using all resources of the CPU. If multi-node processing is available, the execution is enhanced using MPI messages between the compute nodes.Four case studies are presented: experimental variogram calculation, kriging estimation, sequential gaussian and indicator simulation. For each application, three scenarios (small, large and extra large) are tested using a desktop environment with 4 CPU-cores and a multi-node server with 128 CPU-nodes. Elapsed times, speedup and efficiency results are shown.