Sequential Gaussian Simulation Flashcards
State the steps of SGS
- Normalise data (Normal Score Transformation)
- Compute and model the variogram, covariance, or correlogram of the normalised data
- Define a random path that goes through each node of the grid representing the deposit
- Krige the normalised value at the selected node using both actual and simulated data to estimate the mean and variance of the normal local conditional distribution
- Simulate value by randomly sampling the estimated normal local conditional distribution (From random number generation ]0,1[)
- Add the new simulated value to the conditioning data set and move to the next grid node (Make every subsequent estimate dependent on initial and previously generated results)
- Repeat the process until all nodes are simulated
- Back transform the simulated values and validate the results
Case study: Mine is looking to convert ______ to __________ to reduce costs by $0.04/ton. However, increased _____________ will result in __________.
Mine is looking to convert 20ft benches to 25ft benches to reduce costs by $0.04/ton. However, increased bench height will result in increased dilution.
What is the goal of the case study in this section ? What are the variables?
Determine whether it is worth it to increase bench heights. Variables include: bench heights and blasthole spacing/sampling
Increased bench height combined with _________________ may show the same dilution effects as the current bench height with a _________________.
Increased bench height combined with a denser blasthole spacing may show the same dilution effects as the current bench height with a sparse blasthole spacing.
What are some considerations for the case study about optimizing mining parameters?
- Define parameters to test for maximum profit; e.g. bench height 20 ft vs 25 ft and blasthole spacing 15 vs 18ft
- Find a way to minimize misclassification (grade control) and consider related economics
What are possible ore destinations for the case study about optimizing mining parameters?
- Oxide Leach
- Oxide Mill
- Refractory Mill
- Waste
What steps did they take for the case study about optimizing mining parameters?
- Conditionally simulate several images of grade and material types (oxide and refractory)
- Sample the simulated images with the selected combinations of bench height and blasthole spacing
- Do grade control for each sampling (bench height and blasthole spacing)
- Add the related costs and evaluate the expected misclassification and dilution costs
- Decide
What is a notable challenge for the case study about optimizing mining parameters? What is the solution?
- All images are generated on a dense grid with a 5 by 5 by 5 ft resolution
- The available blasthole data represent samples of a 20 ft length
Solution: To use them as conditioning data in the simulations they need to be ‘de-regularized’ to represent 5ft composites
What is de-regularization?
It is the reverse of the typical procedure of compositing from exploration data and amounts to splitting each 20ft length to four 5ft samples with the same mean as the original sample and a variance equal to the variance difference between 20ft and 5ft composites. The latter variance is derived from geostatistical charts.
Comment on the results and conclude the case study about optimizing mining parameters.
The change from 20 ft to 25 ft benches at 16 ft blasthole spacing has an expected cost of dilution at 0.07$ and operational savings were estimated at 0.04$. Conclusion: Don’t do it you will be losing 0.03$ per ton.
How to SGS without samples
Knowing distribution of possible grades at x0 is normal with a mean of 5% and a standard deviation of 2%
1. Pick a value from that histogram (look at grey line mean: 5, std. dev 2)
2. Draw a random number, say 0.74, and read the corresponding grade value on the cumulative frequency plot.
What is Screen-Effect Approximation?
The implementation is based on considering a fixed-size neighborhood around a node where the posterior probability density function is approximated using only the data within this nearby region.
Advantage of SEA?
SEA in the context of SGS is utilized to approximate the posterior probability density function, providing a computational advantage by limiting calculations to localized neighborhoods around nodes of interest. The adaptation ensures that the simulation is computationally efficient and more feasible for larger datasets or models by significantly reducing computational and storage requirements.
Dilution may be seen as the ______________ above the cut-off introduced by the block support-size compared to the theoretical situation where _____________ is that of data support-size.
Dilution may be seen as the percent decrease of average grade above the cut-off introduced by the block support-size compared to the theoretical situation where selectivity is that of data support-size.
Comment on this (2 main points)
- As the block size increase, the average grade decreases because of dilution.
- As the average grade decreases more tonnes of material are available.