The Business of Geoscientific Data
When I began my career as a computer geologist for a mining house during the 1980s, databases resided on mainframes and data was captured in duplicate on punch cards at data capture bureaus in downtown Johannesburg. We needed a logging tool that would encourage geologists to speak the same language when describing rocks as well as make all the relevant observations. Details and geological relationships had to be catered for.
We began on Hewlett Packard portable technical micro PCs with graphics tablets for data collection and pen plotters for graphical reporting. Our mission was to standardise geological descriptions and produce scaled reports which could replace manual draughting. The captured data was verified through an error log produced at the same time as the scaled log report.
The 80’s were the era of large gold exploration projects undertaken in the Witwatersrand Basin and Barberton Mountain Land. Diamond drilling technology had to keep up with the demand for boreholes drilled to depths of greater than 5 kilometers. The target was to reach multiple reefs with enough intersections to produce a proper representative sampling density to predict the very variable gold mineralisation. This was the advent of geostatistics.
The mission was accomplished by SABLE 1 and SABLE 2 using digital files in the 80’s and then SABLE® Data 1, the Microsoft® Windows version in the 90’s at the advent of the first PC database engine.
Standardised Approach to Borehole Logging for Exploration and Evaluation
By the beginning of the new millennium, there was a demand for the varied geoscientific data sets required to support exploration and mining processes to be catered for within a single database under the control of one data standard. This lead to a change of database architecture on a different database engine. The product became known as SABLE® Data Warehouse.
By 2016, the development team committed to take the product to modern development platforms while retaining the ‘user focused’ approach of the past. This generation is known as SABLE®7.
Concerns with respect to the quality of the geoscientific data used to predict the value of mineral assets was mounting by 1997. This lead to the concept of a Competent Person within the relevant field of practice who would bring professional rigour and accountability to the process. Peer reviews among fellow competent people provide due diligence to mitigate the risk of error and personal bias. This in turn, minimises the potential for damages claims by investors if predictions do not materialise. It is common for the Competent Person to be required to present professional certification, experience and know-how applicable to the style of mineralisation and commodities featured.
The Mineral Resource and Reserve declaration must be signed off by the Competent Person(s). This classification is expressed as levels of confidence in estimates which are based on data. This data provides geological evidence and geoscientific knowledge for digital processing. The competent person must be satisfied with the reliability of the data.
How To Achieve Data Integrity
- Data Management
Data management is the practice of organising and maintaining data processes to meet ongoing information lifecycle needs.
- Configurable business processes will provide data governance and must interact directly with the metadata for data life cycles within the database and for data originating and then return to the database (as in the case of sample batch management under geochemical QC protocols).
- ETL processes must provide auditable interfacing on the importing of external data. These procedures will Extract, Transform (and conform) third party data before Loading it to the governed database environment.
- Measuring Data Reliability
In order to be judged as reliable, data must be reproducible (accurate but not precise); useable (computable); relevant and representative (material); and unbiased (accurate and precise). This will promote data integrity.
- Data Governance
Data governance (DG) is the overall management of the availability, usability, integrity and security of data used in an enterprise. A sound data governance program includes a governing body or council, a defined set of procedures and a plan to execute those procedures.
SABLE® Methodology Can Help You Achieve This
SABLE® provides a self-managing, auditable system which supplies data management tools and reports; integrated through an independent data standard which is supported by configurable validations and business processes. Software engineering techniques are applied to the geological problem domain; replacing the subjectivity of stand-alone applications which collect and hide proprietory data sets.
This methodology will reduce complexity in the primary data sets through standardisation of geoscientific descriptions. It classifies primary evidence into relational structures which promote cognition and data abstraction. It maximises data quality, integrity, transparency and useability.
Value Of Data
The financial resources, time and technical expertise required to reproduce the geoscientific data that has been collected and maintained over the last four decades in South Africa will not be easy to reproduce in future. Data should be available to future generations of geologists for research and exploration. This is why it’s important to invest in a reliable database.
The SABLE® Methodology delivers the fundamental principles which support banking of the knowledge and intellectual property primary to the mining industry for the optimisation of its mineral exploration, mining and beneficiation processes. This promotes multiskilling in a multicultural environment.
For more information on SABLE®, visit: https://pages.dataminesoftware.com/sable-data-integrity
Or contact us: https://www.dataminesoftware.com/contact/