However, according to the APPEA 2011-12 Health, Safety and Environment Report, the total recordable injury rate in Australia is still more than two and a half times that of members of the International Oil and Gas Producers Association.
So how can we improve this? The key lies in an additional method used to identify and classify risk within oil and gas operations, using “˜big data’.
Tapping big data
Larger oil and gas operations, such as LNG production facilities, are relatively complex to run.
Various control systems are required to control the well-flows, manage the liquefaction, track storage, and load the product onto ships.
All of these control systems continuously produce data, storing flow rates, the state of valves and other production information.
Besides the control systems, a typical operator has numerous additional computer systems for managing work.
Integrated safe system of work programs, production planning and optimisation, integrity management, asset maintenance, workforce scheduling and integrated logistics are just a few of the business functions that use a control system, along with health and safety mandated work processes and procedures, the tracking of job hazard assessments, incidents and near-misses and risk management performance.
Finally, an operator has access to a number of external sources of information, such as the Bureau of Meteorology, that provide insight into the conditions under which work is performed.
Meanwhile, the above-mentioned systems and external sources produce a vast amount of data.
This data lives locked away in disparate databases, only used by the systems that produce it.
However, in recent years, IT technology has evolved to the point that large volumes of various data sources can be aggregated into in-memory data stores – the first step toward enabling analysis of complex data sets.
Identifying the risk
Aggregated big data in itself enables various parts of the business to query and analyse its performance and address problem areas and bottlenecks.
This type of analysis works well on challenges and problems where it knows what to look for.
Where challenges are not known, the circumstances and leading indicators can make the difference between good and bad situations.
For these types of problems, a big data store can be used to let the data “˜speak for itself’, using automated statistical tools to identify and classify the leading indicators.
The oil and gas industry also uses this approach, with various operations in Europe and other parts of the world utilising statistical analyses on the huge amounts of data that are produced by assets whilst they’re in operation.
These analyses provide insight into the early warning signs that indicate whether equipment is near failure, thus allowing the maintenance team to perform preventive repairs before faults occur.
A modern oil and gas operator has access to various sources of data that stem from activities in its operations, or from external sources that support those operations.
Furthermore, a number of systems are in use that record and track performance on health and safety.
Bringing these sources of data together and using automated statistical tools to identify and classify risk has created a new perspective on the risk profile of activities of oil and gas operations.
This enables organisations to pinpoint their risk-mitigation strategies where they will make the biggest impact.
How big data works
This big data technology now enables organisations to collect data from all sorts of data sources.
This can be either structured data that exists in the enterprise resource planning and other corporate systems or unstructured data, such as data dumps from historians and external sources including the Bureau of Meteorology or an on-site video stream.
All of this data can be stored in-memory, allowing very fast access and therefore enabling complex statistical analyses to be performed with significant speeds.
Whereas traditional business intelligence solutions need the source data to be thoroughly classified and modelled to be cross-referenced, big data solutions do not prescribe such strict modelling.
Statistical analyses tooling is used to perform the process of data mining.
A typical definition of data mining is the computational process of discovering patterns in large data sets involving methods at the intersection of artificial intelligence, machine learning, statistics, and database systems.
According to a 2008 UCLA Anderson Graduate School of Management report Data Mining: What is Data Mining, the overall goal of the data mining process is to extract information from a data set and transform it into an understandable structure for further use.
In practice, a data mining effort allows identification of previously unknown or unexpected patterns or leading indicators for specific incidents.
When searching for leading indicators to link to near-misses, incidents or other significant health and safety occurrences, this method will point to both expected and unexpected circumstances.
For example, for an LNG producer in the northern areas of Australia, the combination of a tropical low developing offshore, with an LNG tanker nearing the plant, following a shutdown that lasted longer than scheduled, would be expected to place further pressure on operations.
However, more elaborate leading indicators could identify combinations of specific contractors, shift-changes, specific maintenance work in parts of the plant, and particular hours of the day, which are more prone to near misses.
An oil and gas operator in Western Australia acknowledged the disparate nature of its existing health and safety systems and the limited coherent insight those separate systems could provide.
As a resolution, a significant number of analysts were initially tasked with stitching together data from different health and safety systems, as well as other corporate systems.
In line with the approach described here, a data warehouse was created to automate the aggregation of all of the data from the health and safety and selected corporate systems.
The obvious immediate benefit was freeing up a number of analysts from the manual labour of stitching together separate data sets, while the integrated data from the data warehouse provided the operator with significantly increased coherent insights into their business.
An initial statistical analysis was trialled and areas of concern were identified across operations.
Specifically, a small number of contractor deficiencies were identified in their on-boarding process, leading to heightened risk profiles for new starters.
Based on that assessment, this issue was resolved.
Overall, the trial closed positively with the advice to continue exploring the big data technologies as a means to increase visibility on risk profiles in the operator’s organisation.
What lies ahead
As Australian major capital projects move into operations over the next couple of years, significant attention will start focusing on the health and safety performance of these operations.
Using new technologies such as big data in innovative ways to achieve significant and extraordinary results will be a good way for the Australian oil and gas industry to demonstrate the necessary ambitions in health and safety excellence.