Digitalisation approaches – An oil & gas industry blueprint

null

Much has been written about Digitalisation recently but it’s often less clear that different approaches are needed depending on the nature of the business goal.

There are essentially two main types of digitalisation - those focused on using data to make co-ordination decisions to sanction and control work (referred to as Digital Decisions), and those that embed information processing so the way work is performed changes (referred to as Digital Execution).  

Examples of Digital Decisions include leveraging data that is generated by measuring and recording field operations, combining it in new ways and then generating insights – and physical examples of the Digital Execution include autonomous vehicles, wearable technology, robotics and automated systems.  

In oil and gas I am seeing most of the Digital Execution innovation being driven through the supply chain with the promise of increased productivity for existing activities. This is accessible to operators through smart procurement and teaming initiatives. Through their own actions, operators have most opportunity to capture value through making better digital decisions.

There are different approaches to delivering Digital Decisions that address different problems, and the choice can be intimidating. To address this there are three key considerations to weigh before choosing an approach for an initiative.  Leaders should first ask: 

  1. What stage of life is the oil and gas field; 
  2. Who will use the systems; and 
  3. What’s the primary business case? 

The stage of life of a field will guide the technology approach to consider. For example, technology that needs to integrate with established ways of working will differ from technologies that can mature along with the design and build of new assets. Assets that are being decommissioned still need instrumentation but not the same as a field in full-blown production.

Who uses the systems is also vitally important. Will it be the OIM and the team off-shore; the data analytics department; or the functional expert in head office? This will impact certain important considerations around the interactivity of the systems, power required from analytical super-computing and the level of ‘cleverness’ of any 3D virtual reality representations.   

If companies place complex analytical tools with sophisticated interfaces into on-line production environments this can cause frustration and result in rejection. This is because they can be seen as distracting and not practical enough to be useful “in the real world”. This can be the case despite the fact that high-end knowledge workers in on-shore analytics roles can use them to add considerable value by understanding what’s going on and then issuing new work instructions to the field. Consider carefully the work-context in which a system will be deployed and how it will be integrated into current ways-of-working.

The primary business case must address one of three fundamental value drivers, which are namely: 

  1. Reduce the time to access and then produce the resource;
  2. Increase the maximum amount of hydrocarbon extracted from the reservoir; or
  3. Reduce the cost to build and operate the required infrastructure 

In addition to these three, there are certain ways that owners can maximise their localised value by redistribution of risk and reward through commercial and trading arrangements. And, of course, this must all be bounded by ensuring that HSSE requirements are met.

Technical approach one: Create a Data Warehouse / Data Lake & Plug-in Analytics 

This provides the promise of uncovering new insights by using specially trained algorithms to automatically look through past-history to identify previously unrecognised patterns. These patterns may be able to show that different configurations of settings will increase production rates.   

It’s also possible that they can create better understanding of reservoir conditions by looking at changes in flow-rates and pressures. Or, perhaps, they will find ways to predict when machinery will fail.   

There are currently two streams of activity underway in this area - one is the creation of “platforms” where data will be stored, organised and made available for use, and the other is in the creation of “analytics” that use techniques like AI and machine learning to access data from the platform to generate insights.   

There is an obvious compatibility issue here; you don’t want to implement a platform to then find out that everyone has built analytics to work on another one. There are a number of commercial platforms in development and some operators are creating their own in-house ones. The issues that need to be addressed are: Data Representation, Performance and Interoperability. 

There are many research companies and universities that are working on AI, Prediction and machine learning approaches. There is not yet an established design pattern or dominant set of commercial algorithms, but it is reasonable to expect them to emerge over time and be able to connect to the data platforms.   

Technical approach two: Collect Big Data in the Cloud and dig for answers 

This is similar in many ways to the data warehouse approach in that data is copied and placed into an area for later use. The purported advantage of this method is the lack of structure required in the data. The cost of creating infrastructure for transmitting and collecting all the sensor data to the centre is justified against the access to general tools and analytics packages that have been developed in other industries.   

In this area you see solutions such as Amazon web services, Google analytics and IBM’s Watson all being applied in the search for meaning across disparate data sets, without the requirement for detailed engineering know-how to guide the analysis.  

Technical approach three: 3D Visualisation and Augmented Reality 

A further approach is one which places visualisation at its core. One area where this works especially well when trying to establish the physical relationships between objects when you are not able to actually see them. This may be because the objects are located far out to sea and you are in the office, or it may be because it has not been built yet.   

Loading data into models from an existing plant can be done through clever laser scanning and picture analysis software. For new plants many of the design packages will automatically create 3D layouts that can be uploaded directly and kept up to date as the plant is built.   

There are cases where this technology has added value - for instance in new builds, being able to simulate the construction can show the optimum order in which to assemble it, which helps to avoid access issues. During modification work, engineering measurements and fitting instructions can be made without the need for site visits. Augmented Reality offers the promise of accurate on-site equipment identification along with work-orders, associated information and instructions.   

Common Challenges  

Regardless of the end goal, there are certain prerequisite activities that include: connection to the complex and diverse computer systems used to run operations, whether that is instrumentation data, real-time performance, maintenance records or warehouse inventory; and integration of analysis results into the daily operation routines so that different actions can be taken to deliver business results.

Digitalisation initiatives should have ultimate target end-goals that include automation, remote analytics and complex visualisations. Companies should learn how to combine sub-surface analysis with surface operations and how to maximise value through the life of a field from concept, construction, operation and decommissioning.   

The implementation will be key to gaining value. For all but the largest super-majors it is probably unwise to pursue all three technical approaches at once. For everyone else it’s about choosing a focus, setting an ultimate vision and the deciding how fast and how far to go down the path, while delivering value at each step.

Here at Eigen, our approach to digitalisation is to leverage the information that is already available by hooking up existing systems without copying data. We enable engineers to create instinctive work-flows that speed their work. We enable them to create their own real-time based alerting for operations as both companywide standards or temporary personal watch-lists. 

The rationale behind our approach is that it offers a very rapid way to gain quick access to information and create value quickly. Because our system is built by process engineers we know how to make the data interactive in context which enables investigation work to be carried out directly, without involving IT or having to becoming an expert in a new analytics package.   

Engineers know what they need and how to make the calculations that help them drive performance. They also know when they need to be alerted. The problem they have is getting hold of the data from across the organisation, applying calculations, setting up a system to watch on their behalf and then rapidly communicating their findings to colleagues. And, of course, the not insignificant problem of integrating their findings into on-going work-programmes. Without our system, this activity is conducted using Excel, Powerpoint, email and a large number of time-wasting group meetings. 

Combining performance data, instrumentation output and systems of record in ways that make today’s operations come alive, promises quick wins and a an on-ramp to enable other more advanced approaches.  This is all available now without waiting for a break-through discovery, design of a central collection point or changes in departmental responsibilities.      

Gareth Davies, Chief Strategy Officer at Eigen 

Image Credit: Isakarakus / Pixabay