12 Mar How Exploration Can Regain Its Digital Mojo
By far the most data intense function in the oil and gas value chain is exploration, but exploration is far from being digital. How might it get its mojo back?
Exploration – a digital early adopter
The exploration function of oil and gas has always been able to attract funding and technology investments. They’re necessary to help collect and interpret the vast volumes of data that characterize geologic formations and highlight the possible presence of oil and gas. After all, the upside can be highly lucrative if the exploration is successful. The upstream departments of oil and gas companies have plenty of rich data sets at hand, and oodles of analytic support, but many will readily admit that they are not particularly digital.
Exploration’s digital paradox
Based on my classic definition of “digital” (a combination of data, analytics and communications with exponential growth trajectories), I wonder why they would classify themselves as digital laggards.
Let’s begin with data. The data volumes in oil and gas (such as seismic data) are enormous, but historically data transmission speeds have been too slow to move the data around. Many exploration departments operate their own private data centers so that the datasets do not have to move along communications networks for analysis. This has historically saved time for geologists and petroleum engineers to the advantage of larger firms able to afford the scale.
Oil and gas companies have traditionally viewed all of their geologic data as highly proprietary and deeply competitive. This makes sense when the value of the underground resource is so dependent on the data. Indeed, there is a lively market in the trade of subsurface data. In the main, however, the tendency has been to keep this data in a company owned data center under lock and key. Indeed, the silos of data inside some exploration departments could lead you to conclude that internal competition is in fact greater than that with others in the industry.
Exploration departments have been demanding users of analytic horsepower and have been very compute intense. Some of the world’s largest and fastest computer installations have belonged to oil and gas companies. Cloud computing, a digital innovation that should be very appealing to the industry, has been embraced only tentatively, and in general not within the upstream. I believe this reflects an ongoing bias and long held orthodoxy that third party generalized computer installations cannot be competitive even at scale with the captive compute facilities in oil and gas companies.
In a meeting with me, the CIO of one of Canada’s largest oil and gas companies dismissed cloud computing as a rent-extracting monopoly to be avoided. Putting data into a third party data center (outsourcing), or into a cloud computing environment, potentially enables the cloud provider to jack up prices to its now captive oil customers.
What about the use of new digital tools like artificial intelligence? Many geologists simply do not accept that the interpretation of geologic data, which involves marrying subsurface imagery and geology training, can be done by machines. Even suggesting that AI has a role to play attracts derision from the industry and immediate dismissal. I’ve personally been on the receiving end of this kind of reaction.
These widely held views do not square any longer with what we are all observing in the world around us. Faster growth and better business valuations are associated now with companies in the digital world who treat data as an open and shareable asset, and who leverage compute on demand. Artificial intelligence is giving us self driving vehicles, energy optimization and data interpretation. AI is interpreting medical imagery such as ultrasound (which is the same thing as seismic).
The status quo looks doomed
As time marches on, the status quo position of building and operating a proprietary data center inside an oil and gas company is beginning to look risky. It is doubtful that any single large industrial company can cost effectively stand up a facility to match those of the cloud computing companies. Size, scale, power needs and automation levels suggest the digital players have the upper hand in computing. I don’t understand how a computer facility in an oil and gas company can even afford the kinds of cyber security protections necessary today – far larger organizations have been successfully hacked.
The sharpest skills in compute technology will blossom in the embrace of the cloud compute operators. Oil and gas has always been able to attract its share of computer science grads, but the new digital economy is looking more appealing. The career ladder in oil and gas for computer people has fewer rungs. I learned this personally, when my first employer (an oil company) told me I couldn’t amount to much in their company because I wasn’t an engineer. It was probably true, but prompted me to seek my fortune elsewhere.
Software assets are under threat too. As the world’s software products migrate to the cloud computing model, companies with a legacy investment in proprietary in-house software will increasingly find themselves with a growing stockpile of dated, unmaintained software. The most innovative and creative solutions and technologies to modern problems will not be available to those organisations wedded to their own infrastructure. Business risks will start to rise.
Change is coming
A few industry players have started to explore whether this model of proprietary data, on premises data farms, and roll-your-own computer centers is still the only way to go. At least one very large oil and gas company has concluded that time may be up for this approach.
In mid 2017, I was invited to attend a workshop hosted by the International Energy Agency (IEA) to discuss the impact that digital would have on the energy industry. The workshop included a number of large oil and gas companies, one of whom presented their experience in trialing digital thinking in their exploration area, and the results were startling and dramatic.
This large global oil company set out to test the current state of big data, computing and analytics, and whether the digital capabilities available on the market were able to attain the demanding performance goals of the oil industry. The company loaded a sizeable volume of seismic and other subsurface data to a commercially available cloud computing environment from one of the market leaders, and invited a cadre of data scientists and algorithm specialists to analyse the data using whatever maths and tools they had at their disposal. The data had been anonymized so that its competitive value was neutral, and was presented in such a way that non-industry professionals could work to solve some of the problems the company wanted to study.
The company concluded that the commercial world of cloud computing services was now sufficiently robust that it could take on their demanding data volumes and analytic challenges. This is a profound change from the status quo.
As for interpreting the data, the data scientists, who were drawn from many industries, were able to match the company’s own geologists and petroleum engineers at interpreting the data and identifying the most prospective locations in the resource. Interpretation of seismic is supposed to be a core competence solely of oil and gas professionals. Geologists will soon be competing with math majors.
Grow those reserves
Aside from rendering in-house data centers obsolete, and creating a new crowd-sourced data interpretation model, what other impacts is digital expected to have on the exploration industry?
The biggest impact by far is the growth forecast in reserves. The IEA forecasts that digital, applied to the unconventional oil and gas resources (that is, shale and other low permeability and low porosity resources) will expand global oil and gas reserves by just 5%, but that translates to 70 billion tons or 500 billion barrels of oil equivalent. With demand at about 100m barrels per day, this volume equates to 13 years’ supply. Do we need more supply?
While many countries in the world have shale resources (the USA, Canada, Argentina, China, the UK), only the US and Canada have figured out how to extract hydrocarbons in meaningful volumes from their shale deposits. Much of this growth in reserves will accrue to these two countries. This helps explain why the IEA also forecasts that the US may well become the world’s largest oil exporter in the next few years.
In addition to the low permeability and low porosity reservoirs, the legacy conventional deposits, with recoverability averaging just 40%, will also benefit from better analytics and understanding of the reservoirs. Suffice to say, there’s a very big prize awaiting the early adopters of just a few digital techniques.
Five low risk actions to take
Exploration departments can move forward with just a few low risk steps:
• Break down the internal data silos, by creating large accessible data pools or lakes that are at least open for internal users.
• Execute small trials of cloud computing to see if there are viable commercial options in your neighborhood.
• Start requiring that software purchases incorporate options to move to cloud versions.
• Run some crowd sourcing data interpretation competitions.
• Rent some AI tools to see how they fare with your subsurface data.
In time the exploration departments can set out a proper long term game plan to become more digital. Meanwhile these steps will make a big difference.