Why AI Won’t Fly in Oil and Gas and How To Give It Wings

C2554FFE 527E 4349 8395 732C02B24460 e1543164088129

Why AI Won’t Fly in Oil and Gas and How To Give It Wings

Algorithms, a kind of artificial intelligence, are going to struggle to get off the ground in oil and gas. Here’s why, and what management can do to give AI wings.

My first algorithm

I remember watching my Dad at work when I was growing up. He was a math wizz and correctly tallied up long columns of numbers in his head. One day the boss issued these new battery powered calculators to the troops. So cool, with their space-age red symbol read outs, tiny keys and form factor. The funny thing was, Dad added up the numbers using the calculator, as the boss wanted, but then he redid all the math by hand because he didn’t quite trust the calculator, the algorithm of his day, to get it right.

Algorithms are now pervasive in our broader society. Consider my smart watch, for example. It’s jammed with apps and algorithms. One of its more clever is the heart beat monitor—using a sensor to read the movement of blood just under the skin, and a bit of math, the monitor  picks up my heart beat rather well. I compare its accuracy with the built in monitors on gym equipment and a chest strap monitor that I use on my bike, and it’s spot on.

I find it super valuable. When I’m out cycling, I use the heart monitor to guide my activity levels—I might feel winded, but the monitor tells me my heart rate is well below my norms, so I push on.

I have no clear idea how it works. But I accept it.

Medical professionals think this is junk. They use highly sophisticated heart rate measurement devices, way beyond the simple stethoscope. They bolt sensitive electrical sensors on the skin in a dozen places and can pick up the tiniest signals. They relate detailed heart measurement data with lung performance and blood pressure, which I can’t do with my watch. They rightfully challenge my watch’s ability to pick up nuances in beat regularity, and they diss how it records the occasional false positive where the reading is worryingly high or dangerously low. At best, they say your watch merely confirms that you have a heart, a piece of data that I really don’t need to spend $500 to discover.

My anonymous health professional is my proxy for an oil and gas engineer who has discovered that someone has strapped some new-tangled digital sensor to some equipment and is then asked to rely on the interpretation of the data from the sensor using an  algorithm. Let’s peel this apart to see why this is a fraught endeavour.

The uncertainties of new technology

The oil and gas industry is less uncertain now. My engineer friend in oil and gas relies on the technology model designed 40 years ago, since the development of SCADA systems in 70s and 80s. Analog sensors are built to known industrial standard, embed directly in the piece of equipment, and hardwired to a control panel somewhere, or to a front-mounted gauge. There are no uncertainties in the industrial standard, or in the tests of compliance that the devices and gauges must pass. Copper wire is highly reliable, and the SCADA system has been working reliably for decades now. The data is extracted from the historian directly into an Excel spreadsheet onto a trusted employee’s computer for analysis and interpretation. The people involved are known quantities.

The engineers know how all this stuff works. It’s been part of the curriculum and disciplines for decades. The older guys hire the younger guys and pass on all this knowledge rather well.

New technologies introduce fresh uncertainties into this stable world. Imagine strapping a new modern wireless sensor on a piece of equipment, using an artificial intelligence engine on the cloud to interpret that data, and making decisions based on the results. Here’s eight:

  • The sensor itself, its technical features, and its compliance with industrial standards
  • The sensor mounting and how reliable it is to capture data correctly
  • The power supply to the sensor and its reliability (AA batteries don’t make the grade)
  • The data the sensor generates and the potential for compromise (an embedded sensor cannot be as easily compromised as a strap on wireless)
  • The integrity of the wireless network that moves the data from the sensor to the cloud analytics engine
  • The technical competence of the algorithm’s author
  • The integrity of algorithm itself
  • The results that the algorithm generates

AI enthusiasts are frustrated with the slow pace of adoption of AI technology in oil and gas, but I find they offer little by way of helpful response to many of the uncertainties above, and some social norms compel them to block their own progress. For example, they cannot, or won’t, clearly explain how their AI algorithm technically works, perhaps out of a concern that their IP will be compromised, which means the engineer has to trust it.

AI in the field

Let’s assume that some brave oil and gas company kicks off an AI initiative, rolls a shiny new algorithm out to the field, and hopes for results. Assume that they resolve the problems of sensor provenance, network reliability, power supply, and connectivity. Remember, the field team does not understand how the algorithm works. Soon the algorithm starts generating analysis.

Consider four scenarios:


The algorithm correctly interprets the data and the interpretation matches what the engineer thinks should be the result. The AI machine is improved, but he’s irritated that the company spent a lot of money on something that he could already do. The full benefits of a smarter machine are deferred.


The algorithm correctly interprets the data, but the interpretation differs from what the engineer thinks should be the result. Now the engineer is in a dilemma: is this a false positive? Does he take the recommended action for an uncertain outcome, or ignore the recommended action and rely on his intuition or a sidebar manual exercise? Performance metrics and targets compel the engineer to weigh the business and personal consequences to determine the safe path.

What if the machine is correct and proves the engineer has been wrong all this time. Will there be repercussions? Who wants that embarrassment?

Our engineer reverts to previous analysis and ignores the machine. He loses a learning opportunity for both the human and the machine, and he’ll spend time and money trying to replicate the algorithm. At least he can claim to have avoided a possible catastrophe, which he would have avoided anyway, and he won’t be embarrassed by a machine.

But if he follows the machine’s recommendations, he is better off in both the short term (because of a smarter decision and fewer wasted resources) and in the long run (because the machine is made smarter).


The algorithm incorrectly interprets the data and the engineer agrees that the interpretation is incorrect, forcing the engineer to rely on previous knowhow and analysis. He’s irritated that the company spent a lot of money on something that doesn’t work and he’s no better off. His choices then inform the algorithm, which gets a bit smarter and might pay off later on, but at the risk of doing himself out of a job.


The algorithm incorrectly interprets the data, but the engineer can’t tell that the interpretation is incorrect and has no better analysis to leverage. Again, the engineer is in a dilemma: what if the machine is wrong? if she follows its recommendations and it doesn’t work, is she to blame?

She’s in a bind, and has no option but to execute, and there’s a failure. She’ll wear this at the next performance review, but at least she can cast some shade at the algorithm. The algorithm is a little smarter.

Boosting chances of success

If you think for a minute you’re going to throw AI at the field and it will get results, think again. The business model in place in oil and gas is stacked against you. Here’s what else you need to change:

Sponsorship and support.

Demonstrate unyielding support for AI. If the troops on the front line don’t think they have management support, they’re not going to invest much time in new technology. All four scenarios that I outline are correctly seen as either a waste of time and money, or create poor career outcomes for the troops.


Insist that the AI provider include as much education as possible on how the algorithm actually works, its self learning ability and its limitations. And the AI provider needs to get over themselves and their obsession with protecting their IP.

Performance metrics.

Tune the performance metrics to reward the behaviour required: supportive supervisors, agility and willingness to trial new concepts, and a smarter AI machine.

The success of AI in the field is not going to be based on how good the algorithms are but on how good management is at promoting change in the workplace.


If you liked this article, you might really like my new book, “Bits, Bytes, and Barrels : The Digital Transformation of Oil and Gas”. Sign up here to learn more!


Mobile: ☎️+1(587)830-6900
email: ✉️ geoff@geoffreycann.com
Visit: 🖥 https://geoffreycann.com/
LinkedIn: 🚀 https://www.linkedin.com/in/digitalstrategyoilgas

No Comments

Post A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.