AI and Its Cousins

Artificial Intelligence (AI), Machine Learning and Deep Learning are trending subjects worldwide with applications being deployed in all business segments e.g. medicine, self driving vehicles and customer services. 

The objective of the article is to understand why these subjects are popular. Furthermore, the aim is to explore each of these topics in a high level overview that acts as a gentle introduction to AI and its cousins.

Why now?

AI has been around since the 1950’s. Machine Learning gained traction in the 80’s and now, in recent years Deep Learning has had a successful breakthrough.

Birth of Deep Learning

Historically, computers weren’t very powerful. As AI algorithms were/are very computationally expensive, working with and extracting value from them was not very effective. As a result, they did not appear in many practical applications.

Nowadays, computing speeds have increased compared to the ’50s era and the availability of high computing power is more mainstream. The cost of computing power and data storage has become significantly lower too. Furthermore, the Cloud is empowering engineers and scientists, with on-demand, low cost, parallel computing capacity (CPU and GPU) made available through the web browser. Nowadays, we regularly see applications that implement AI.

_The oil & gas industry has been generating large volumes of data in recent decades_ — now with Industry 4.0 accelerating, the amount of data generated will rise exponentially. This means companies in the supply chain are seriously considering how to work with this data and harness value from it effectively — with machine learning and analytics.

Moreover, there has been extensive research initiatives from Google, Apple, Facebook, Amazon and others. Apple has recently shared their research findings here. Open source libraries can be called within Python applications (for example) — giving engineers powerful resources to implement AI and its cousins in their own work.


Before discussing the details of each subject, let’s look at their definitions.

Artificial Intelligence

“The study of the modelling of human mental functions by computer programs.” (Source: Collins Dictionary)

Machine Learning

“Machine learning is the science of getting computers to act without being explicitly programmed.” (Source: Stanford University)

Deep Learning

“Deep Learning is a subfield of machine learning concerned with algorithms inspired by the structure and function of the brain called artificial neural networks”. (Source: Machine Learning Mastery)

Breaking It Down

The three topics mentioned are all subsets of one another and not necessarily the same thing.


AI is the building of a machine that performs a task just as equally or better than a human. Therefore it acts with intelligence, just like the robot shown below (Atlas by Boston Dynamics).

Machine Learning

Given a dataset, a machine learning model can be trained and tested for pattern recognition and then used for prediction. The algorithm learns from the data and gains experience at performing a task, whether it’s regression, classification, clustering etc..

It should be noted that there are several learning paradigms just as in Deep Learning (e.g. supervised learning, unsupervised learning, reinforced learning) and the choice depends on the data available and the objective of the user. Normally, without Machine Learning, we would have a software engineer code a program to tell a computer what to do.

Deep Learning

Deep Learning structures try to mimic that of the human brain. To understand Deep Learning, let’s look at Neural Networks, described by “A neural network passes data through interconnected layers of nodes, classifying information and characteristics of a layer before passing the results on to other nodes in subsequent layers. The difference between a neural network and a deep learning network is contingent on the number of layers: A basic neural network may have two to three layers, while a deep learning network may have dozens or hundreds.”

Final Thoughts

The aforementioned subjects probably come to mind as being buzzwords. I believe that they are no longer buzzwords. These algorithms are embedded in our daily lives e.g. online banking fraud detection, access control for mobile phones and recommendation systems for e-commerce stores.

There is a lot of talk about the threats posed by AI and potential negative effects as a result of any malpractice. As discussed in a recent blog post, I believe that “AI and data will emerge as a key ally in the digital transformation process working seamlessly with humans”.  Oversight is necessary from humans to ensure that the end results we get are [transparent, unbiased and ethical](“").

Artificial Intelligence & Machine Learning — Source: Pinterest

As discussed, AI in general is already superior to humans in several areas (e.g. image recognition and complex board games like chess and go). Yet, these intelligent algorithms don’t perform as well in other areas. For example, conversational AI and chatbots still have a long way to go to compete with humans..

Memory versus Computational Speed — Source: Visual Capitalist

In the oil & energy sector, there are real gains to be had by adopting AI and its cousins, especially as:

  1. More and more assets come online and gather data to be stored in the cloud
  2. We re-think about the way we execute work/tasks/processes to become more productive
  3. We try to unlock “hard to reach” energy sources

Added-value opportunities will come in many forms, for example: * Advanced insights for operations * Enhanced decision making * Optimisation of oil and gas production

Currently, a big wave of activity is happening within these fields and what we see today is only the tip of the iceberg.  More exciting times to come!


Thanks to Romain Faure and Francois Goupil for their guidance on these subjects.

comments powered by Disqus