05.10.2021 Views

Maintworld Magazine 3/2021

- maintenance & asset management

- maintenance & asset management

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

DIGITALISATION<br />

NLP CAN BE CONSIDERED A BRANCH OF AI<br />

AND IS ALL ABOUT MAKING SENSE<br />

OF HUMAN LANGUAGE.<br />

given all other words in that text (e.g. “It is [MASK] today.<br />

Let’s go to the beach”). Knowing the conditional probability of<br />

words is the basis for a variety of downstream NLP tasks such<br />

as content creation, language translation, auto-completion,<br />

question answering and text classification. Most language<br />

models are pre-trained on a large dataset (such as Wikipedia)<br />

and afterwards fine-tuned to perform a specific NLP task on a<br />

smaller dataset. This process is called transfer learning.<br />

What is GPT-3?<br />

GPT-3 is the third version of the Generative Pre-trained<br />

Transformer models developed by OpenAI, the AI specialist<br />

co-founded by Elon Musk, although he is no longer onboard.<br />

Nevertheless, there are no major breakthroughs in terms<br />

of architecture, it is considered the most powerful language<br />

model ever. Why? Because of its size!<br />

The model has a stunning 175 billion parameters and was<br />

pre-trained on a corpus of nearly half a trillion words, mainly<br />

sourced from the internet. In fact, the model is so large that<br />

no fine-tuning at all is required. It knows so much about<br />

language that it can learn NLP tasks that it has never encountered<br />

before by just giving it a few examples. This is called<br />

few-shot learning. The idea of such a general model is very<br />

tempting because it opens the path for democratizing AI and<br />

making NLP accessible for everyone.<br />

paragraph by revealing all the text was generated by GPT-3. A<br />

few months ago, The Guardian generated some publicity with<br />

a similar experiment.<br />

There are a lot of other intriguing examples making GPT-3<br />

a real buzz in the AI community and even beyond. But why is<br />

it such a big deal for Natural Language Processing (NLP)?<br />

NLP so far…<br />

NLP can be considered a branch of AI and is all about making<br />

sense of human language. NLP originates from the 1950s<br />

but the last decade brought a real revolution. We went from<br />

vectorizing words and analyzing word similarities (e.g. “man<br />

is to boy” what “woman is to girl”) with word2vec in 2013 to<br />

the Transformer model proposed in the “Attention Is All You<br />

Need” paper released in 2017. Transformers leverage the use<br />

of attention mechanisms to gather information about the<br />

context of a given word and encode that information in the<br />

vector representing the word (e.g. I have a “date” tonight vs<br />

What “date” is it today?). Transformers are the basis for stateof-the-art<br />

language models such as BERT and GPT.<br />

"Attention please"<br />

Language models are the new kids on the block in NLP. They<br />

basically predict how likely one word will appear in a text,<br />

Let’s get practical<br />

The idea is tempting. Unfortunately only limited people<br />

have access to the beta API for GPT-3 for now, but subscribing<br />

to the premium version of AI Dungeon gives you a workaround<br />

allowing you to catch a glimpse of what the model<br />

really can do.<br />

Let us have a look and see how it could help maintenance<br />

professionals and operations managers in supporting their<br />

day-to-day job. I gave the model a little context for warming<br />

up and asked a question about pump failure modes.<br />

The text in bold is fully generated by the AI system.<br />

Centrifugal pumps are one of the world's most widely used<br />

type of pump, having an extensive range of applications, from<br />

food processing to water or sewage transportation. Problems<br />

that arise within these machines decrease the flow of the fluid<br />

within the pipelines, thus interrupting the production and<br />

transport of the fluid to its destination within the process.<br />

What are the most common failure modes that are found in<br />

centrifugal pumps in the sewage industry?<br />

A: The main cause for failures is a lack of lubrication.<br />

If there is no oil or grease on the gears, then it will not<br />

move smoothly and the machine may stop working altogether.<br />

3/<strong>2021</strong> maintworld 15

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!