Natural Language Processing (NLP) is an area of expertise focused on the interaction between computers and human language. With the use of Data Science and Artificial Intelligence, NLP algorithms can extract information from texts, analyze texts, and even generate texts. 

NLP is a phenomenon dating back to the 1950s, and over the last decades the field has drastically matured. Back in the days the approach of NLP was very linguistic, concentrated on language structures, and exploring ways for computers to understand these structures. Nowadays, the linguistics of a language are less relevant due to the utilization of Big Data and modern types of Neural Networks . These Neural Networks are models that, when large enough, can interpret every conceivable relationship in the input. Moreover, Neural Networks are able to learn tasks such as classification, prediction, and visualization, only by considering examples.

Recent developments in the field of NLP are a direct result of the application of Neural Networks and Deep Learnings methods. In the last decade Deep Learning has emerged, and became the groundwork of innovations in all fields of AI over the last five years. Guus van de Mond, partner at Squadra and founder of Machine Learning Company, explains Deep Learning with the following example:

“Deep Learning is essentially dividing a certain problem into several layers. Every layer represents a certain function, defining an abstract model. Every layer that is added, can use the information from the previous layers. Thus, imagine you want to teach the algorithm to be able to recognize a picture of a dog. In this case, the first layer could be one that recognizes shapes (circles, triangles, etcetera). The second layer could be one that can identify eyes (two oval shapes next to each other). The third layer could be one that recognizes a face, and so on. Finally, the algorithm can recognize the picture of a dog.”­­ The same principle can be applied to text data sources like sentences. 

Recently, the world was introduced to Transformers (e.g. BERT, T5 and GPT-3), which are revolutionary Deep Learning models that do not have to process data sequentially (from beginning to end), but use a mechanism known as attention to analyze a large body of text simultaneously. These innovations drastically improved the models’ understanding of linguistical context, and makes recent models capable of outdoing earlier models in a variety of different tasks.

An example of such a task is missing word prediction. Missing word prediction is useful because it makes it easy to create a huge dataset, simply by taking a large body of text and masking words. To create a useful model (such answering questions based on a text), researchers used a much smaller dataset and trained the model again for this specific task (a process known as fine tuning). The AI community was stunned to see that BERT outperformed all existing AI models on a wide range of NLP tasks!

But the latest revolution comes from the GPT-3 model. An extremely powerful model which consists of the huge amount of 175 billion parameters. It Is able to understand English prompts and can generate texts without a single example. Jelmer Wind, Data Scientist at Machine Learning Company, experimented with the GPT-3 model by asking it to generate a text that opposes a human political argument. Without a single example (zero-shot training), the GPT-3 model was able to generate a congruent text representing a counter-argument to the aforementioned human political argument. This capability is a direct results of the improved understanding of human language.

At Squadra Machine Learning Company, multiple NLP applications are already operative. Two examples are PowerText.ai and PowerEnrich.ai
PowerText.ai is a tool that can generate unique and SEO optimized commercial product descriptions for e-commerce platforms, based on the feature data of the products (e.g. color, capacity, type, etc.). It enables wholesalers and retailers to save time and costs in copywriting product descriptions and to gain an substantial increase in revenue due to higher SEO results.

PowerEnrich.ai is a tool for e-commerce organizations that analyses various text sources and is able to extract data elements automatically with the goal to enrich product data for online publishing. It saves huge amounts of manual data entry efforts for large datasets. With the use of Transformer models, Guus expects to be able to add even more creativity to the generated texts and higher accuracy for the text extraction algorithms. 

Due to the enormous power, these recent innovations in the field of NLP could also potentially have a negative impact when applied for unethical purposes. For example, the GPT3 model can easily be persuaded to argue for anything, no matter how unethical, in such a lifelike manner that it is virtually indistinguishable from a human. For example, recent models are able to generate human-like texts which do not necessarily have to include the truth. This is why the access to models like GPT-3 is limited, and a balance should be struck between technological innovations and unethical intentions. 

The field of NLP is changing rapidly with new developments almost on a daily basis. Squadra Machine Learning Company is always on the look-out for developments that can improve their services, and increase the ability to solve their clients’ challenges. Sometimes, the most recent applications are used to solve challenges that would not have been able to be solved a year prior a project, that is how quickly the field of NLP is evolving! 

Squadra Machine Learning Company is an innovative Dutch company which combines knowledge about business processes, algorithm development and data visualization for the purpose of helping customers with Machine Learning algorithms and the application of Artificial Intelligence solutions. They have a proven track record in applying Data Science and AI solutions for supplier-, wholesale- and retail-organizations.