Skip to main content

Talking the talk: Why NLP is the next great AI opportunity

(Image credit: Image Credit: Computerizer / Pixabay)

New technologies that are revolutionising the world as we know it are being developed all of the time. At one time, we relied on books and the experience of those around us for knowledge. Then came the internet, giving us an instant answer to any question we could think of. Now, with the development and success of personal assistant applications such as Alexa, Siri, Cortana, and OK Google, we don’t even have to move from our chair to find out what the weather is today or to learn historical facts.  Personal assistant applications are enabled by Artificial Intelligence (AI) and Natural Language Processing (NLP), two technical innovations that are playing a significant role in maximising the technological impact on our everyday lives.

NLP is the driving force behind applications that are language translation focused such as Google Translate, word processors like Microsoft Word, and Grammarly, which uses NLP to check the grammatical accuracy of texts. But, as with all new developments, there are usually teething troubles and in the case of NLP, there are still some challenges that must be overcome before it can reach its true potential.

Foundations first

In simple terms, NLP is a combination of AI, linguistics, computer science, and data engineering, allowing computers to understand human languages by applying algorithms to identify and extract natural language data. NLP manages the programming of computers to process and analyse this language data through syntactic and semantic analysis.

If we take a look at a common interaction between humans and machines, it would start with the human asking the machine a question. The machine would respond by capturing the audio and converting it to text. The text data is then processed, converted back to audio and played to the human, thus answering their question. This is a very simple, straightforward example and if every conversation went just like this, there would be little room for error. However, like moments in a human conversation that are sometimes misunderstood, perhaps due to regional dialects or sarcasm, machines can also struggle to understand the context of text unless properly prepared and trained.

Syntactic analysis is used to assess how the natural language aligns with the grammatical rules, ensuring they make grammatical sense. To do this, computer algorithms are used to apply grammatical rules to a group of words which then take the meaning from them. The other key technique, semantic analysis, refers to the meaning that is conveyed by the text. This is slightly more challenging than syntactic analysis, as it requires the application of computer algorithms to understand the meaning and interpretation of words and how sentences are structured. Named entity recognition, word sense disambiguation, and natural language generation are three common techniques that are used in syntactic analysis.

Common applications

NLP has a wide variety of uses, playing a role in any language-related application, whether that is focused on language translation or editing grammar. NLP is also found within sentiment analysis, which involves the classification of emotion behind text content and information extraction, a technique that extracts structured data from text.

We can see the effect of NLP in HR, advertising, healthcare and retail use-cases, to name just a few examples. NLP has enabled quicker efficiency for HR professionals who can now speed up the search for candidates by filtering out relevant resumes and creating job descriptions that are non-bias and gender-neutral. In advertising, professionals can identify new audiences potentially interested in their products by analysing their digital footprint. NLP does this by analysing social media, emails, search keywords, and browsing behaviour.

As the human population grows and lives longer due to medical developments, pressures are being placed on healthcare services around the globe. NLP enables virtual nurse assistants to interview and assess patients and can also act as a bridge between doctors and patients with a language barrier, relieving pressure.

If you are a regular online shopper, you will have noticed the role NLP is playing in the retail environment. Already reaping the benefits of technological innovations, retail organisations are placing AI-driven chatbots on their websites. AI-chatbots have become a commonplace tool that improves the customer service experience. Whilst browsing, customers can ask any questions or request additional information that could impact their shopping habits. By getting the information to the customer quickly and efficiently, retailers are ensuring customer satisfaction whilst reducing its operating costs.

A work in progress

NLP is increasingly being applied to more and more challenges. The benefits it brings allow for greater efficiency, saving time and resources, and improving service and satisfaction. But as with many technological innovations, we are not at the end of the journey yet and there are still challenges that we must overcome in order to see the full impact of NLP.

One issue facing NLP is the building of a vocabulary. Leveraging Parts of Speech (POS) tagging tools and dependency graphs will generate an authoritative vocabulary that can be interpreted by the machine in a way that is comparable to human understanding. This is key to the continued success of language-based applications, as we need to feel we are speaking to another human, not a machine. Following on from this, extracting semantic meanings is another obstacle that developers are working on smoothing out. The challenge this brings is ensuring that the machine can understand the semantics of every single word within the context of the text, to guarantee its response is accurate.

Talk of a skills gap is frequent throughout the technology industry and with the rapid advancement of innovations, it can be difficult to recruit or develop the right talent for your business. The lack of individuals skilled in NLP poses an issue to the continued development of NLP. Developers and IT teams need to be able to anticipate, understand and correctly interpret the particular types of challenges in language processing in order to overcome them and push forward NLP.

Looking ahead to the future

Whilst there are challenges that must be overcome in order to drive the development and benefits of NLP further, there are options available that can transform the way that it works. It is already leaving its mark across different sectors, verticals, and areas through a wide range of applications. NLP is enabling spell check, voice text messaging, and spam filters, as well as personal assistant applications and chatbots, helping to automate, streamline, and impress all whilst reducing operational costs. However it is the very core idea of NLP that makes continued development difficult. The nature of human language is not always simple and easy to understand, and the rules that command linguistics can be tough for computers to understand.

The large and growing amounts of language and data that NLP has the ability to process must be organised in an effective, integrated way as this will significantly impact the future performance of NLP. A successful way to ensure a data-driven approach is by implementing integration technologies that allow for the effective movement and integration of data. This ensures maximum performance and use, but can also have a positive impact on business users and productivity of workforces. Self-service integration platforms allow teams with diverse skill sets to drive innovation and results, and in doing so, expand their own capabilities and help to bridge the organisational skills gap.

Once the data is where it needed to be, NLP will be able to deliver on its full promise and business, governments, and society as a whole will reap the benefits.

Diby Malakar, VP of Product Management, SnapLogic (opens in new tab)

Diby Malakar is VP of Product Management at SnapLogic and is responsible for driving product direction and strategy, with a focus on iPaaS, Big Data, AI and Machine Learning. He has over 25 years of experience in the industry and previously held positions at Oracle, Informatica and Cloud9 Analytics.