Getting machines to understand languages ​​is the next big frontier in AI


Rob Toews’ article “Language is the next great frontier in AI” published on Forbes.com in February of this year caught my eye because of my interest in this topic. My doctoral research focused on automatic natural language processing and text mining. Building machines that can understand language has long been a central goal in the field of artificial intelligence. If certain predictions had come true, we would now have machines fully capable of understanding human language. But he proved “extremely elusive,” Toews says.

Yet there have been major breakthroughs in language technologies in recent years. For example, I was fascinated by the free Google Translate app that you can use to point to foreign language texts such as Chinese or Japanese using your phone’s camera and get instant translations.

“Technology is now at a critical inflection point, poised to move from academic research to widespread adoption in the real world. In doing so, large swaths of the business world and our daily lives will be transformed. the ubiquity of language, few areas of technology will have a greater impact on society in the years to come,” adds Toews.

What are language technologies?

Language technologies (LT) are computer applications that help us do useful things with human language, whether spoken or written. Thus, they are also known as Human Language Technologies (HLT). Text-to-speech converters, speech-to-text converters, text classification programs, machine translation systems, etc. are just a few examples of LT.

LTs deal with the computational processing of human language, whether in spoken or written form to facilitate both interaction with machines and the processing of large amounts of textual information.

Why language technologies are important

Why are advances in language technologies (LT) important? The answer is simple. Without language, we cannot reason abstractly or develop complex ideas and communicate to others. Civilization as we know it simply would not have evolved without language.

LTs are a key technology that will drive advancements in AI and computing in general in the near future. A Harvard Business Review article published in September 2020 entitled “The next big breakthrough in AI will be around language” says this: “The 2010s produced breakthroughs in vision-driven technologies, image searches accurate on the web to computer vision. systems for analyzing medical images or detecting defective parts in manufacturing and assembly, as we have described in detail in our book and our research. GPT3, developed by OpenAI, indicates that the 2020s will be marked by major advances in language-based AI tasks.

“Imagine being able to talk to your car and make it react intelligently, giving detailed route advice or summarizing the latest news you just missed on the radio. Or, being able to speak or type queries to your engine web search in plain language, like you would a person, and have them return just the document you’re looking for, perhaps in a summarized form for easy reading, translated from another language and with the key points for your needs highlighted. Some of these capabilities are already here, and more are on the horizon,” according to the Center for Language Technology, Macquarie University, Australia.

With advances in machine learning and robotics, people predict that human servant robots may be invented to perform household chores in the future. Advances in LTs could allow seamless two-way communication between human masters and robot servants, much like communication between two humans.

Recent Breakthroughs in LTs

The invention of the transformer by a group of Google researchers at the end of 2017 is considered a major breakthrough. It is a novel neural network architecture that has unlocked vast new possibilities in AI. “The great innovation of Transformers is to make language processing parallelized, i.e. all tokens in a given body of text are parsed at once rather than in sequence,” the paper says. Toews mentioned above. Many innovations have now sat on top of Google’s original architecture, including Facebook’s 2019 RoBERTa model.

Some common applications of LTs

Text-to-speech systems

As already pointed out, text-to-speech systems are quite well developed in major languages ​​like English. A great app that I use often is a text-to-speech app to read me e-books that I downloaded in pdf format. The app just opens the e-book I saved on my phone and reads it to me while I’m driving. It is a free application, but the quality is quite good. I don’t really feel like a robot is reading it to me. The app I’m referring to is @Voice Aloud Reader (TTS Reader) from Hyperionics Technology. You can also download it for free from play store and have a try.

Text-to-speech systems

These systems are used to capture what you say and convert it to text. This is already possible on most keyboards of your smartphone and also to search on Google using your smartphone. This basically eliminates the need for a keyboard.

Chatbots

Chatbots are used by many businesses to answer common customer questions. In 2019, the media widely reported that a Chinese software engineer designed a chatbot to chat with his girlfriend while he was busy at work.

Spoken language dialogue systems

These systems allow you to talk to a computer through a telephone. “These can be used to call on the phone and talk to a machine to buy or sell stocks and shares, or to get directions from one city to another,” sources say.

Automatic code generation

OpenAI announced Codex, a transformer-based model that can write computer code surprisingly well. Human users can give it a plain English description of a command or function, and Codex turns that description into working computer code.

Automatic translation

Machine translation technology takes a document in one language and translates it into a document in another language. The best example of this is Google Translate. I’m sure most of you have already tried it. Machine translation is still not perfect, but improving. One day may not be far off when the need to learn another language will be entirely eliminated!

What about LTs for Dzongkha?

We need to conduct NLP research for Dzongkha and develop relevant language technologies that can promote its use and aid in its preservation. Unfortunately, major research projects do not address minority languages ​​such as Dzongkha due to the small potential market for products or services for these languages.

Previous The datasets we're looking at this week
Next Scam search engine gets a nod as online fraud cases rise