By Alex Woodie
Natural language processing (NLP) has been one of the hottest sectors in AI over the past two years. Will the string of big data breakthroughs continue into 2022? We checked in with industry experts to find out.
There’s been a veritable arms race to develop large transformer models over the past couple of years. It started in 2020 with OpenAI’s GPT-3 with 175 billion parameters. Then Microsoft and Nvidia teamed up on MT-NLG (Megatron-Turing Natural Language Generation), which sported 530 billion parameters. Finally in 2021, Google gave us its Switch Transformer with 1.6 trillion parameters.
Don’t expect the race to build ever larger transformer models to slow down in 2022, says Natalia Vassilieva, director of product for machine learning at AI hardware maker Cerebras.
“These larger models promise better results in a variety of natural language tasks with arguably one of the most interesting being an ability to generate an answer to any posted question,” she writes. “However, these giant models are usually trained on very large corpora of publicly available generic texts crawled from all over the Internet. So the answers generated by these models will rely on that public data.”
What’s more, there’s a gap between what these models trained on generic data can do versus what a model trained on a company’s domain-specific data can do, Vassilieva says. We’ll start to close that gap with the large transformer models this year.
“I expect that an ability to continuously pre-train (or fine-tune) these gigantic generic language models with proprietary domain-specific data will be of high interest and, once trained and deployed, will deliver better insights to domain scientists,” she writes. “In 2022 we will need to figure out how to do that efficiently, and also how to reduce the cost of running predictions with these humongous models once they are trained and tuned. Pruning and distilling the models might be a way to do so, as well as relying on a special-purpose hardware.”
In 2022, we’ll get the first $100-million language model, predict a trio of tech execs, including Paul Barba and Jeff Catlin, the chief scientist and CEO of NLP solution provider Lexalytics, respectively, and Mehul Nagrani, the GM of AI product and technology for InMoment, which owns Lexalytics.
“The race to train the largest possible language model continues unabated, and whether GPT-4 weights in at a particularly heavy parameter count or another of the tech giants reaches for this particular crown, an organization will announce a transformer-based deep network that cost at least $100M to train in 2022,” the execs write. “Each generation of language models has shown improvements on standard tasks and occasional new behaviors, but with inference costs also ballooning with model size, the commercial use case will be limited.”
NLP has come to the forefront as one of the most visible manifestations of our progress in AI. In 2022, the new capabilities will become even more widespread, says Michael Krause, senior manager of AI solutions at enterprise AI software provider Beyond Limits.
“In general, major breakthroughs in AI technologies are hard to time. However, 2022 will be an exciting year, [as] a potential new language model, GPT-4, brings with it hopes to dramatically improve natural language AI,” Krause writes. “Auto-generated articles that are indistinguishable from human writing, improved real-time language translation, and meta-learning capabilities are just a few ideas of what may come next. Taking this kind of human-like processing power and applying this to existing technologies such as the cloud will elevate the advancement of tech not just in one sector, but within every single industry.”
As AI models get bigger, they need more data to train them. One promising new source of materiel for the AI war is synthetic data, which has seen increased adoption over the past few years as companies ramp up AI initiatives. Wilson Pang, the CTO of AI solution provider Appen, sees synthetic data helping to help create new NLP use cases in 2022.
“As early implementations for generative AI technology lets companies do things like leverage identify marketing content with a higher success rate and leverage highly nuanced NLP capabilities to diagnose health cases through text and image data, we may see more use cases emerge over the next year as experimentation and adoption picks up,” Pang says.
As ambient computing begins to surround us, human-computer interactions will evolve into something new, says Lenovo VP Jerry Paradise.
“The IoT will continue to mature as device manufacturers refine user inputs. NLP will change the user experience as we know it. Multiple devices will respond, in concert, with one voice query,” Paradise says. “As the user interface changes, our device interactions will automatically become both more natural and more secure. And, as adoption grows, we will start to see more ‘connected’ endpoints from the connected car to the connected city and beyond.”
Large language models, such as BERT, are behind many of the conversational interfaces and chatbots that have proliferated over the past few years. In 2022, the language models will begin to give information workers some interesting new abilities, says Natalie Monbiot, head of strategy for Hour One, a provider of synthetic characters based on real-life people.
“2022 will see the growth of a new hybrid workforce in which human employees share their workload with digital employees. They will offload repetitive or routine tasks to machines that can perform them just as well, and in some cases better,” Monbiot writes. “What’s more, employees will have their own digital avatars, with superhuman skills–such as the ability to speak any language. This will serve to break down geographical and cultural barriers and enable a whole new era of frictionless communications.”
In 2022, the machines will begin to understand not just what we said, but how we said it, which will help to eliminate bias, says Scott Stephenson, CEO and co-founder of Deepgram, a provider of AI-based automated speech recognition software.
“Voice is the most natural form of communication. However, machines have historically been locked out of listening and analyzing conversations,” Stephenson writes. “In 2022, machines will be able to do more than just describe which words were said, but how they were said. This will enable users to truly understand what their customers want and empathize with their needs. Reducing bias in speech infrastructure will also be a top priority for vendors so that their customers can more accurately understand the voices of various backgrounds, genders, and languages of their users.”
Teun Schutte, managing consultant, digital strategy for healthcare at digital consultancy Mobiquity, says use cases in voice technology will increase across healthcare and life sciences.
“As voice technology has improved, we should expect to see it applied in more ways across the full journey of both patients and healthcare professionals,” Schutte says. “One major appeal on the patient side is accessibility. Literacy, specifically average reading and writing levels of the entire population, are sometimes disregarded in traditional methods of collecting or recording important information. This can be seen in the simple exchange of information during appointments or in clinical trials where people are asked to record their own experience and data.”
We’ve come quite far in AI adoption over the past two years, particularly with automated chatbots. In 2022, we’ll have a little bit of a pullback on that front, predicts Jeff Gallino, CTO of CallMiner, a provider of software for analyzing omnichannel customer interactions.
“AI has long been positioned as the solution to all of our problems, especially for customer experience,” Gallino says. “2022 will be the year that the technology will lose some of its shine. Some organizations have already realized that AI solutions, like chatbots, do not deliver on CX the way they were sold, often frustrating customers more than they help. More organizations will become tired of how AI is positioned to them in the year ahead.
“To combat this, AI companies will shift how they sell,” Gallino continues. “Instead of positioning AI as a silver bullet, it will be portrayed for what it truly is–a supporting tool to help humans, like CX agents, do their jobs more effectively and help organizations uncover valuable customer insights. If handled properly, these insights have the potential to move past commodification to improve overall business outcomes. The more AI companies sell solutions as being able to generate data-driven insights, as well as embedding these findings and closing the feedback loop, the more they’ll win over buyers.”
The maturation of NLP and ML tech will help regular business users operate like more highly skilled data analysts, predicts Raj Gossain, chief product officer at Alation, a provider of data catalogs and governance solutions.
“Organizations with integrated data strategies will provide their employees with the tools that allow them to gain data analyst ‘superpowers’ by tapping into vast amounts of data and drive business results,” Gossain writes. “This improves the productivity of business users’ and eliminates bottlenecks caused by the reliance on data analysts to find and analyze trusted data within their organizations, making the process more prolonged and arduous than necessary.”
Read more here:: www.datanami.com/feed/