The Future of Coding is Conversation

Published on17 NOV 2022

The way humans interact with computers is poised for a profound change.

For decades, people have been adapting to machines by typing code on a keyboard. But in the coming years computers will be the ones getting used to us. “We'll see computers adapting back to us, into the natural way that we're used to doing things,” Renen Hallak, founder and CEO of VAST Data, said last month at Goldman Sachs’ Private Innovative Company Conference. “They will be able to understand the natural world in a way that wasn't possible before.”

Developments in language understanding mean everyone from scientists to businesspeople can increasingly use AI tools without learning coding languages like Python or C++. “The progress that we have seen in natural language understanding dwarfs anything that happened in the previous 40 years,” said Vijay Saraswat, head of artificial intelligence research and development at Goldman Sachs.

As an example of what’s already available, Saraswat says he has used off-the-shelf software to pull information from documents. He didn’t have to use a large database to teach the model to understand the data, and it gave back answers in plain English. “Did no training on it,” he said at our conference in Las Vegas. “None whatsoever.”

Many companies are still in the early days of getting the full potential from machine learning systems, says Justin Borgman, founder and CEO of Starburst Data. Recommendation engines — if a customer buys a pair of shoes, for example, the algorithm will suggest a pair of trousers — are popular. But firms still struggle to pull together the data needed for those models from different parts of the firm.

One of the ways engineers are finding ways around those silos of data is through an emerging concept called a “data mesh,” Borgman says. The architecture decentralizes data creation and ownership and makes the data a product that others can consume. He says his firm is seeing a lot of enterprises using a mesh across different data sources for their machine learning models.

Hallak says AI has reached the point that old, seemingly unworkable, ideas are no longer farfetched. “In the ‘70s and ‘80s, neural nets were a four-letter word, at least in academia,” he said, referring to a field of machine learning that’s inspired by signaling between the brain’s neurons. “Over the last few years, with the abundance of both compute power and information that we can feed into these neural nets and slight tweaks into their structure, we're seeing that they actually do work.”

That availability of data is accelerating as we interact with computers more. “The last few years, even during the pandemic, has accelerated the digitization of life, of all aspects of life,” Borgman said. “Consumer behavior has leaped forward decades probably in terms of the pace with which we consume digital products, whether that's delivery services and various games that we play and so forth.”

Saraswat says fundamental breakthroughs are happening because of a few powerful research groups that are driving them, and they’re building on many years of research. While there’s still a lot of work to do and the field is still close to its beginning, he thinks AI is going through a golden age where breakthroughs are being applied outside the lab much more quickly than in the past.

George Lee, co-head of applied innovation at Goldman Sachs, added that an emergent quality of artificial intelligence and machine learning is the ability of these systems to develop code on their own. This hints at the ability for humans to operate in natural language and rely on AIs to render their ideas and designs into functional software.

AI has advanced so much that it’s growing faster than anything that came before it in computer science, including the Internet, Hallak says. “Computers will understand and be able to thrive in the analog world that we are used to,” he says.

Explore More Insights