Tech

AI Smarts have a big price tag

Calvin Chi, I work for a search startup called Glean and want to use the latest artificial intelligence algorithms to improve my company’s products.

Glean provides tools for searching applications such as Gmail, Slack, and Salesforce. Qi says the new AI technology for analyzing languages ​​will help Glean’s customers find the right files and conversations much faster.

However, training such state-of-the-art AI algorithms can cost millions of dollars. That’s why Glean uses a smaller, less capable AI model that can’t extract too much meaning from text.

Getting the same level of results as companies like Google and Amazon is “difficult in small places with low budgets,” Chi said. He says the most powerful AI model is “out of the question”.

AI has created exciting breakthroughs in the last decade. A program that allows you to defeat humans in complex games, drive through the streets of the city under certain conditions, respond to voice commands, and write consistent text based on short prompts. Writing in particular relies on recent advances in the ability of computers to parse and manipulate languages.

These advances are primarily the result of providing the algorithm with more text as an example of learning it and more chips to digest it. And it costs money.

Consider the OpenAI language model GPT-3. This was a large mathematically simulated neural network that was fed a large amount of text scraped from the web. GPT-3 can find statistical patterns that predict with impressive consistency which words should follow other words. Immediately out of the box, GPT-3 is significantly better than previous AI models in tasks such as answering questions, summarizing text, and fixing grammatical errors. On one scale, it is 1,000 times more capable than its predecessor, GPT-2. However, GPT-3 training costs nearly $ 5 million, according to some estimates.

“If GPT-3 is accessible and cheap, search engines will be completely overloaded,” says Qi. “It’s really, really powerful.”

The exponential cost of training advanced AI is also a problem for established companies looking to build AI capabilities.

Dan McCreary leads a team within one division of Optum, a healthcare IT company. The team uses a language model to analyze call records to identify high-risk patients and recommend referrals. He says that training a language model that is one-thousandth the size of GPT-3 can quickly run out of team budget. Models need to be trained for specific tasks and can cost more than $ 50,000 paid to cloud computing companies to rent computers and programs.

McCreary says cloud computing providers have little reason to lower costs. “I can’t believe cloud providers are working to reduce the cost of building AI models,” he says. He is considering purchasing a dedicated chip designed to speed up AI training.

Part of the reason AI has made rapid progress lately is that many academic labs and startups can download and use the latest ideas and methods. For example, algorithms that brought breakthroughs in image processing emerged from academic laboratories and were developed using off-the-shelf hardware and openly shared datasets.

However, over time, it is becoming increasingly clear that advances in AI are associated with an exponential increase in the capabilities of the underlying computers.

Of course, large companies always have advantages in terms of budget, size, and reach. And the power of large numbers of computers is an important element of industries such as drug discovery.

Some are now trying to scale it further. Microsoft said this week that it used Nvidia to build a language model that was more than double that of GPT-3. Chinese researchers say they have built four times that language model.

“The cost of training AI is definitely rising,” said David Kanter, executive director of ML Commons, an organization that tracks the performance of chips designed for AI. He says the idea that larger models can unlock valuable new features is found in many areas of the tech industry. It may explain why Tesla is designing its own chip just to train AI models for autonomous driving.

Some are concerned that rising costs to access the latest and greatest technology can slow the pace of innovation by booking it for large companies and companies leasing tools. increase.

“I think it reduces innovation,” says professor Chris Manning of Stanford University, who specializes in AI and languages. “When there are only a handful of places where people can play inside these models of that size, it needs to significantly reduce the amount of creative quest that takes place.”

AI Smarts have a big price tag

Source link AI Smarts have a big price tag

Back to top button