قالب وردپرس درنا توس
Home / Technology / OpenAI’s GPT-3 algorithm now produces billions of words a day

OpenAI’s GPT-3 algorithm now produces billions of words a day



When OpenAI released its huge, naturalistic algorithm GPT-3 last summer, its jaw dropped. Encoders and developers with special access to an early API quickly discovered new (and unexpected) things GPT-3 could do with nothing but a message. It wrote navigable poetry, produced decent code, calculated simple sums and with some edits wrote news articles.

All this, it turns out, was just the beginning. In a recently updated blog post, OpenAI said that tens of thousands of developers are now creating apps on the GPT-3 platform.

Over 300 apps (and counting) use GPT-3, and the algorithm generates 4.5 billion words a day for them.

Obviously there are many words. But to get a hold of how many, let̵

7;s try some napkin math.

The Coming Torrent of Algorithmic Content

Every month, users publish approximately 70 million posts on WordPress, which is hands-down the dominant online content management system.

Assuming an average article is 800 words long – which is speculation on my part, but not super long or short – people turn out around 56 billion words a month or 1.8 billion words a day on WordPress.

If our average number of words is in the ballpark, GPT-3 produces above twice the daily word count of WordPress posts. Even if you average more than 2000 words per article (which seems high to me), the two are about the same.

Now, not every word GPT-3 produces is a word worth reading, and it does not necessarily produce blog posts (more on applications below). But in both cases, just nine months in, GPT-3’s production seems to predict a threatening stream of algorithmic content.

GPT-3 runs a number of apps

So, how are all the words used? Just as the first outbreak of activity suggested, developers are building a series of apps around GPT-3.

Viable, for example, shows topics in customer feedback – such as surveys, reviews and helpdesk tickets – and provides brief summaries for companies that aim to improve their services. Fable Studio brings virtual characters in interactive stories to life with GPT-3-generated dialogue. And Algolia uses GPT-3 to run an advanced search tool.

Instead of code, developers use “fast programming” by giving GPT-3 some examples of what kind of production they hope to generate. More advanced users can fine-tune things by providing the algorithm’s dataset with examples or even human feedback.

In this regard, GPT-3 (and other similar algorithms) may accelerate the adoption of machine learning in natural language processing (NLP). While the learning curve has previously been steep for working with machine learning algorithms, OpenAI says that many in the GPT-3 developer environment have no background in AI or programming.

“It’s almost this new interface for working with computers,” said Greg Brockman, Chief Technology Officer and co-founder of OpenAI, Nature in an article earlier this month.

A fenced garden for AI

OpenAI licensed GPT-3 to Microsoft – which invested $ 1 billion in OpenAI in return for such partnerships – but has not made the code public.

The company claims that revenue generation of their machine learning products helps to fund their larger assignments. In addition, they say they are able to control how the technology is used by strictly gate accessing it with an API.

One concern, for example, is that advanced natural language algorithms such as GPT-3 can overload online disinformation. Another is that large-scale algorithms also contain built-in bias and that it takes a lot of care and attention to limit the effects.

On top of the original madness, OpenAI CEO Sam Altman tweeted: “The GPT-3 syringe is way too much. It’s impressive (thanks for the nice compliments!), But it still has serious weaknesses and sometimes makes very stupid mistakes. ”

Deep learning algorithms lack common sense or contextual awareness. So of course, with the right guidance, the GPT-3 has easily paralyzed the electronic ugliness that was part of the training dataset.

To address these issues, openAI veterinary developers and applications before giving access to GPT-3. They have also developed guidelines for developers, work with tools to identify and reduce bias, and require processes and people to be in place to monitor bad behavior apps.

It remains to be seen whether these protection conditions will be sufficient as access to GPT-3 scales.

Researchers want to give algorithms a certain common sense, understanding of cause and effect and moral judgment. “What we have today is really a mouth without a brain,” said Yejin Choi, a computer scientist at the University of Washington and the Allen Institute for AI. Nature.

As long as these properties remain out of reach, researchers and GPT-3’s human handlers will have to work hard to ensure that the benefits outweigh the risks.

Alt-AI: GPT-3 open source options

Not everyone agrees in the fenced garden.

Eleuther, a project aimed at making an open source competitor to the GPT-3, released its latest model GPT-Neo last week. The project uses OpenAI’s papers on GPT-3 as a starting point for its algorithms and trains them on distributed computing resources donated by the cloud computing company CoreWeave and Google.

They have also created a carefully curated training dataset called Pile. Eleuther co-founder Connor Leahy told The cable the project has “gone to great lengths over the months to curate this dataset, make sure it was well filtered and diverse, and document its shortcomings and assumptions.”

GPT-Neo’s performance may not yet match GPT-3, but it is on par with GPT-3’s least advanced version, according to The cable. Meanwhile, other open source projects are also underway.

“There is tremendous excitement right now for open source NLP and for producing useful models outside of large technology companies,” said Alexander Rush, a professor of computer science at Cornell University. “It’s something like an NLP spaceflight.”

The risk of open source remains: When the code is out in nature, there is no going back, no one controls how it is used.

But Rush argues that developing algorithms openly allows researchers outside of large companies to study them, warts and all, and solve problems.

The new command line

Open source or not, GPT-3 will not be alone for long. For example, Google Brain recently announced its own huge natural language model, weighing in at 1.6 trillion parameters.

In a recent Tech Crunch article, Oren Etzioni, CEO of Allen Insitute for AI, and venture investor Matt McIlwain wrote that they expect GPT-3 and the addition of other major natural language algorithms to provide more availability and lower costs.

And in particular, they see “fast programming” as a significant shift.

Text, wrote Etzioni and McIlwain, could increasingly become the new command line, a universal translator of a kind that allows the “codeless” to take advantage of machine learning and bring new ideas to life: “We believe this will empower a whole new generation of creators, with trillions of parameters at your fingertips, in a completely low-code / no-code way. ”

Machines seem to be getting a lot chattier. And we’ve got our job done to make sure the conversation is meaningful.

Photo credit: Emil Widlund / Unsplash




Source link