OpenAI



Original Source Here

Sam Altman of Y Combinator and OpenAI has said that we are already seeing a possible early precursor to Artificial General Intelligence with Open AI’s GPT3 natural language processing (NLP) transformer model. Azeem Azhar probed him more about the technology, applications, regulation and governance of GPT3 in the hands of OpenAI on his Exponential View Podcast.

Using deep-learning techniques, GPT-3’s creators trained the system on nearly all the public text created by humanity through October 2019. This included the entirety of Wikipedia, tens of millions of books, and over one trillion words posted to Twitter, other social networks, and the public internet.

The end result is an artificial intelligence system that has access to a massive chunk of the thoughts, facts, and opinions that humans have ever put into words and published — as well as the ability to generalize from these sources, find connections between them, and process them mathematically. During its training, GPT-3 identified over 175 billion parameters by which it understands and processes human words and ideas. MIT’s Technology Review described the system as “shockingly good.”

GPT-3 will spark an additional wave of no-code and AutoML tools. Many would-be employers will opt for these tools rather than hire expensive programmers. Even Jack Dorsey, CEO of Twitter and Square said that software engineers especially at the entry-level will be replaced by many AI tools, and GPT3 is certainly one of the more generalized AI tools which is exciting technologists and scientists all over. We are getting much closer to potential artificial general intelligence with tools like GPT3 along with No-code, Low-code and AutoML tools because theses are all examples of software which can teach itself or self-learning software!

AI/ML

Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot

%d bloggers like this: