Experimenting with GPT-3 — Part 1

The Generative Pre-trained Transformer 3 is the latest language version which uses Deep learning to produce human like text. GPT-3 was created by the OpenAI Lab (which has Elon Musk as one of the founders).

The main thing about it is that supports 175 billion machine learning parameters. Each parameter value define how the algorithms behave. Just for comparison, the most advanced language of this type was Microsoft’s Turing NLG, with *only * 17 billion params as per graphic below.

Turing NLG was capable of answering questions like “When did WW2 end?” and it was also capable of generating compplex summaries as per docummentation here.

On another hand, GPT-3 was trained with public information from internet, which means is capable of translating text, generating news articles and a range of applications which can be found here: https://gpt3examples.com/#examples

One of the coolest examples I found it’s about creating AWS automation (with OpsWorks, which is based on Chef or Puppet) from the following sentence in plain English “create, deploy, list, and delete any services on AWS”.

How cool is that?

Thanks for Reading,

Javier Caceres

Originally published at http://jacace.wordpress.com on August 12, 2020.

Hands-on Sr Software Manager / Architect based in Ireland. Views are my own. Linkedin: https://ie.linkedin.com/in/jacace Twitter: https://twitter.com/jacace