Will GPT-3 take your job?

Think of GPT-3 as the autopredict capability of your phone or computer, but on steroids. The model is trained with 499 billion tokens (a unit of text or punctuation in natural language processing) and has 175 billion parameters (learned patterns from the training dataset). 

It’s been fed all of the Internet and then some more: 45TB of text training compared to its predecessor, GPT-2’s 40GB. The wealth of information it has to pull from is gigantic and this is why it is able to, with some prompting, produce a diverse range of material from poems to news articles to lines of code with astonishing coherence.   Read more via Techabal