OpenAI is an AI research foundation started by Elon Musk, Sam Altman and a who’s who of big names in machine learning. GPT-3 is OpenAI’s text-generating AI that debuted about a month ago, and is a language model trained with 175 billion parameters, a successor to GPT-2 which had only 1.5 billion parameters. In short, remember how we’ve discussed on this show before how AI can’t read and it can’t speak. Yeah, well…

Quick thoughts on GPT3 (Delian Asparouhov)

Giving GPT-3 a Turing Test (Kevin Lacker)

For more stories like this, subscribe now to the Techmeme Ride Home podcast. 15 minutes of the latest tech news, weekdays at 5pm ET.