Hope you are doing great. We are back discussing the latest news from The Batch.
This week, DeepMind released three papers on language models. One of those papers is Gopher, a 280 billion parameter language model based on OpenAI’s GPT-2.
Although language models fail when it comes to logical reasoning and common sense tasks, Gopher showed a great performance in areas like reading comprehension, fact-checking, and identifying toxic language.
For more about Gopher and the other 2 papers released along with it, check out the official release blog post.
This week’s newsletter also contains other news from the AI community such as a leaked document of the TikTok recommendation engine, Andrew’s note on Data-Centric AI Workshop, and a new ConvNet that can recognize shifted or translated images.
What was your favorite news this week? You’re welcome to share with us here.