Hi all,This newsletter discusses accelerating science, memorizing vs learning to look things up, and a Schmidhuber-centric view of the last decade. It also features slides on transfer learning and Deep Learning essentials, multiple translation corpora (speech-to-text, comprehensive translations for language learning), a Greek BERT, and ARC. Finally, it includes the blog posts and papers that I particularly enjoyed reading over the past months, including the Illustrated Reformer and the Annotated GPT-2, an analysis of NLP and ML papers in 2019, and oLMpics.Contributions 💪 If you have written or have come across something that would be relevant to the community, hit reply on the issue so that it can be shared more widely.I really appreciate your feedback, so let me know what you love ❤️ and hate 💔 about this edition. Simply hit reply on the issue.If you were referred by a friend, click here to subscribe. If you enjoyed this issue, give it a tweet 🐦.
Share this post
Accelerating science, memorizing vs learning…
Share this post
Hi all,This newsletter discusses accelerating science, memorizing vs learning to look things up, and a Schmidhuber-centric view of the last decade. It also features slides on transfer learning and Deep Learning essentials, multiple translation corpora (speech-to-text, comprehensive translations for language learning), a Greek BERT, and ARC. Finally, it includes the blog posts and papers that I particularly enjoyed reading over the past months, including the Illustrated Reformer and the Annotated GPT-2, an analysis of NLP and ML papers in 2019, and oLMpics.Contributions 💪 If you have written or have come across something that would be relevant to the community, hit reply on the issue so that it can be shared more widely.I really appreciate your feedback, so let me know what you love ❤️ and hate 💔 about this edition. Simply hit reply on the issue.If you were referred by a friend, click here to subscribe. If you enjoyed this issue, give it a tweet 🐦.