Hi all,
I hope you’ve had a good start into 2019! This newsletter covers a ton of material: slides from the creator of BERT and on using transfer learning for dialogue; an MIT lecture on the Deep Learning state of the art as of 2019, Gaussian Processes, and VI from authorities in each area; NLP lesson curricula from CMU, Stanford, and Berkeley and new lessons from fast.ai to kick-start your learning in the new year; an overview of some of the cool things people have been doing with BERT; a discussion of DeepMind’s AlphaStar; key takeaways on how to manage research teams; new resources containing the state of the art, lecture slides, and a guide on how to debug your neural network; a long list of tools and exciting new ML packages; loads of articles and blog posts; and exciting new research papers.
A new section 🤓 This newsletter contains a new section,
Meaningful Measures,
inspired by
FiveThirtyEight’s Significant Digits. In this section, I’ll list numbers related to ML and NLP that stuck out to me in the last month.
Contributions 💪 If you have written or have come across something that would be relevant to the community, hit reply on the issue so that it can be shared more widely.
I really appreciate your feedback, so let me know what you love ❤️ and hate 💔 about this edition. Simply hit reply on the issue.
If you were referred by a friend, click
here to subscribe. If you enjoyed this issue, give it a
tweet 🐦.