25 min listen
Single Headed Attention RNN: Stop Thinking With Your Head with Stephen Merity - #325
FromThe TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Single Headed Attention RNN: Stop Thinking With Your Head with Stephen Merity - #325
FromThe TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
ratings:
Length:
59 minutes
Released:
Dec 12, 2019
Format:
Podcast episode
Description
Today we’re joined by Stephen Merity, startup founder and independent researcher, with a focus on NLP and Deep Learning. In our conversation, we discuss: Stephen’s newest paper, Single Headed Attention RNN: Stop Thinking With Your Head. His motivations behind writing the paper; the fact that NLP research has been recently dominated by the use of transformer models, and the fact that these models are not the most accessible/trainable for broad use. The architecture of transformers models. How Stephen decided to use SHA-RNNs for this research. How Stephen built and trained the model, for which the code is available on Github. His approach to benchmarking this project. Stephen’s goals for this research in the broader NLP research community. The complete show notes for this episode can be found at twimlai.com/talk/325. There you’ll find links to both the paper referenced in this interview, and the code. Enjoy!
Released:
Dec 12, 2019
Format:
Podcast episode
Titles in the series (100)
This Week in ML & AI - 7/22/16: ML to Optimize Datacenters, Crazy New GPU from NVIDIA, Faster RNNs: This Week in Machine Learning & AI brings you the… by The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)