This session was not filmed.
In this session we’ll discuss the paper ‘Hierarchical Neural Story Generation’ which uses Transformer-based neural networks to generate writing prompts and stories, trained on posts from reddit.com/r/writingprompts.
We’ll start with a brief introduction on the paper and then break into discussion groups:
- Paper - Heirarchical Neural Story Generation: https://arxiv.org/abs/1805.04833
- The model and dataset: https://github.com/pytorch/fairseq/tree/master/examples/stories
The Transformer mechanism in this paper was first introduced in the 2017 paper ‘Attention Is All You Need’ (previously discussed in Journal Club September 2017). Some background reading on the topic:
- The original ‘Attention Is All You Need’ paper: https://arxiv.org/abs/1706.03762
- Illustrated explanation of Transformers: http://jalammar.github.io/illustrated-transformer/
- Summary of Attention Is All You Need: https://medium.com/@hyponymous/paper-summary-attention-is-all-you-need-22c2c7a5e06
- Recent high-profile article on text generation: https://openai.com/blog/better-language-models/
A note about the Journal Club format:
- The sessions usually start with a 5-10 minute introduction to the paper by the topic volunteer, followed by splitting into smaller groups to discuss the paper and other materials. We finish the session by coming together for about 15 minutes to discuss what we have learned as a group and ask questions around the room.
There is no speaker at Journal Club. One of the community has volunteered their time to suggest the topic and start the session, but most of the discussion comes from within the groups.
You will get more benefit from the session if you read the paper or other materials in advance. We try to provide (where we can find them) accompanying blog posts, relevant code and other summaries of the topic to serve as entry points.
If you don't have time to do much preparation, please come anyway. You will probably have something to contribute, and even if you just end up following the other discussions, you can still learn a lot.
It’s OK just to read the blog post or watch the video :)
We don’t have spare copies of the paper during the session, so please print out your own if you want a hard copy for discussion. For digital copies, you are welcome to use your laptops/tablets/phones during the session.
YOU MAY ALSO LIKE:
- Fast Track to Machine Learning with Louis Dorard (in London on 15th - 17th July 2019)
- Real-time Systems with Spark Streaming and Kafka (in London on 23rd - 24th September 2019)
- Infiniteconf 2019 - A one-day community celebration of Big Data, Machine Learning and AI (in London on 4th July 2019)
- London TensorFlow.js (in London on 20th June 2019)
- Keynote by Dick Wall on Why API Design Matters, and Why Yours Sucks! (and mine sucks too!) (in London on 24th June 2019)
- AI Auditing Framework (SkillsCast recorded in June 2019)
- Applying Monte Carlo Tree Search (MCTS) to the Protein Folding (SkillsCast recorded in June 2019)