SkillsCast

Hierarchical Neural Story Generation

29th April 2019 in London at CodeNode

This SkillsCast was filmed at Hierarchical Neural Story Generation

This session was not filmed.

In this session we’ll discuss the paper ‘Hierarchical Neural Story Generation’ which uses Transformer-based neural networks to generate writing prompts and stories, trained on posts from reddit.com/r/writingprompts.

We’ll start with a brief introduction on the paper and then break into discussion groups:
- Paper - Heirarchical Neural Story Generation: https://arxiv.org/abs/1805.04833
- The model and dataset: https://github.com/pytorch/fairseq/tree/master/examples/stories

The Transformer mechanism in this paper was first introduced in the 2017 paper ‘Attention Is All You Need’ (previously discussed in Journal Club September 2017). Some background reading on the topic:
- The original ‘Attention Is All You Need’ paper: https://arxiv.org/abs/1706.03762
- Illustrated explanation of Transformers: http://jalammar.github.io/illustrated-transformer/
- Summary of Attention Is All You Need: https://medium.com/@hyponymous/paper-summary-attention-is-all-you-need-22c2c7a5e06
- Recent high-profile article on text generation: https://openai.com/blog/better-language-models/

A note about the Journal Club format:

  1. The sessions usually start with a 5-10 minute introduction to the paper by the topic volunteer, followed by splitting into smaller groups to discuss the paper and other materials. We finish the session by coming together for about 15 minutes to discuss what we have learned as a group and ask questions around the room.
  2. There is no speaker at Journal Club. One of the community has volunteered their time to suggest the topic and start the session, but most of the discussion comes from within the groups.

  3. You will get more benefit from the session if you read the paper or other materials in advance. We try to provide (where we can find them) accompanying blog posts, relevant code and other summaries of the topic to serve as entry points.

  4. If you don't have time to do much preparation, please come anyway. You will probably have something to contribute, and even if you just end up following the other discussions, you can still learn a lot.

  5. It’s OK just to read the blog post or watch the video :)

  6. We don’t have spare copies of the paper during the session, so please print out your own if you want a hard copy for discussion. For digital copies, you are welcome to use your laptops/tablets/phones during the session.

YOU MAY ALSO LIKE:

Thanks to our sponsors

SkillsCast

This session was not filmed.

In this session we’ll discuss the paper ‘Hierarchical Neural Story Generation’ which uses Transformer-based neural networks to generate writing prompts and stories, trained on posts from reddit.com/r/writingprompts.

We’ll start with a brief introduction on the paper and then break into discussion groups:
- Paper - Heirarchical Neural Story Generation: https://arxiv.org/abs/1805.04833
- The model and dataset: https://github.com/pytorch/fairseq/tree/master/examples/stories

The Transformer mechanism in this paper was first introduced in the 2017 paper ‘Attention Is All You Need’ (previously discussed in Journal Club September 2017). Some background reading on the topic:
- The original ‘Attention Is All You Need’ paper: https://arxiv.org/abs/1706.03762
- Illustrated explanation of Transformers: http://jalammar.github.io/illustrated-transformer/
- Summary of Attention Is All You Need: https://medium.com/@hyponymous/paper-summary-attention-is-all-you-need-22c2c7a5e06
- Recent high-profile article on text generation: https://openai.com/blog/better-language-models/

A note about the Journal Club format:

  1. The sessions usually start with a 5-10 minute introduction to the paper by the topic volunteer, followed by splitting into smaller groups to discuss the paper and other materials. We finish the session by coming together for about 15 minutes to discuss what we have learned as a group and ask questions around the room.
  2. There is no speaker at Journal Club. One of the community has volunteered their time to suggest the topic and start the session, but most of the discussion comes from within the groups.

  3. You will get more benefit from the session if you read the paper or other materials in advance. We try to provide (where we can find them) accompanying blog posts, relevant code and other summaries of the topic to serve as entry points.

  4. If you don't have time to do much preparation, please come anyway. You will probably have something to contribute, and even if you just end up following the other discussions, you can still learn a lot.

  5. It’s OK just to read the blog post or watch the video :)

  6. We don’t have spare copies of the paper during the session, so please print out your own if you want a hard copy for discussion. For digital copies, you are welcome to use your laptops/tablets/phones during the session.

YOU MAY ALSO LIKE:

Thanks to our sponsors