Meet up

LightGBM: A Highly Efficient Gradient Boosting Decision Tree

Tuesday, 17th April at CodeNode, London

This meetup was organised by London Data Science Journal Club in April 2018

LightGBM: A Highly Efficient Gradient Boosting Decision Tree

Don't miss this month's LDSJC where we'll be learning more about LightGBM! Check it out.

Presented at NIPS 2017, this month we will be looking at the paper ‘LightGBM: A Highly Efficient Gradient Boosting Decision Tree’

Gradient boosting decision trees are a popular algorithm in machine learning, and have demonstrated their utility very visibly by their rise to dominance in competitive situations such as Kaggle.

LightGBM is the most efficient and scaleable version (up to 20x faster than traditional GBDT) yet created, quickly overtaking XGBoost as the connoisseur’s choice for this technique.

How does this algorithm work? What are the trade-offs versus the related approaches? How should we think about applying LightGBM to real world problems? Come along to discuss the paper and the practice.

Paper:

LightGBM: A Highly Efficient Gradient Boosting Decision Tree (Ke G. et al, presented at NIPS 2017)

Background material:
A note about the Journal Club format:
  1. There is no speaker at Journal Club.

  2. There is NO speaker at Journal Club.

  3. We split into small groups of 6 people and discuss the papers. For the first hour the groups are random to make sure everyone is on the same page. Afterwards we split into blog/paper/code groups to go deeper.

  4. Volunteers sometimes seed the discussion by guiding through the paper highlights for 5 mins. You are very welcome to volunteer in the comments.

  5. Reading the materials in advance is really helpful. If you don't have time, please come anyway. We need this group to learn together.

Thanks to our sponsors

Attending Members

Overview

LightGBM: A Highly Efficient Gradient Boosting Decision Tree

Don't miss this month's LDSJC where we'll be learning more about LightGBM! Check it out.

Presented at NIPS 2017, this month we will be looking at the paper ‘LightGBM: A Highly Efficient Gradient Boosting Decision Tree’

Gradient boosting decision trees are a popular algorithm in machine learning, and have demonstrated their utility very visibly by their rise to dominance in competitive situations such as Kaggle.

LightGBM is the most efficient and scaleable version (up to 20x faster than traditional GBDT) yet created, quickly overtaking XGBoost as the connoisseur’s choice for this technique.

How does this algorithm work? What are the trade-offs versus the related approaches? How should we think about applying LightGBM to real world problems? Come along to discuss the paper and the practice.

Paper:

LightGBM: A Highly Efficient Gradient Boosting Decision Tree (Ke G. et al, presented at NIPS 2017)

Background material:
A note about the Journal Club format:
  1. There is no speaker at Journal Club.

  2. There is NO speaker at Journal Club.

  3. We split into small groups of 6 people and discuss the papers. For the first hour the groups are random to make sure everyone is on the same page. Afterwards we split into blog/paper/code groups to go deeper.

  4. Volunteers sometimes seed the discussion by guiding through the paper highlights for 5 mins. You are very welcome to volunteer in the comments.

  5. Reading the materials in advance is really helpful. If you don't have time, please come anyway. We need this group to learn together.

Thanks to our sponsors

Who's coming?

Attending Members