Neural Nets with Automatic Differentiation

6th October 2016 in London at CodeNode

There are 42 other SkillsCasts available from Haskell eXchange 2016

Please log in to watch this conference skillscast.

595646587 640

During this talk, you will learn how to use Haskell's powerful features for abstraction to create a Neural Network library in native Haskell that makes it easy to create complex network architectures in a type-safe and flexible way. Automatic differentiation is used to provide painless gradient descent and easy extension with new components without the need to first compute complicated partial derivatives by hand. Furthermore, the API uses "Pipes" for separation of concerns (data import, training, reporting, termination).

The Call for Papers is now open for Haskell eXchange 2017! Submit your talk for the chance to join a stellar line-up of experts on stage. Find out more.

Get your tickets for Haskell eXchange!


Thanks to our sponsors

Neural Nets with Automatic Differentiation

Lars Brünjes

I am a pure mathematician by training, and hold a PhD in the same. I have spent several years teaching and doing postgraduate research in arithmetic number theory at Cambridge University in the UK, and at Regensburg University in Germany. My current position is that of a Lead Software Architect for an international IT company. My job, amongst other things, is the application of mathematical optimization techniques to the paper industry and on web framework development (using mostly C#, JavaScript and TypeScript). I have been interested in programming since my early teens, and I love learning new programming languages and paradigms - in particular those that offer a radically new way of looking at problems and of thinking about solutions. I am especially fascinated by functional programming and its promise of elegant, bug-free code that can be developed rapidly.