Qazrw70uukxtafyahte0
SkillsCast

Neural Nets with Automatic Differentiation

6th October 2016 in London at CodeNode

There are 42 other SkillsCasts available from Haskell eXchange 2016

Please log in to watch this conference skillscast.

595646587 640

During this talk, you will learn how to use Haskell's powerful features for abstraction to create a Neural Network library in native Haskell that makes it easy to create complex network architectures in a type-safe and flexible way. Automatic differentiation is used to provide painless gradient descent and easy extension with new components without the need to first compute complicated partial derivatives by hand. Furthermore, the API uses "Pipes" for separation of concerns (data import, training, reporting, termination).

YOU MAY ALSO LIKE:

Thanks to our sponsors

Neural Nets with Automatic Differentiation

Lars Brünjes

Lars is a pure mathematician by training, and holds a PhD in the same. He has spent several years teaching and doing postgraduate research in arithmetic number theory at Cambridge University in the UK, and at Regensburg University in Germany. He worked for ten years as a Lead Software Architect for an international IT company. His job, amongst other things, was the application of mathematical optimization techniques to the paper industry and web framework development (using mostly C#, JavaScript and TypeScript). His present position is that of a professional Haskell developer for Input Output Hong Kong, working on blockchain technology and on teaching Haskell. He has been interested in programming since his early teens, and he loves learning new programming languages and paradigms - in particular those that offer a radically new way of looking at problems and of thinking about solutions. He is especially fascinated by functional programming and its promise of elegant, bug-free code that can be developed rapidly.