Please log in to watch this conference skillscast.
During this talk, you will learn how to use Haskell's powerful features for abstraction to create a Neural Network library in native Haskell that makes it easy to create complex network architectures in a type-safe and flexible way. Automatic differentiation is used to provide painless gradient descent and easy extension with new components without the need to first compute complicated partial derivatives by hand. Furthermore, the API uses "Pipes" for separation of concerns (data import, training, reporting, termination).
The Call for Papers is now open for Haskell eXchange 2017! Submit your talk for the chance to join a stellar line-up of experts on stage. Find out more.
YOU MAY ALSO LIKE:
Neural Nets with Automatic Differentiation