Hnyfigp3ifin6rxvweqg
SkillsCast

Building robust data pipelines in Scala

8th December 2014 in London at Business Design Centre

There are 54 other SkillsCasts available from Scala eXchange 2014

Please log in to watch this conference skillscast.

499384862 640

Over the past couple of years, Scala has become a go-to language for building data processing applications, as evidenced by the emerging ecosystem of frameworks and tools including LinkedIn's Kafka, Twitter's Scalding and our own Snowplow project (https://github.com/snowplow/snowplow).

In this talk, Alex will draw on his experiences at Snowplow to explore how to build rock-sold data pipelines in Scala, highlighting a range of techniques including:

  • Translating the Unix stdin/out/err pattern to stream processing
  • ""Railway oriented"" programming using the Scalaz Validation
  • Validating data structures with JSON Schema
  • Visualizing event stream processing errors in ElasticSearch

Alex's talk will draw on his experiences working with event streams in Scala over the last two and a half years at Snowplow, and by Alex's recent work penning Unified Log Processing, a Manning book.

YOU MAY ALSO LIKE:

Thanks to our sponsors

Building robust data pipelines in Scala

Alex Dean

I'm the co-founder and tech lead at Snowplow Analytics, the open source web and event analytics platform (https://github.com/snowplow/snowplow). Snowplow is almost exclusively written in Scala, using a range of technologies including Scalaz, Scalding and Spray. I spend a lot of time working with distributed systems (historically Hadoop, increasingly Kinesis, Kafka et al) to deliver really scalable event stream processing. I'm also the author of Unified Log Processing from Manning Publications (http://manning.com/dean/).