Please log in to watch this conference skillscast.
Combing near real time stream processing with batch analytics is the goal for so many companies. Why? Well, rather than getting new insights the next day after a nightly batch job you can start to get them with in seconds with stream processing. The result? Fast results that are up to date but also take into account vast amounts of historical data. Typically this is two technology stacks, e.g. Storm for stream processing and Hadoop for batch analytics. Through this talk you will lean how to do it all with the same stack: Spark running on Cassandra.
This talk will explore: an overview of Cassandra and how to model time series data, hooking up Spark stream processing to do on-the-fly aggregates, and running Spark batch jobs.
Oh and did Christopher mention? It's all in Scala.
YOU MAY ALSO LIKE:
- Lightbend Scala Language - Professional (in London on 17th - 18th September 2018)
- Lightbend Scala Language - Expert (in London on 19th - 21st September 2018)
- Real-time Systems with Spark Streaming and Kafka (in London on 24th - 25th September 2018)
- CloudNative London 2018 (in London on 26th - 28th September 2018)
Combining batch and stream analytics with Apache Spark and Apache Cassandra
Christopher is a Senior Engineer at Lightbend. He is currently on the core Akka team responsible for developing Akka (https://akka.io/), Akka Http, Akka Streams, Reactive Kafka and Alpakka (https://github.com/akka/alpakka). He has previously built trading systems, online television platforms and worked extensively with Apache Cassandra. Likes: Scala, Java, the JVM, Akka, distributed databases, XP, TDD, Pairing. Dislikes: Untested software and code ownership.