Please log in to watch this conference skillscast.
Combing near real time stream processing with batch analytics is the goal for so many companies. Why? Well, rather than getting new insights the next day after a nightly batch job you can start to get them with in seconds with stream processing. The result? Fast results that are up to date but also take into account vast amounts of historical data. Typically this is two technology stacks, e.g. Storm for stream processing and Hadoop for batch analytics. Through this talk you will lean how to do it all with the same stack: Spark running on Cassandra.
This talk will explore: an overview of Cassandra and how to model time series data, hooking up Spark stream processing to do on-the-fly aggregates, and running Spark batch jobs.
Oh and did Christopher mention? It's all in Scala.
YOU MAY ALSO LIKE:
- Building Scalable, Back Pressured Services with Akka (SkillsCast recorded in December 2017)
- Lightbend Scala Language - Professional (in London on 9th - 10th September 2019)
- Lightbend Scala Language - Expert (in London on 11th - 13th September 2019)
- Scala eXchange London 2019 (in London on 12th - 13th December 2019)
- Keynote by Dick Wall on Why API Design Matters, and Why Yours Sucks! (and mine sucks too!) (in London on 24th June 2019)
- London Java Community June (in London on 25th June 2019)
- Awesome CI/CD Data Pipelines for Distributed Data-Sources (SkillsCast recorded in May 2019)
- The Elements of Tagless Final Style (SkillsCast recorded in May 2019)
Combining batch and stream analytics with Apache Spark and Apache Cassandra
Christopher is a Senior Engineer at Lightbend. He is currently on the core Akka team responsible for developing Akka (https://akka.io/), Akka Http, Akka Streams, Reactive Kafka and Alpakka (https://github.com/akka/alpakka). He has previously built trading systems, online television platforms and worked extensively with Apache Cassandra. Likes: Scala, Java, the JVM, Akka, distributed databases, XP, TDD, Pairing. Dislikes: Untested software and code ownership.