SkillsCast
About the Speaker
Please log in to watch this conference skillscast.
Rewriting code as you scale is a terrible waste of time. You have perfectly working code, but it doesn’t scale. You really need code that works at any size, whether that’s a megabyte or a terabyte. Beam allows you to learn a single API and process data as it grows. You don’t have to rewrite at every step.
In this session, we will talk about Beam and its API. We’ll see how to Beam execute on Big Data or small data. We’ll touch on some of the advanced features that make Beam an interesting choice.
YOU MAY ALSO LIKE:
Processing Data of Any Size with Apache Beam
Jesse Anderson
Data Engineer, Creative Engineer and Managing Director
Big Data Institute