Please log in to watch this conference skillscast.
In this talk we will share our experiences of deploying statistical algorithms to a hadoop cluster. We will discuss approaches used to scale R code from processing thousands of data points on a desktop to billions in the cloud.
YOU MAY ALSO LIKE:
- An example of a map/reduce algorithm using R and Hadoop (SkillsCast recorded in August 2013)
- Better Software Faster with Dave Farley (Online Course on 8th - 9th February 2021)
- CI/CD Agility and Controlling Pipeline Sprawl (SkillsCast recorded in December 2020)
- Taming Dependabot: Keeping Microservices up to Date (SkillsCast recorded in December 2020)
Taking Data Science to the Data Centre
Anette is a consultant for ThoughtWorks where she builds people, teams, projects and occasionally a bit of code. She has worked in a number of different countries, industries and development stacks to solve all sorts of problems, but lately it has be
Brian helps clients make the, usually difficult, transition from a traditional analyse, develop and test development model to a more rapid, repeatable and agile mode of delivery.