Please log in to watch this conference skillscast.
In this talk we will share our experiences of deploying statistical algorithms to a hadoop cluster. We will discuss approaches used to scale R code from processing thousands of data points on a desktop to billions in the cloud.
YOU MAY ALSO LIKE:
- An example of a map/reduce algorithm using R and Hadoop (SkillsCast recorded in August 2013)
- Getting Started with CI/CD with Dave Farley (Online Course on 2nd - 3rd December 2020)
- Pipelines Done Right (SkillsCast recorded in March 2019)
- Sociotechnical Architecture: Aligning Teams and Software for Continuous Delivery (SkillsCast recorded in November 2018)
Taking Data Science to the Data Centre
Anette is a consultant for ThoughtWorks where she builds people, teams, projects and occasionally a bit of code. She has worked in a number of different countries, industries and development stacks to solve all sorts of problems, but lately it has be
Brian helps clients make the, usually difficult, transition from a traditional analyse, develop and test development model to a more rapid, repeatable and agile mode of delivery.