A number of talks on GPUs, Recurrent neural networks & Developing better drugs with deep learning
Jeremy will be covering two NVIDIA products (DIGITS and cuDNN), but also non-NVIDIA frameworks.
Jeremy Appleyard is a member of NVIDIA's European Developer Technology team. Based near Oxford, he works with developers accelerating applications on GPUs. He has experience with both deep-learning as well as traditional scientific computing applications. He holds a Ph.D in computational fluid dynamics from Cranfield University.
Stratified Medical is a tech company leveraging the latest in Deep Learning and Big Data technologies to improve people’s lives through better drugs. There is so much that is known about the human body and its internal mechanisms that are unfortunately siloed in millions of scientific articles and a smaller number of specialised databases. The Stratified team is building a big data platform that will mine large volumes of unstructured text as well as biomedical ontologies and structured data. We are using Deep Learning for knowledge extraction, representation and reasoning and will also describe how we are using NVIDIA GPUs to accelerate this exciting research. The Stratified Medical big data AI platform will make it easier to connect knowledge that will lead to new insights for better medicines.
Serial technologist, start-up founder, VC-fundraising experience and deep R&D strategist in Big Data, Natural Language Processing, state-of-the-art Deep Learning and deployment of AI platforms at internet scale for Tier1 Silicon Valley companies. Doctorate in Machine Learning and Computer Vision and another 7 years of Post-Doctoral research experience in brain-inspired pattern recognition at Imperial College. Spun-out a successful start-up out of Imperial with multi-million VC investment and revenue from a big UK retailer within 10 months. Now working in big data and advanced machine learning to leverage the totality of human knowledge, teaching machines to understand and reason, with the goal of making a real difference in the world. Author of over 45 articles in scientific journals and conferences, 3 granted patents in US and EU and 4 pending patents.
Recurrent Neural Networks (RNNs), and specifically a variant with Long Short-Term Memory (LSTM), are enjoying renewed interest as a result of successful applications in a wide range of machine learning problems that involve sequential data. I will summarize my own experience with training these models for automated image captioning and for generating text character by character, with a particular focus on understanding the source of their impressive performance and their limitations.
Andrej is a 5th year PhD student at Stanford University, working with Fei-Fei Li. His focus is on Deep Learning, with applications in Computer Vision, Natural Language Processing, and their intersection. He is visiting London during the summer as an intern at DeepMind.