Please log in to watch this conference skillscast.
The users of most of automated decision systems lack the trust with decisions produced by such (AI) systems.
This is due to many factors and in essence the way that the decision is being explained to the user. One potential purpose of explanations is to build the trust between the user and the AI application.
This can be achieved by providing the user with understanding of the scope of automated decision-making, and the reasons that led to a particular decision. Since, the GDPR does not appear to require accessing the “black box” to explain the internal logic of the decision-making system to data subjects. Therefore, with this in mind, this talk will try to answer the question “How can we build a trusted worthy automated decisions?”
By using the “black art power” of explanation techniques, the automated decisions can be fairer, more transparent and much understandable by the user. Further, during this talk, a framework will be presented to suggest two recommended approaches for developing the explain-ability with highlighting some of the possible challenges.
YOU MAY ALSO LIKE:
- Leonardo De Marchi's Deep Learning Fundamentals (in London on 22nd - 23rd October 2019)
- Brian Sletten's Data Science with Python Workshop (in London on 18th - 20th November 2019)
- Scala eXchange London 2019 (in London on 12th - 13th December 2019)
- Practical ML 2020 (in London on 2nd - 3rd July 2020)
- Clean Architecture using BLoC & Voyager: DI & the Widget Router (in London on 21st October 2019)
- Security in the Age of Big Data (Data Anonymisation & Encryption) (in London on 21st October 2019)
- How AI can be used to enable assisted living for the ageing population (SkillsCast recorded in October 2019)
- The importance of DataOps (SkillsCast recorded in October 2019)
Community Session: Women Leading in AI - Explaining the Automated Decisions is a Black Art
Dr Samara Banno holds a PhD in statistical machine learning and AI from UK, her speciality in automated decision making systems.