Qdasdplb70ydr6gkgznv
SkillsCast

Community Session: Women Leading in AI - Explaining the Automated Decisions is a Black Art

4th July 2019 in London at CodeNode

There are 15 other SkillsCasts available from Infiniteconf 2019 - A one-day community celebration of Big Data, Machine Learning and AI

Please log in to watch this conference skillscast.

Https s3.amazonaws.com prod.tracker2 resource 41088130 skillsmatter conference skillscast o9nohu

The users of most of automated decision systems lack the trust with decisions produced by such (AI) systems.

This is due to many factors and in essence the way that the decision is being explained to the user. One potential purpose of explanations is to build the trust between the user and the AI application.

This can be achieved by providing the user with understanding of the scope of automated decision-making, and the reasons that led to a particular decision. Since, the GDPR does not appear to require accessing the “black box” to explain the internal logic of the decision-making system to data subjects. Therefore, with this in mind, this talk will try to answer the question “How can we build a trusted worthy automated decisions?”

By using the “black art power” of explanation techniques, the automated decisions can be fairer, more transparent and much understandable by the user. Further, during this talk, a framework will be presented to suggest two recommended approaches for developing the explain-ability with highlighting some of the possible challenges.

YOU MAY ALSO LIKE:

Thanks to our sponsors

Community Session: Women Leading in AI - Explaining the Automated Decisions is a Black Art

Dr Samara Banno

Dr Samara Banno holds a PhD in statistical machine learning and AI from UK, her speciality in automated decision making systems.

SkillsCast

Please log in to watch this conference skillscast.

Https s3.amazonaws.com prod.tracker2 resource 41088130 skillsmatter conference skillscast o9nohu

The users of most of automated decision systems lack the trust with decisions produced by such (AI) systems.

This is due to many factors and in essence the way that the decision is being explained to the user. One potential purpose of explanations is to build the trust between the user and the AI application.

This can be achieved by providing the user with understanding of the scope of automated decision-making, and the reasons that led to a particular decision. Since, the GDPR does not appear to require accessing the “black box” to explain the internal logic of the decision-making system to data subjects. Therefore, with this in mind, this talk will try to answer the question “How can we build a trusted worthy automated decisions?”

By using the “black art power” of explanation techniques, the automated decisions can be fairer, more transparent and much understandable by the user. Further, during this talk, a framework will be presented to suggest two recommended approaches for developing the explain-ability with highlighting some of the possible challenges.

YOU MAY ALSO LIKE:

Thanks to our sponsors

About the Speaker

Community Session: Women Leading in AI - Explaining the Automated Decisions is a Black Art

Dr Samara Banno

Dr Samara Banno holds a PhD in statistical machine learning and AI from UK, her speciality in automated decision making systems.

Photos