9TH – 11TH DECEMBER 2019

SEMINAR HALL – IIT DELHI, INDIA

ALGORITHMIC BIAS, TRANSPARENCY AND FAIRNESS: WHAT IS IT, WHY DOES IT MATTER AND WHAT’S BEING DONE ABOUT IT?

PAUL CLOUGH

Professor of Search and Analytics,
Information School,
University of Sheffield

11th Dec

12:30PM-1:10PM

Paul Clough is part-time t (see, http://www.shef.ac.uk/is/staff/clough). During his time in the department, as well as contributing to research and teaching activities, Paul has been head of the Information Retrieval Group, Director of Research, and coordinator of the MSc Data Science programme. Paul conducts research in areas including information seeking and retrieval, data analytics and natural language processing/text analytics.

Paul is also Head of Data Science for Peak Indicators (https://www.peakindicators.com/), a UK-based Business Intelligence and Analytics company, where he is helping to develop data products and services, produce educational resources and grow the data science capability within Peak Indicators. Prior to working at the University of Sheffield and Peak Indicators, Paul worked in R&D for British Telecommunications Plc at Adastral Park (UK). Paul is Fellow of the Higher Education Academy and Member of the British Computer Society.

 

ABSTRACT

Increasingly algorithms are driving information systems and services where they influence people’s decision-making and behaviours. However, recent coverage in the media and academic research has shown negative effects of data-driven methods, such as discrimination and the reinforcement of social biases. This talk will review algorithmic bias, transparency and fairness and reflect on the results of research conducted with colleagues on gender stereotypes and backlash within image search [1]. Results highlighted the need to understand how and why biases enter search algorithms and at which stages of the engineering process. The findings also align with current concerns about algorithms that underlie information services, especially search engines, and the view of the world they present and the extent to which they are biased. The talk will also summarise initiatives, such as Microsoft’s Fairness, Accountability, Transparency and Ethics (FATE) in AI, and potential solutions to the problem through technical solutions, such as Explainable AI, and tools to assist with the discovery and prevention of data and algorithmic biases.

—————————————————————————————————————————————————————————————————

[1] Otterbacher, J., Bates, J., and Clough P. (2017), Competent Men and Warm Women: Gender Stereotypes and Backlash in Image Search Results, In Proceedings of the Conference on Human Factors in Computing Systems CHI’2017, ACM, New York, NY, USA, pp. 6620-6631.

PRESENTATION

Algorithmic Bias, Transperency and Fairness: What is it, Why Does it Matter and What’s being Done about it?

Download

VIDEO

GALLERY