Podcast: Facial Recognition Technology Is Biased Against People of Colour and Impacts Their Lives

In conversation with filmmaker Shalini Kantayya, the director of 'Coded Bias'.

Artificial Intelligence is being used in many ways and facial recognition is one of them. This technology is in use by police forces in different parts of the world, in countries like China and many democracies. The most troubling part is that such use is unregulated and the executive has no oversight.

An African American or Asian is at most risk because this technology is geared to recognise mainly white features. In China, a citizen’s life can be impacted in all sorts of ways – permission to travel and jobs could depend on behaviour that this technology monitors.

What can be done? Coded Bias, a new documentary on Netflix, explores this question with examples from around the world. Shalini Kantayya talks to Sidharth Bhatia in this podcast about how citizens’ groups are fighting back at this intrusion.