
3: Can AI discriminate?
Échec de l'ajout au panier.
Veuillez réessayer plus tard
Échec de l'ajout à la liste d'envies.
Veuillez réessayer plus tard
Échec de la suppression de la liste d’envies.
Veuillez réessayer plus tard
Échec du suivi du balado
Ne plus suivre le balado a échoué
-
Narrateur(s):
-
Auteur(s):
À propos de cet audio
But what about when AI is used for decisions that actually matter? Like whether a person with disabilities gets the support they need to live independently, or how the police predict who is going to commit a crime?
With people’s rights and freedom on the line, the stakes are much higher – especially because AI can discriminate.
To unpack all of this we’re joined by Griff Ferris, Senior Legal and Policy Officer at campaign organisation Fair Trials, to discuss the extent to which AI can discriminate, the impact it has on people who are already marginalised, and what we can do about it.
Mentioned in this episode:
- Fair Trials’ predictive policing quiz
- Fair Trials’ report Automating Injustice
- The HART algorithm used by Durham Police
- The Government’s AI regulation white paper
- Public Law Project’s Tracking Automated Government register
Pas encore de commentaire