Predictive policing poses discrimination risk, thinktank warns

Machine-learning algorithms could replicate or amplify bias on race, sexuality and age

Predictive policing – the use of machine-learning algorithms to fight crime – risks unfairly discriminating against protected characteristics including race, sexuality and age, a security thinktank has warned.

Such algorithms, used to mine insights from data collected by police, are currently deployed for various purposes including facial recognition, mobile phone data extraction, social media analysis, predictive crime mapping and individual risk assessment.

Continue reading…

Read More Predictive policing poses discrimination risk, thinktank warns

Related Post