Training data that is meant to make predictive policing less biased is still racist
Training algorithms on crime reports from victims rather than arrest data was supposed to make the tools less biased. But it doesn’t look like it does.
Arabian Post strives to deliver the most accurate and reliable information to its readers. If you believe you have identified an error or inconsistency in this article, please don't hesitate to contact our editorial team at editor[at]thearabianpost[dot]com. We are committed to promptly addressing any concerns and ensuring the highest level of journalistic integrity.