
[Kalé Carey]
THE PRIME MINISTER’S OFFICE IN THE UK IS TESTING SOFTWARE TO PREDICT IF SOMEONE WILL BECOME A KILLER LATER IN LIFE.
IT’S A MODERN DAY VERSION OF THE MOVIE MINORITY REPORT, WHERE TECHNOLOGY IS USED TO PREDICT AND ARREST PEOPLE BEFORE THEY COMMIT MURDERS.
SIMILAR TO THE FILM, THE MINISTRY OF JUSTICE IS USING THE HOMICIDE PREDICTION PROJECT TO GATHER POLICE AND GOVERNMENT DATA TO CREATE PROFILES AND ASSESS RISK.
NOW, TITLED SHARING DATA TO IMPROVE RISK ASSESSMENT.
STATEWATCH SUBMITTED A FREEDOM OF INFORMATION REQUEST TO GET MORE DETAILS ABOUT THE COLLABORATION WITH POLICE AND THE UK’S DIVISION OF PUBLIC SAFETY.
THE GOVERNMENT EXPLAINS THE PILOT PROGRAM REVIEWS OFFENDER CHARACTERISTICS AND ASSISTS IN DETERMINING RISK ASSESSMENTS FOR HOMICIDES.
THE MINISTRY OF JUSTICE STATES IT’S FOR RESEARCH AND POLICY ONLY, NOT TO BE USED WITHIN THE COURT SYSTEM.
THE PROGRAM AIMS TO ANSWER THREE QUESTIONS: CAN DATA SCIENCE IMPROVE PREDICTIONS, HOW CAN LOCAL POLICE DATA ENHANCE ACCURACY IN PREDICTING SERIOUS VIOLENCE AND CAN IT HELP UNDERSTAND OFFENDER RISK AND PROMOTE MORE COLLABORATION WITH LOCAL POLICE?
THE DOCUMENTS REVEAL A DATA-SHARING AGREEMENT THAT RELEASED INFORMATION ON A RANGE OF PEOPLE—BETWEEN 100,000 AND 500,000—TO CREATE WHAT THEY CALL A PREDICTIVE TOOL.
THE DATA INCLUDES INFORMATION FROM CASES INVOLVING VICTIMS, WITNESSES, MISSING PERSONS, SUSPECTS AND ANYONE FLAGGED AS A DANGER.
THE PROJECT ALSO EXPANDS ITS SYSTEM TO INCLUDE FACTORS SUCH AS MENTAL HEALTH, ADDICTION, SUICIDE, DISABILITY AND VULNERABILITY.
ALL INFORMATION DATES BACK TO BEFORE 2015, INCLUDING DEMOGRAPHIC DATA LIKE AGE, GENDER AND ETHNICITY.
PREDICTIVE MODELS ARE ALREADY USED WITHIN THE UK. ENGLAND AND WALES USE THE OFFENDER ASSESSMENT SYSTEM TO EVALUATE RISKS FOR INDIVIDUALS IN CUSTODY AND ON PROBATION WITHIN THE COMMUNITY.
A MINISTRY OF JUSTICE STUDY FOUND THAT THESE ALGORITHMS WERE MORE ACCURATE FOR WHITE OFFENDERS AND LESS EFFECTIVE FOR BLACK AND MIXED ETHNICITY OFFENDERS.
STATEWATCH RESEARCHERS ARGUE THAT THESE TOOLS ARE FLAWED BECAUSE THEY RELY ON DATA FROM A SYSTEM THAT IS INSTITUTIONALLY RACIST AND BIASED AGAINST LOW-INCOME COMMUNITIES.
THEY POINT TO A 2023 GOVERNMENT STUDY THAT USED STATISTICAL ANALYSIS TO PREDICT MURDER BASED ON PAST OFFENSES. THE STUDY, WHICH AIMED TO IMPROVE FUTURE PREDICTIONS OF SERIOUS REOFFENDING, FOUND ETHNICITY BIAS IN THESE PREDICTIVE TOOLS.
UK OFFICIALS SAY A REPORT WILL BE PUBLISHED TO ASSESS THE EFFECTIVENESS OF THE SYSTEM FOR GOVERNMENT USE.
FOR STRAIGHT ARROW NEWS, I’M KALÉ CAREY
FIND MORE UNBIASED, FACT BASED NEWS RIGHT ON THE STRAIGHT ARROW NEWS MOBILE APP.