The

What Is The Gangs Matrix?

The Gangs Matrix, a controversial database created by the Metropolitan Police (the Met) in the aftermath of London’s 2011 riots which were sparked because of the killing of Marck DUggan by police, to purportedly identify and surveil not only those at risk of committing gang-related violence, but also victims of it.

 

Based on a number of variables such as previous offenses, patrol logs, social media activity and friendship networks, the Gangs matrix claims to rely on a mathematical formula to calculate a “risk score” – red, amber, or green – for each person, in reference to the likelihood they will be involved in gang violence. This clearly “flawed”  intelligence in “theory” guides what should be an efficient use of police resources and aid court prosecutions.

 

However there is an argument that it is one of the most racist & flawed policing initiatives in modern times. In a report last year, human rights charity Amnesty International described it as “a racially biased database criminalising a generation of young black men”, revealing that 35 per cent of those on the matrix had no police intelligence linking them to gang violence or being involved in a gang ,and most had never been charged with a crime. Sharing certain YouTube videos of grime or drill music, meanwhile, is considered a key indicator of gang affiliation by the Met.

The Gangs Matrix, a controversial database created by the Metropolitan Police (the Met) in the aftermath of London’s 2011 riots which were sparked because of the killing of Marck DUggan by police, to purportedly identify and surveil not only those at risk of committing gang-related violence, but also victims of it.

 

Based on a number of variables such as previous offenses, patrol logs, social media activity and friendship networks, the Gangs matrix claims to rely on a mathematical formula to calculate a “risk score” – red, amber, or green – for each person, in reference to the likelihood they will be involved in gang violence. This clearly “flawed”  intelligence in “theory” guides what should be an efficient use of police resources and aid court prosecutions.

 

However there is an argument that it is one of the most racist & flawed policing initiatives in modern times. In a report last year, human rights charity Amnesty International described it as “a racially biased database criminalising a generation of young black men”, revealing that 35 per cent of those on the matrix had no police intelligence linking them to gang violence or being involved in a gang ,and most had never been charged with a crime. Sharing certain YouTube videos of grime or drill music, meanwhile, is considered a key indicator of gang affiliation by the Met.

The implications of being on the matrix are alarming, but finding out why you are on it, let alone how to be removed, is extremely difficult. One family received a letter warning they would be evicted from their home if their son did not stop his involvement with gangs – however, this young man had been dead for over a year. A disabled mother’s council-provided car was seized after her son – who acted as her carer and was registered to drive the car – he was then arrested without charge or further action (NFA). In Bill’s case, he was forced out of his mother’s house and put into a residential care home due to being on the matrix. He was later banned from attending the South London Learning Centre this in turn could potentially have a long term impact on his life chances.

In November, the Information Commissioner’s Office (ICO), Britain’s watchdog for data use, ruled that the matrix breached data protection rules. The investigation found that the matrix does not clearly distinguish between victims of crime and perpetrators of crime, some boroughs were keeping informal lists of those who were supposed to have been removed from the matrix, and there was “blanket sharing [of data] that goes against GDPR rules, with third parties” which includes schools, job centres, and housing associations.

A separate, damning review published the following month by the Mayor of London’s Office, which oversees the Met, found that although there is a need to address violence in the capital, the number of young black people on the matrix was “disproportionate to their likelihood of criminality and victimisation.” The review ordered the force to radically reform the tool within a year. The Met said in a statement at the time that it “does not believe that the Gangs Matrix directly discriminates against any community”.

According to data obtained by WIRED via a Subject Access Request (SAR), children as young as 13 are currently listed on the Metropolitan Police Gangs Matrix. The list contains over 3,000 people, with the majority being young, black boys, which also includes 55 children under 16. More than 7,000 individuals have been on the matrix at some point. About 80 per cent of those on the list are described as “African-Caribbean”, 12 per cent are from other ethnic minority backgrounds, and just eight per cent are “white European”. Yet the vast majority are considered to pose little threat of violence by even the police, with 65 per cent currently rated with a green risk score, 30 per cent amber, and five per cent red.

As a result, campaigners are now calling for the matrix & the remit of which covers more than eight million people, to be scrapped. “I think it’s deeply problematic,” says Tanya O’Carroll, director of Amnesty’s global technology and human rights programme. “It’s a rudimentary use of data in poorly thought-out ways that ends up being extremely discriminatory to young black boys. The way the matrix works, intelligence about people is essentially hearsay – feeling, not fact.”

The matrix is part of a growing trend of police forces across the UK using open-source intelligence, big data and machine learning as part of its crime-stopping. A report published by Liberty in February revealed that at least 14 forces across the UK – around a third – are already using what has been coined “predictive policing”.

It outlined two main strands: “predictive mapping”, which identifies areas where crime will likely occur, and “individual risk assessment”, which predicts how likely an individual is to commit crime based on an officers perception. However, the first has led to over-policing of the Black community and the second facilitates racial profiling which is blatantly discriminatory. Critics argue, that the broader issue of “predictive policing” is legally ambiguous, lacking accountability and not proven to be effective in many ways.

 

The Gangs Matrix does not exactly implement artificial intelligence or machine learning, unlike the tools of many others forces, says Hannah Couchman, a policy and campaigns officer at Liberty who authored the report. But there is still the concept of “pre-criminality” – being investigated by police without reasonable grounds, she adds. It comes into conflict with age concepts such as innocence until proven guilty and probable cause. We are seeing a pattern of police forces rolling out these technologies without sufficient protection,” says Couchman. 

 

Kent Police, the first UK police force to try to use computer algorithms to predict crime, ended its five-year deal with the US company PredPol last March, citing difficulties in proving that the technology could reduce crime.South Wales Police and the Met are testing Automated Facial Recognition (AFR) – despite high rates of incorrect identifications, particularly for women and again black people. Avon and Somerset police have started using a broad mapping program to assess the likelihood of things such as being a victim of stalking and taking stress-related sick leave.

 

Despite the significant problems, however, the UK’s national coordination body for law enforcement rejects any criticism. “For many years police forces have looked to be innovative in their use of technology to protect the public and prevent harm and we continue to develop new approaches to achieve these aims,” says Jon Drake, intelligence lead for the National Police Chiefs’ Council. However The Black Child Agenda feels this is a clear and direct discriminatory act against the Black community. But while police use of big data may be inevitable, a significant concern is the lack of accountability in these systems, whose processes remain a mystery to even the officers tasked with deploying them – and the experts that have built them. Automation bias, the hesitancy to overrule computers’ automated decisions, is also a significant problem, says Nick Jennings, a professor at Imperial College London.