Politic?

This is a blog dedicated to a personal interpretation of political news of the day. I attempt to be as knowledgeable as possible before commenting and committing my thoughts to a day's communication.

Wednesday, September 02, 2020

Predictive Policing : Algorithmic Technology

"We have to make sure that we're not putting forward technology as a solution to a problem that technology actually cannot solve."                                                   "The idea is [to] pinpoint with more accuracy where crime is more likely to happen, send officers to that particular [area] and deter the alleged future crime before it happens."                                                                                                              "We don't have a specific legal safeguard that says whether or not it's appropriate and sufficiently protective of privacy rights to re-purpose [those] databases in this way."                                                                                                                                           Cynthia Khoo, research fellow, technology and human rights lawyer, Citizen Lab, Toronto

City of Pittsburgh Police Officer Shane Kovach sitting in police car with a laptop
Officer Shane Kovach patrols Homewood, a Pittsburgh, Pennsylvania, neighborhood where predictive policing will soon be introduced. Stephanie Strasburg

Police agencies have professional experience in community safety and security and they have the invaluable intuition of seasoned officers to help in the never-ending confrontation between law and order and social alienation and criminal activities to help them distinguish which levels of society, which groups within society and which neighbourhoods in community settings may be largely responsible for anti-social and criminal acts harming the whole of society. Police then respond within the parameters of allowable protocols in aid and protection of community interests.

It's a difficult job, one fraught with problems as well as exposure to personal dangers, and liable to complaints from marginalized groups, ethnic groups, and those mired in poverty, simply because it is frequently from among those disaffected groups who view themselves as victims of society that much crime emanates. Usually white collar crimes are committed by the educated, by the well-off classes of society. Crimes of violence more generally from among those disenchanted with society for whom it is claimed equality is an elusive goal.

"The most significant weakness of the predictive aspects of these models is the difficulty of identifying, anticipating, and factoring in the impact of the variables expected to influence future crime trends. The quantitative models identified in this research did not, and probably could not, factor in the vast array of variables that may influence crime; most of the time series models used to predict future crime rates only took into consideration one or two key influential (demographic and macro-economic) variables. While there is substantial evidence of the impact that these variables have on the crime rate, a key reason they are used in crime forecasts is that their trends can also be quantitatively documented, and hence, forecasted into the future. What is absent from these mathematical models is the integration of influential variables that are more difficult to quantify through historical time series data such as technology, life-style changes, criminal justice responses, and the level of private security and crime prevention efforts undertaken by the public."   Predicting Crime: The Review of Research, Department of Justice, Government of Canada

That, in any event is the view of the progressive mind that feels the law should use a different standard of expectation on those whose backgrounds are less than ideal, citing childhood traumas, unequal treatment and poverty as causatives for social alienation that edges into gangs, drugs, guns and violence. Citizen Lab, a University of Toronto research group, has issued a report warning that the police use of algorithmic policing technology may infringe on the privacy of residents and impinge on their Charter rights.

Algorithmic policing may be new to Canada, relatively speaking, but it is advanced in the U.S. and U.K. in particular. A few forces in Canada are making use of algorithmic policing and Citizen Lab is warning against its use, calling for a halt until federal authorities complete a judicial enquiry into its use and its pitfalls. The Vancouver police department makes use of a machine learning-based system they launched four years ago in forecasting where break-and-enter crimes are likely to occur, responding by dispatching officers to patrol those areas.

Sounds sensible, doesn't it? It's proactive, stopping crimes before they occur by the simple medium of showing up, becoming more visible, the result being that the presence of police is sufficient in itself for deterring potential criminal acts from proceeding. A partnership between the provincial government, university and the Saskatoon police force feeds police data into a model to identify youth at risk of going missing. A much-needed alert that can prevent young people from taking desperate action that often does them great harm.

In Toronto and in Calgary, law enforcement agencies used social media surveillance techniques even as the RCMP put out a tender for similar services which other countries have used more widely. In the U.K., forces use facial recognition and automated risk-assessment systems. The Los Angeles Police Department ended a program using Palantir technology in response to intense protests and criticism from the Stop LAPD Spying Coalition, while retaining another crime location prediction system.

In Saskatchewan, authorities decided they would build an in-house system "to avoid the issues that they know can come with proprietary software", explained Ms.Khoo, to avoid private sector vendors charging that the details of their technology were breached. Citing commercial confidentiality, some forces chose to withhold documents that Citizen Lab's freedom-of-information requests sought, to aid in their research.

predpol
Predictive policing is built around algorithms that identify potential crime hotspots.. (PredPol)

Citizen Lab contends that algorithmic policing technologies could infringe on such Charter rights as freedom of expression, peaceful assembly, equality and liberty; essentially people's privacy rights. Its recommendation following the research it undertook and the conclusions it reached, was that forces not use the technique as the sole basis for arrest or detention. That court authorization be sought before deploying automated surveillance tools online in public places.

The report asks government to impose moratoriums on the use or training of algorithms on historical police data sets until such time as a national judicial inquiry examines the technology and its implications for Charter rights. Typically, historical police data is used in systems forecasting where crime may occur or whether an individual is likely to commit a crime. Numerous studies, posits the report, as well as court decisions, have pointed out that Black and Indigenous people are over-represented in carding, street-check, arrest, sentencing and incarceration data as a result of "biased criminal justice practices or systemic discrimination."

It is no great secret on the other hand, that it is from among the Black and Indigenous communities that a high proportion of criminal activity emanates and persists in emanating. The question is why should any group within society be held to a lower standard of civic behaviour than any other group? Why should expectations for respect of the law that applies equally to the entire population be lowered for these specific groups? Who may demand it, and when their goal is achieved feel it to be justified that they not respect law and order.


Labels: , , , ,

0 Comments:

Post a Comment

<< Home

() Follow @rheytah Tweet