You are the owner of this article.
You have permission to edit this article.
Edit

UW mathematics professors co-sign letter discouraging researchers collaborating with police

  • 2
  • 3 min to read
‘Keep the pressure on’: UW BLM continues to protest for unmet demands

Protesters march towards Greek Row on June 28, 2020. Organizers shared personal stories about experiencing discrimination from UW fraternities and sororities.

In the wake of protests over police violence against Black people, over 1,400 mathematicians signed a letter urging researchers to stop working with police departments and reexamine their work with a focus on ethics and racial justice.

The letter, which detailed the extent to which researchers had collaborated with law enforcement in the past, warned of the major issues embedded in working with police data, such as racial bias. The authors also called for a reexamination of data science courses taught at universities to specifically address racial justice issues and ethics in their curriculum. 

Among the co-signers of the letter were professors and researchers from the University of California, the University of Utah, Pomona College, the University of Wisconsin, and others. 

“We are really hoping that this is a moment in which more of our colleagues are ready to accept ethical responsibility for the outcomes of algorithms or other applications of mathematics to policing,'' the authors, responding as a group, said in an email. “And given the deeply racist feedback loops of predictive policing algorithms, we cannot imagine an ethical reckoning resulting in anything short of ending such collaborations.”

For some of these mathematicians, this work is personal; one had previously worked on PredPol, a controversial predictive policing algorithm. PredPol works by relying on existing police data to allegedly predict where crime will happen and who will perpetrate it.

Though discussions about racial justice and algorithms are just now being brought to mass attention, many researchers have worked with police departments, and algorithms to supposedly predict crime are in high demand. The authors of the letter noted that they felt they had a responsibility to speak up, especially when considering that they risked justifying racism with science.

“I think the research shows that the predictive policing as it exists now just gives a scientific veneer to a system which is racially biased,” Christopher Hoffman, UW professor and one of the co-authors, said. 

Throughout their use, these algorithms have been criticized and highly controversial. A 2016 investigation by ProPublica into an algorithm predicting the likelihood to commit future offenses found the algorithm to be biased against Black people.

One of the ways to fix some of these biases is through education, which the authors argued had been lacking in a meaningful discussion of ethics in university curricula. 

“Most data science courses don't even begin to acknowledge the fact that real people are being affected by the algorithms that we design ... and how deeply biased a lot of data sets are,” they said in an email. “So there is a lot of space for growth.”

Some researchers, though, will see the discouragement of researching predictive policing as a threat to the integrity of science and a suggestion that their work is political. The collective of authors countered that notion by emphasizing the overwhelmingly political aspects of science. 

“Research related to social issues — such as crimes and policing — is inherently political,” the collective said in an email. “Any successful model of real world effects requires expertise from those studying the real world; so it would be an odd choice for mathematicians studying crime and policing to not incorporate all expertise on the matter.”

The collective has been highlighting and uplifting ongoing work to integrate ethics into university curricula, such as looking at the work of social scientists who have studied policing and racial issues and supporting activist organizations committed to change. Data for Black Lives is one such activist movement of researchers and data scientists committed to making actionable change for Black people.

Extending beyond policing, the researchers also pushed for algorithms with high social impact to undergo an audit system before being used. These tools have the potential to drastically affect the lives of the people they come in contact with, highlighting the need for a thorough ethical examination.

The consortium of authors also pointed to ongoing movements like Decriminalize UW as ways that students — whether math majors or not — could get involved, in severing ties with police.

“We are really hoping that this is a moment in which more of our colleagues are ready to accept ethical responsibility for the outcomes of algorithms or other applications of mathematics to policing,” the authors said in an email. “And given the deeply racist feedback loops of predictive policing algorithms, we cannot imagine an ethical reckoning resulting in anything short of ending such collaborations.”

Reach reporter Thelonious Goerz at news@dailuw.com. Twitter: @TheloniousGoerz

Like what you’re reading? Support high-quality student journalism by donating here.

(2) comments

RedtheRetard

Hurr, statistics and algorithms are RACIST!!!

plutarchheavensbee62

The pursuit of truth has nothing to do with politics and your feelings. This is nothing different than the catholic church trying to hide that the earth is round. And in the long run it will be mathematicians who restrict themselves to peoples hurt feelings that will ultimately be looked down upon in academia. I have already seen outrageous claims based on biased statistics. Once these entities lose their credibility, they will not get it back.

Welcome to the discussion.

Keep it Clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
PLEASE TURN OFF YOUR CAPS LOCK.
Don't Threaten. Threats of harming another person will not be tolerated.
Be Truthful. Don't knowingly lie about anyone or anything.
Be Nice. No racism, sexism or any sort of -ism that is degrading to another person.
Be Proactive. Use the 'Report' link on each comment to let us know of abusive posts.
Share with Us. We'd love to hear eyewitness accounts, the history behind an article.