Police are looking to algorithms to predict domestic violence

Ignatiev / Getty Images

 

Connecting state and local government leaders

Law enforcement officers use various tools, from simple questionnaires to algorithms, as a way to prioritize the highest-risk crimes, but it is unclear whether they are effective.

Domestic abuse is a widespread problem in the United States and around the world. Violence at the hands of an intimate partner has affected more than 600 million women globally, according to World Health Organization estimates, and the problem has only grown during the pandemic.

Law enforcement officers have turned to various tools, from simple questionnaires to algorithms, as a way to prioritize the highest-risk crimes. While some research has recognized the potential benefits of the tools, it has also left experts in the domestic violence community with questions about the ethics and efficacy of relying on technology to predict future violent acts.

Matthew Bland, an associate professor in evidence based policing at the University of Cambridge, said there is broad acknowledgment that something needs to be done to improve services for domestic violence victims, but that how, or whether, to use technology as a solution is up for debate. 

“We’re still quite polarized, I think, as a domestic abuse community, on the right way forward,” he said. 

Range of techniques

Some tools used by police are effectively just paper questionnaires. In the United Kingdom, police use a relatively simple tool called DASH, short for “Domestic Abuse, Stalking and Honor-Based Violence.” After an incident, police question victims and add up the number of “yes” responses to produce a risk classification that guides their response. 

Although the idea has gained the most traction in Europe, some police forces in the United States also use a basic form of risk assessment similar to DASH. 

Other systems are relatively advanced. The government of Spain launched an ambitious project in 2007 to battle domestic violence through a system called VioGén. Its goal was to build a centralized system for domestic violence cases that could also predict future incidents. 

VioGén is powered by an algorithm developed by researchers based on which factors in an incident have been linked to high-risk cases in the past. Police log such details of a case as whether the aggressor has made death threats or uses drugs, and VioGén calculates a score based on the inputs.

VioGén has since performed millions of “risk evaluations.” The scale rates risk from lowest to highest and guides how police respond, including whether to pursue charges or provide a victim with police protection.

Today, VioGén is likely the most advanced predictive domestic violence tool. According to a report from Eticas Foundation, a nonprofit tech advocacy group that studied the tool, there were more than 670,000 cases in the system at the beginning of 2022. 

Effectiveness and ethics

Are the tools effective at preventing domestic violence?

“That’s kind of the gigantic elephant in the room, not only in Spain but with all risk assessment tools,” said Juan Jose Medina Ariza, a researcher in crime sciences at the University of Seville. “We don’t really know” whether putting these tools in the hands of police improves their response to domestic violence, he said.

Researchers have found that some relatively simple tools like DASH are disappointing. One 2019 study by Medina Ariza and colleagues found that the system was “underperforming” and was “at best, weakly predictive of revictimization.”

The published research on VioGén has been relatively positive, Medina Ariza said—but it’s been criticized for being evaluated by researchers who work directly on the tool with the Spanish government. 

Eticas CEO Gemma Galdon said there needs to be more transparency from the Spanish Ministry of the Interior, which developed the system. Police have leeway to override the algorithm and heighten the risk level of a case manually, but, 95 percent of the time, officers followed the algorithm, according to the Eticas report, which relied on limited available data on the system.

Without independent third-party audits, Galdon said, the public can’t truly be assured that tools like VioGén are effective and resources are reaching the people they’re meant to help.

“When a woman with a low risk score is killed, the ministry cannot say, with confidence, ‘This is an anecdote, and the system works,’ ” Galdon said. “That is very, very, very concerning.” 

The Spanish Ministry of the Interior did not respond to a request for comment.

More options, more controversy 

Some officials and researchers have suggested using more data-intensive techniques. One controversial idea: machine learning.

VioGén’s decisions are based on factors predetermined by researchers to be linked to violence—whether the aggressor has had suicidal thoughts, for example, is factored into VioGén’s decisions. 

But a machine learning tool can draw its own conclusions about risk. Such a system could read through police data on crimes and decide autonomously which cases are the highest risk, based on factors like prior arrests and convictions. The system could even decide that cases from certain zip codes are higher risk because it sees more reports of abuse from those neighborhoods.

Multiple researchers have found that they were able to improve on the predictions of simple risk assessments by using such a technique. But Medina Ariza, who also published a paper finding that a machine learning technique could improve on the predictive power of the United Kingdom’s DASH tool if it were implemented, said using machine learning in domestic violence abuse remains ethically controversial.

The technique relies on past data to make predictions about the future, raising the concern that it will reinforce past prejudices, like a focus on one racial group. If a machine learning algorithm is trained on arrest data, for example, it may overpredict abuse in groups that police disproportionately arrest.

“Our fear is that we are substituting really faulty and discriminatory human systems with even worse and more opaque technical systems,” Galdon said.

Still, the idea of using machine learning to sort cases is being toyed with. Last year, for example, police in Queensland, Australia, announced that they would pilot the use of a machine learning program trained on police data to predict the highest-risk domestic violence offenders. 

According to The Guardian, police officials said officers would use the tools to predict which cases would escalate and be “proactively knocking on doors without any call for service.” Matt Adams, a spokesperson for the Queensland Police Service, told The Markup that the trial has been delayed by COVID, but the police are moving ahead with the plan.

Medina Ariza said that, at the very least, researchers have shown that big data techniques have been better able to predict domestic abuse than the simplest risk assessments. 

“The question then becomes one of, is it O.K. to use a machine learning model, even with all of the debates that are going on about algorithmic fairness?” he said. “I think that that’s still very much an open question.”

This article was originally published on The Markup and was republished under the Creative Commons Attribution-NonCommercial-NoDerivatives license.

X
This website uses cookies to enhance user experience and to analyze performance and traffic on our website. We also share information about your use of our site with our social media, advertising and analytics partners. Learn More / Do Not Sell My Personal Information
Accept Cookies
X
Cookie Preferences Cookie List

Do Not Sell My Personal Information

When you visit our website, we store cookies on your browser to collect information. The information collected might relate to you, your preferences or your device, and is mostly used to make the site work as you expect it to and to provide a more personalized web experience. However, you can choose not to allow certain types of cookies, which may impact your experience of the site and the services we are able to offer. Click on the different category headings to find out more and change our default settings according to your preference. You cannot opt-out of our First Party Strictly Necessary Cookies as they are deployed in order to ensure the proper functioning of our website (such as prompting the cookie banner and remembering your settings, to log into your account, to redirect you when you log out, etc.). For more information about the First and Third Party Cookies used please follow this link.

Allow All Cookies

Manage Consent Preferences

Strictly Necessary Cookies - Always Active

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data, Targeting & Social Media Cookies

Under the California Consumer Privacy Act, you have the right to opt-out of the sale of your personal information to third parties. These cookies collect information for analytics and to personalize your experience with targeted ads. You may exercise your right to opt out of the sale of personal information by using this toggle switch. If you opt out we will not be able to offer you personalised ads and will not hand over your personal information to any third parties. Additionally, you may contact our legal department for further clarification about your rights as a California consumer by using this Exercise My Rights link

If you have enabled privacy controls on your browser (such as a plugin), we have to take that as a valid request to opt-out. Therefore we would not be able to track your activity through the web. This may affect our ability to personalize ads according to your preferences.

Targeting cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising.

Social media cookies are set by a range of social media services that we have added to the site to enable you to share our content with your friends and networks. They are capable of tracking your browser across other sites and building up a profile of your interests. This may impact the content and messages you see on other websites you visit. If you do not allow these cookies you may not be able to use or see these sharing tools.

If you want to opt out of all of our lead reports and lists, please submit a privacy request at our Do Not Sell page.

Save Settings
Cookie Preferences Cookie List

Cookie List

A cookie is a small piece of data (text file) that a website – when visited by a user – asks your browser to store on your device in order to remember information about you, such as your language preference or login information. Those cookies are set by us and called first-party cookies. We also use third-party cookies – which are cookies from a domain different than the domain of the website you are visiting – for our advertising and marketing efforts. More specifically, we use cookies and other tracking technologies for the following purposes:

Strictly Necessary Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Functional Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Performance Cookies

We do not allow you to opt-out of our certain cookies, as they are necessary to ensure the proper functioning of our website (such as prompting our cookie banner and remembering your privacy choices) and/or to monitor site performance. These cookies are not used in a way that constitutes a “sale” of your data under the CCPA. You can set your browser to block or alert you about these cookies, but some parts of the site will not work as intended if you do so. You can usually find these settings in the Options or Preferences menu of your browser. Visit www.allaboutcookies.org to learn more.

Sale of Personal Data

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Social Media Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.

Targeting Cookies

We also use cookies to personalize your experience on our websites, including by determining the most relevant content and advertisements to show you, and to monitor site traffic and performance, so that we may improve our websites and your experience. You may opt out of our use of such cookies (and the associated “sale” of your Personal Information) by using this toggle switch. You will still see some advertising, regardless of your selection. Because we do not track you across different devices, browsers and GEMG properties, your selection will take effect only on this browser, this device and this website.