Artificial intelligence could identify gang crimes—and ignite an ethical firestorm | Science | AAAS


15 bookmarks. First posted by nicklally march 2018.


Matthew Hutson:
<p>…the partially generative algorithm reduced errors by close to 30%, the team reported at the Artificial Intelligence, Ethics, and Society (AIES) conference this month in New Orleans, Louisiana. The researchers have not yet tested their algorithm’s accuracy against trained officers.

It’s an “interesting paper,” says Pete Burnap, a computer scientist at Cardiff University who has studied crime data. But although the predictions could be useful, it’s possible they would be no better than officers’ intuitions, he says. Haubert agrees, but he says that having the assistance of data modeling could sometimes produce “better and faster results.” Such analytics, he says, “would be especially useful in large urban areas where a lot of data is available.”

But researchers attending the AIES talk raised concerns during the Q&A afterward. How could the team be sure the training data were not biased to begin with? What happens when someone is mislabeled as a gang member? Lemoine asked rhetorically whether the researchers were also developing algorithms that would help heavily patrolled communities predict police raids.

Hau Chan, a computer scientist now at Harvard University who was presenting the work, responded that he couldn’t be sure how the new tool would be used. “I’m just an engineer,” he said. Lemoine quoted a lyric from a song about the wartime rocket scientist Wernher von Braun, in a heavy German accent: “Once the rockets are up, who cares where they come down?” Then he angrily walked out.

Approached later for comment, Lemoine said he had talked to Chan to smooth things over. “I don’t necessarily think that we shouldn’t build tools for the police, or that we should,” Lemoine said (commenting, he specified, as an individual, not as a Google representative). “I think that when you are building powerful things, you have some responsibility to at least consider how could this be used.”

Two of the paper’s senior authors spent nearly 20 minutes deflecting such questions during a later interview. “It’s kind of hard to say at the moment,” said Jeffrey Brantingham, an anthropologist at the University of California, Los Angeles. “It’s basic research.” Milind Tambe, a computer scientist at the University of Southern California in Los Angeles, agreed. Might a tool designed to classify gang crime be used to, say, classify gang crime? They wouldn’t say.</p>
ai  police  ethics  machinelearning 
march 2018 by charlesarthur
By Matthew Hutson
Feb. 28, 2018
doi:10.1126/science.aat4510
sp_issues  algorithmBias  ethics  reliability 
march 2018 by Frieda.Mendelsohn
Hau Chan, a computer scientist now at Harvard University who was presenting the work, responded that he couldn’t be sure how the new tool would be used. “I’m just an engineer,” he said. Lemoine quoted a lyric from a song about the wartime rocket scientist Wernher von Braun, in a heavy German accent: “Once the rockets are up, who cares where they come down?” Then he angrily walked out.
march 2018 by slee2004
---Design engineer's comment,"I am just an engineer".
ethics  policing  artificial_intelligence  crime  algorithms  bias 
march 2018 by rvenkat
"How could the team be sure the training data were not biased to begin with? What happens when someone is mislabeled as a gang member? Lemoine asked rhetorically whether the researchers were also developing algorithms that would help heavily patrolled communities predict police raids.

Hau Chan, a computer scientist now at Harvard University who was presenting the work, responded that he couldn’t be sure how the new tool would be used. “I’m just an engineer,” he said. Lemoine quoted a lyric from a song about the wartime rocket scientist Wernher von Braun, in a heavy German accent: “Once the rockets are up, who cares where they come down?” Then he angrily walked out.

Approached later for comment, Lemoine said he had talked to Chan to smooth things over. “I don’t necessarily think that we shouldn’t build tools for the police, or that we should,” Lemoine said (commenting, he specified, as an individual, not as a Google representative). “I think that when you are building powerful things, you have some responsibility to at least consider how could this be used.”"
ethics  technology  police  future  profiling  ai 
march 2018 by ssam
Favorite tweet: hypervisible

Researchers build AI to identify gang members. When asked about potential misuses, presenter (a computer scientist at Harvard) says "I'm just an engineer." 🤦🏿‍♂️🤦🏿‍♂️🤦🏿‍♂️https://t.co/NbKepiaG4Y pic.twitter.com/qp6f0okJ1g

— chris g (@hypervisible) March 4, 2018

http://twitter.com/hypervisible/status/970276540419395584
IFTTT  twitter  favorite 
march 2018 by tswaterman
Hau Chan, a computer scientist now at Harvard University who was presenting the work, responded that he couldn’t be sure how the new tool would be used. “I’m just an engineer,” he said. Lemoine quoted a lyric from a song about the wartime rocket scientist Wernher von Braun, in a heavy German accent: “Once the rockets are up, who cares where they come down?” Then he angrily walked out.
software  politics  police  AI 
march 2018 by nicklally