jm + policing   16

China’s Operating Manuals for Mass Internment and Arrest by Algorithm - ICIJ
“The Chinese have bought into a model of policing where they believe that through the collection of large-scale data run through artificial intelligence and machine learning that they can, in fact, predict ahead of time where possible incidents might take place, as well as identify possible populations that have the propensity to engage in anti-state anti-regime action,” said Mulvenon, the SOS International document expert and director of intelligence integration. “And then they are preemptively going after those people using that data.”

Mulvenon said IJOP is more than a “pre-crime” platform, but a “machine-learning, artificial intelligence, command and control” platform that substitutes artificial intelligence for human judgment. He described it as a “cybernetic brain” central to China’s most advanced police and military strategies. Such a system “infantilizes” those tasked with implementing it, said Mulvenon, creating the conditions for policies that could spin out of control with catastrophic results.

The program collects and interprets data without regard to privacy, and flags ordinary people for investigation based on seemingly innocuous criteria, such as daily prayer, travel abroad, or frequently using the back door of their home.

Perhaps even more significant than the actual data collected are the grinding psychological effects of living under such a system.  With batteries of facial-recognition cameras on street corners, endless checkpoints and webs of informants, IJOP generates a sense of an omniscient, omnipresent state that can peer into the most intimate aspects of daily life.  As neighbors disappear based on the workings of unknown algorithms, Xinjiang lives in a perpetual state of terror.

The seeming randomness of investigations resulting from IJOP isn’t a bug but a feature, said Samantha Hoffman, an analyst at the Australian Strategic Policy Institute whose research focuses on China’s use of data collection for social control. “That’s how state terror works,” Hoffman said. “Part of the fear that this instills is that you don’t know when you’re not OK.”
terror  dystopia  china  algorithms  ijop  future  policing  grim-meathook-future  privacy  data-privacy  uighurs 
12 weeks ago by jm
Europol Tells Internet Archive That Much Of Its Site Is 'Terrorist Content' | Techdirt
'The Internet Archive has a few staff members that process takedown notices from law enforcement who operate in the Pacific time zone. Most of the falsely identified URLs mentioned here (including the report from the French government) were sent to us in the middle of the night – between midnight and 3am Pacific – and all of the reports were sent outside of the business hours of the Internet Archive.

The one-hour requirement essentially means that we would need to take reported URLs down automatically and do our best to review them after the fact.

It would be bad enough if the mistaken URLs in these examples were for a set of relatively obscure items on our site, but the EU IRU’s lists include some of the most visited pages on archive.org and materials that obviously have high scholarly and research value.'
eu  europol  policing  france  archive.org  archival  web  freedom  censorship  fail 
april 2019 by jm
Computer says "prison camp"
China: Big Data Fuels Crackdown in Minority Region:
Chinese authorities are building and deploying a predictive policing program based on big data analysis in Xinjiang, Human Rights Watch said today. The program aggregates data about people – often without their knowledge – and flags those it deems potentially threatening to officials. According to interviewees, some of those targeted are detained and sent to extralegal “political education centers” where they are held indefinitely without charge or trial, and can be subject to abuse.

“For the first time, we are able to demonstrate that the Chinese government’s use of big data and predictive policing not only blatantly violates privacy rights, but also enables officials to arbitrarily detain people,” said Maya Wang, senior China researcher at Human Rights Watch. “People in Xinjiang can’t resist or challenge the increasingly intrusive scrutiny of their daily lives because most don’t even know about this ‘black box’ program or how it works.”


(via Zeynep Tufekci)
via:zeynep  human-rights  china  grim-meathook-future  future  grim  policing  xinjiang  prison-camps  surveillance  big-data 
january 2019 by jm
A UK police force is dropping tricky cases on advice of an algorithm
Wow, this is a terrible idea. It will definitely launder existing human bias into its decisions.

However, because the technology bases its predictions on past investigations, any biases contained in those decisions may be reinforced by the algorithm. For example, if there are areas that don’t have CCTV and police frequently decided not to pursue cases there, people in those places could be disadvantaged. “When we train algorithms on the data on historical arrests or reports of crime, any biases in that data will go into the algorithm and it will learn those biases and then reinforce them,” says Joshua Loftus at Stanford University in California.
[...]

Police forces only ever know about crimes they detect or have reported to them, but plenty of crime goes unreported, especially in communities that have less trust in the police.
This means the algorithms are making predictions based on a partial picture. While this sort of bias is hard to avoid, baking it into an algorithm may make its decisions harder to hold to account compared with an officer’s. John Phillips, superintendent at Kent Police, says that for the types of crimes that EBIT is being used for, under-reporting isn’t an issue and so shouldn’t affect the tool’s effectiveness.


....well, I guess that's OK then? I would have assumed under-reporting would be a massive source of bias alright....
bias  machine-learning  ml  ai  cctv  police  uk  kent  policing 
january 2019 by jm
Palantir Knows Everything About You
This is so fucking dystopian:
Operation Laser has made L.A. cops more surgical — and, according to community activists, unrelenting. Once targets are enmeshed in a [Palantir] spidergram, they’re stuck.

Manuel Rios, 22, lives in the back of his grandmother’s house at the top of a hill in East L.A., in the heart of the city’s gang area. [...] He grew up surrounded by friends who joined Eastside 18, the local affiliate of the 18th Street gang, one of the largest criminal syndicates in Southern California. Rios says he was never “jumped in”—initiated into 18. He spent years addicted to crystal meth and was once arrested for possession of a handgun and sentenced to probation. But except for a stint in county jail for a burglary arrest inside a city rec center, he’s avoided further trouble and says he kicked his meth habit last year.

In 2016, Rios was sitting in a parked car with an Eastside 18 friend when a police car pulled up. His buddy ran, pursued by the cops, but Rios stayed put. “Why should I run? I’m not a gang member,” he says over steak and eggs at the IHOP near his home. The police returned and handcuffed him. One of them took his picture with a cellphone. “Welcome to the gang database!” the officer said.

Since then he’s been stopped more than a dozen times, he says, and told that if he doesn’t like it he should move. He has nowhere to go. His girlfriend just had a baby girl, and he wants to be around for them. “They say you’re in the system, you can’t lie to us,” he says. “I tell them, ‘How can I be in the hood if I haven’t got jumped in? Can’t you guys tell people who bang and who don’t?’ They go by their facts, not the real facts.”

The police, on autopilot with Palantir, are driving Rios toward his gang friends, not away from them, worries Mariella Saba, a neighbor and community organizer who helped him get off meth. When whole communities like East L.A. are algorithmically scraped for pre-crime suspects, data is destiny, says Saba. “These are systemic processes. When people are constantly harassed in a gang context, it pushes them to join. They internalize being told they’re bad.”
palantir  surveillance  privacy  precrime  spidergrams  future  la  gangs  justice  algorithms  data-protection  data-privacy  policing  harrassment 
april 2018 by jm
A Closer Look at Experian Big Data and Artificial Intelligence in Durham Police
'UK police bought profiling data for their artificial intelligence (AI) system, deciding whether to hold suspects in custody, from ... Experian.'

'The AI tool uses 34 data categories including the offender’s criminal history, combined with their age, gender and two types of residential postcode. The use of postcode data is problematic in predictive software of this kind as it carries a risk of perpetuating bias towards areas marked by community deprivation.'
experian  marketing  credit-score  data  policing  uk  durham  ai  statistics  crime  hart 
april 2018 by jm
The criminal exploits of "Prawo Jazdy"
Excellent policing folklore here....

'Eventually a letter was sent to the Polish embassy to ask for the Polish government's assistance in bringing this rogue motorist to justice.
Their reply was as swift as it was courteous. It said "Prawo Jazdy is Polish for driver's license".'
gardai  policing  ireland  polish  driving  safety  road-safety  funny  anecdotes 
march 2017 by jm
Founder of Google X has no concept of how machine learning as policing tool risks reinforcing implicit bias
This is shocking:
At the end of the panel on artificial intelligence, a young black woman asked [Sebastian Thrun, CEO of the education startup Udacity, who is best known for founding Google X] whether bias in machine learning “could perpetuate structural inequality at a velocity much greater than perhaps humans can.” She offered the example of criminal justice, where “you have a machine learning tool that can identify criminals, and criminals may disproportionately be black because of other issues that have nothing to do with the intrinsic nature of these people, so the machine learns that black people are criminals, and that’s not necessarily the outcome that I think we want.”
In his reply, Thrun made it sound like her concern was one about political correctness, not unconscious bias. “Statistically what the machines do pick up are patterns and sometimes we don’t like these patterns. Sometimes they’re not politically correct,” Thrun said. “When we apply machine learning methods sometimes the truth we learn really surprises us, to be honest, and I think it’s good to have a dialogue about this.”


"the truth"! Jesus. We are fucked
google  googlex  bias  racism  implicit-bias  machine-learning  ml  sebastian-thrun  udacity  inequality  policing  crime 
october 2016 by jm
Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks. - ProPublica
holy crap, this is dystopian:
The first time Paul Zilly heard of his score — and realized how much was riding on it — was during his sentencing hearing on Feb. 15, 2013, in court in Barron County, Wisconsin. Zilly had been convicted of stealing a push lawnmower and some tools. The prosecutor recommended a year in county jail and follow-up supervision that could help Zilly with “staying on the right path.” His lawyer agreed to a plea deal.
But Judge James Babler had seen Zilly’s scores. Northpointe’s software had rated Zilly as a high risk for future violent crime and a medium risk for general recidivism. “When I look at the risk assessment,” Babler said in court, “it is about as bad as it could be.”
Then Babler overturned the plea deal that had been agreed on by the prosecution and defense and imposed two years in state prison and three years of supervision.
dystopia  law  policing  risk  risk-assessment  northpointe  racism  fortune-telling  crime 
may 2016 by jm
Dr TJ McIntyre: Fight against cybercrime needs funding, not more words - Independent.ie
Is the Irish policing system capable of tackling computer crime? A report this week from the Garda Inspectorate makes it clear that the answer is no. There is no Garda cybercrime unit, which is of serious concern given the threat posed by cybercrime to key national infrastructure such as energy, transport and telecommunications systems. [...]

A combination of inadequate resources and increased workload have swamped the [Computer Crime Investigation Unit]. Today, almost every crime is a computer crime, in the sense that mobile phones, laptops and even devices such as game consoles are likely to contain evidence. The need to forensically inspect all these devices - using outdated equipment - has resulted in several-year delays and seem to have forced the unit into a position where it is running to stand still rather than responding to new developments.
via:tjmcintyre  ireland  cybercrime  law  policing  hacking 
december 2015 by jm
Lisa Jones, girlfriend of undercover policeman Mark Kennedy: ‘I thought I knew him better than anyone’ | UK news | The Guardian
She thought they were a normal couple until she found a passport in a glovebox – and then her world shattered. Now she is finally getting compensation and a police apology for that surreal, state-sponsored deception. But she still lies awake and wonders: did he ever really love me?


I can't believe this was going on in the 2000s!
surveillance  police  uk  undercover  scandals  policing  environmentalism  greens 
november 2015 by jm
The Gardai haven't requested info on any Twitter accounts in the past 6 months
This seems to imply they haven't been investigating any allegations of cyber-bullying/harassment from "anonymous" Twitter handles, despite having the legal standing to do so. Enforcement is needed, not new laws
cyber-bullying  twitter  social-media  enforcement  gardai  policing  harassment  online  society  law  government 
february 2014 by jm
Filters 'not a silver bullet' that will stop perverts, warns Interpol chief - Independent.ie
Sunday Independent interview with Interpol assistant director Mick Moran:
Moran spoke out after child welfare organisations here called on the Government to follow the UK's example by placing anti-pornography filters on Irish home broadband connections. The Irish Society for the Prevention of Cruelty to Children argued that pornography was damaging to young children and should be removed from their line of sight.

But Moran warned this would only lull parents into a false sense of security. "If we imagine the access people had to porn in the past – that access is now complete and total. They have access to the most horrific material out there. We now need to focus on parental responsibility about how kids are using the internet."
mick-moran  cam  interpol  policing  ispcc  filtering  parenting  children  broadband 
august 2013 by jm

related tags

ai  algorithms  anecdotes  antisocial  archival  archive.org  bias  big-data  blocking  broadband  bullying  cam  cctv  censorship  children  china  consultations  credit-score  crime  cyber-bullying  cyberbullying  cybercrime  data  data-privacy  data-protection  dcenr  driving  durham  dystopia  enforcement  environmentalism  eu  europol  experian  facial-recognition  fail  filtering  fortune-telling  france  fraud  free-speech  freedom  funny  future  gangs  gardai  google  googlex  governance  government  greens  grim  grim-meathook-future  hacking  harassment  harrassment  hart  human-rights  ijop  implicit-bias  inequality  internet  interpol  ireland  ispcc  justice  kent  la  law  machine-learning  marketing  mick-moran  ml  northpointe  online  palantir  parenting  phishing  police  policing  polish  precrime  prison-camps  privacy  racism  risk  risk-assessment  road-safety  safety  scandals  sebastian-thrun  social-media  society  spidergrams  statistics  surveillance  terror  twitter  udacity  uighurs  uk  undercover  us-politics  via:tjmcintyre  via:zeynep  web  xinjiang 

Copy this bookmark:



description:


tags: