jm + facial-recognition   6

ACLU to Amazon: Get out of the surveillance business
This is a fair point from the ACLU:
Already, Rekognition is in use in Florida and Oregon. Government agencies in California and Arizona have sought information about it, too. And Amazon didn't just sell Rekognition to law enforcement, it's actively partnering with them to ensure that authorities can fully utilize Rekognition's capabilities.

Amazon has branded itself as customer-centric, opposed secret government surveillance, and has a CEO who publicly supported First Amendment freedoms and spoke out against the discriminatory Muslim Ban. Yet, Amazon is powering dangerous surveillance that poses a grave threat to customers and communities already unjustly targeted in the current political climate.
We must make it clear to Amazon that we won't stand by and let it pad its bottom line by selling out our civil rights.
aclu  amazon  rekognition  facial-recognition  faces  law  privacy  data-privacy  civil-rights 
4 days ago by jm
Face recognition police tools 'staggeringly inaccurate' - BBC News
"In figures given to Big Brother Watch, South Wales Police said its facial recognition technology had made 2,685 "matches" between May 2017 and March 2018 - but 2,451 were false alarms."

This is going to be a disaster.
surveillance  bbc  wales  facial-recognition  privacy  false-positives  ml 
11 days ago by jm
Do algorithms reveal sexual orientation or just expose our stereotypes?
'A study claiming that artificial intelligence can infer sexual orientation from facial images caused a media uproar in the Fall of 2017. [...] Michal Kosinski, who co-authored the study with fellow researcher Yilun Wang, initially expressed surprise, calling the critiques “knee-jerk” reactions. However, he then proceeded to make even bolder claims: that such AI algorithms will soon be able to measure the intelligence, political orientation, and criminal inclinations of people from their facial images alone.'

'In [this paper], we have shown how the obvious differences between lesbian or gay and straight faces in selfies relate to grooming, presentation, and lifestyle  —  that is, differences in culture, not in facial structure. [...] We’ve demonstrated that just a handful of yes/no questions about these variables can do nearly as good a job at guessing orientation as supposedly sophisticated facial recognition AI. Therefore — at least at this point — it’s hard to credit the notion that this AI is in some way superhuman at “outing” us based on subtle but unalterable details of our facial structure.'
culture  facial-recognition  ai  papers  facial-structure  sexual-orientation  lgbt  computer-vision 
january 2018 by jm
Ireland goes Big Brother as police upgrade snooping abilities - The Register
The Garda Síochána has proposed to expand its surveillance on Irish citizens by swelling the amount of data it collects on them through an increase in its CCTV and ANPR set-ups, and will also introduce facial and body-in-a-crowd biometrics technologies. [...] The use of Automated Facial Recognition (AFR) technology is fairly troubled in the UK, with the independent biometrics commissioner warning the government that it was risking inviting a legal challenge back in March. It is no less of an issue in Ireland, where the Data Protection Commissioner (DPC) audited Facebook in 2011 and 2012, and scolded the Zuckerborg over its use of facial recognition technology.
afr  facial-recognition  minority-report  surveillance  ireland  gardai  cctv  anpr  biometrics  privacy 
june 2016 by jm
Red lines and no-go zones - the coming surveillance debate
The Anderson Report to the House of Lords in the UK on RIPA introduces a concept of a "red line":
"Firm limits must also be written into the law: not merely safeguards, but red lines that may not be crossed." …   
"Some might find comfort in a world in which our every interaction and movement could be recorded, viewed in real time and indefinitely retained for possible future use by the authorities. Crime fighting, security, safety or public health justifications are never hard to find." [13.19] 

The Report then gives examples, such as a perpetual video feed from every room in every house, the police undertaking to view the record only on receipt of a complaint; blanket drone-based surveillance; licensed service providers, required as a condition of the licence to retain within the jurisdiction a complete plain-text version of every communication to be made available to the authorities on request; a constant data feed from vehicles, domestic appliances and health-monitoring personal devices; fitting of facial recognition software to every CCTV camera and the insertion of a location-tracking chip under every individual's skin.

It goes on:
"The impact of such powers on the innocent could be mitigated by the usual apparatus of safeguards, regulators and Codes of Practice. But a country constructed on such a basis would surely be intolerable to many of its inhabitants. A state that enjoyed all those powers would be truly totalitarian, even if the authorities had the best interests of its people at heart." [13.20] …  

"The crucial objection is that of principle. Such a society would have gone beyond Bentham's Panopticon (whose inmates did not know they were being watched) into a world where constant surveillance was a certainty and quiescence the inevitable result. There must surely come a point (though it comes at different places for different people) where the escalation of intrusive powers becomes too high a price to pay for a safer and more law abiding environment." [13.21]
panopticon  jeremy-bentham  law  uk  dripa  ripa  surveillance  spying  police  drones  facial-recognition  future  tracking  cctv  crime 
november 2015 by jm

Copy this bookmark:



description:


tags: