jm + data-privacy   34

CCC's 10 requirements for the evaluation of "Contact Tracing" apps
"Corona apps" are on everyone's lips as a way to contain the SARS-CoV-2 epidemic. CCC publishes 10 requirements for their evaluation from a technical and societal perspective.

Currently, technically supported "contact tracing" is being considered as means to counteract the spread of SARS-CoV-2 in a more targeted manner. The general motivation is to allow greater freedom of movement for the broad spectrum of society by allowing quick tracing and interruption of infection chains. Contacts of infected persons should be alerted more quickly and thus be able to quarantine themselves more quickly. This, in turn, should prevent further infections. A "corona app" could therefore protect neither ourselves nor our contacts: It would be designed to break chains of infection by protecting the contacts of our contacts.
covid-19  pandemics  contact-tracing  ccc  privacy  data-privacy 
3 days ago by jm
China’s Operating Manuals for Mass Internment and Arrest by Algorithm - ICIJ
“The Chinese have bought into a model of policing where they believe that through the collection of large-scale data run through artificial intelligence and machine learning that they can, in fact, predict ahead of time where possible incidents might take place, as well as identify possible populations that have the propensity to engage in anti-state anti-regime action,” said Mulvenon, the SOS International document expert and director of intelligence integration. “And then they are preemptively going after those people using that data.”

Mulvenon said IJOP is more than a “pre-crime” platform, but a “machine-learning, artificial intelligence, command and control” platform that substitutes artificial intelligence for human judgment. He described it as a “cybernetic brain” central to China’s most advanced police and military strategies. Such a system “infantilizes” those tasked with implementing it, said Mulvenon, creating the conditions for policies that could spin out of control with catastrophic results.

The program collects and interprets data without regard to privacy, and flags ordinary people for investigation based on seemingly innocuous criteria, such as daily prayer, travel abroad, or frequently using the back door of their home.

Perhaps even more significant than the actual data collected are the grinding psychological effects of living under such a system.  With batteries of facial-recognition cameras on street corners, endless checkpoints and webs of informants, IJOP generates a sense of an omniscient, omnipresent state that can peer into the most intimate aspects of daily life.  As neighbors disappear based on the workings of unknown algorithms, Xinjiang lives in a perpetual state of terror.

The seeming randomness of investigations resulting from IJOP isn’t a bug but a feature, said Samantha Hoffman, an analyst at the Australian Strategic Policy Institute whose research focuses on China’s use of data collection for social control. “That’s how state terror works,” Hoffman said. “Part of the fear that this instills is that you don’t know when you’re not OK.”
terror  dystopia  china  algorithms  ijop  future  policing  grim-meathook-future  privacy  data-privacy  uighurs 
november 2019 by jm
How ICE Picks Its Targets in the Surveillance Age - The New York Times
This article is terrifying.
Tracking immigrants in this country is an increasingly trivial exercise because it’s an increasingly trivial exercise to track any of us. [...] Over the course of more than a year, I tried to reverse-engineer individual ICE officers’ use of America’s vast post-Sept. 11 domestic-surveillance apparatus, retracing their hunt for targets down to the very searches they entered into their computers.

The scale of domestic surveillance of the general population in the US is huge. We need more friction:

'What may be most unusual about Washington State is not what it collects and not what it has shared but the degree to which it has been forced to become transparent about the vast quantity of personal data that courses through its bureaucracy. For decades, the overriding objective of American business and government has been to remove friction from the tracking system, by linking networks, by speeding connections, by eliminating barriers. But friction is the only thing that has ever made privacy, let alone obscurity, possible. If there’s no friction, if we can all be profiled instantly and intimately, then there’s nothing to stop any of our neighbors from being targeted — nothing, that is, except our priorities.'
ice  privacy  data-protection  data-privacy  immigration  us-politics  trump  surveillance  palantir  great-oak  tracking 
october 2019 by jm
The Plan to Use Fitbit Data to Stop Mass Shootings Is One of the Scariest Proposals Yet
“The proposed data collection goes beyond absurdity when they mention the desire to collect FitBit data,” Annas told Gizmodo. “I am unaware of any study linking walking too much and committing mass murder. As for the other technologies, what are these people expecting? ‘Alexa, tell me the best way to kill a lot of people really quickly’? Really?” [....]

Fridel said that “literally any risk factor identified for mass shooters will result in millions of false positives,” adding that the most reliable risk factor is gender, and that most mass murderers are male. “Should we create a list of all men in the United States and keep tabs on them?” she said. “Although it would be absurd and highly unethical, doing so would be more effective than keeping a list of persons with mental illness.”
dystopia  technology  grim-meathook-future  data-protection  data-privacy  fitbit  harpa 
september 2019 by jm
Irish State told to delete ‘unlawful’ data on 3.2m citizens
This is amazing:
The State has been told it must delete data held on 3.2 million citizens, which was gathered as part of the roll-out of the Public Services Card, as there is no lawful basis for retaining it.

In a highly critical report on its investigation into the card, the Data Protection Commission found there was no legal reason to make individuals obtain the card in order to access State services such as renewing a driving licence or applying for a college grant. [...]

Helen Dixon, the Data Protection Commissioner, told The Irish Times that forcing people to obtain such a card for services other than those provided by the department was “unlawful from a data-processing point of view”.
psc  ireland  politics  data-privacy  privacy  data-collection  dpo  dpc 
august 2019 by jm
IBM’s photo-scraping scandal shows what a weird bubble AI researchers live in - MIT Technology Review
scraping data from publicly available sources is so much of an industry standard that it’s taught as a foundational skill (sans ethics) in most data science and machine-learning training.

[...] this story highlights the need for the tech industry to adapt its cultural norms and standard practices to keep pace with the rapid evolution of the technology itself, as well as the public’s awareness of how their data is used.
scraping  privacy  data  ai  big-data  data-privacy  flickr  photos  machine-learning 
august 2019 by jm
Data isn't the new oil, it's the new CO2
great point.
We should not endlessly be defending arguments along the lines that “people choose to willingly give up their freedom in exchange for free stuff online”. The argument is flawed for two reasons.

First the reason that is usually given - people have no choice but to consent in order to access the service, so consent is manufactured.  We are not exercising choice in providing data but rather resigned to the fact that they have no choice in the matter. 

The second, less well known but just as powerful, argument is that we are not only bound by other people’s data; we are bound by other people’s consent.  In an era of machine learning-driven group profiling, this effectively renders my denial of consent meaningless. Even if I withhold consent, say I refuse to use Facebook or Twitter or Amazon, the fact that everyone around me has joined means there are just as many data points about me to target and surveil. The issue is systemic, it is not one where a lone individual can make a choice and opt out of the system. We perpetuate this myth by talking about data as our own individual “oil”, ready to sell to the highest bidder. In reality I have little control over this supposed resource which acts more like an atmospheric pollutant, impacting me and others in myriads of indirect ways. There are more relations - direct and indirect - between data related to me, data about me, data inferred about me via others than I can possibly imagine, let alone control with the tools we have at our disposal today. 
data  ethics  data-privacy  privacy  surveillance  surveillance-capitalism  co2  future  profiling  consent  gdpr 
july 2019 by jm
Estimating the success of re-identifications in incomplete datasets using generative models | Nature Communications
Using our model, we find that 99.98% of Americans would be correctly re-identified in any dataset using 15 demographic attributes. Our results suggest that even heavily sampled anonymized datasets are unlikely to satisfy the modern standards for anonymization set forth by GDPR and seriously challenge the technical and legal adequacy of the de-identification release-and-forget model.


ouch.
deanonymization  deidentification  anonymization  anonymisation  gdpr  privacy  data-privacy  papers 
july 2019 by jm
Rossa McMahon re GMI
Rossa McMahon with a twitter thread on the legality of GMI's genomic data collection program in Ireland:
GMI is a big, expensive company. It announced planned investment injection of $400m last year. It is engaged in a hot industry - hot because of investor interest and hot because of regulatory/ethics concerns.

GDPR is not new. It has been known since 2016. Data protection law is not new. It has been known since 1988. The impact of these laws on genetic data collection & use is not a surprise. So if you have a $400m+ business and this is a key business issue, you have taken advice. And you have, no doubt, been in a position to take that advice from some of the best and/or most expensive advisors available. Assumptions are dangerous, but I think it is fair to assume this has happened.

So read the story again.

Would you be looking for repeated meetings with [Department of Health], answers to questions on regulatory matters and assurances from the State, if you had legal advice of your own to the effect that you are operating or can operate as your currently are?
gmi  genomics  genetics  data-privacy  privacy  gdpr  ireland 
july 2019 by jm
Ireland putting profit before people with genomic medicine strategy
From David McConnell and Orla Hardiman at TCD:
Much of the medical information sought by GMI [Genomics Medicine Ireland] has been collected from patients in public hospitals funded by the exchequer at great expense [...]. Clinicians are being contracted and asked to obtain consent from their patients to transfer clinical information to GMI, along with a tissue sample for WGS [Whole Genome Sequencing]. We understand GMI will pay for the additional hospital clinical costs required for the project. It will obtain the full genetic code for each patient (WGS), and it will analyse all the data. For the most part .... there is minimal tangible benefit to the patient who participates in this programme.

It is important to realise that GMI will own all the clinical and WGS data that they have acquired from the health service, which is of considerable commercial value. GMI will also have complete control over the research and any outcomes. Participating patients do not appear to have access to their data held by GMI – and there does not seem to be a “right to be forgotten”, despite the commercial nature of the enterprise. Moreover, the genomic and clinical data may also be transmitted outside of the European Union, and thus will not be protected by the stringent data-protection laws within the EU.[....]

The Government has made a very big investment in GMI. There may be a view that it is not necessary to provide any additional public investments in genomic medicine in Ireland. However, to those of us who care about the longer-term development of genomic medicine in Ireland, this would be a seriously short-sighted approach. One person in 20 will develop a genetic disorder in their lifetime and half of the Irish population will experience a form of cancer. These and many other patients should be able to benefit from a publicly-available genomics project that can drive new medical care in Ireland.

Genomic medicine is here to stay. We urgently need a properly governed genomics programme in Ireland that will ensure that Irish genomics remains within the public (non-commercial) domain, and that data obtained from Irish citizens will be used to benefit the entire Irish population.

(via Aoife McLysaght)
gmi  wgs  genome  open-data  data-privacy  gdpr  privacy  health  medicine  ireland  genomics 
july 2019 by jm
Palantir’s Top-Secret User Manual for Cops
The Palantir user guide shows that police can start with almost no information about a person of interest and instantly know extremely intimate details about their lives. The capabilities are staggering, according to the guide:

If police have a name that’s associated with a license plate, they can use automatic license plate reader data to find out where they’ve been, and when they’ve been there. This can give a complete account of where someone has driven over any time period.
With a name, police can also find a person's email address, phone numbers, current and previous addresses, bank accounts, social security number(s), business relationships, family relationships, and license information like height, weight, and eye color, as long as it's in the agency's database.
The software can map out a person's family members and business associates of a suspect, and theoretically, find the above information about them, too.
All of this information is aggregated and synthesized in a way that gives law enforcement nearly omniscient knowledge over any suspect they decide to surveil.
police  surveillance  palantir  creepy  grim  data-privacy  privacy 
july 2019 by jm
nearly every site running ads has an /ads.txt
Pinboard on Twitter:
'I just learned that nearly every site running ads has a standardized ads.txt file that helpfully shows you how badly it murders your privacy. The file is a whitelist of all authorized resellers for programmatic advertising. For example, ªªhttps://www.bostonglobe.com/ads.txt ºº'
ads.txt  advertising  pinboard  privacy  data-privacy  adtech  robots 
may 2019 by jm
Crumlin hospital sent DNA off without consent
a company called WuXi NextCODE owning the genome data for a large part of Ireland's population is very cyberpunk
wuxi-nextcode  genomics  genomes  data-privacy  privacy  crumlin-hospital  genes  ireland  gdpr 
january 2019 by jm
UIDAI’s Aadhaar Software Hacked, ID Database Compromised, Experts Confirm
The authenticity of the data stored in India's controversial Aadhaar identity database, which contains the biometrics and personal information of over 1 billion Indians, has been compromised by a software patch that disables critical security features of the software used to enrol new Aadhaar users, a three month-long investigation by HuffPost India reveals.

The patch—freely available for as little as Rs 2,500 (around $35)— allows unauthorised persons, based anywhere in the world, to generate Aadhaar numbers at will, and is still in widespread use.

This has significant implications for national security at a time when the Indian government has sought to make Aadhaar numbers the gold standard for citizen identification, and mandatory for everything from using a mobile phone to accessing a bank account.
security  aadhaar  identity  india  privacy  databases  data-privacy 
september 2018 by jm
ACLU to Amazon: Get out of the surveillance business
This is a fair point from the ACLU:
Already, Rekognition is in use in Florida and Oregon. Government agencies in California and Arizona have sought information about it, too. And Amazon didn't just sell Rekognition to law enforcement, it's actively partnering with them to ensure that authorities can fully utilize Rekognition's capabilities.

Amazon has branded itself as customer-centric, opposed secret government surveillance, and has a CEO who publicly supported First Amendment freedoms and spoke out against the discriminatory Muslim Ban. Yet, Amazon is powering dangerous surveillance that poses a grave threat to customers and communities already unjustly targeted in the current political climate.
We must make it clear to Amazon that we won't stand by and let it pad its bottom line by selling out our civil rights.
aclu  amazon  rekognition  facial-recognition  faces  law  privacy  data-privacy  civil-rights 
may 2018 by jm
GDPR will pop the adtech bubble
Without adtech, the EU’s GDPR (General Data Protection Regulation) would never have happened. But the GDPR did happen, and as a result websites all over the world are suddenly posting notices about their changed privacy policies, use of cookies, and opt-in choices for “relevant” or “interest-based” (translation: tracking-based) advertising. Email lists are doing the same kinds of things.

“Sunrise day” for the GDPR is 25 May. That’s when the EU can start smacking fines on violators.

Simply put, your site or service is a violator if it extracts or processes personal data without personal permission. Real permission, that is. You know, where you specifically say “Hell yeah, I wanna be tracked everywhere.”

Of course what I just said greatly simplifies what the GDPR actually utters, in bureaucratic legalese. The GDPR is also full of loopholes only snakes can thread; but the spirit of the law is clear, and the snakes will be easy to shame, even if they don’t get fined. (And legitimate interest—an actual loophole in the GDPR, may prove hard to claim.)

Toward the aftermath, the main question is What will be left of advertising—and what it supports—after the adtech bubble pops?
advertising  europe  law  privacy  gdpr  tracking  data-privacy 
may 2018 by jm
DNA databases: biology stripped bare
Unlike other biometrics, [DNA] also provides revealing [data regarding] thousands of other related individuals; even to an entire ethnic group.

Such markers may reveal a genetic predisposition towards cancer, or early onset dementia. Mining that data and linking it to family trees and thus, individuals, might interest insurance companies, or state health bodies, or – as ever – advertisers. Or? Who knows?

And the ability of a third-party potentially to reveal such information about me, about you, without us having any say, by providing their DNA profile for some personal purpose? Consider how furious so many have been on the basis of their Facebook profile data going to Cambridge Analytica via some Facebook friend deciding to do a quiz.

Facebook profile data is revealing enough. But DNA is you, fully, irrevocably, exposed. And whatever it displays about you right now, is trivial compared to what we will be able to read into it in the future.

That’s why this case isn’t just about a solitary law enforcement outcome, but about all of us doing an unintended, genetic full monty.
dna-matching  dna  data-privacy  privacy  future  health  cancer  insurance  karlin-lillington 
may 2018 by jm
I tried leaving Facebook. I couldn’t - The Verge
Facebook events, Facebook pages, Facebook photos, and Facebook videos are for many people an integral part of the church picnic, the Christmas party, the class reunion, the baby shower. (The growing scourge of gender reveal parties with their elaborate “reveal” rituals and custom-made cakes seems particularly designed to complement documentation on social media). The completeness of Facebook allows people to create better substitutes for in-person support groups in a wide range of ever-narrowing demographics — from casual interests like Instant Pot recipes for Korean food to heavy life-altering circumstances like rare forms of cancer.

Of all people, I know why I shouldn’t trust Facebook, why my presence on its network contributes to the collective problem of its monopolistic hold on people. Everyone is on Facebook because everyone is on Facebook. And because everyone is on Facebook, even the people who aren’t are having their data collected in shadow profiles. My inaction affects even the people who have managed to stay away. I know this, I barely use Facebook, I don’t even like Facebook, and I find it nearly impossible to leave.
privacy  facebook  deletefacebook  social-networking  social  life  social-media  data-privacy 
may 2018 by jm
The brave new world of genetic genealogy - MIT Technology Review
The combination of DNA and genealogy is a potentially a huge force for good in the world, but it must be used responsibly. In all cases where public databases like GEDmatch are used, the potential for good must be balanced against the potential for harm. In cases involving adoptee searches, missing persons, and unidentified bodies, the potential for good usually markedly outweighs the potential for harm.

But the situation is not so clear-cut when it comes to the use of the methodology to identify suspects in rape and murder cases. The potential for harm is much higher under these circumstances, because of the risk of misuse, misapplication or misinterpretation of the data leading to wrongful identification of suspects. The stakes are too high for the GEDmatch database to be used by the police without oversight by a court of law. 

However, we are not looking at a dystopian future. In the long run the public sharing of DNA data, when done responsibly, is likely to have huge benefits for society. If a criminal can be caught not by his own DNA but through a match with one of his cousins he will be less likely to commit a crime in the first place. With the move to whole genome sequencing in forensic cases in the future, it will be possible to make better use of genetic genealogy methods and databases to identify missing people, the remains of soldiers from World War One and World War Two as well as more recent wars, and casualties from natural and manmade disasters. We will be able to give many more unidentified people the dignity of their identity in death. But we each control our own DNA and we should all be able to decide what, if anything, we wish to share.
gedmatch  genealogy  dna  police  murder  rape  dna-matching  privacy  data-privacy 
april 2018 by jm
The Australian Bureau of Statistics Tracked People By Their Mobile Device Data.
The ABS claims population estimates have a “major data gap” and so they’ve been a busy bee figuring out a way to track crowd movement. Their solution? Mobile device user data. “…with its near-complete coverage of the population, mobile device data is now seen as a feasible way to estimate temporary populations,” states a 2017 conference extract for a talk by ABS Demographer Andrew Howe.

While the “Estimated Resident Population” (ERP) is Australia’s official population measure, the ABS felt the pre-existing data wasn’t ‘granular’ enough. What the ABS really wanted to know was where you’re moving, hour by hour, through the CBD, educational hubs, tourist areas. Howe’s ABS pilot study of mobile device user data creates population estimates with the help of a trial engagement with an unnamed telco company. The data includes age and sex breakdowns. The study ran between the 18th April to 1st May 2016. [....]

Electronic Frontiers Australia board member Justin Warren also pointed out that while there are beneficial uses for this kind of information, “…the ABS should be treading much more carefully than it is. The ABS damaged its reputation with its bungled management of the 2016 Census, and with its failure to properly consult with civil society about its decision to retain names and addresses. Now we discover that the ABS is running secret tracking experiments on the population?”

“Even if the ABS’ motives are benign, this behaviour — making ethically dubious decisions without consulting the public it is experimenting on — continues to damage the once stellar reputation of the ABS.”

“This kind of population tracking has a dark history. During World War II, the US Census Bureau used this kind of tracking information to round up Japanese-Americans for internment. Census data was used extensively by Nazi Germany to target specific groups of people. The ABS should be acutely aware of these historical abuses, and the current tensions within society that mirror those earlier, dark days all too closely.”
abs  australia  tracking  location-data  privacy  data-privacy  mobile 
april 2018 by jm
Use the GDPR to find who has advertised to you on Facebook, and get them to delete your details
Sometimes you get ads on Facebook and you are just not interested in what they’re selling. This is a way to find out who has uploaded your email address into facebook to target ads at you, and then- if you’re in the EU- how to use the new General Data Protection Regulation to get those advertisers to delete you from their system.


Totally going to do this. roll on May 25
gdpr  facebook  privacy  ads  data-privacy  eu 
april 2018 by jm
Palantir Knows Everything About You
This is so fucking dystopian:
Operation Laser has made L.A. cops more surgical — and, according to community activists, unrelenting. Once targets are enmeshed in a [Palantir] spidergram, they’re stuck.

Manuel Rios, 22, lives in the back of his grandmother’s house at the top of a hill in East L.A., in the heart of the city’s gang area. [...] He grew up surrounded by friends who joined Eastside 18, the local affiliate of the 18th Street gang, one of the largest criminal syndicates in Southern California. Rios says he was never “jumped in”—initiated into 18. He spent years addicted to crystal meth and was once arrested for possession of a handgun and sentenced to probation. But except for a stint in county jail for a burglary arrest inside a city rec center, he’s avoided further trouble and says he kicked his meth habit last year.

In 2016, Rios was sitting in a parked car with an Eastside 18 friend when a police car pulled up. His buddy ran, pursued by the cops, but Rios stayed put. “Why should I run? I’m not a gang member,” he says over steak and eggs at the IHOP near his home. The police returned and handcuffed him. One of them took his picture with a cellphone. “Welcome to the gang database!” the officer said.

Since then he’s been stopped more than a dozen times, he says, and told that if he doesn’t like it he should move. He has nowhere to go. His girlfriend just had a baby girl, and he wants to be around for them. “They say you’re in the system, you can’t lie to us,” he says. “I tell them, ‘How can I be in the hood if I haven’t got jumped in? Can’t you guys tell people who bang and who don’t?’ They go by their facts, not the real facts.”

The police, on autopilot with Palantir, are driving Rios toward his gang friends, not away from them, worries Mariella Saba, a neighbor and community organizer who helped him get off meth. When whole communities like East L.A. are algorithmically scraped for pre-crime suspects, data is destiny, says Saba. “These are systemic processes. When people are constantly harassed in a gang context, it pushes them to join. They internalize being told they’re bad.”
palantir  surveillance  privacy  precrime  spidergrams  future  la  gangs  justice  algorithms  data-protection  data-privacy  policing  harrassment 
april 2018 by jm
A flaw-by-flaw guide to Facebook’s new GDPR privacy changes | TechCrunch
Overall, it seems like Facebook is complying with the letter of GDPR law, but with questionable spirit. Sure, privacy is boring to a lot of people. Too little info and they feel confused and scared. Too many choices and screens and they feel overwhelmed and annoyed. Facebook struck the right balance in some places here. But the subtly pushy designs seem intended to steer people away from changing their defaults in ways that could hamper Facebook’s mission and business.
gdpr  design  facebook  privacy  data-protection  data-privacy  social-networking  eu  law 
april 2018 by jm
London police’s use of AFR facial recognition falls flat on its face
A “top-of-the-line” automated facial recognition (AFR) system trialled for the second year in a row at London’s Notting Hill Carnival couldn’t even tell the difference between a young woman and a balding man, according to a rights group worker invited to view it in action. Because yes, of course they did it again: London’s Met police used controversial, inaccurate, largely unregulated automated facial recognition (AFR) technology to spot troublemakers. And once again, it did more harm than good.

Last year, it proved useless. This year, it proved worse than useless: it blew up in their faces, with 35 false matches and one wrongful arrest of somebody erroneously tagged as being wanted on a warrant for a rioting offense.

[...] During a recent, scathing US House oversight committee hearing on the FBI’s use of the technology, it emerged that 80% of the people in the FBI database don’t have any sort of arrest record. Yet the system’s recognition algorithm inaccurately identifies them during criminal searches 15% of the time, with black women most often being misidentified.
face-recognition  afr  london  notting-hill-carnival  police  liberty  met-police  privacy  data-privacy  algorithms 
september 2017 by jm
'Let’s all survive the GDPR'
Simon McGarr and John Looney's slides from their SRECon '17 presentation
simon-mcgarr  data-privacy  privacy  data-protection  gdpr  slides  presentations 
september 2017 by jm
Australian Doctor on Twitter: "Outcry as MyHealthRecord default privacy setting left open to universal access"
Funnily enough, this is exactly what Ross Anderson warned about 10 years ago re patient record digitisation in the UK.

'Occupational therapists working for an employer, doctors working for insurance companies, a dietitian, an optometrist or a dentist or their staff can view the [patient] record and see if individuals have a sexually transmitted disease, a mental illness, have had an abortion or are using Viagra.'
privacy  heaith  australia  myhealthrecord  data-protection  data-privacy  healthcare  medicine 
april 2017 by jm
Research Blog: Federated Learning: Collaborative Machine Learning without Centralized Training Data
Great stuff from Google - this is really nifty stuff for large-scale privacy-preserving machine learning usage:

It works like this: your device downloads the current model, improves it by learning from data on your phone, and then summarizes the changes as a small focused update. Only this update to the model is sent to the cloud, using encrypted communication, where it is immediately averaged with other user updates to improve the shared model. All the training data remains on your device, and no individual updates are stored in the cloud.

Federated Learning allows for smarter models, lower latency, and less power consumption, all while ensuring privacy. And this approach has another immediate benefit: in addition to providing an update to the shared model, the improved model on your phone can also be used immediately, powering experiences personalized by the way you use your phone.

Papers:
https://arxiv.org/pdf/1602.05629.pdf , https://arxiv.org/pdf/1610.05492.pdf
google  ml  machine-learning  training  federated-learning  gboard  models  privacy  data-privacy  data-protection 
april 2017 by jm
Self-driving cars: overlooking data privacy is a car crash waiting to happen
Interesting point -- self-driving cars are likely to be awash in telemetry data, "phoned home"
self-driving  cars  vehicles  law  data  privacy  data-privacy  surveillance 
july 2016 by jm
Amazon Echo security fail
Ughhhh.
Amazon Echo sends your WiFi password to Amazon. No option to disable. Trust us it's in an "encrypted file"
amazon  echo  wifi  passwords  security  data-privacy  data-protection 
january 2016 by jm
WePromise.EU
'The European election will take place between 22 and 25 May 2014. Citizens, promise to vote for candidates that have signed a 10-point charter of digital rights! Show candidates that they need to earn your vote by signing our charter!'
europarl  ep  digital-rights  rights  ireland  eu  data-privacy  data-protection  privacy 
march 2014 by jm
Big doubts on big data: Why I won't be sharing my medical data with anyone - yet
These problems can be circumvented, but they must be dealt with, publically and soberly, if the NHS really does want to win public confidence. The NHS should approach selling the scheme to the public as if was opt-in, not opt-out, then work to convince us to join it. Tell us how sharing our data can help, but tell us what risk too. Let us decide if that balance is worth it. If it's found wanting, the NHS must go back to the drawing board and retool the scheme until it is. It's just too important to get wrong.
nhs  uk  privacy  data-protection  data-privacy  via:mynosql  big-data  healthcare  insurance 
february 2014 by jm
UK NHS will soon require GPs pass confidential medical data to third parties
Specifically, unanonymised, confidential, patient-identifying data, for purposes of "admin, healthcare planning, and research", to be held indefinitely, via the HSCIC. Opt-outs may be requested, however
opt-out  privacy  medical  data  healthcare  nhs  uk  data-privacy  data-protection 
january 2014 by jm
Experian Sold Consumer Data to ID Theft Service
This is what happens when you don't have strong controls on data protection/data privacy -- the US experience.
While [posing as a US-based private investigator] may have gotten the [Vietnam-based gang operating the massive identity fraud site Superget.info] past Experian and/or CourtVentures’ screening process, according to Martin there were other signs that should have alerted Experian to potential fraud associated with the account. For example, Martin said the Secret Service told him that the alleged proprietor of Superget.info had paid Experian for his monthly data access charges using wire transfers sent from Singapore.

“The issue in my mind was the fact that this went on for almost a year after Experian did their due diligence and purchased” Court Ventures, Martin said. “Why didn’t they question cash wires coming in every month? Experian portrays themselves as the data-breach experts, and they sell identity theft protection services. How this could go on without them detecting it I don’t know. Our agreement with them was that our information was to be used for fraud prevention and ID verification, and was only to be sold to licensed and credentialed U.S. businesses, not to someone overseas.”


via Simon McGarr
via:tupp_ed  privacy  security  crime  data-protection  data-privacy  experian  data-breaches  courtventures  superget  scams  fraud  identity  identity-theft 
october 2013 by jm
Irish EU Council Presidency proposes destruction of right to privacy | EDRI
'For example, based on the current situation in Ireland, the idea is that all companies can do whatever they want with personal data, without fear of sanction. Sanctions, such as fines, “should be optional or at least conditional upon a prior warning or reprimand”. In other words, do what you want, the worst that can happen is that you will receive a warning.' Shame! Daragh O'Brien's comment: 'utter idiocy'. ( at https://twitter.com/daraghobrien/status/292041500873850880 )
privacy  ireland  eu  fail  data-protection  data-privacy  politics 
january 2013 by jm

related tags

aadhaar  abs  aclu  ads  ads.txt  adtech  advertising  afr  ai  algorithms  amazon  anonymisation  anonymization  australia  big-data  cancer  cars  ccc  china  civil-rights  co2  consent  contact-tracing  courtventures  covid-19  creepy  crime  crumlin-hospital  data  data-breaches  data-collection  data-privacy  data-protection  databases  deanonymization  deidentification  deletefacebook  design  digital-rights  dna  dna-matching  dpc  dpo  dystopia  echo  ep  ethics  eu  europarl  europe  experian  face-recognition  facebook  faces  facial-recognition  fail  federated-learning  fitbit  flickr  fraud  future  gangs  gboard  gdpr  gedmatch  genealogy  genes  genetics  genome  genomes  genomics  gmi  google  great-oak  grim  grim-meathook-future  harpa  harrassment  heaith  health  healthcare  ice  identity  identity-theft  ijop  immigration  india  insurance  ireland  justice  karlin-lillington  la  law  liberty  life  location-data  london  machine-learning  medical  medicine  met-police  ml  mobile  models  murder  myhealthrecord  nhs  notting-hill-carnival  open-data  opt-out  palantir  pandemics  papers  passwords  photos  pinboard  police  policing  politics  precrime  presentations  privacy  profiling  psc  rape  rekognition  rights  robots  scams  scraping  security  self-driving  simon-mcgarr  slides  social  social-media  social-networking  spidergrams  superget  surveillance  surveillance-capitalism  technology  terror  tracking  training  trump  uighurs  uk  us-politics  vehicles  via:mynosql  via:tupp_ed  wgs  wifi  wuxi-nextcode 

Copy this bookmark:



description:


tags: