Facebook Wanted Us to Kill This Investigative Tool


24 bookmarks. First posted by ryanpitts 14 days ago.


They said we would need to shut down the tool (which was impossible because it’s an open source tool) and delete any data we collected (which was also impossible because the information was stored on individual users’ computers; we weren’t collecting it centrally).
We argued that we weren’t seeking access to users’ accounts or collecting any information from them; we had just given users a tool to log into their own accounts on their own behalf, to collect information they wanted collected, which was then stored on their own computers. Facebook disagreed and escalated the conversation to their head of policy for Facebook’s Platform, who said they didn’t want users entering their Facebook credentials anywhere that wasn’t an official Facebook site—because anything else is bad security hygiene and could open users up to phishing attacks. She said we needed to take our tool off Github within a week.
fb  privacy  opensource  githubeg  law  t&c 
12 days ago by paulbradshaw
How Wanted Group to Kill An Investigative Reporting Data Tool ? via
from twitter_favs
13 days ago by girma
the lengths to which facebook will go is disgusting
nice-thinking  social-media 
13 days ago by mozzarella
RT : Facebook wasn't happy that we built a tool to let users collect their own 'People You May Know Data'
from twitter
14 days ago by ashaw
Last year, we launched an investigation into how Facebook’s People You May Know tool makes its creepily accurate recommendations. By November, we had it mostly figured out: Facebook has nearly limitless access to all the phone numbers, email addresses, home addresses, and social media handles most people on Earth have ever used. That, plus its deep mining of people’s messaging behavior on Android, means it can make surprisingly insightful observations about who you know in real life—even if it’s wrong about your desire to be “friends” with them on Facebook.
14 days ago by burin
The episode demonstrated a huge problem to us: Journalists need to probe technological platforms in order to understand how unseen and little understood algorithms influence the experiences of hundreds of millions of people—whether it’s to better understand creepy friend recommendations, to uncover the potential for discrimination in housing ads, to understand how the fake follower economy operates, or to see how social networks respond to imposter accounts. Yet journalistic projects that require scraping information from tech platforms or creating fictitious accounts generally violate these sites’ terms of service.
privacy  facebook  tech_journalism 
14 days ago by gwijthoff
"The episode demonstrated a huge problem to us: Journalists need to probe technological platforms in order to understand how unseen and little understood algorithms influence the experiences of hundreds of millions of people—whether it’s to better understand creepy friend recommendations, to uncover the potential for discrimination in housing ads, to understand how the fake follower economy operates, or to see how social networks respond to imposter accounts. Yet journalistic projects that require scraping information from tech platforms or creating fictitious accounts generally violate these sites’ terms of service."
facebook 
14 days ago by lief
That’s why we’re telling this story for the first time: When we released a tool to help people study their People You Know recommendations, Facebook wasn’t…
from instapaper
14 days ago by mathewi
RT : Facebook wasn't happy that we built a tool to let users collect their own 'People You May Know Data'
from twitter
14 days ago by dwillis
Facebook wasn't happy that we built a tool to let users collect their own 'People You May Know Data'
from twitter_favs
14 days ago by ryanpitts