Copy this bookmark:



bookmark detail

Remarks at the SASE Panel On The Moral Economy of Tech
First, programmers are trained to seek maximal and global solutions. Why solve a specific problem in one place when you can fix the general problem for everybody, and for all time? We don't think of this as hubris, but as a laudable economy of effort. And the startup funding culture of big risk, big reward encourages this grandiose mode of thinking. There is powerful social pressure to avoid incremental change, particularly any change that would require working with people outside tech and treating them as intellectual equals.

Machine learning is like money laundering for bias. It's a clean, mathematical apparatus that gives the status quo the aura of logical inevitability. The numbers don't lie.

The reality is, opting out of surveillance capitalism means opting out of much of modern life.

We tend to imagine dystopian scenarios as one where a repressive government uses technology against its people. But what scares me in these scenarios is that each one would have broad social support, possibly majority support. Democratic societies sometimes adopt terrible policies.

We should not listen to people who promise to make Mars safe for human habitation, until we have seen them make Oakland safe for human habitation.

Techies will complain that trivial problems of life in the Bay Area are hard because they involve politics. But they should involve politics. Politics is the thing we do to keep ourselves from murdering each other.
ai-policy  scary 
april 2018 by elrob
view in context