No one’s coming. It’s up to us. – Dan Hon – Medium


50 bookmarks. First posted by kohlmannj 10 days ago.


Noting that Metcafe's Law and Barlow's Manifesto may need to be challenged
technology  ethics  society  2018 
3 days ago by mechazoidal
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
from instapaper
4 days ago by spinnerin
1. Clearly decide what kind of society we want; and then
2. Design and deliver the technologies that forever get us closer to achieving that desired society.
design  ethics  technology  society 
4 days ago by maxbarners
Transcribed talk about ethical computing.

Couple takeaways:
* "Metcalfe's law": value of a network is proportional to the square of number of users (measuring network effect)
* "When Barlow said 'the fact remains that there is not much one can do about bad behavior online except to take faith that the vast majority of what goes on there is not bad behavior,' his position was that we should accept the current state of affairs because there is literally no room for improvement."
humanist-technology  technology 
6 days ago by gunsch
No one’s coming. It’s up to us. – Dan Hon – Medium
from twitter
6 days ago by jackysee
“No one’s coming. It’s up to us.” by
from twitter_favs
6 days ago by cianw
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
from instapaper
6 days ago by iany
Favorite tweet: hondanhon

2 things:

1. This is the Foo Camp talk I gave last year on the need for us technologists to accept our responsibility to society: https://t.co/RQaZnq5AZm

2. Help! I'm pulling things I see together. I tried to list influences, sources, academics etc. Who/what have I missed?

— dan hon (@hondanhon) February 12, 2018

http://twitter.com/hondanhon/status/963050055828582401
IFTTT  twitter  favorite 
7 days ago by tswaterman
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
from instapaper
7 days ago by than
“the problem may be due to focusing on a specific requirement (“free speech”) rather than the full user story (“As a user, I need freedom of speech, so that I can pursue life, liberty and happiness”). Free speech is a means to an end, not an end”https://t.co/QOWB2TpO9N

— Jane Dallaway (@JaneDallaway) February 12, 2018
IFTTT  Twitter 
7 days ago by JaneDallaway
Dan Hon on the role & responsibility of humanist technologists.
newsletter  thingsconnl  design  technology  ethics 
7 days ago by thewavingcat
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
from instapaper
7 days ago by anglepoised
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
from instapaper
7 days ago by pelles
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
from instapaper
8 days ago by h-lame
RT : If you have the privilege to choose what to work on, consider this perspective
from twitter_favs
8 days ago by rtanglao
Dan Hon | Medium
One of the ways people think about and illustrate Metcalfe’s Law is by understanding that a telephone or fax machine isn’t particularly useful on its own (the examples go to show how comparatively old the law is), but that two are much more useful. Four even more so, and so on. These understandings are a bit like folk understandings.
Lately, Metcalfe’s law has been applied to social networks and not just to equipment that’s connected together. Applying Metcalfe’s law to social networks is a sort of folk understanding that the more users your social network has, the more valuable it is. If you value value, then, the law points you in the direction of increasing the number of your users.

The problem is, I like to ask dumb questions. Asking dumb questions has served me well in my career. Asking dumb questions in this case means I have two simple questions about the law:
First: why do we call Metcalfe’s law a law? Has it been proven to be true?
Second: What do we mean, exactly, by value?
technology  society  networks  SocialMedia  facebook  google  platforms  twitter  from twitter
8 days ago by loughlin
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
8 days ago by jkleske
"Getting from here to there

This is all very well and good. But what can we do? And more precisely, what “we”? There’s increasing acceptance of the reality that the world we live in is intersectional and we all play different and simultaneous roles in our lives. The society of “we” includes technologists who have a chance of affecting the products and services, it includes customers and users, it includes residents and citizens.

I’ve made this case above, but I feel it’s important enough to make again: at a high level, I believe that we need to:

1. Clearly decide what kind of society we want; and then

2. Design and deliver the technologies that forever get us closer to achieving that desired society.

This work is hard and, arguably, will never be completed. It necessarily involves compromise. Attitudes, beliefs and what’s considered just changes over time.

That said, the above are two high level goals, but what can people do right now? What can we do tactically?

What we can do now

I have two questions that I think can be helpful in guiding our present actions, in whatever capacity we might find ourselves.

For all of us: What would it look like, and how might our societies be different, if technology were better aligned to society’s interests?

At the most general level, we are all members of a society, embedded in existing governing structures. It certainly feels like in the recent past, those governing structures are coming under increasing strain, and part of the blame is being laid at the feet of technology.

One of the most important things we can do collectively is to produce clarity and prioritization where we can. Only by being clearer and more intentional about the kind of society we want and accepting what that means, can our societies and their institutions provide guidance and leadership to technology.

These are questions that cannot and should not be left to technologists alone. Advances in technology mean that encryption is a societal issue. Content moderation and censorship are a societal issue. Ultimately, it should be for governments (of the people, by the people) to set expectations and standards at the societal level, not organizations accountable only to a board of directors and shareholders.

But to do this, our governing institutions will need to evolve and improve. It is easier, and faster, for platforms now to react to changing social mores. For example, platforms are responding in reaction to society’s reaction to “AI-generated fake porn” faster than governing and enforcing institutions.

Prioritizations may necessarily involve compromise, too: the world is not so simple, and we are not so lucky, that it can be easily and always divided into A or B, or good or not-good.

Some of my perspective in this area is reflective of the schism American politics is currently experiencing. In a very real way, America, my adoptive country of residence, is having to grapple with revisiting the idea of what America is for. The same is happening in my country of birth with the decision to leave the European Union.

These are fundamental issues. Technologists, as members of society, have a point of view on them. But in the way that post-enlightenment governing institutions were set up to protect against asymmetric distribution of power, technology leaders must recognize that their platforms are now an undeniable, powerful influence on society.

As a society, we must do the work to have a point of view. What does responsible technology look like?

For technologists: How can we be humane and advance the goals of our society?

As technologists, we can be excited about re-inventing approaches from first principles. We must resist that impulse here, because there are things that we can do now, that we can learn now, from other professions, industries and areas to apply to our own. For example:

* We are better and stronger when we are together than when we are apart. If you’re a technologist, consider this question: what are the pros and cons of unionizing? As the product of a linked network, consider the question: what is gained and who gains from preventing humans from linking up in this way?

* Just as we create design patterns that are best practices, there are also those that represent undesired patterns from our society’s point of view known as dark patterns. We should familiarise ourselves with them and each work to understand why and when they’re used and why their usage is contrary to the ideals of our society.

* We can do a better job of advocating for and doing research to better understand the problems we seek to solve, the context in which those problems exist and the impact of those problems. Only through disciplines like research can we discover in the design phase — instead of in production, when our work can affect millions — negative externalities or unintended consequences that we genuinely and unintentionally may have missed.

* We must compassionately accept the reality that our work has real effects, good and bad. We can wish that bad outcomes don’t happen, but bad outcomes will always happen because life is unpredictable. The question is what we do when bad things happen, and whether and how we take responsibility for those results. For example, Twitter’s leadership must make clear what behaviour it considers acceptable, and do the work to be clear and consistent without dodging the issue.

* In America especially, technologists must face the issue of free speech head-on without avoiding its necessary implications. I suggest that one of the problems culturally American technology companies (i.e., companies that seek to emulate American culture) face can be explained in software terms. To use agile user story terminology, the problem may be due to focusing on a specific requirement (“free speech”) rather than the full user story (“As a user, I need freedom of speech, so that I can pursue life, liberty and happiness”). Free speech is a means to an end, not an end, and accepting that free speech is a means involves the hard work of considering and taking a clear, understandable position as to what ends.

* We have been warned. Academics — in particular, sociologists, philosophers, historians, psychologists and anthropologists — have been warning of issues such as large-scale societal effects for years. Those warnings have, bluntly, been ignored. In the worst cases, those same academics have been accused of not helping to solve the problem. Moving on from the past, is there not something that we technologists can learn? My intuition is that post the 2016 American election, middle-class technologists are now afraid. We’re all in this together. Academics are reaching out, have been reaching out. We have nothing to lose but our own shame.

* Repeat to ourselves: some problems don’t have fully technological solutions. Some problems can’t just be solved by changing infrastructure. Who else might help with a problem? What other approaches might be needed as well?

There’s no one coming. It’s up to us.

My final point is this: no one will tell us or give us permission to do these things. There is no higher organizing power working to put systemic changes in place. There is no top-down way of nudging the arc of technology toward one better aligned with humanity.

It starts with all of us.

Afterword

I’ve been working on the bigger themes behind this talk since …, and an invitation to 2017’s Foo Camp was a good opportunity to try to clarify and improve my thinking so that it could fit into a five minute lightning talk. It also helped that Foo Camp has the kind of (small, hand-picked — again, for good and ill) influential audience who would be a good litmus test for the quality of my argument, and would be instrumental in taking on and spreading the ideas.

In the end, though, I nearly didn’t do this talk at all.

Around 6:15pm on Saturday night, just over an hour before the lightning talks were due to start, after the unconference’s sessions had finished and just before dinner, I burst into tears talking to a friend.

While I won’t break the societal convention of confidentiality that helps an event like Foo Camp be productive, I’ll share this: the world felt too broken.

Specifically, the world felt broken like this: I had the benefit of growing up as a middle-class educated individual (albeit, not white) who believed he could trust that institutions were a) capable and b) would do the right thing. I now live in a country where a) the capability of those institutions has consistently eroded over time, and b) those institutions are now being systematically dismantled, to add insult to injury.

In other words, I was left with the feeling that there’s nothing left but ourselves.

Do you want the poisonous lead removed from your water supply? Your best bet is to try to do it yourself.

Do you want a better school for your children? Your best bet is to start it.

Do you want a policing policy that genuinely rehabilitates rather than punishes? Your best bet is to…

And it’s just. Too. Much.

Over the course of the next few days, I managed to turn my outlook around.

The answer, of course, is that it is too much for one person.

But it isn’t too much for all of us."
danhon  technology  2018  2017  johnperrybarlow  ethics  society  calltoaction  politics  policy  purpose  economics  inequality  internet  web  online  computers  computing  future  design  debchachra  ingridburrington  fredscharmen  maciejceglowski  timcarmody  rachelcoldicutt  stacy-marieishmael  sarahjeong  alexismadrigal  ericmeyer  timmaughan  mimionuoha  jayowens  jayspringett  stacktivism  georginavoss  damienwilliams  rickwebb  sarawachter-boettcher  jamebridle  adamgreenfield  foocamp  timoreilly  kaitlyntiffany  fredturner  tomcarden  blainecook  warrenellis  danhill  cydharrell  jenpahljka  robinray  noraryan  mattwebb  mattjones  danachisnell  heathercamp  farrahbostic  negativeexternalities  collectivism  zeyneptufekci  maciejcegłowski 
8 days ago by robertogreco
If you are building software you can't just launch it into the world and then clutch your pearls in horror if it has unwanted societal effects. It is up to you to think about these in advance and do something about it because no ones else will.
software-will-eat-the-world  society  research  dan-hon  software 
8 days ago by mr_stru
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
from instapaper
9 days ago by wacko42
No one’s coming. It’s up to us. – Dan Hon – Medium via Instapaper http://ift.tt/2nXg8nV
technology  design  architecture  nostalgia 
9 days ago by craniac
We’re responsible to society for the tools we make.
ethics  development  design  architecture 
9 days ago by pb
RT : A long, important and moving article from .
from twitter
9 days ago by fabianmu
No one’s coming. It’s up to us. – Dan Hon – Medium http://ift.tt/2nXg8nV
IFTTT  Instapaper 
9 days ago by ldodds
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at #foocamp 2017 in San Francisco on Saturday, November 4th, 2017. For context, a lightning talk is normally around five minutes long.
IFTTT  Pocket 
9 days ago by bunch
Please read: “No one’s coming. It’s up to us.” by @hondanhon https://t.co/D8cIrKUoaQ

— Ryan D Gantz, JFC (@sixfoot6) February 10, 2018
from instapaper
9 days ago by mathewi
Please read: “No one’s coming. It’s up to us.” by
from twitter_favs
9 days ago by mathpunk
“No one’s coming. It’s up to us.” by (ht )
from twitter
9 days ago by mkb
Great piece - required reading “No one’s coming. It’s up to us.” by
from twitter_favs
10 days ago by danhon
Adapted from “We Are The Very Model Of Modern Humanist Technologists”, a lightning talk given at # foocamp 2017 in San Francisco on Saturday, November 4th,…
from instapaper
10 days ago by kohlmannj