asterisk2a + hawking   8

Hawking: Humans at risk of lethal 'own goal' - BBC News
Humanity is at risk from a series of dangers of our own making, according to Prof Stephen Hawking. Nuclear war, global warming and genetically-engineered viruses are among the scenarios which he singles out. And he says that further progress in science and technology will create "new ways things can go wrong". Prof Hawking is giving this year's BBC Reith Lectures, which explore research into black holes, and his warning came in answer to audience questions. He says that assuming humanity eventually establishes colonies on other worlds, it will be able to survive. "Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years. [...] But he also said that future generations of researchers should be aware of how scientific and technological progress is changing the world, and to help the wider public understand it. //&!
humanity  blackswan  GFC  self-regulation  humanitarian  crisis  human  tragedy  geneticallyengineered  genetics  gene  editing  global  warming  climate  crisis  climate  change  food  security  National  health  crisis  antibiotics  antibiotic  resistance  post-antibiotic  era  sick  population  nuclear  power  nuclear  waste  nuclear  war  climate  science  climate  system  water  scarcity  watersupply  water  rights  water  supply  water  pollution  water  security  drinking  water  inequality  Super  Rich  1%  Wall  Street  shareholder  value  profit  maximisation  shared  economic  interest  Stephen  Hawking  ecological  disaster  environmental  disaster  mass  extinction  unknown  unkown  unintended  consequences  Fukushima  extreme  weather  weather  extreme  economic  damage  AI  artificial  intelligence  Elon  Musk 
january 2016 by asterisk2a
Nuclear 'Command And Control': A History Of False Alarms And Near Catastrophes : NPR
Mankind is a lot better at creating complex technological systems than at controlling them, // "When it comes to nuclear command and control, anything less than perfection is unacceptable because of how devastatingly powerful these weapons are," [...] His new book, Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety, is a critical look at the history of the nation's nuclear weapons systems — and a terrifying account of the fires, explosions, false attack alerts and accidentally dropped bombs that plagued America's military throughout the Cold War. [...] Command and Control Nuclear Weapons, the Damascus Accident, and the Illusion of Safety
AI  artificial  intelligence  nuclear  power  nuclear  book  Elon  Musk  Stephen  Hawking 
august 2015 by asterisk2a
Elon Musk und Stephen Hawking warnen vor autonomen Waffen - SPIEGEL ONLINE
Killer-Roboter klingen mehr nach Hollywood als nach einer akuten Bedrohung der Menschheit. Doch jetzt warnen mehr als tausend Experten für künstliche Intelligenz: Autonome Waffensysteme könnten bald das werden, "was Kalaschnikows heute sind".
AI  artificial  intelligence  Elon  Musk  Stephen  Hawking 
july 2015 by asterisk2a
Stephen Hawking to answer your questions via his first Reddit forum - CNET
In his first Reddit AMA (Ask Me Anything) forum, the renowned physicist plans to discuss his concerns that artificial intelligence could one day outsmart mankind if we're not careful.
artificial  intelligence  AI  Stephen  Hawking 
july 2015 by asterisk2a
The Future of Artificial Intelligence - Up Next - YouTube
Back in the 1990s, Jeffrey Hawkins became both rich and famous when he invented the Palm Pilot—a device that in no small way ushered in a whole new era of mobile computing. These days, though, he’s on a far more ambitious mission. His goal: to build a machine that can think and reason on its own by mimicking the workings of the human brain. In this edition of Up Next, Hawkins opines on the both risks and rewards of artificial intelligence. Series: "Up Next: Perspectives on the Future of Everything" [5/2015] // by 2040 there will be no jobs a robot can't do. *Lead of Carnegie Mellon Robotics. // &! Now Bill Gates Is 'Concerned' About Artificial Intelligence - // &! Nick Bostrom - << &! Engelbart's Law of Technology Prediction; short-term we will over predict/overestimate and long-run (bc of exponential growth of technology) we underestimate.
AI  artificial  intelligence  Stephen  Hawking  Elon  Musk  Three  Laws  of  Robotics  Future  of  Work  automation  Robotics  Software  Is  Eating  The  World  3D  printing  Manufacturing  Mobile  Creatives  Mobile  Creative  Moore's  Law  exponential  growth  augmented  intelligence  computer  science  STEM 
june 2015 by asterisk2a

related tags

1%  3D  AI  antibiotic  antibiotics  artificial  augmented  automation  blackswan  book  change  climate  computer  consequences  Creative  Creatives  crisis  damage  disability  disaster  drinking  Eating  ecological  economic  editing  Elon  environmental  era  Euthanasia  exponential  extinction  extreme  finite  food  Fukushima  Future  gene  geneticallyengineered  genetics  GFC  global  growth  Hawking  health  human  humanitarian  humanity  inequality  intelligence  interest  Is  Law  Laws  Manufacturing  mass  maximisation  Mobile  Moore's  Musk  National  nuclear  of  pollution  population  post-antibiotic  power  printing  profit  race  resistance  resources  Rich  rights  Robotics  scarcity  science  security  self-regulation  shared  shareholder  sick  singularity  Software  STEM  Stephen  Street  Super  supply  system  The  Three  tragedy  unintended  unknown  unkown  value  Wall  war  warming  waste  water  watersupply  weather  Work  World 

Copy this bookmark: