Superintelligence: The Idea That Eats Smart People
"Note especially that the constructs we use in AI are fairly opaque after training. They don't work in the way that the superintelligence scenario needs them to work. There's no place to recursively tweak to make them "better", short of retraining on even more data."
ai  machinelearning  futurism 
You’re working in the wrong place. – Amar Singh – Medium
Open plan offices "optimize for a long tail event at the expense of something every single employee will benefit from — focus."
office  architecture  tech 
3 days ago
pretrained.ml - Deep learning models with demos
Sortable and searchable compilation of pre-trained deep learning models. With demos and code.
9 days ago
Who Will Command The Robot Armies?
The people who design these devices don't think about how they are supposed to peacefully coexist in a world full of other smart objects.

This raises the question of who will step up and figure out how to make the Internet of Things work together as a cohesive whole.

Of course, the answer is hackers!

Over the last two decades, the government's ability to spy on its citizens has grown immeasurably.

Mostly this is due to technology transfer from the commercial Internet, whose economic model is mass surveillance. Techniques and software that work in the marketplace are quickly adopted by intelligence agencies worldwide.
technology  politics 
10 days ago
Are Index Funds Communist? - Bloomberg
Again: I know this is silly. But as a wild extrapolation of the far future of financial capitalism, I submit to you that it is less silly than the  "Silent Road to Serfdom" thesis. That thesis is that, in the long run, financial markets will tend toward mindlessness, a sort of central planning -- by an index fund -- that is worse than 1950s communism because it's not even trying to make the right decisions.

The alternative view is that, in the long run, financial markets will tend toward perfect knowledge, a sort of central planning -- by the Best Capital Allocating Robot -- that is better than Marxism because it is perfectly informed and ideally rational. And once you have that, you can shut down the market: The game is over, and the Best Capital Allocating Robot won. The Fraser-Jenkins thesis is that algorithmic investing runs the risk of destroying capitalism by abandoning the pursuit of knowledge. But the really fun alternative is that it runs the risk of destroying capitalism by perfecting that pursuit: Once you have solved the socialist calculation problem, what do you need markets for?
economics  algorithms  finance 
10 days ago
Why is Kullback-Leibler divergence not a distance?
As an example, consider the probability densities below, one exponential and one gamma with a shape parameter of 2.

The two densities differ mostly on the left end. The exponential distribution believes this region is likely while the gamma does not. This means that an expectation with respect to the exponential distribution will weigh things in this region more heavily. In an information-theoretic sense, an exponential is a better approximation to a gamma than the other way around.
statistics  InformationTheory 
13 days ago
[1710.06068] Data analysis recipes: Using Markov Chain Monte Carlo
David W. Hogg (Flatiron) (NYU) (MPIA), Daniel Foreman-Mackey (Flatiron) (UW)

Markov Chain Monte Carlo (MCMC) methods for sampling probability density functions (combined with abundant computational resources) have transformed the sciences, especially in performing probabilistic inferences, or fitting models to data. In this primarily pedagogical contribution, we give a brief overview of the most basic MCMC method and some practical advice for the use of MCMC in real inference problems. We give advice on method choice, tuning for performance, methods for initialization, tests of convergence, troubleshooting, and use of the chain output to produce or report parameter estimates with associated uncertainties. We argue that autocorrelation time is the most important test for convergence, as it directly connects to the uncertainty on the sampling estimate of any quantity of interest. We emphasize that sampling is a method for doing integrals; this guides our thinking about how MCMC output is best used.
statistics  probabilisticprogramming  probability  machinelearning  maths 
16 days ago
Beautiful differentiation
Automatic differentiation (AD) is a precise, efficient, and convenient method for computing derivatives of functions. Its forward-mode implementation can be quite simple even when extended to compute all of the higher-order derivatives as well. The higher-dimensional case has also been tackled, though with extra complexity. This paper develops an implementation of higher-dimensional, higher-order, forward-mode AD in the extremely general and elegant setting of calculus on manifolds and derives that implementation from a simple and precise specification.

In order to motivate and discover the implementation, the paper poses the question “What does AD mean, independently of implementation?” An answer arises in the form of naturality of sampling a function and its derivative. Automatic differentiation flows out of this naturality condition, together with the chain rule. Graduating from first-order to higher-order AD corresponds to sampling all derivatives instead of just one. Next, the setting is expanded to arbitrary vector spaces, in which derivative values are linear maps. The specification of AD adapts to this elegant and very general setting, which even simplifies the development.
haskell  math  programming 
16 days ago
Floating Point Demystified, Part 1
To turn things around, think about time_t. time_t is a type defined to represent the number of seconds since the epoch of 1970-01-01 00:00 UTC. It has traditionally been defined as a 32-bit signed integer (which means that it will overflow in the year 2038). Imagine that a 32-bit single-precision float had been chosen instead.

With a float time_t, there would be no overflow until the year 5395141535403007094485264579465 AD, long after the Sun has swallowed up the Earth as a Red Giant, and turned into a Black Dwarf. However! With this scheme the granularity of timekeeping would get worse and worse the farther we got from 1970. Unlike the int32 which gives second granularity all the way until 2038, with a float time_t we would already in 2014 be down to a precision of 128 seconds — far too coarse to be useful.
math  programming 
16 days ago
Michael Clayton’s Motorola Pebl: A Remembrance
The phone is as perfectly chosen for drama as it is for characterization. There is a showmanship to opening a flip phone — a dramatic flair that’s not possible with the touchscreen swipe. According to an early 2006 CNET review, “When you hold it in one hand, you can open the Pebl by sliding the front flap toward you (away from the hinge) with your thumb. The flap then flips open in one easy stroke … It opens so quickly and with such force that we felt as if the Pebl would fly out of our hand.” (You know what else opens “so quickly and with such force”? The 2007 motion picture Michael Clayton, which begins with a blistering monologue by the great Tom Wilkinson.)
16 days ago
Hack the derivative
Finite difference with complex analysis trick
calculus  math  python 
17 days ago
Web Design - The First 100 Years
When things are doubling, the only sane place to be is at the cutting edge. By definition, exponential growth means the thing that comes next will be equal in importance to everything that came before. So if you're not working on the next big thing, you're nothing.

This leads to a contempt for the past. Too much of what was created in the last fifty years is gone because no one took care to preserve it.

So the world of the near future is one of power constrained devices in a bandwidth-constrained environment. It's very different from the recent past, where hardware performance went up like clockwork, with more storage and faster CPUs every year.

And as designers, you should be jumping up and down with relief, because hard constraints are the midwife to good design. The past couple of decades have left us with what I call an exponential hangover.

A further symptom of our exponential hangover is bloat. As soon as a system shows signs of performance, developers will add enough abstraction to make it borderline unusable. Software forever remains at the limits of what people will put up with. Developers and designers together create overweight systems in hopes that the hardware will catch up in time and cover their mistakes.

Technological Utopianism has been tried before and led to some pretty bad results. There's no excuse for not studying the history of positivism, scientific Marxism and other attempts to rationalize the world, before making similar promises about what you will do with software.
technology  design  culture  history  aviation  web 
18 days ago
How I Start. Go
Excellent, practical intro tutorial to Go
programming  golang 
18 days ago
Time Bandits | Rick Perlstein
Comparing Trump and Nixon

"I peered into the blackness in my remote studio in Chicago as Heilemann asked me in my earpiece, “What parallels do you see between Trump as a candidate and the way Nixon ran in ’68 and ’72?” I said there were some, but the demagoguery that marked Trump should more accurately be traced to the broader context of Republican electioneering going back to Joseph McCarthy. I suggested that the genealogy of Trumpism runs not just through Nixon but also through Reagan and Newt Gingrich’s revolution of 1994, and really through all previous Republican campaigns. I also cautioned that, in important respects, the dunderheaded Trump was a very poor heir indeed to an experienced and subtle political and geostrategic actor like Nixon. I noted that “the candidate in 1968 who really defined the Trump position was this guy George Wallace,” and suggested that we need to begin broadening the discussion to encompass Europe’s experience with fascism if we really want to understand Trump. What’s more, I said, considering that the ur-establishment candidate Jeb Bush announced that he, too, would consider a ban on Muslim immigration, we need to think more about the Republican Party as an institution and less about Trump as an individual."
politics  usa  history  1960s  republican 
18 days ago
@20 (Ftrain.com)
I'm in the middle right now. Young company, young kids, unfinished book, 40s, sore back, facing bariatric uncertainties and paying down the mortgage. 20 years is arbitrary nonsense. A blip. Our software is bullshit, our literary essays are too long, the good editors all quit or got fired, hardly anyone is experimenting with form in a way that wakes me up, the IDEs haven't caught up with the 1970s, the R&D budgets are weak, the little zines are badly edited, the tweets are poor, the short stories make no sense, people still care too much about magazines, the Facebook posts are nightmares, LinkedIn has ruined capitalism, and the big tech companies that have arisen are exhausting, lumbering gold-thirsty kraken that swim around with sour looks on their face wondering why we won't just give them all our gold and save the time. With every flap of their terrible fins they squash another good idea in the interest of consolidating pablum into a single database, the better to jam it down our mental baby duck feeding tubes in order to make even more of the cognitive paté that Silicon Valley is at pains to proclaim a delicacy. Social media is veal calves being served tasty veal. In the spirit of this thing I won't be editing this paragraph.
internet  culture  politics  business  writing 
26 days ago
« earlier      
1970s 20c academia accommodation advertising aerospace ai algorithms amazon analytics anecdata anomaly antarctica api apple architecture arxiv astro asyncio audio awk aws backup bash bayes bias bigdata book books brexit business c calculus california capitalism car causality chrome cia climate clojure cloudfront communication computers concurrency conference cosmology crime cryptocurrency cryptography cs csv culture data database dataengineering datascience death deeplearning design devops differentialprivacy diversity dns docker economics education email engineering english equity espionage ethics eu europe facebook family fatml feminism fiction film finance fintech fpga functional fzf gans gaussianprocesses gawker git github golang google gpu gradschool h1b hardware haskell health hiring history homomorphic housing http https humor humour immigration infrastructure internet interpretability interview investments ipython java job jobs journal journalism js json jupyter kdb keras kubernetes labour lambda language law lda legal linearalgebra linux literature losangeles lstm mac machinelearning macos make management map mapreduce maps marketing mars math maths me media module money music name network neuralnetworks neuroscience newyork nlp notebook numpy nyc oop optimization os oxford package pandas phone photoshop physics politics polling presentation privacy probabilisticprogramming probability product professional programming psephology publishing pymc3 pytest python quant r race racism ransomware recipe recommendation reinforcementlearning religion remote republican research rest review rnn route53 ruby russia rust s3 safety sanfrancisco scala science sciencewriting scientism scifi scipy search security sentiment serverless sexism shell siliconvalley slack social socialmedia space spark sql sre ssh ssl stan startup statistics streaming style summarization supervised talk tax tech technology tensorflow testing text timemachine timeseries tmux translation transport travel trump tsne tutorial tv twitter uber uk unix urban usa versioncontrol video vim visa visualization vpn web webdev word2vec

Copy this bookmark: