nhaliday + applicability-prereqs   29

Eliminative materialism - Wikipedia
Eliminative materialism (also called eliminativism) is the claim that people's common-sense understanding of the mind (or folk psychology) is false and that certain classes of mental states that most people believe in do not exist.[1] It is a materialist position in the philosophy of mind. Some supporters of eliminativism argue that no coherent neural basis will be found for many everyday psychological concepts such as belief or desire, since they are poorly defined. Rather, they argue that psychological concepts of behaviour and experience should be judged by how well they reduce to the biological level.[2] Other versions entail the non-existence of conscious mental states such as pain and visual perceptions.[3]

Eliminativism about a class of entities is the view that that class of entities does not exist.[4] For example, materialism tends to be eliminativist about the soul; modern chemists are eliminativist about phlogiston; and modern physicists are eliminativist about the existence of luminiferous aether. Eliminative materialism is the relatively new (1960s–1970s) idea that certain classes of mental entities that common sense takes for granted, such as beliefs, desires, and the subjective sensation of pain, do not exist.[5][6] The most common versions are eliminativism about propositional attitudes, as expressed by Paul and Patricia Churchland,[7] and eliminativism about qualia (subjective interpretations about particular instances of subjective experience), as expressed by Daniel Dennett and Georges Rey.[3] These philosophers often appeal to an introspection illusion.

In the context of materialist understandings of psychology, eliminativism stands in opposition to reductive materialism which argues that mental states as conventionally understood do exist, and that they directly correspond to the physical state of the nervous system.[8][need quotation to verify] An intermediate position is revisionary materialism, which will often argue that the mental state in question will prove to be somewhat reducible to physical phenomena—with some changes needed to the common sense concept.

Since eliminative materialism claims that future research will fail to find a neuronal basis for various mental phenomena, it must necessarily wait for science to progress further. One might question the position on these grounds, but other philosophers like Churchland argue that eliminativism is often necessary in order to open the minds of thinkers to new evidence and better explanations.[8]
concept  conceptual-vocab  philosophy  ideology  thinking  metameta  weird  realness  psychology  cog-psych  neurons  neuro  brain-scan  reduction  complex-systems  cybernetics  wiki  reference  parallax  truth  dennett  within-without  the-self  subjective-objective  absolute-relative  deep-materialism  new-religion  identity  analytical-holistic  systematic-ad-hoc  science  theory-practice  theory-of-mind  applicability-prereqs  nihil  lexical 
april 2018 by nhaliday
Prisoner's dilemma - Wikipedia
caveat to result below:
An extension of the IPD is an evolutionary stochastic IPD, in which the relative abundance of particular strategies is allowed to change, with more successful strategies relatively increasing. This process may be accomplished by having less successful players imitate the more successful strategies, or by eliminating less successful players from the game, while multiplying the more successful ones. It has been shown that unfair ZD strategies are not evolutionarily stable. The key intuition is that an evolutionarily stable strategy must not only be able to invade another population (which extortionary ZD strategies can do) but must also perform well against other players of the same type (which extortionary ZD players do poorly, because they reduce each other's surplus).[14]

Theory and simulations confirm that beyond a critical population size, ZD extortion loses out in evolutionary competition against more cooperative strategies, and as a result, the average payoff in the population increases when the population is bigger. In addition, there are some cases in which extortioners may even catalyze cooperation by helping to break out of a face-off between uniform defectors and win–stay, lose–switch agents.[8]

https://alfanl.com/2018/04/12/defection/
Nature boils down to a few simple concepts.

Haters will point out that I oversimplify. The haters are wrong. I am good at saying a lot with few words. Nature indeed boils down to a few simple concepts.

In life, you can either cooperate or defect.

Used to be that defection was the dominant strategy, say in the time when the Roman empire started to crumble. Everybody complained about everybody and in the end nothing got done. Then came Jesus, who told people to be loving and cooperative, and boom: 1800 years later we get the industrial revolution.

Because of Jesus we now find ourselves in a situation where cooperation is the dominant strategy. A normie engages in a ton of cooperation: with the tax collector who wants more and more of his money, with schools who want more and more of his kid’s time, with media who wants him to repeat more and more party lines, with the Zeitgeist of the Collective Spirit of the People’s Progress Towards a New Utopia. Essentially, our normie is cooperating himself into a crumbling Western empire.

Turns out that if everyone blindly cooperates, parasites sprout up like weeds until defection once again becomes the standard.

The point of a post-Christian religion is to once again create conditions for the kind of cooperation that led to the industrial revolution. This necessitates throwing out undead Christianity: you do not blindly cooperate. You cooperate with people that cooperate with you, you defect on people that defect on you. Christianity mixed with Darwinism. God and Gnon meet.

This also means we re-establish spiritual hierarchy, which, like regular hierarchy, is a prerequisite for cooperation. It is this hierarchical cooperation that turns a household into a force to be reckoned with, that allows a group of men to unite as a front against their enemies, that allows a tribe to conquer the world. Remember: Scientology bullied the Cathedral’s tax department into submission.

With a functioning hierarchy, men still gossip, lie and scheme, but they will do so in whispers behind closed doors. In your face they cooperate and contribute to the group’s wellbeing because incentives are thus that contributing to group wellbeing heightens status.

Without a functioning hierarchy, men gossip, lie and scheme, but they do so in your face, and they tell you that you are positively deluded for accusing them of gossiping, lying and scheming. Seeds will not sprout in such ground.

Spiritual dominance is established in the same way any sort of dominance is established: fought for, taken. But the fight is ritualistic. You can’t force spiritual dominance if no one listens, or if you are silenced the ritual is not allowed to happen.

If one of our priests is forbidden from establishing spiritual dominance, that is a sure sign an enemy priest is in better control and has vested interest in preventing you from establishing spiritual dominance..

They defect on you, you defect on them. Let them suffer the consequences of enemy priesthood, among others characterized by the annoying tendency that very little is said with very many words.

https://contingentnotarbitrary.com/2018/04/14/rederiving-christianity/
To recap, we started with a secular definition of Logos and noted that its telos is existence. Given human nature, game theory and the power of cooperation, the highest expression of that telos is freely chosen universal love, tempered by constant vigilance against defection while maintaining compassion for the defectors and forgiving those who repent. In addition, we must know the telos in order to fulfill it.

In Christian terms, looks like we got over half of the Ten Commandments (know Logos for the First, don’t defect or tempt yourself to defect for the rest), the importance of free will, the indestructibility of evil (group cooperation vs individual defection), loving the sinner and hating the sin (with defection as the sin), forgiveness (with conditions), and love and compassion toward all, assuming only secular knowledge and that it’s good to exist.

Iterated Prisoner's Dilemma is an Ultimatum Game: http://infoproc.blogspot.com/2012/07/iterated-prisoners-dilemma-is-ultimatum.html
The history of IPD shows that bounded cognition prevented the dominant strategies from being discovered for over over 60 years, despite significant attention from game theorists, computer scientists, economists, evolutionary biologists, etc. Press and Dyson have shown that IPD is effectively an ultimatum game, which is very different from the Tit for Tat stories told by generations of people who worked on IPD (Axelrod, Dawkins, etc., etc.).

...

For evolutionary biologists: Dyson clearly thinks this result has implications for multilevel (group vs individual selection):
... Cooperation loses and defection wins. The ZD strategies confirm this conclusion and make it sharper. ... The system evolved to give cooperative tribes an advantage over non-cooperative tribes, using punishment to give cooperation an evolutionary advantage within the tribe. This double selection of tribes and individuals goes way beyond the Prisoners' Dilemma model.

implications for fractionalized Europe vis-a-vis unified China?

and more broadly does this just imply we're doomed in the long run RE: cooperation, morality, the "good society", so on...? war and group-selection is the only way to get a non-crab bucket civilization?

Iterated Prisoner’s Dilemma contains strategies that dominate any evolutionary opponent:
http://www.pnas.org/content/109/26/10409.full
http://www.pnas.org/content/109/26/10409.full.pdf
https://www.edge.org/conversation/william_h_press-freeman_dyson-on-iterated-prisoners-dilemma-contains-strategies-that

https://en.wikipedia.org/wiki/Ultimatum_game

analogy for ultimatum game: the state gives the demos a bargain take-it-or-leave-it, and...if the demos refuses...violence?

The nature of human altruism: http://sci-hub.tw/https://www.nature.com/articles/nature02043
- Ernst Fehr & Urs Fischbacher

Some of the most fundamental questions concerning our evolutionary origins, our social relations, and the organization of society are centred around issues of altruism and selfishness. Experimental evidence indicates that human altruism is a powerful force and is unique in the animal world. However, there is much individual heterogeneity and the interaction between altruists and selfish individuals is vital to human cooperation. Depending on the environment, a minority of altruists can force a majority of selfish individuals to cooperate or, conversely, a few egoists can induce a large number of altruists to defect. Current gene-based evolutionary theories cannot explain important patterns of human altruism, pointing towards the importance of both theories of cultural evolution as well as gene–culture co-evolution.

...

Why are humans so unusual among animals in this respect? We propose that quantitatively, and probably even qualitatively, unique patterns of human altruism provide the answer to this question. Human altruism goes far beyond that which has been observed in the animal world. Among animals, fitness-reducing acts that confer fitness benefits on other individuals are largely restricted to kin groups; despite several decades of research, evidence for reciprocal altruism in pair-wise repeated encounters4,5 remains scarce6–8. Likewise, there is little evidence so far that individual reputation building affects cooperation in animals, which contrasts strongly with what we find in humans. If we randomly pick two human strangers from a modern society and give them the chance to engage in repeated anonymous exchanges in a laboratory experiment, there is a high probability that reciprocally altruistic behaviour will emerge spontaneously9,10.

However, human altruism extends far beyond reciprocal altruism and reputation-based cooperation, taking the form of strong reciprocity11,12. Strong reciprocity is a combination of altruistic rewarding, which is a predisposition to reward others for cooperative, norm-abiding behaviours, and altruistic punishment, which is a propensity to impose sanctions on others for norm violations. Strong reciprocators bear the cost of rewarding or punishing even if they gain no individual economic benefit whatsoever from their acts. In contrast, reciprocal altruists, as they have been defined in the biological literature4,5, reward and punish only if this is in their long-term self-interest. Strong reciprocity thus constitutes a powerful incentive for cooperation even in non-repeated interactions and when reputation gains are absent, because strong reciprocators will reward those who cooperate and punish those who defect.

...

We will show that the interaction between selfish and strongly reciprocal … [more]
concept  conceptual-vocab  wiki  reference  article  models  GT-101  game-theory  anthropology  cultural-dynamics  trust  cooperate-defect  coordination  iteration-recursion  sequential  axelrod  discrete  smoothness  evolution  evopsych  EGT  economics  behavioral-econ  sociology  new-religion  deep-materialism  volo-avolo  characterization  hsu  scitariat  altruism  justice  group-selection  decision-making  tribalism  organizing  hari-seldon  theory-practice  applicability-prereqs  bio  finiteness  multi  history  science  social-science  decision-theory  commentary  study  summary  giants  the-trenches  zero-positive-sum  🔬  bounded-cognition  info-dynamics  org:edge  explanation  exposition  org:nat  eden  retention  long-short-run  darwinian  markov  equilibrium  linear-algebra  nitty-gritty  competition  war  explanans  n-factor  europe  the-great-west-whale  occident  china  asia  sinosphere  orient  decentralized  markets  market-failure  cohesion  metabuch  stylized-facts  interdisciplinary  physics  pdf  pessimism  time  insight  the-basilisk  noblesse-oblige  the-watchers  ideas  l 
march 2018 by nhaliday
The weirdest people in the world?
Abstract: Behavioral scientists routinely publish broad claims about human psychology and behavior in the world’s top journals based on samples drawn entirely from Western, Educated, Industrialized, Rich, and Democratic (WEIRD) societies. Researchers – often implicitly – assume that either there is little variation across human populations, or that these “standard subjects” are as representative of the species as any other population. Are these assumptions justified? Here, our review of the comparative database from across the behavioral sciences suggests both that there is substantial variability in experimental results across populations and that WEIRD subjects are particularly unusual compared with the rest of the species – frequent outliers. The domains reviewed include visual perception, fairness, cooperation, spatial reasoning, categorization and inferential induction, moral reasoning, reasoning styles, self-concepts and related motivations, and the heritability of IQ. The findings suggest that members of WEIRD societies, including young children, are among the least representative populations one could find for generalizing about humans. Many of these findings involve domains that are associated with fundamental aspects of psychology, motivation, and behavior – hence, there are no obvious a priori grounds for claiming that a particular behavioral phenomenon is universal based on sampling from a single subpopulation. Overall, these empirical patterns suggests that we need to be less cavalier in addressing questions of human nature on the basis of data drawn from this particularly thin, and rather unusual, slice of humanity. We close by proposing ways to structurally re-organize the behavioral sciences to best tackle these challenges.
pdf  study  microfoundations  anthropology  cultural-dynamics  sociology  psychology  social-psych  cog-psych  iq  biodet  behavioral-gen  variance-components  psychometrics  psych-architecture  visuo  spatial  morality  individualism-collectivism  n-factor  justice  egalitarianism-hierarchy  cooperate-defect  outliers  homo-hetero  evopsych  generalization  henrich  europe  the-great-west-whale  occident  organizing  🌞  universalism-particularism  applicability-prereqs  hari-seldon  extrema  comparison  GT-101  ecology  EGT  reinforcement  anglo  language  gavisti  heavy-industry  marginal  absolute-relative  reason  stylized-facts  nature  systematic-ad-hoc  analytical-holistic  science  modernity  behavioral-econ  s:*  illusion  cool  hmm  coordination  self-interest  social-norms  population  density  humanity  sapiens  farmers-and-foragers  free-riding  anglosphere  cost-benefit  china  asia  sinosphere  MENA  world  developing-world  neurons  theory-of-mind  network-structure  nordic  orient  signum  biases  usa  optimism  hypocrisy  humility  within-without  volo-avolo  domes 
november 2017 by nhaliday
All models are wrong - Wikipedia
Box repeated the aphorism in a paper that was published in the proceedings of a 1978 statistics workshop.[2] The paper contains a section entitled "All models are wrong but some are useful". The section is copied below.

Now it would be very remarkable if any system existing in the real world could be exactly represented by any simple model. However, cunningly chosen parsimonious models often do provide remarkably useful approximations. For example, the law PV = RT relating pressure P, volume V and temperature T of an "ideal" gas via a constant R is not exactly true for any real gas, but it frequently provides a useful approximation and furthermore its structure is informative since it springs from a physical view of the behavior of gas molecules.

For such a model there is no need to ask the question "Is the model true?". If "truth" is to be the "whole truth" the answer must be "No". The only question of interest is "Is the model illuminating and useful?".
thinking  metabuch  metameta  map-territory  models  accuracy  wire-guided  truth  philosophy  stats  data-science  methodology  lens  wiki  reference  complex-systems  occam  parsimony  science  nibble  hi-order-bits  info-dynamics  the-trenches  meta:science  physics  fluid  thermo  stat-mech  applicability-prereqs  theory-practice 
august 2017 by nhaliday
Econometric Modeling as Junk Science
The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics: https://www.aeaweb.org/articles?id=10.1257/jep.24.2.3

On data, experiments, incentives and highly unconvincing research – papers and hot beverages: https://papersandhotbeverages.wordpress.com/2015/10/31/on-data-experiments-incentives-and-highly-unconvincing-research/
In my view, it has just to do with the fact that academia is a peer monitored organization. In the case of (bad) data collection papers, issues related to measurement are typically boring. They are relegated to appendices, no one really has an incentive to monitor it seriously. The problem is similar in formal theory: no one really goes through the algebra in detail, but it is in principle feasible to do it, and, actually, sometimes these errors are detected. If discussing the algebra of a proof is almost unthinkable in a seminar, going into the details of data collection, measurement and aggregation is not only hard to imagine, but probably intrinsically infeasible.

Something different happens for the experimentalist people. As I was saying, I feel we have come to a point in which many papers are evaluated based on the cleverness and originality of the research design (“Using the World Cup qualifiers as an instrument for patriotism!? Woaw! how cool/crazy is that! I wish I had had that idea”). The sexiness of the identification strategy has too often become a goal in itself. When your peers monitor you paying more attention to the originality of the identification strategy than to the research question, you probably have an incentive to mine reality for ever crazier discontinuities. It is true methodologists have been criticized in the past for analogous reasons, such as being guided by the desire to increase mathematical complexity without a clear benefit. But, if you work with pure formal theory or statistical theory, your work is not meant to immediately answer question about the real world, but instead to serve other researchers in their quest. This is something that can, in general, not be said of applied CI work.

https://twitter.com/pseudoerasmus/status/662007951415238656
This post should have been entitled “Zombies who only think of their next cool IV fix”
https://twitter.com/pseudoerasmus/status/662692917069422592
massive lust for quasi-natural experiments, regression discontinuities
barely matters if the effects are not all that big
I suppose even the best of things must reach their decadent phase; methodological innov. to manias……

https://twitter.com/cblatts/status/920988530788130816
Following this "collapse of small-N social psych results" business, where do I predict econ will collapse? I see two main contenders.
One is lab studies. I dallied with these a few years ago in a Kenya lab. We ran several pilots of N=200 to figure out the best way to treat
and to measure the outcome. Every pilot gave us a different stat sig result. I could have written six papers concluding different things.
I gave up more skeptical of these lab studies than ever before. The second contender is the long run impacts literature in economic history
We should be very suspicious since we never see a paper showing that a historical event had no effect on modern day institutions or dvpt.
On the one hand I find these studies fun, fascinating, and probably true in a broad sense. They usually reinforce a widely believed history
argument with interesting data and a cute empirical strategy. But I don't think anyone believes the standard errors. There's probably a HUGE
problem of nonsignificant results staying in the file drawer. Also, there are probably data problems that don't get revealed, as we see with
the recent Piketty paper (http://marginalrevolution.com/marginalrevolution/2017/10/pikettys-data-reliable.html). So I take that literature with a vat of salt, even if I enjoy and admire the works
I used to think field experiments would show little consistency in results across place. That external validity concerns would be fatal.
In fact the results across different samples and places have proven surprisingly similar across places, and added a lot to general theory
Last, I've come to believe there is no such thing as a useful instrumental variable. The ones that actually meet the exclusion restriction
are so weird & particular that the local treatment effect is likely far different from the average treatment effect in non-transparent ways.
Most of the other IVs don't plausibly meet the e clue ion restriction. I mean, we should be concerned when the IV estimate is always 10x
larger than the OLS coefficient. This I find myself much more persuaded by simple natural experiments that use OLS, diff in diff, or
discontinuities, alongside randomized trials.

What do others think are the cliffs in economics?
PS All of these apply to political science too. Though I have a special extra target in poli sci: survey experiments! A few are good. I like
Dan Corstange's work. But it feels like 60% of dissertations these days are experiments buried in a survey instrument that measure small
changes in response. These at least have large N. But these are just uncontrolled labs, with negligible external validity in my mind.
The good ones are good. This method has its uses. But it's being way over-applied. More people have to make big and risky investments in big
natural and field experiments. Time to raise expectations and ambitions. This expectation bar, not technical ability, is the big advantage
economists have over political scientists when they compete in the same space.
(Ok. So are there any friends and colleagues I haven't insulted this morning? Let me know and I'll try my best to fix it with a screed)

HOW MUCH SHOULD WE TRUST DIFFERENCES-IN-DIFFERENCES ESTIMATES?∗: https://economics.mit.edu/files/750
Most papers that employ Differences-in-Differences estimation (DD) use many years of data and focus on serially correlated outcomes but ignore that the resulting standard errors are inconsistent. To illustrate the severity of this issue, we randomly generate placebo laws in state-level data on female wages from the Current Population Survey. For each law, we use OLS to compute the DD estimate of its “effect” as well as the standard error of this estimate. These conventional DD standard errors severely understate the standard deviation of the estimators: we find an “effect” significant at the 5 percent level for up to 45 percent of the placebo interventions. We use Monte Carlo simulations to investigate how well existing methods help solve this problem. Econometric corrections that place a specific parametric form on the time-series process do not perform well. Bootstrap (taking into account the auto-correlation of the data) works well when the number of states is large enough. Two corrections based on asymptotic approximation of the variance-covariance matrix work well for moderate numbers of states and one correction that collapses the time series information into a “pre” and “post” period and explicitly takes into account the effective sample size works well even for small numbers of states.

‘METRICS MONDAY: 2SLS–CHRONICLE OF A DEATH FORETOLD: http://marcfbellemare.com/wordpress/12733
As it turns out, Young finds that
1. Conventional tests tend to overreject the null hypothesis that the 2SLS coefficient is equal to zero.
2. 2SLS estimates are falsely declared significant one third to one half of the time, depending on the method used for bootstrapping.
3. The 99-percent confidence intervals (CIs) of those 2SLS estimates include the OLS point estimate over 90 of the time. They include the full OLS 99-percent CI over 75 percent of the time.
4. 2SLS estimates are extremely sensitive to outliers. Removing simply one outlying cluster or observation, almost half of 2SLS results become insignificant. Things get worse when removing two outlying clusters or observations, as over 60 percent of 2SLS results then become insignificant.
5. Using a Durbin-Wu-Hausman test, less than 15 percent of regressions can reject the null that OLS estimates are unbiased at the 1-percent level.
6. 2SLS has considerably higher mean squared error than OLS.
7. In one third to one half of published results, the null that the IVs are totally irrelevant cannot be rejected, and so the correlation between the endogenous variable(s) and the IVs is due to finite sample correlation between them.
8. Finally, fewer than 10 percent of 2SLS estimates reject instrument irrelevance and the absence of OLS bias at the 1-percent level using a Durbin-Wu-Hausman test. It gets much worse–fewer than 5 percent–if you add in the requirement that the 2SLS CI that excludes the OLS estimate.

Methods Matter: P-Hacking and Causal Inference in Economics*: http://ftp.iza.org/dp11796.pdf
Applying multiple methods to 13,440 hypothesis tests reported in 25 top economics journals in 2015, we show that selective publication and p-hacking is a substantial problem in research employing DID and (in particular) IV. RCT and RDD are much less problematic. Almost 25% of claims of marginally significant results in IV papers are misleading.

https://twitter.com/NoamJStein/status/1040887307568664577
Ever since I learned social science is completely fake, I've had a lot more time to do stuff that matters, like deadlifting and reading about Mediterranean haplogroups
--
Wait, so, from fakest to realest IV>DD>RCT>RDD? That totally matches my impression.
org:junk  org:edu  economics  econometrics  methodology  realness  truth  science  social-science  accuracy  generalization  essay  article  hmm  multi  study  🎩  empirical  causation  error  critique  sociology  criminology  hypothesis-testing  econotariat  broad-econ  cliometrics  endo-exo  replication  incentives  academia  measurement  wire-guided  intricacy  twitter  social  discussion  pseudoE  effect-size  reflection  field-study  stat-power  piketty  marginal-rev  commentary  data-science  expert-experience  regression  gotchas  rant  map-territory  pdf  simulation  moments  confidence  bias-variance  stats  endogenous-exogenous  control  meta:science  meta-analysis  outliers  summary  sampling  ensembles  monte-carlo  theory-practice  applicability-prereqs  chart  comparison  shift  ratty  unaffiliated 
june 2017 by nhaliday
The Limits of Public Choice Theory – Jacobite
Many people believe that politics is difficult because of incentives: voters vote for their self interest; bureaucrats deliberately don’t solve problems to enlarge their departments; and elected officials maximize votes for power and sell out to lobbyists. But this cynical view is mostly wrong—politics, insofar as it has problems, has problems not because people are selfish—it has problems because people have wrong ideas. In fact, people mostly act surprisingly altruistically, motivated by trying to do good for their country.

...

I got into politics and ideas as a libertarian. I was attracted by the idea of public choice as a universal theory of politics. It’s intuitively appealing, methodologically individualist, and it supported all of the things I already believed. And it’s definitely true to some extent—there is a huge amount of evidence that it affects things somewhat. But it’s terrible as a general theory of politics in the developed world. Our policies are bad because voters are ignorant and politicians believe in things too much, not because everyone is irredeemably cynical and atavistic.

interesting take, HBD?: https://twitter.com/pseudoerasmus/status/869882831572434946

recommended by Garett Jones:
https://web.archive.org/web/20110517015819/http://reviewsindepth.com/2010/03/yes-prime-minister-the-most-cunning-political-propaganda-ever-conceived/
https://en.wikipedia.org/wiki/The_Thick_of_It
org:popup  albion  wonkish  econotariat  rhetoric  essay  contrarianism  methodology  economics  micro  social-choice  elections  government  politics  polisci  incentives  altruism  social-norms  democracy  cynicism-idealism  optimism  antidemos  morality  near-far  ethics  map-territory  models  cooperate-defect  anthropology  coordination  multi  twitter  social  commentary  pseudoE  broad-econ  wealth-of-nations  rent-seeking  leviathan  pop-diff  gnon  political-econ  public-goodish  tv  review  garett-jones  backup  recommendations  microfoundations  wiki  britain  organizing  interests  applicability-prereqs  the-watchers  noblesse-oblige  n-factor  self-interest  cohesion  EGT  world  guilt-shame  alignment 
may 2017 by nhaliday
Lucio Russo - Wikipedia
In The Forgotten Revolution: How Science Was Born in 300 BC and Why It Had to Be Reborn (Italian: La rivoluzione dimenticata), Russo promotes the belief that Hellenistic science in the period 320-144 BC reached heights not achieved by Classical age science, and proposes that it went further than ordinarily thought, in multiple fields not normally associated with ancient science.

La Rivoluzione Dimenticata (The Forgotten Revolution), Reviewed by Sandro Graffi: http://www.ams.org/notices/199805/review-graffi.pdf

Before turning to the question of the decline of Hellenistic science, I come back to the new light shed by the book on Euclid’s Elements and on pre-Ptolemaic astronomy. Euclid’s definitions of the elementary geometric entities—point, straight line, plane—at the beginning of the Elements have long presented a problem.7 Their nature is in sharp contrast with the approach taken in the rest of the book, and continued by mathematicians ever since, of refraining from defining the fundamental entities explicitly but limiting themselves to postulating the properties which they enjoy. Why should Euclid be so hopelessly obscure right at the beginning and so smooth just after? The answer is: the definitions are not Euclid’s. Toward the beginning of the second century A.D. Heron of Alexandria found it convenient to introduce definitions of the elementary objects (a sign of decadence!) in his commentary on Euclid’s Elements, which had been written at least 400 years before. All manuscripts of the Elements copied ever since included Heron’s definitions without mention, whence their attribution to Euclid himself. The philological evidence leading to this conclusion is quite convincing.8

...

What about the general and steady (on the average) impoverishment of Hellenistic science under the Roman empire? This is a major historical problem, strongly tied to the even bigger one of the decline and fall of the antique civilization itself. I would summarize the author’s argument by saying that it basically represents an application to science of a widely accepted general theory on decadence of antique civilization going back to Max Weber. Roman society, mainly based on slave labor, underwent an ultimately unrecoverable crisis as the traditional sources of that labor force, essentially wars, progressively dried up. To save basic farming, the remaining slaves were promoted to be serfs, and poor free peasants reduced to serfdom, but this made trade disappear. A society in which production is almost entirely based on serfdom and with no trade clearly has very little need of culture, including science and technology. As Max Weber pointed out, when trade vanished, so did the marble splendor of the ancient towns, as well as the spiritual assets that went with it: art, literature, science, and sophisticated commercial laws. The recovery of Hellenistic science then had to wait until the disappearance of serfdom at the end of the Middle Ages. To quote Max Weber: “Only then with renewed vigor did the old giant rise up again.”

...

The epilogue contains the (rather pessimistic) views of the author on the future of science, threatened by the apparent triumph of today’s vogue of irrationality even in leading institutions (e.g., an astrology professorship at the Sorbonne). He looks at today’s ever-increasing tendency to teach science more on a fideistic than on a deductive or experimental basis as the first sign of a decline which could be analogous to the post-Hellenistic one.

Praising Alexandrians to excess: https://sci-hub.tw/10.1088/2058-7058/17/4/35
The Economic Record review: https://sci-hub.tw/10.1111/j.1475-4932.2004.00203.x

listed here: https://pinboard.in/u:nhaliday/b:c5c09f2687c1

Was Roman Science in Decline? (Excerpt from My New Book): https://www.richardcarrier.info/archives/13477
people  trivia  cocktail  history  iron-age  mediterranean  the-classics  speculation  west-hunter  scitariat  knowledge  wiki  ideas  wild-ideas  technology  innovation  contrarianism  multi  pdf  org:mat  books  review  critique  regularizer  todo  piracy  physics  canon  science  the-trenches  the-great-west-whale  broad-econ  the-world-is-just-atoms  frontier  speedometer  🔬  conquest-empire  giants  economics  article  growth-econ  cjones-like  industrial-revolution  empirical  absolute-relative  truth  rot  zeitgeist  gibbon  big-peeps  civilization  malthus  roots  old-anglo  britain  early-modern  medieval  social-structure  limits  quantitative-qualitative  rigor  lens  systematic-ad-hoc  analytical-holistic  cycles  space  mechanics  math  geometry  gravity  revolution  novelty  meta:science  is-ought  flexibility  trends  reason  applicability-prereqs  theory-practice  traces  evidence 
may 2017 by nhaliday
Typos | West Hunter
In a simple model, a given mutant has an equilibrium frequency μ/s, when μ is the mutation rate from good to bad alleles and s is the size of the selective disadvantage. To estimate the total impact of mutation at that locus, you multiply the frequency by the expected harm, s: which means that the fitness decrease (from effects at that locus) is just μ, the mutation rate. If we assume that these fitness effects are multiplicative, the total fitness decrease (also called ‘mutational load’) is approximately 1 – exp(-U), when U is where U=Σ2μ, the total number of new harmful mutations per diploid individual.

https://westhunt.wordpress.com/2012/10/17/more-to-go-wrong/

https://westhunt.wordpress.com/2012/07/13/sanctuary/
interesting, suggestive comment on Africa:
https://westhunt.wordpress.com/2012/07/13/sanctuary/#comment-3671
https://westhunt.wordpress.com/2012/07/14/too-darn-hot/
http://infoproc.blogspot.com/2012/07/rare-variants-and-human-genetic.html
https://westhunt.wordpress.com/2012/07/18/changes-in-attitudes/
https://westhunt.wordpress.com/2012/08/24/men-and-macaques/
I have reason to believe that few people understand genetic load very well, probably for self-referential reasons, but better explanations are possible.

One key point is that the amount of neutral variation is determined by the long-term mutational rate and population history, while the amount of deleterious variation [genetic load] is set by the selective pressures and the prevailing mutation rate over a much shorter time scale. For example, if you consider the class of mutations that reduce fitness by 1%, what matters is the past few thousand years, not the past few tens or hundreds of of thousands of years.

...

So, assuming that African populations have more neutral variation than non-African populations (which is well-established), what do we expect to see when we compare the levels of probably-damaging mutations in those two populations? If the Africans and non-Africans had experienced essentially similar mutation rates and selective pressures over the past few thousand years, we would expect to see the same levels of probably-damaging mutations. Bottlenecks that happened at the last glacial maximum or in the expansion out of Africa are irrelevant – too long ago to matter.

But we don’t. The amount of rare synonymous stuff is about 22% higher in Africans. The amount of rare nonsynonymous stuff (usually at least slightly deleterious) is 20.6% higher. The number of rare variants predicted to be more deleterious is ~21.6% higher. The amount of stuff predicted to be even more deleterious is ~27% higher. The number of harmful looking loss-of-function mutations (yet more deleterious) is 25% higher.

It looks as if the excess grows as the severity of the mutations increases. There is a scenario in which this is possible: the mutation rate in Africa has increased recently. Not yesterday, but, say, over the past few thousand years.

...

What is the most likely cause of such variations in the mutation rate? Right now, I’d say differences in average paternal age. We know that modest differences (~5 years) in average paternal age can easily generate ~20% differences in the mutation rate. Such between-population differences in mutation rates seem quite plausible, particularly since the Neolithic.
https://westhunt.wordpress.com/2016/04/10/bugs-versus-drift/
more recent: https://westhunt.wordpress.com/2017/06/06/happy-families-are-all-alike-every-unhappy-family-is-unhappy-in-its-own-way/#comment-92491
Probably not, but the question is complex: depends on the shape of the deleterious mutational spectrum [which we don’t know], ancient and recent demography, paternal age, and the extent of truncation selection in the population.
west-hunter  scitariat  discussion  bio  sapiens  biodet  evolution  mutation  genetics  genetic-load  population-genetics  nibble  stylized-facts  methodology  models  equilibrium  iq  neuro  neuro-nitgrit  epidemiology  selection  malthus  temperature  enhancement  CRISPR  genomics  behavioral-gen  multi  poast  africa  roots  pop-diff  ideas  gedanken  paternal-age  🌞  environment  speculation  gene-drift  longevity  immune  disease  parasites-microbiome  scifi-fantasy  europe  asia  race  migration  hsu  study  summary  commentary  shift  the-great-west-whale  nordic  intelligence  eden  long-short-run  debate  hmm  idk  explanans  comparison  structure  occident  mediterranean  geography  within-group  correlation  direction  volo-avolo  demographics  age-generation  measurement  data  applicability-prereqs  aging 
may 2017 by nhaliday
In a handbasket | West Hunter
It strikes me that in many ways, life was gradually getting harder in the Old World, especially in the cradles of civilization.

slavery and Rome/early US: https://westhunt.wordpress.com/2016/06/17/in-a-handbasket/#comment-80503
Rome and innovation: https://westhunt.wordpress.com/2016/06/17/in-a-handbasket/#comment-80505
"Culture’s have flavors and the Roman flavor was unfavorable to being clever. The Greeks were clever but not interested in utility. While the central American civilizations liked to cut people’s hearts out and stick cactus spines through their penis in public. Let us all act according to national customs."
https://twitter.com/Evolving_Moloch/status/881652804900671489
https://en.wikipedia.org/wiki/Bloodletting_in_Mesoamerica

https://westhunt.wordpress.com/2014/07/05/let-no-new-thing-arise/
It helps to think about critical community size (CCS). Consider a disease like measles, one that doesn’t last long and confers lifelong immunity. The virus needs fresh, never-infected hosts (we call them children) all the time, else it will go extinct. The critical community size for measles is probably more than half a million – which means that before agriculture, measles as we know it today couldn’t and didn’t exist. In fact, it looks as if split off from rinderpest within the last two thousand years. Mumps was around in Classical times (Hippocrates gives a good description), but it too has a large CCS and must be relatively new. Rubella can’t be ancient. Whooping cough has a smaller CCS, maybe only 100,000, but it too must postdate agriculture.

"let no new thing arise":
http://www.theseeker.org/cgi-bin/bulletin/show.pl?Todd%20Collier/Que%20no%20hayan%20novedades.
http://itre.cis.upenn.edu/~myl/languagelog/archives/003347.html
http://www.bradwarthen.com/2010/02/que-no-haya-novedad-may-no-new-thing-arise/

https://westhunt.wordpress.com/2013/07/03/legionnaires-disease/
Before 1900, armies usually lost more men from infectious disease than combat, particularly in extended campaigns.  At least that seems to have been the case in modern Western history.

There are indications that infectious disease was qualitatively different – less important –  in  the Roman legions.  For one thing, camps were placed near good supplies of fresh water. The legions had good camp sanitation, at least by the time of the Principate. They used latrines flushed with running water in permanent camps  and deep slit trenches with wooden covers and removable buckets in the field.  Using those latrines would have protected soldiers from diseases like typhoid and dysentery, major killers in recent armies.  Romans armies were mobile, often shifting their camps.  They seldom quartered their soldiers in urban areas –  they feared that city luxuries would corrupt their men, but this habit helped them avoid infectious agents, regardless of their reasons.

They managed to avoid a lot of serious illnesses because the causative organisms  simply weren’t there yet. Smallpox, and maybe measles, didn’t show up until the middle Empire. Falciparum malaria was around, but hadn’t reached Rome itself, during the Republic. It definitely had by the time of the Empire. Bubonic plague doesn’t seem to have caused trouble before Justinian.  Syphilis for sure, and typhus probably,  originated in the Americas, while cholera didn’t arrive until after 1800.
west-hunter  scitariat  history  iron-age  medieval  early-modern  discussion  europe  civilization  technology  innovation  agriculture  energy-resources  disease  parasites-microbiome  recent-selection  lived-experience  multi  mediterranean  the-classics  economics  usa  age-of-discovery  poast  aphorism  latin-america  farmers-and-foragers  cultural-dynamics  social-norms  culture  wealth-of-nations  twitter  social  commentary  quotes  anthropology  nihil  martial  nietzschean  embodied  ritual  wiki  reference  ethnography  flux-stasis  language  jargon  foreign-lang  population  density  speculation  ideas  war  meta:war  military  red-queen  strategy  epidemiology  public-health  trends  zeitgeist  archaeology  novelty  spreading  cost-benefit  conquest-empire  malthus  pre-ww2  the-south  applicability-prereqs 
april 2017 by nhaliday
How Universal Is the Big Five? Testing the Five-Factor Model of Personality Variation Among Forager–Farmers in the Bolivian Amazon
We failed to find robust support for the FFM, based on tests of (a) internal consistency of items expected to segregate into the Big Five factors, (b) response stability of the Big Five, (c) external validity of the Big Five with respect to observed behavior, (d) factor structure according to exploratory and confirmatory factor analysis, and (e) similarity with a U.S. target structure based on Procrustes rotation analysis.

...

We argue that Tsimane personality variation displays 2 principal factors that may reflect socioecological characteristics common to small-scale societies. We offer evolutionary perspectives on why the structure of personality variation may not be invariant across human societies.
pdf  study  psychology  cog-psych  society  embedded-cognition  personality  metrics  generalization  methodology  farmers-and-foragers  latin-america  context  homo-hetero  info-dynamics  water  psychometrics  exploratory  things  phalanges  dimensionality  anthropology  universalism-particularism  applicability-prereqs 
february 2017 by nhaliday
The infinitesimal model | bioRxiv
Our focus here is on the infinitesimal model. In this model, one or several quantitative traits are described as the sum of a genetic and a non-genetic component, the first being distributed as a normal random variable centred at the average of the parental genetic components, and with a variance independent of the parental traits. We first review the long history of the infinitesimal model in quantitative genetics. Then we provide a definition of the model at the phenotypic level in terms of individual trait values and relationships between individuals, but including different evolutionary processes: genetic drift, recombination, selection, mutation, population structure, ... We give a range of examples of its application to evolutionary questions related to stabilising selection, assortative mating, effective population size and response to selection, habitat preference and speciation. We provide a mathematical justification of the model as the limit as the number M of underlying loci tends to infinity of a model with Mendelian inheritance, mutation and environmental noise, when the genetic component of the trait is purely additive. We also show how the model generalises to include epistatic effects. In each case, by conditioning on the pedigree relating individuals in the population, we incorporate arbitrary selection and population structure. We suppose that we can observe the pedigree up to the present generation, together with all the ancestral traits, and we show, in particular, that the genetic components of the individual trait values in the current generation are indeed normally distributed with a variance independent of ancestral traits, up to an error of order M^{-1/2}. Simulations suggest that in particular cases the convergence may be as fast as 1/M.

published version:
The infinitesimal model: Definition, derivation, and implications: https://sci-hub.tw/10.1016/j.tpb.2017.06.001

Commentary: Fisher’s infinitesimal model: A story for the ages: http://www.sciencedirect.com/science/article/pii/S0040580917301508?via%3Dihub
This commentary distinguishes three nested approximations, referred to as “infinitesimal genetics,” “Gaussian descendants” and “Gaussian population,” each plausibly called “the infinitesimal model.” The first and most basic is Fisher’s “infinitesimal” approximation of the underlying genetics – namely, many loci, each making a small contribution to the total variance. As Barton et al. (2017) show, in the limit as the number of loci increases (with enough additivity), the distribution of genotypic values for descendants approaches a multivariate Gaussian, whose variance–covariance structure depends only on the relatedness, not the phenotypes, of the parents (or whether their population experiences selection or other processes such as mutation and migration). Barton et al. (2017) call this rigorously defensible “Gaussian descendants” approximation “the infinitesimal model.” However, it is widely assumed that Fisher’s genetic assumptions yield another Gaussian approximation, in which the distribution of breeding values in a population follows a Gaussian — even if the population is subject to non-Gaussian selection. This third “Gaussian population” approximation, is also described as the “infinitesimal model.” Unlike the “Gaussian descendants” approximation, this third approximation cannot be rigorously justified, except in a weak-selection limit, even for a purely additive model. Nevertheless, it underlies the two most widely used descriptions of selection-induced changes in trait means and genetic variances, the “breeder’s equation” and the “Bulmer effect.” Future generations may understand why the “infinitesimal model” provides such useful approximations in the face of epistasis, linkage, linkage disequilibrium and strong selection.
study  exposition  bio  evolution  population-genetics  genetics  methodology  QTL  preprint  models  unit  len:long  nibble  linearity  nonlinearity  concentration-of-measure  limits  applications  🌞  biodet  oscillation  fisher  perturbation  stylized-facts  chart  ideas  article  pop-structure  multi  pdf  piracy  intricacy  map-territory  kinship  distribution  simulation  ground-up  linear-models  applicability-prereqs  bioinformatics 
january 2017 by nhaliday
Information Processing: Evidence for (very) recent natural selection in humans
height (+), infant head circumference (+), some biomolecular stuff, female hip size (+), male BMI (-), age of menarche (+, !!), and birth weight (+)

Strong selection in the recent past can cause allele frequencies to change significantly. Consider two different SNPs, which today have equal minor allele frequency (for simplicity, let this be equal to one half). Assume that one SNP was subject to strong recent selection, and another (neutral) has had approximately zero effect on fitness. The advantageous version of the first SNP was less common in the far past, and rose in frequency recently (e.g., over the last 2k years). In contrast, the two versions of the neutral SNP have been present in roughly the same proportion (up to fluctuations) for a long time. Consequently, in the total past breeding population (i.e., going back tens of thousands of years) there have been many more copies of the neutral alleles (and the chunks of DNA surrounding them) than of the positively selected allele. Each of the chunks of DNA around the SNPs we are considering is subject to a roughly constant rate of mutation.

Looking at the current population, one would then expect a larger variety of mutations in the DNA region surrounding the neutral allele (both versions) than near the favored selected allele (which was rarer in the population until very recently, and whose surrounding region had fewer chances to accumulate mutations). By comparing the difference in local mutational diversity between the two versions of the neutral allele (should be zero modulo fluctuations, for the case MAF = 0.5), and between the (+) and (-) versions of the selected allele (nonzero, due to relative change in frequency), one obtains a sensitive signal for recent selection. See figure at bottom for more detail. In the paper what I call mutational diversity is measured by looking at distance distribution of singletons, which are rare variants found in only one individual in the sample under study.

The 2,000 year selection of the British: http://www.unz.com/gnxp/the-2000-year-selection-of-the-british/

Detection of human adaptation during the past 2,000 years: http://www.biorxiv.org/content/early/2016/05/07/052084

The key idea is that recent selection distorts the ancestral genealogy of sampled haplotypes at a selected site. In particular, the terminal (tip) branches of the genealogy tend to be shorter for the favored allele than for the disfavored allele, and hence, haplotypes carrying the favored allele will tend to carry fewer singleton mutations (Fig. 1A-C and SOM).

To capture this effect, we use the sum of distances to the nearest singleton in each direction from a test SNP as a summary statistic (Fig. 1D).

Figure 1. Illustration of the SDS method.

Figure 2. Properties of SDS.

Based on a recent model of European demography [25], we estimate that the mean tip length for a neutral sample of 3,000 individuals is 75 generations, or roughly 2,000 years (Fig. 2A). Since SDS aims to measure changes in tip lengths of the genealogy, we conjectured that it would be most likely to detect selection approximately within this timeframe.

Indeed, in simulated sweep models with samples of 3,000 individuals (Fig. 2B,C and fig. S2), we find that SDS focuses specifically on very recent time scales, and has equal power for hard and soft sweeps within this timeframe. At individual loci, SDS is powered to detect ~2% selection over 100 generations. Moreover, SDS has essentially no power to detect older selection events that stopped >100 generations before the present. In contrast, a commonly-used test for hard sweeps, iHS [12], integrates signal over much longer timescales (>1,000 generations), has no specificity to the more recent history, and has essentially no power for the soft sweep scenarios.

Catching evolution in the act with the Singleton Density Score: http://www.molecularecologist.com/2016/05/catching-evolution-in-the-act-with-the-singleton-density-score/
The Singleton Density Score (SDS) is a measure based on the idea that changes in allele frequencies induced by recent selection can be observed in a sample’s genealogy as differences in the branch length distribution.

You don’t need a weatherman: https://westhunt.wordpress.com/2016/05/08/you-dont-need-a-weatherman/
You can do a million cool things with this method. Since the effective time scale goes inversely with sample size, you could look at evolution in England over the past 1000 years or the past 500. Differencing, over the period 1-1000 AD. Since you can look at polygenic traits, you can see whether the alleles favoring higher IQs have increased or decreased in frequency over various stretches of time. You can see if Greg Clark’s proposed mechanism really happened. You can (soon) tell if creeping Pinkerization is genetic, or partly genetic.

You could probably find out if the Middle Easterners really have gotten slower, and when it happened.

Looking at IQ alleles, you could not only show whether the Ashkenazi Jews really are biologically smarter but if so, when it happened, which would give you strong hints as to how it happened.

We know that IQ-favoring alleles are going down (slowly) right now (not counting immigration, which of course drastically speeds it up). Soon we will know if this was true while Russia was under the Mongol yoke – we’ll know how smart Periclean Athenians were and when that boost occurred. And so on. And on!

...

“The pace has been so rapid that humans have changed significantly in body and mind over recorded history."

bicameral mind: https://westhunt.wordpress.com/2016/05/08/you-dont-need-a-weatherman/#comment-78934

https://westhunt.wordpress.com/2016/05/08/you-dont-need-a-weatherman/#comment-78939
Chinese, Koreans, Japanese and Ashkenazi Jews all have high levels of myopia. Australian Aborigines have almost none, I think.

https://westhunt.wordpress.com/2016/05/08/you-dont-need-a-weatherman/#comment-79094
I expect that the fall of all great empires is based on long term dysgenic trends. There is no logical reason why so many empires and civilizations throughout history could grow so big and then not simply keep growing, except for dysgenics.
--
I can think of about twenty other possible explanations off the top of my head, but dysgenics is a possible cause.
--
I agree with DataExplorer. The largest factor in the decay of civilizations is dysgenics. The discussion by R. A. Fisher 1930 p. 193 is very cogent on this matter. Soon we will know for sure.
--
Sometimes it can be rapid. Assume that the upper classes are mostly urban, and somewhat sharper than average. Then the Mongols arrive.
sapiens  study  genetics  evolution  hsu  trends  data  visualization  recent-selection  methodology  summary  GWAS  2016  scitariat  britain  commentary  embodied  biodet  todo  control  multi  gnxp  pop-diff  stat-power  mutation  hypothesis-testing  stats  age-generation  QTL  gene-drift  comparison  marginal  aDNA  simulation  trees  time  metrics  density  measurement  conquest-empire  pinker  population-genetics  aphorism  simler  dennett  👽  the-classics  iron-age  mediterranean  volo-avolo  alien-character  russia  medieval  spearhead  gregory-clark  bio  preprint  domestication  MENA  iq  islam  history  poast  west-hunter  scale  behavioral-gen  gotchas  cost-benefit  genomics  bioinformatics  stylized-facts  concept  levers  🌞  pop-structure  nibble  explanation  ideas  usa  dysgenics  list  applicability-prereqs  cohesion  judaism  visuo  correlation  china  asia  japan  korea  civilization  gibbon  rot  roots  fisher  giants  books  old-anglo  selection  agri-mindset  hari-seldon 
august 2016 by nhaliday

related tags

absolute-relative  academia  accuracy  acm  acmtariat  aDNA  adversarial  advice  africa  age-generation  age-of-discovery  aging  agri-mindset  agriculture  ai  albion  alien-character  alignment  allodium  altruism  analysis  analytical-holistic  anglo  anglosphere  anthropology  antidemos  antiquity  aphorism  apollonian-dionysian  applicability-prereqs  applications  archaeology  article  ascetic  asia  axelrod  backup  behavioral-econ  behavioral-gen  being-right  bias-variance  biases  big-peeps  bio  biodet  bioinformatics  bitcoin  blockchain  bonferroni  books  bounded-cognition  brain-scan  brands  britain  broad-econ  business  caching  canon  causation  chapman  characterization  charity  chart  cheatsheet  checklists  chemistry  china  christianity  civilization  cjones-like  class  clever-rats  cliometrics  coarse-fine  cocktail  cog-psych  cohesion  commentary  communism  community  comparison  competition  complex-systems  composition-decomposition  concentration-of-measure  concept  conceptual-vocab  confidence  conquest-empire  context  contradiction  contrarianism  control  convergence  cool  cooperate-defect  coordination  correlation  cost-benefit  counterexample  criminology  CRISPR  critique  crosstab  crux  cryptocurrency  cultural-dynamics  culture  curiosity  cybernetics  cycles  cynicism-idealism  darwinian  data  data-science  debate  decentralized  decision-making  decision-theory  deep-materialism  degrees-of-freedom  democracy  demographics  dennett  density  dependence-independence  descriptive  detail-architecture  developing-world  dignity  dimensionality  direction  dirty-hands  discrete  discussion  disease  distribution  diversity  domestication  duality  duty  dysgenics  early-modern  earth  ecology  econometrics  economics  econotariat  eden  EEA  effect-size  egalitarianism-hierarchy  EGT  elections  electromag  elite  embedded-cognition  embodied  emotion  empirical  ems  endo-exo  endogenous-exogenous  energy-resources  engineering  enhancement  ensembles  environment  envy  epidemiology  epistemic  equilibrium  error  essay  ethics  ethnography  europe  evidence  evolution  evopsych  examples  expansionism  expectancy  expert  expert-experience  explanans  explanation  exploratory  exposition  extrema  farmers-and-foragers  fertility  field-study  finance  finiteness  fisher  flexibility  fluid  flux-stasis  foreign-lang  formal-values  fourier  free-riding  frontier  futurism  game-theory  garett-jones  gavisti  GCTA  gedanken  gene-drift  generalization  genetic-load  genetics  genomics  geography  geometry  giants  gibbon  gnon  gnosis-logos  gnxp  good-evil  gotchas  government  gravity  gray-econ  gregory-clark  ground-up  group-selection  growth-econ  GT-101  guilt-shame  GWAS  haidt  hanson  hari-seldon  heavy-industry  henrich  heuristic  hi-order-bits  history  hmm  homo-hetero  honor  hsu  humanity  humility  hypocrisy  hypothesis-testing  ideas  identity  ideology  idk  iidness  illusion  immune  impetus  incentives  individualism-collectivism  industrial-revolution  inference  info-dynamics  info-econ  init  innovation  insight  integrity  intelligence  interdisciplinary  interests  intersection-connectedness  interview  intricacy  intuition  investing  iq  iron-age  is-ought  islam  iteration-recursion  japan  jargon  judaism  justice  kinship  knowledge  korea  language  latin-america  len:long  lens  lesswrong  levers  leviathan  lexical  limits  linear-algebra  linear-models  linearity  links  list  literature  lived-experience  local-global  logic  long-short-run  longevity  love-hate  machine-learning  magnitude  malthus  manifolds  map-territory  marginal  marginal-rev  market-failure  markets  markov  martial  math  math.CA  measure  measurement  mechanics  medieval  mediterranean  MENA  meta-analysis  meta:prediction  meta:rhetoric  meta:science  meta:war  metabuch  metameta  methodology  metrics  micro  microfoundations  migration  military  minimum-viable  miri-cfar  model-organism  models  modernity  moloch  moments  monte-carlo  morality  motivation  multi  multiplicative  mutation  n-factor  nature  near-far  network-structure  neuro  neuro-nitgrit  neurons  new-religion  nibble  nietzschean  nihil  nitty-gritty  noblesse-oblige  nonlinearity  nordic  novelty  number  objektbuch  occam  occident  old-anglo  optimism  order-disorder  org:edge  org:edu  org:junk  org:mat  org:nat  org:popup  organizing  orient  oscillation  outcome-risk  outliers  overflow  parallax  parasites-microbiome  parsimony  paternal-age  patho-altruism  pdf  peace-violence  people  personality  perturbation  pessimism  phalanges  phase-transition  philosophy  physics  pic  piketty  pinker  piracy  play  poast  poetry  polisci  political-econ  politics  pop-diff  pop-structure  population  population-genetics  postrat  pre-2013  pre-ww2  prediction  preprint  priors-posteriors  probability  problem-solving  property-rights  prudence  pseudoE  psych-architecture  psychology  psychometrics  public-goodish  public-health  putnam-like  q-n-a  QTL  quantitative-qualitative  quantum  quotes  race  random  rant  rationality  ratty  realness  reason  recent-selection  recommendations  red-queen  reduction  reference  reflection  regression  regularizer  reinforcement  relativity  religion  rent-seeking  replication  reputation  retention  review  revolution  rhetoric  rigor  risk  ritual  robust  roots  rot  russia  s:*  s:**  sampling  sampling-bias  sanctity-degradation  sapiens  scale  science  scifi-fantasy  scitariat  selection  self-interest  sequential  shift  signal-noise  signaling  signum  simler  simplex  simulation  singularity  sinosphere  skeleton  smoothness  social  social-choice  social-norms  social-psych  social-science  social-structure  sociality  society  sociology  space  spatial  spearhead  speculation  speed  speedometer  spreading  stat-mech  stat-power  state  stats  status  stereotypes  stories  strategy  street-fighting  structure  study  stylized-facts  subculture  subjective-objective  summary  syntax  synthesis  systematic-ad-hoc  technology  techtariat  telos-atelos  temperature  tetlock  the-basilisk  the-classics  the-great-west-whale  the-self  the-south  the-trenches  the-watchers  the-world-is-just-atoms  theory-of-mind  theory-practice  theos  thermo  thick-thin  things  thinking  time  todo  traces  trees  trends  tribalism  tricki  trivia  trust  truth  tutorial  tv  twin-study  twitter  unaffiliated  uncertainty  uniqueness  unit  universalism-particularism  us-them  usa  vague  values  vampire-squid  variance-components  visualization  visuo  vitality  volo-avolo  war  water  wealth-of-nations  weird  west-hunter  westminster  wiki  wild-ideas  wire-guided  within-group  within-without  wonkish  working-stiff  world  wormholes  yoga  zeitgeist  zero-positive-sum  zooming  🌞  🎩  👽  🔬  🦀 

Copy this bookmark:



description:


tags: