nhaliday + applicability-prereqs   46

Advantages and disadvantages of building a single page web application - Software Engineering Stack Exchange
Advantages
- All data has to be available via some sort of API - this is a big advantage for my use case as I want to have an API to my application anyway. Right now about 60-70% of my calls to get/update data are done through a REST API. Doing a single page application will allow me to better test my REST API since the application itself will use it. It also means that as the application grows, the API itself will grow since that is what the application uses; no need to maintain the API as an add-on to the application.
- More responsive application - since all data loaded after the initial page is kept to a minimum and transmitted in a compact format (like JSON), data requests should generally be faster, and the server will do slightly less processing.

Disadvantages
- Duplication of code - for example, model code. I am going to have to create models both on the server side (PHP in this case) and the client side in Javascript.
- Business logic in Javascript - I can't give any concrete examples on why this would be bad but it just doesn't feel right to me having business logic in Javascript that anyone can read.
- Javascript memory leaks - since the page never reloads, Javascript memory leaks can happen, and I would not even know where to begin to debug them.

--

Disadvantages I often see with Single Page Web Applications:
- Inability to link to a specific part of the site, there's often only 1 entry point.
- Disfunctional back and forward buttons.
- The use of tabs is limited or non-existant.
(especially mobile:)
- Take very long to load.
- Don't function at all.
- Can't reload a page, a sudden loss of network takes you back to the start of the site.

This answer is outdated, Most single page application frameworks have a way to deal with the issues above – Luis May 27 '14 at 1:41
@Luis while the technology is there, too often it isn't used. – Pieter B Jun 12 '14 at 6:53

https://softwareengineering.stackexchange.com/questions/201838/building-a-web-application-that-is-almost-completely-rendered-by-javascript-whi

https://softwareengineering.stackexchange.com/questions/143194/what-advantages-are-conferred-by-using-server-side-page-rendering
Server-side HTML rendering:
- Fastest browser rendering
- Page caching is possible as a quick-and-dirty performance boost
- For "standard" apps, many UI features are pre-built
- Sometimes considered more stable because components are usually subject to compile-time validation
- Leans on backend expertise
- Sometimes faster to develop*
*When UI requirements fit the framework well.

Client-side HTML rendering:
- Lower bandwidth usage
- Slower initial page render. May not even be noticeable in modern desktop browsers. If you need to support IE6-7, or many mobile browsers (mobile webkit is not bad) you may encounter bottlenecks.
- Building API-first means the client can just as easily be an proprietary app, thin client, another web service, etc.
- Leans on JS expertise
- Sometimes faster to develop**
**When the UI is largely custom, with more interesting interactions. Also, I find coding in the browser with interpreted code noticeably speedier than waiting for compiles and server restarts.

https://softwareengineering.stackexchange.com/questions/237537/progressive-enhancement-vs-single-page-apps

https://stackoverflow.com/questions/21862054/single-page-application-advantages-and-disadvantages
=== ADVANTAGES ===
1. SPA is extremely good for very responsive sites:
2. With SPA we don't need to use extra queries to the server to download pages.
3.May be any other advantages? Don't hear about any else..

=== DISADVANTAGES ===
1. Client must enable javascript.
2. Only one entry point to the site.
3. Security.

https://softwareengineering.stackexchange.com/questions/287819/should-you-write-your-back-end-as-an-api
focused on .NET

https://softwareengineering.stackexchange.com/questions/337467/is-it-normal-design-to-completely-decouple-backend-and-frontend-web-applications
A SPA comes with a few issues associated with it. Here are just a few that pop in my mind now:
- it's mostly JavaScript. One error in a section of your application might prevent other sections of the application to work because of that Javascript error.
- CORS.
- SEO.
- separate front-end application means separate projects, deployment pipelines, extra tooling, etc;
- security is harder to do when all the code is on the client;

- completely interact in the front-end with the user and only load data as needed from the server. So better responsiveness and user experience;
- depending on the application, some processing done on the client means you spare the server of those computations.
- have a better flexibility in evolving the back-end and front-end (you can do it separately);
- if your back-end is essentially an API, you can have other clients in front of it like native Android/iPhone applications;
- the separation might make is easier for front-end developers to do CSS/HTML without needing to have a server application running on their machine.

Create your own dysfunctional single-page app: https://news.ycombinator.com/item?id=18341993
I think are three broadly assumed user benefits of single-page apps:
1. Improved user experience.
2. Improved perceived performance.
3. It’s still the web.

5 mistakes to create a dysfunctional single-page app
Mistake 1: Under-estimate long-term development and maintenance costs
Mistake 2: Use the single-page app approach unilaterally
Mistake 3: Under-invest in front end capability
Mistake 4: Use naïve dev practices
Mistake 5: Surf the waves of framework hype

The disadvantages of single page applications: https://news.ycombinator.com/item?id=9879685
You probably don't need a single-page app: https://news.ycombinator.com/item?id=19184496
https://news.ycombinator.com/item?id=20384738
MPA advantages:
- Stateless requests
- The browser knows how to deal with a traditional architecture
- Fewer, more mature tools
- SEO for free

When to go for the single page app:
- Core functionality is real-time (e.g Slack)
- Rich UI interactions are core to the product (e.g Trello)
- Lots of state shared between screens (e.g. Spotify)

Hybrid solutions
...
Github uses this hybrid approach.
...

Ask HN: Is it ok to use traditional server-side rendering these days?: https://news.ycombinator.com/item?id=13212465

https://www.reddit.com/r/webdev/comments/cp9vb8/are_people_still_doing_ssr/
https://www.reddit.com/r/webdev/comments/93n60h/best_javascript_modern_approach_to_multi_page/
https://www.reddit.com/r/webdev/comments/aax4k5/do_you_develop_solely_using_spa_these_days/
The SEO issues with SPAs is a persistent concern you hear about a lot, yet nobody ever quantifies the issues. That is because search engines keep the operation of their crawler bots and indexing secret. I have read into it some, and it seems that problem used to exist, somewhat, but is more or less gone now. Bots can deal with SPAs fine.
--
I try to avoid building a SPA nowadays if possible. Not because of SEO (there are now server-side solutions to help with that), but because a SPA increases the complexity of the code base by a magnitude. State management with Redux... Async this and that... URL routing... And don't forget to manage page history.

How about just render pages with templates and be done?

If I need a highly dynamic UI for a particular feature, then I'd probably build an embeddable JS widget for it.
q-n-a  stackex  programming  engineering  tradeoffs  system-design  design  web  frontend  javascript  cost-benefit  analysis  security  state  performance  traces  measurement  intricacy  code-organizing  applicability-prereqs  multi  comparison  smoothness  shift  critique  techtariat  chart  ui  coupling-cohesion  interface-compatibility  hn  commentary  best-practices  discussion  trends  client-server  api  composition-decomposition  cycles  frameworks  ecosystem  degrees-of-freedom  dotnet  working-stiff  reddit  social 
23 days ago by nhaliday
Ask HN: Favorite note-taking software? | Hacker News
Ask HN: What is your ideal note-taking software and/or hardware?: https://news.ycombinator.com/item?id=13221158

my wishlist as of 2019:
- web + desktop macOS + mobile iOS (at least viewing on the last but ideally also editing)
- sync across all those
- open-source data format that's easy to manipulate for scripting purposes
- flexible organization: mostly tree hierarchical (subsuming linear/unorganized) but with the option for directed (acyclic) graph (possibly a second layer of structure/linking)
- can store plain text, LaTeX, diagrams, and raster/vector images (video prob not necessary except as links to elsewhere)
- full-text search
- somehow digest/import data from Pinboard, Workflowy, Papers 3/Bookends, and Skim, ideally absorbing most of their functionality
- so, eg, track notes/annotations side-by-side w/ original PDF/DjVu/ePub documents (to replace Papers3/Bookends/Skim), and maybe web pages too (to replace Pinboard)
- OCR of handwritten notes (how to handle equations/diagrams?)
- various forms of NLP analysis of everything (topic models, clustering, etc)
- maybe version control (less important than export)

candidates?:
- Evernote prob ruled out do to heavy use of proprietary data formats (unless I can find some way to export with tolerably clean output)
- Workflowy/Dynalist are good but only cover a subset of functionality I want
- org-mode doesn't interact w/ mobile well (and I haven't evaluated it in detail otherwise)
- TiddlyWiki/Zim are in the running, but not sure about mobile
- idk about vimwiki but I'm not that wedded to vim and it seems less widely used than org-mode/TiddlyWiki/Zim so prob pass on that
- Quiver/Joplin/Inkdrop look similar and cover a lot of bases, TODO: evaluate more
- Trilium looks especially promising, tho read-only mobile and for macOS desktop look at this: https://github.com/zadam/trilium/issues/511
- RocketBook is interesting scanning/OCR solution but prob not sufficient due to proprietary data format
- TODO: many more candidates, eg, TreeSheets, Gingko, OneNote (macOS?...), Notion (proprietary data format...), Zotero, Nodebook (https://nodebook.io/landing), Polar (https://getpolarized.io), Roam (looks very promising)

Ask HN: What do you use for you personal note taking activity?: https://news.ycombinator.com/item?id=15736102

Ask HN: What are your note-taking techniques?: https://news.ycombinator.com/item?id=9976751

Ask HN: How do you take notes (useful note-taking strategies)?: https://news.ycombinator.com/item?id=13064215

Ask HN: How to get better at taking notes?: https://news.ycombinator.com/item?id=21419478

Ask HN: How did you build up your personal knowledge base?: https://news.ycombinator.com/item?id=21332957
nice comment from math guy on structure and difference between math and CS: https://news.ycombinator.com/item?id=21338628
useful comment collating related discussions: https://news.ycombinator.com/item?id=21333383
highlights:
Designing a Personal Knowledge base: https://news.ycombinator.com/item?id=8270759
Ask HN: How to organize personal knowledge?: https://news.ycombinator.com/item?id=17892731
Do you use a personal 'knowledge base'?: https://news.ycombinator.com/item?id=21108527
Ask HN: How do you share/organize knowledge at work and life?: https://news.ycombinator.com/item?id=21310030

other stuff:
https://www.getdnote.com/blog/how-i-built-personal-knowledge-base-for-myself/
Tiago Forte: https://www.buildingasecondbrain.com

hn search: https://hn.algolia.com/?query=notetaking&type=story

Slant comparison commentary: https://news.ycombinator.com/item?id=7011281

good comparison of options here in comments here (and Trilium itself looks good): https://news.ycombinator.com/item?id=18840990

https://en.wikipedia.org/wiki/Comparison_of_note-taking_software

wikis:
https://www.slant.co/versus/5116/8768/~tiddlywiki_vs_zim
https://www.wikimatrix.org/compare/tiddlywiki+zim
http://tiddlymap.org/
https://www.zim-wiki.org/manual/Plugins/BackLinks_Pane.html
https://zim-wiki.org/manual/Plugins/Link_Map.html

apps:
Roam: https://news.ycombinator.com/item?id=21440289

Inkdrop: https://news.ycombinator.com/item?id=20103589

Joplin: https://news.ycombinator.com/item?id=15815040

Frame: https://news.ycombinator.com/item?id=18760079

https://www.reddit.com/r/TheMotte/comments/cb18sy/anyone_use_a_personal_wiki_software_to_catalog/
Notion: https://news.ycombinator.com/item?id=18904648

Anki:
https://www.reddit.com/r/Anki/comments/as8i4t/use_anki_for_technical_books/
https://www.freecodecamp.org/news/how-anki-saved-my-engineering-career-293a90f70a73/
hn  discussion  recommendations  software  tools  desktop  app  notetaking  exocortex  wkfly  wiki  productivity  multi  comparison  crosstab  properties  applicability-prereqs  nlp  info-foraging  chart  webapp  reference  q-n-a  retention  workflow  reddit  social  ratty  ssc  learning  studying  commentary  structure  thinking  network-structure  things  collaboration  ocr  trees  graphs  LaTeX  search  todo  project  money-for-time  synchrony  pinboard  state  duplication  worrydream  simplification-normalization  links  minimalism  design  neurons  ai-control  openai  miri-cfar 
4 weeks ago by nhaliday
c++ - mmap() vs. reading blocks - Stack Overflow
The discussion of mmap/read reminds me of two other performance discussions:

Some Java programmers were shocked to discover that nonblocking I/O is often slower than blocking I/O, which made perfect sense if you know that nonblocking I/O requires making more syscalls.

Some other network programmers were shocked to learn that epoll is often slower than poll, which makes perfect sense if you know that managing epoll requires making more syscalls.

Conclusion: Use memory maps if you access data randomly, keep it around for a long time, or if you know you can share it with other processes (MAP_SHARED isn't very interesting if there is no actual sharing). Read files normally if you access data sequentially or discard it after reading. And if either method makes your program less complex, do that. For many real world cases there's no sure way to show one is faster without testing your actual application and NOT a benchmark.
q-n-a  stackex  programming  systems  performance  tradeoffs  objektbuch  stylized-facts  input-output  caching  computer-memory  sequential  applicability-prereqs 
july 2019 by nhaliday
data structures - Why are Red-Black trees so popular? - Computer Science Stack Exchange
- AVL trees have smaller average depth than red-black trees, and thus searching for a value in AVL tree is consistently faster.
- Red-black trees make less structural changes to balance themselves than AVL trees, which could make them potentially faster for insert/delete. I'm saying potentially, because this would depend on the cost of the structural change to the tree, as this will depend a lot on the runtime and implemntation (might also be completely different in a functional language when the tree is immutable?)

There are many benchmarks online that compare AVL and Red-black trees, but what struck me is that my professor basically said, that usually you'd do one of two things:
- Either you don't really care that much about performance, in which case the 10-20% difference of AVL vs Red-black in most cases won't matter at all.
- Or you really care about performance, in which you case you'd ditch both AVL and Red-black trees, and go with B-trees, which can be tweaked to work much better (or (a,b)-trees, I'm gonna put all of those in one basket.)

--

> For some kinds of binary search trees, including red-black trees but not AVL trees, the "fixes" to the tree can fairly easily be predicted on the way down and performed during a single top-down pass, making the second pass unnecessary. Such insertion algorithms are typically implemented with a loop rather than recursion, and often run slightly faster in practice than their two-pass counterparts.

So a RedBlack tree insert can be implemented without recursion, on some CPUs recursion is very expensive if you overrun the function call cache (e.g SPARC due to is use of Register window)

--

There are some cases where you can't use B-trees at all.

One prominent case is std::map from C++ STL. The standard requires that insert does not invalidate existing iterators

...

I also believe that "single pass tail recursive" implementation is not the reason for red black tree popularity as a mutable data structure.

First of all, stack depth is irrelevant here, because (given log𝑛 height) you would run out of the main memory before you run out of stack space. Jemalloc is happy with preallocating worst case depth on the stack.
nibble  q-n-a  overflow  cs  algorithms  tcs  data-structures  functional  orders  trees  cost-benefit  tradeoffs  roots  explanans  impetus  performance  applicability-prereqs  programming  pls  c(pp)  ubiquity 
june 2019 by nhaliday
algorithm - Skip List vs. Binary Search Tree - Stack Overflow
Skip lists are more amenable to concurrent access/modification. Herb Sutter wrote an article about data structure in concurrent environments. It has more indepth information.

The most frequently used implementation of a binary search tree is a red-black tree. The concurrent problems come in when the tree is modified it often needs to rebalance. The rebalance operation can affect large portions of the tree, which would require a mutex lock on many of the tree nodes. Inserting a node into a skip list is far more localized, only nodes directly linked to the affected node need to be locked.
q-n-a  stackex  nibble  programming  tcs  data-structures  performance  concurrency  comparison  cost-benefit  applicability-prereqs  random  trees  tradeoffs 
may 2019 by nhaliday
its-not-software - steveyegge2
You don't work in the software industry.

...

So what's the software industry, and how do we differ from it?

Well, the software industry is what you learn about in school, and it's what you probably did at your previous company. The software industry produces software that runs on customers' machines — that is, software intended to run on a machine over which you have no control.

So it includes pretty much everything that Microsoft does: Windows and every application you download for it, including your browser.

It also includes everything that runs in the browser, including Flash applications, Java applets, and plug-ins like Adobe's Acrobat Reader. Their deployment model is a little different from the "classic" deployment models, but it's still software that you package up and release to some unknown client box.

...

Servware

Our industry is so different from the software industry, and it's so important to draw a clear distinction, that it needs a new name. I'll call it Servware for now, lacking anything better. Hardware, firmware, software, servware. It fits well enough.

Servware is stuff that lives on your own servers. I call it "stuff" advisedly, since it's more than just software; it includes configuration, monitoring systems, data, documentation, and everything else you've got there, all acting in concert to produce some observable user experience on the other side of a network connection.
techtariat  sv  tech  rhetoric  essay  software  saas  devops  engineering  programming  contrarianism  list  top-n  best-practices  applicability-prereqs  desktop  flux-stasis  homo-hetero  trends  games  thinking  checklists  dbs  models  communication  tutorial  wiki  integration-extension  frameworks  api  whole-partial-many  metrics  retrofit  c(pp)  pls  code-dive  planning  working-stiff  composition-decomposition  libraries  conceptual-vocab  amazon  system-design  cracker-prog  tech-infrastructure  blowhards  client-server 
may 2019 by nhaliday
Eliminative materialism - Wikipedia
Eliminative materialism (also called eliminativism) is the claim that people's common-sense understanding of the mind (or folk psychology) is false and that certain classes of mental states that most people believe in do not exist.[1] It is a materialist position in the philosophy of mind. Some supporters of eliminativism argue that no coherent neural basis will be found for many everyday psychological concepts such as belief or desire, since they are poorly defined. Rather, they argue that psychological concepts of behaviour and experience should be judged by how well they reduce to the biological level.[2] Other versions entail the non-existence of conscious mental states such as pain and visual perceptions.[3]

Eliminativism about a class of entities is the view that that class of entities does not exist.[4] For example, materialism tends to be eliminativist about the soul; modern chemists are eliminativist about phlogiston; and modern physicists are eliminativist about the existence of luminiferous aether. Eliminative materialism is the relatively new (1960s–1970s) idea that certain classes of mental entities that common sense takes for granted, such as beliefs, desires, and the subjective sensation of pain, do not exist.[5][6] The most common versions are eliminativism about propositional attitudes, as expressed by Paul and Patricia Churchland,[7] and eliminativism about qualia (subjective interpretations about particular instances of subjective experience), as expressed by Daniel Dennett and Georges Rey.[3] These philosophers often appeal to an introspection illusion.

In the context of materialist understandings of psychology, eliminativism stands in opposition to reductive materialism which argues that mental states as conventionally understood do exist, and that they directly correspond to the physical state of the nervous system.[8][need quotation to verify] An intermediate position is revisionary materialism, which will often argue that the mental state in question will prove to be somewhat reducible to physical phenomena—with some changes needed to the common sense concept.

Since eliminative materialism claims that future research will fail to find a neuronal basis for various mental phenomena, it must necessarily wait for science to progress further. One might question the position on these grounds, but other philosophers like Churchland argue that eliminativism is often necessary in order to open the minds of thinkers to new evidence and better explanations.[8]
concept  conceptual-vocab  philosophy  ideology  thinking  metameta  weird  realness  psychology  cog-psych  neurons  neuro  brain-scan  reduction  complex-systems  cybernetics  wiki  reference  parallax  truth  dennett  within-without  the-self  subjective-objective  absolute-relative  deep-materialism  new-religion  identity  analytical-holistic  systematic-ad-hoc  science  theory-practice  theory-of-mind  applicability-prereqs  nihil  lexical 
april 2018 by nhaliday
Prisoner's dilemma - Wikipedia
caveat to result below:
An extension of the IPD is an evolutionary stochastic IPD, in which the relative abundance of particular strategies is allowed to change, with more successful strategies relatively increasing. This process may be accomplished by having less successful players imitate the more successful strategies, or by eliminating less successful players from the game, while multiplying the more successful ones. It has been shown that unfair ZD strategies are not evolutionarily stable. The key intuition is that an evolutionarily stable strategy must not only be able to invade another population (which extortionary ZD strategies can do) but must also perform well against other players of the same type (which extortionary ZD players do poorly, because they reduce each other's surplus).[14]

Theory and simulations confirm that beyond a critical population size, ZD extortion loses out in evolutionary competition against more cooperative strategies, and as a result, the average payoff in the population increases when the population is bigger. In addition, there are some cases in which extortioners may even catalyze cooperation by helping to break out of a face-off between uniform defectors and win–stay, lose–switch agents.[8]

https://alfanl.com/2018/04/12/defection/
Nature boils down to a few simple concepts.

Haters will point out that I oversimplify. The haters are wrong. I am good at saying a lot with few words. Nature indeed boils down to a few simple concepts.

In life, you can either cooperate or defect.

Used to be that defection was the dominant strategy, say in the time when the Roman empire started to crumble. Everybody complained about everybody and in the end nothing got done. Then came Jesus, who told people to be loving and cooperative, and boom: 1800 years later we get the industrial revolution.

Because of Jesus we now find ourselves in a situation where cooperation is the dominant strategy. A normie engages in a ton of cooperation: with the tax collector who wants more and more of his money, with schools who want more and more of his kid’s time, with media who wants him to repeat more and more party lines, with the Zeitgeist of the Collective Spirit of the People’s Progress Towards a New Utopia. Essentially, our normie is cooperating himself into a crumbling Western empire.

Turns out that if everyone blindly cooperates, parasites sprout up like weeds until defection once again becomes the standard.

The point of a post-Christian religion is to once again create conditions for the kind of cooperation that led to the industrial revolution. This necessitates throwing out undead Christianity: you do not blindly cooperate. You cooperate with people that cooperate with you, you defect on people that defect on you. Christianity mixed with Darwinism. God and Gnon meet.

This also means we re-establish spiritual hierarchy, which, like regular hierarchy, is a prerequisite for cooperation. It is this hierarchical cooperation that turns a household into a force to be reckoned with, that allows a group of men to unite as a front against their enemies, that allows a tribe to conquer the world. Remember: Scientology bullied the Cathedral’s tax department into submission.

With a functioning hierarchy, men still gossip, lie and scheme, but they will do so in whispers behind closed doors. In your face they cooperate and contribute to the group’s wellbeing because incentives are thus that contributing to group wellbeing heightens status.

Without a functioning hierarchy, men gossip, lie and scheme, but they do so in your face, and they tell you that you are positively deluded for accusing them of gossiping, lying and scheming. Seeds will not sprout in such ground.

Spiritual dominance is established in the same way any sort of dominance is established: fought for, taken. But the fight is ritualistic. You can’t force spiritual dominance if no one listens, or if you are silenced the ritual is not allowed to happen.

If one of our priests is forbidden from establishing spiritual dominance, that is a sure sign an enemy priest is in better control and has vested interest in preventing you from establishing spiritual dominance..

They defect on you, you defect on them. Let them suffer the consequences of enemy priesthood, among others characterized by the annoying tendency that very little is said with very many words.

https://contingentnotarbitrary.com/2018/04/14/rederiving-christianity/
To recap, we started with a secular definition of Logos and noted that its telos is existence. Given human nature, game theory and the power of cooperation, the highest expression of that telos is freely chosen universal love, tempered by constant vigilance against defection while maintaining compassion for the defectors and forgiving those who repent. In addition, we must know the telos in order to fulfill it.

In Christian terms, looks like we got over half of the Ten Commandments (know Logos for the First, don’t defect or tempt yourself to defect for the rest), the importance of free will, the indestructibility of evil (group cooperation vs individual defection), loving the sinner and hating the sin (with defection as the sin), forgiveness (with conditions), and love and compassion toward all, assuming only secular knowledge and that it’s good to exist.

Iterated Prisoner's Dilemma is an Ultimatum Game: http://infoproc.blogspot.com/2012/07/iterated-prisoners-dilemma-is-ultimatum.html
The history of IPD shows that bounded cognition prevented the dominant strategies from being discovered for over over 60 years, despite significant attention from game theorists, computer scientists, economists, evolutionary biologists, etc. Press and Dyson have shown that IPD is effectively an ultimatum game, which is very different from the Tit for Tat stories told by generations of people who worked on IPD (Axelrod, Dawkins, etc., etc.).

...

For evolutionary biologists: Dyson clearly thinks this result has implications for multilevel (group vs individual selection):
... Cooperation loses and defection wins. The ZD strategies confirm this conclusion and make it sharper. ... The system evolved to give cooperative tribes an advantage over non-cooperative tribes, using punishment to give cooperation an evolutionary advantage within the tribe. This double selection of tribes and individuals goes way beyond the Prisoners' Dilemma model.

implications for fractionalized Europe vis-a-vis unified China?

and more broadly does this just imply we're doomed in the long run RE: cooperation, morality, the "good society", so on...? war and group-selection is the only way to get a non-crab bucket civilization?

Iterated Prisoner’s Dilemma contains strategies that dominate any evolutionary opponent:
http://www.pnas.org/content/109/26/10409.full
http://www.pnas.org/content/109/26/10409.full.pdf
https://www.edge.org/conversation/william_h_press-freeman_dyson-on-iterated-prisoners-dilemma-contains-strategies-that

https://en.wikipedia.org/wiki/Ultimatum_game

analogy for ultimatum game: the state gives the demos a bargain take-it-or-leave-it, and...if the demos refuses...violence?

The nature of human altruism: http://sci-hub.tw/https://www.nature.com/articles/nature02043
- Ernst Fehr & Urs Fischbacher

Some of the most fundamental questions concerning our evolutionary origins, our social relations, and the organization of society are centred around issues of altruism and selfishness. Experimental evidence indicates that human altruism is a powerful force and is unique in the animal world. However, there is much individual heterogeneity and the interaction between altruists and selfish individuals is vital to human cooperation. Depending on the environment, a minority of altruists can force a majority of selfish individuals to cooperate or, conversely, a few egoists can induce a large number of altruists to defect. Current gene-based evolutionary theories cannot explain important patterns of human altruism, pointing towards the importance of both theories of cultural evolution as well as gene–culture co-evolution.

...

Why are humans so unusual among animals in this respect? We propose that quantitatively, and probably even qualitatively, unique patterns of human altruism provide the answer to this question. Human altruism goes far beyond that which has been observed in the animal world. Among animals, fitness-reducing acts that confer fitness benefits on other individuals are largely restricted to kin groups; despite several decades of research, evidence for reciprocal altruism in pair-wise repeated encounters4,5 remains scarce6–8. Likewise, there is little evidence so far that individual reputation building affects cooperation in animals, which contrasts strongly with what we find in humans. If we randomly pick two human strangers from a modern society and give them the chance to engage in repeated anonymous exchanges in a laboratory experiment, there is a high probability that reciprocally altruistic behaviour will emerge spontaneously9,10.

However, human altruism extends far beyond reciprocal altruism and reputation-based cooperation, taking the form of strong reciprocity11,12. Strong reciprocity is a combination of altruistic rewarding, which is a predisposition to reward others for cooperative, norm-abiding behaviours, and altruistic punishment, which is a propensity to impose sanctions on others for norm violations. Strong reciprocators bear the cost of rewarding or punishing even if they gain no individual economic benefit whatsoever from their acts. In contrast, reciprocal altruists, as they have been defined in the biological literature4,5, reward and punish only if this is in their long-term self-interest. Strong reciprocity thus constitutes a powerful incentive for cooperation even in non-repeated interactions and when reputation gains are absent, because strong reciprocators will reward those who cooperate and punish those who defect.

...

We will show that the interaction between selfish and strongly reciprocal … [more]
concept  conceptual-vocab  wiki  reference  article  models  GT-101  game-theory  anthropology  cultural-dynamics  trust  cooperate-defect  coordination  iteration-recursion  sequential  axelrod  discrete  smoothness  evolution  evopsych  EGT  economics  behavioral-econ  sociology  new-religion  deep-materialism  volo-avolo  characterization  hsu  scitariat  altruism  justice  group-selection  decision-making  tribalism  organizing  hari-seldon  theory-practice  applicability-prereqs  bio  finiteness  multi  history  science  social-science  decision-theory  commentary  study  summary  giants  the-trenches  zero-positive-sum  🔬  bounded-cognition  info-dynamics  org:edge  explanation  exposition  org:nat  eden  retention  long-short-run  darwinian  markov  equilibrium  linear-algebra  nitty-gritty  competition  war  explanans  n-factor  europe  the-great-west-whale  occident  china  asia  sinosphere  orient  decentralized  markets  market-failure  cohesion  metabuch  stylized-facts  interdisciplinary  physics  pdf  pessimism  time  insight  the-basilisk  noblesse-oblige  the-watchers  ideas  l 
march 2018 by nhaliday
The weirdest people in the world?
Abstract: Behavioral scientists routinely publish broad claims about human psychology and behavior in the world’s top journals based on samples drawn entirely from Western, Educated, Industrialized, Rich, and Democratic (WEIRD) societies. Researchers – often implicitly – assume that either there is little variation across human populations, or that these “standard subjects” are as representative of the species as any other population. Are these assumptions justified? Here, our review of the comparative database from across the behavioral sciences suggests both that there is substantial variability in experimental results across populations and that WEIRD subjects are particularly unusual compared with the rest of the species – frequent outliers. The domains reviewed include visual perception, fairness, cooperation, spatial reasoning, categorization and inferential induction, moral reasoning, reasoning styles, self-concepts and related motivations, and the heritability of IQ. The findings suggest that members of WEIRD societies, including young children, are among the least representative populations one could find for generalizing about humans. Many of these findings involve domains that are associated with fundamental aspects of psychology, motivation, and behavior – hence, there are no obvious a priori grounds for claiming that a particular behavioral phenomenon is universal based on sampling from a single subpopulation. Overall, these empirical patterns suggests that we need to be less cavalier in addressing questions of human nature on the basis of data drawn from this particularly thin, and rather unusual, slice of humanity. We close by proposing ways to structurally re-organize the behavioral sciences to best tackle these challenges.
pdf  study  microfoundations  anthropology  cultural-dynamics  sociology  psychology  social-psych  cog-psych  iq  biodet  behavioral-gen  variance-components  psychometrics  psych-architecture  visuo  spatial  morality  individualism-collectivism  n-factor  justice  egalitarianism-hierarchy  cooperate-defect  outliers  homo-hetero  evopsych  generalization  henrich  europe  the-great-west-whale  occident  organizing  🌞  universalism-particularism  applicability-prereqs  hari-seldon  extrema  comparison  GT-101  ecology  EGT  reinforcement  anglo  language  gavisti  heavy-industry  marginal  absolute-relative  reason  stylized-facts  nature  systematic-ad-hoc  analytical-holistic  science  modernity  behavioral-econ  s:*  illusion  cool  hmm  coordination  self-interest  social-norms  population  density  humanity  sapiens  farmers-and-foragers  free-riding  anglosphere  cost-benefit  china  asia  sinosphere  MENA  world  developing-world  neurons  theory-of-mind  network-structure  nordic  orient  signum  biases  usa  optimism  hypocrisy  humility  within-without  volo-avolo  domes 
november 2017 by nhaliday
All models are wrong - Wikipedia
Box repeated the aphorism in a paper that was published in the proceedings of a 1978 statistics workshop.[2] The paper contains a section entitled "All models are wrong but some are useful". The section is copied below.

Now it would be very remarkable if any system existing in the real world could be exactly represented by any simple model. However, cunningly chosen parsimonious models often do provide remarkably useful approximations. For example, the law PV = RT relating pressure P, volume V and temperature T of an "ideal" gas via a constant R is not exactly true for any real gas, but it frequently provides a useful approximation and furthermore its structure is informative since it springs from a physical view of the behavior of gas molecules.

For such a model there is no need to ask the question "Is the model true?". If "truth" is to be the "whole truth" the answer must be "No". The only question of interest is "Is the model illuminating and useful?".
thinking  metabuch  metameta  map-territory  models  accuracy  wire-guided  truth  philosophy  stats  data-science  methodology  lens  wiki  reference  complex-systems  occam  parsimony  science  nibble  hi-order-bits  info-dynamics  the-trenches  meta:science  physics  fluid  thermo  stat-mech  applicability-prereqs  theory-practice  elegance  simplification-normalization 
august 2017 by nhaliday
Econometric Modeling as Junk Science
The Credibility Revolution in Empirical Economics: How Better Research Design Is Taking the Con out of Econometrics: https://www.aeaweb.org/articles?id=10.1257/jep.24.2.3

On data, experiments, incentives and highly unconvincing research – papers and hot beverages: https://papersandhotbeverages.wordpress.com/2015/10/31/on-data-experiments-incentives-and-highly-unconvincing-research/
In my view, it has just to do with the fact that academia is a peer monitored organization. In the case of (bad) data collection papers, issues related to measurement are typically boring. They are relegated to appendices, no one really has an incentive to monitor it seriously. The problem is similar in formal theory: no one really goes through the algebra in detail, but it is in principle feasible to do it, and, actually, sometimes these errors are detected. If discussing the algebra of a proof is almost unthinkable in a seminar, going into the details of data collection, measurement and aggregation is not only hard to imagine, but probably intrinsically infeasible.

Something different happens for the experimentalist people. As I was saying, I feel we have come to a point in which many papers are evaluated based on the cleverness and originality of the research design (“Using the World Cup qualifiers as an instrument for patriotism!? Woaw! how cool/crazy is that! I wish I had had that idea”). The sexiness of the identification strategy has too often become a goal in itself. When your peers monitor you paying more attention to the originality of the identification strategy than to the research question, you probably have an incentive to mine reality for ever crazier discontinuities. It is true methodologists have been criticized in the past for analogous reasons, such as being guided by the desire to increase mathematical complexity without a clear benefit. But, if you work with pure formal theory or statistical theory, your work is not meant to immediately answer question about the real world, but instead to serve other researchers in their quest. This is something that can, in general, not be said of applied CI work.

https://twitter.com/pseudoerasmus/status/662007951415238656
This post should have been entitled “Zombies who only think of their next cool IV fix”
https://twitter.com/pseudoerasmus/status/662692917069422592
massive lust for quasi-natural experiments, regression discontinuities
barely matters if the effects are not all that big
I suppose even the best of things must reach their decadent phase; methodological innov. to manias……

https://twitter.com/cblatts/status/920988530788130816
Following this "collapse of small-N social psych results" business, where do I predict econ will collapse? I see two main contenders.
One is lab studies. I dallied with these a few years ago in a Kenya lab. We ran several pilots of N=200 to figure out the best way to treat
and to measure the outcome. Every pilot gave us a different stat sig result. I could have written six papers concluding different things.
I gave up more skeptical of these lab studies than ever before. The second contender is the long run impacts literature in economic history
We should be very suspicious since we never see a paper showing that a historical event had no effect on modern day institutions or dvpt.
On the one hand I find these studies fun, fascinating, and probably true in a broad sense. They usually reinforce a widely believed history
argument with interesting data and a cute empirical strategy. But I don't think anyone believes the standard errors. There's probably a HUGE
problem of nonsignificant results staying in the file drawer. Also, there are probably data problems that don't get revealed, as we see with
the recent Piketty paper (http://marginalrevolution.com/marginalrevolution/2017/10/pikettys-data-reliable.html). So I take that literature with a vat of salt, even if I enjoy and admire the works
I used to think field experiments would show little consistency in results across place. That external validity concerns would be fatal.
In fact the results across different samples and places have proven surprisingly similar across places, and added a lot to general theory
Last, I've come to believe there is no such thing as a useful instrumental variable. The ones that actually meet the exclusion restriction
are so weird & particular that the local treatment effect is likely far different from the average treatment effect in non-transparent ways.
Most of the other IVs don't plausibly meet the e clue ion restriction. I mean, we should be concerned when the IV estimate is always 10x
larger than the OLS coefficient. This I find myself much more persuaded by simple natural experiments that use OLS, diff in diff, or
discontinuities, alongside randomized trials.

What do others think are the cliffs in economics?
PS All of these apply to political science too. Though I have a special extra target in poli sci: survey experiments! A few are good. I like
Dan Corstange's work. But it feels like 60% of dissertations these days are experiments buried in a survey instrument that measure small
changes in response. These at least have large N. But these are just uncontrolled labs, with negligible external validity in my mind.
The good ones are good. This method has its uses. But it's being way over-applied. More people have to make big and risky investments in big
natural and field experiments. Time to raise expectations and ambitions. This expectation bar, not technical ability, is the big advantage
economists have over political scientists when they compete in the same space.
(Ok. So are there any friends and colleagues I haven't insulted this morning? Let me know and I'll try my best to fix it with a screed)

HOW MUCH SHOULD WE TRUST DIFFERENCES-IN-DIFFERENCES ESTIMATES?∗: https://economics.mit.edu/files/750
Most papers that employ Differences-in-Differences estimation (DD) use many years of data and focus on serially correlated outcomes but ignore that the resulting standard errors are inconsistent. To illustrate the severity of this issue, we randomly generate placebo laws in state-level data on female wages from the Current Population Survey. For each law, we use OLS to compute the DD estimate of its “effect” as well as the standard error of this estimate. These conventional DD standard errors severely understate the standard deviation of the estimators: we find an “effect” significant at the 5 percent level for up to 45 percent of the placebo interventions. We use Monte Carlo simulations to investigate how well existing methods help solve this problem. Econometric corrections that place a specific parametric form on the time-series process do not perform well. Bootstrap (taking into account the auto-correlation of the data) works well when the number of states is large enough. Two corrections based on asymptotic approximation of the variance-covariance matrix work well for moderate numbers of states and one correction that collapses the time series information into a “pre” and “post” period and explicitly takes into account the effective sample size works well even for small numbers of states.

‘METRICS MONDAY: 2SLS–CHRONICLE OF A DEATH FORETOLD: http://marcfbellemare.com/wordpress/12733
As it turns out, Young finds that
1. Conventional tests tend to overreject the null hypothesis that the 2SLS coefficient is equal to zero.
2. 2SLS estimates are falsely declared significant one third to one half of the time, depending on the method used for bootstrapping.
3. The 99-percent confidence intervals (CIs) of those 2SLS estimates include the OLS point estimate over 90 of the time. They include the full OLS 99-percent CI over 75 percent of the time.
4. 2SLS estimates are extremely sensitive to outliers. Removing simply one outlying cluster or observation, almost half of 2SLS results become insignificant. Things get worse when removing two outlying clusters or observations, as over 60 percent of 2SLS results then become insignificant.
5. Using a Durbin-Wu-Hausman test, less than 15 percent of regressions can reject the null that OLS estimates are unbiased at the 1-percent level.
6. 2SLS has considerably higher mean squared error than OLS.
7. In one third to one half of published results, the null that the IVs are totally irrelevant cannot be rejected, and so the correlation between the endogenous variable(s) and the IVs is due to finite sample correlation between them.
8. Finally, fewer than 10 percent of 2SLS estimates reject instrument irrelevance and the absence of OLS bias at the 1-percent level using a Durbin-Wu-Hausman test. It gets much worse–fewer than 5 percent–if you add in the requirement that the 2SLS CI that excludes the OLS estimate.

Methods Matter: P-Hacking and Causal Inference in Economics*: http://ftp.iza.org/dp11796.pdf
Applying multiple methods to 13,440 hypothesis tests reported in 25 top economics journals in 2015, we show that selective publication and p-hacking is a substantial problem in research employing DID and (in particular) IV. RCT and RDD are much less problematic. Almost 25% of claims of marginally significant results in IV papers are misleading.

https://twitter.com/NoamJStein/status/1040887307568664577
Ever since I learned social science is completely fake, I've had a lot more time to do stuff that matters, like deadlifting and reading about Mediterranean haplogroups
--
Wait, so, from fakest to realest IV>DD>RCT>RDD? That totally matches my impression.

https://twitter.com/wwwojtekk/status/1190731344336293889
https://archive.is/EZu0h
Great (not completely new but still good to have it in one place) discussion of RCTs and inference in economics by Deaton, my favorite sentences (more general than just about RCT) below
Randomization in the tropics revisited: a theme and eleven variations: https://scholar.princeton.edu/sites/default/files/deaton/files/deaton_randomization_revisited_v3_2019.pdf
org:junk  org:edu  economics  econometrics  methodology  realness  truth  science  social-science  accuracy  generalization  essay  article  hmm  multi  study  🎩  empirical  causation  error  critique  sociology  criminology  hypothesis-testing  econotariat  broad-econ  cliometrics  endo-exo  replication  incentives  academia  measurement  wire-guided  intricacy  twitter  social  discussion  pseudoE  effect-size  reflection  field-study  stat-power  piketty  marginal-rev  commentary  data-science  expert-experience  regression  gotchas  rant  map-territory  pdf  simulation  moments  confidence  bias-variance  stats  endogenous-exogenous  control  meta:science  meta-analysis  outliers  summary  sampling  ensembles  monte-carlo  theory-practice  applicability-prereqs  chart  comparison  shift  ratty  unaffiliated  garett-jones 
june 2017 by nhaliday
The Limits of Public Choice Theory – Jacobite
Many people believe that politics is difficult because of incentives: voters vote for their self interest; bureaucrats deliberately don’t solve problems to enlarge their departments; and elected officials maximize votes for power and sell out to lobbyists. But this cynical view is mostly wrong—politics, insofar as it has problems, has problems not because people are selfish—it has problems because people have wrong ideas. In fact, people mostly act surprisingly altruistically, motivated by trying to do good for their country.

...

I got into politics and ideas as a libertarian. I was attracted by the idea of public choice as a universal theory of politics. It’s intuitively appealing, methodologically individualist, and it supported all of the things I already believed. And it’s definitely true to some extent—there is a huge amount of evidence that it affects things somewhat. But it’s terrible as a general theory of politics in the developed world. Our policies are bad because voters are ignorant and politicians believe in things too much, not because everyone is irredeemably cynical and atavistic.

interesting take, HBD?: https://twitter.com/pseudoerasmus/status/869882831572434946

recommended by Garett Jones:
https://web.archive.org/web/20110517015819/http://reviewsindepth.com/2010/03/yes-prime-minister-the-most-cunning-political-propaganda-ever-conceived/
https://en.wikipedia.org/wiki/The_Thick_of_It
org:popup  albion  wonkish  econotariat  rhetoric  essay  contrarianism  methodology  economics  micro  social-choice  elections  government  politics  polisci  incentives  altruism  social-norms  democracy  cynicism-idealism  optimism  antidemos  morality  near-far  ethics  map-territory  models  cooperate-defect  anthropology  coordination  multi  twitter  social  commentary  pseudoE  broad-econ  wealth-of-nations  rent-seeking  leviathan  pop-diff  gnon  political-econ  public-goodish  tv  review  garett-jones  backup  recommendations  microfoundations  wiki  britain  organizing  interests  applicability-prereqs  the-watchers  noblesse-oblige  n-factor  self-interest  cohesion  EGT  world  guilt-shame  alignment 
may 2017 by nhaliday
Lucio Russo - Wikipedia
In The Forgotten Revolution: How Science Was Born in 300 BC and Why It Had to Be Reborn (Italian: La rivoluzione dimenticata), Russo promotes the belief that Hellenistic science in the period 320-144 BC reached heights not achieved by Classical age science, and proposes that it went further than ordinarily thought, in multiple fields not normally associated with ancient science.

La Rivoluzione Dimenticata (The Forgotten Revolution), Reviewed by Sandro Graffi: http://www.ams.org/notices/199805/review-graffi.pdf

Before turning to the question of the decline of Hellenistic science, I come back to the new light shed by the book on Euclid’s Elements and on pre-Ptolemaic astronomy. Euclid’s definitions of the elementary geometric entities—point, straight line, plane—at the beginning of the Elements have long presented a problem.7 Their nature is in sharp contrast with the approach taken in the rest of the book, and continued by mathematicians ever since, of refraining from defining the fundamental entities explicitly but limiting themselves to postulating the properties which they enjoy. Why should Euclid be so hopelessly obscure right at the beginning and so smooth just after? The answer is: the definitions are not Euclid’s. Toward the beginning of the second century A.D. Heron of Alexandria found it convenient to introduce definitions of the elementary objects (a sign of decadence!) in his commentary on Euclid’s Elements, which had been written at least 400 years before. All manuscripts of the Elements copied ever since included Heron’s definitions without mention, whence their attribution to Euclid himself. The philological evidence leading to this conclusion is quite convincing.8

...

What about the general and steady (on the average) impoverishment of Hellenistic science under the Roman empire? This is a major historical problem, strongly tied to the even bigger one of the decline and fall of the antique civilization itself. I would summarize the author’s argument by saying that it basically represents an application to science of a widely accepted general theory on decadence of antique civilization going back to Max Weber. Roman society, mainly based on slave labor, underwent an ultimately unrecoverable crisis as the traditional sources of that labor force, essentially wars, progressively dried up. To save basic farming, the remaining slaves were promoted to be serfs, and poor free peasants reduced to serfdom, but this made trade disappear. A society in which production is almost entirely based on serfdom and with no trade clearly has very little need of culture, including science and technology. As Max Weber pointed out, when trade vanished, so did the marble splendor of the ancient towns, as well as the spiritual assets that went with it: art, literature, science, and sophisticated commercial laws. The recovery of Hellenistic science then had to wait until the disappearance of serfdom at the end of the Middle Ages. To quote Max Weber: “Only then with renewed vigor did the old giant rise up again.”

...

The epilogue contains the (rather pessimistic) views of the author on the future of science, threatened by the apparent triumph of today’s vogue of irrationality even in leading institutions (e.g., an astrology professorship at the Sorbonne). He looks at today’s ever-increasing tendency to teach science more on a fideistic than on a deductive or experimental basis as the first sign of a decline which could be analogous to the post-Hellenistic one.

Praising Alexandrians to excess: https://sci-hub.tw/10.1088/2058-7058/17/4/35
The Economic Record review: https://sci-hub.tw/10.1111/j.1475-4932.2004.00203.x

listed here: https://pinboard.in/u:nhaliday/b:c5c09f2687c1

Was Roman Science in Decline? (Excerpt from My New Book): https://www.richardcarrier.info/archives/13477
people  trivia  cocktail  history  iron-age  mediterranean  the-classics  speculation  west-hunter  scitariat  knowledge  wiki  ideas  wild-ideas  technology  innovation  contrarianism  multi  pdf  org:mat  books  review  critique  regularizer  todo  piracy  physics  canon  science  the-trenches  the-great-west-whale  broad-econ  the-world-is-just-atoms  frontier  speedometer  🔬  conquest-empire  giants  economics  article  growth-econ  cjones-like  industrial-revolution  empirical  absolute-relative  truth  rot  zeitgeist  gibbon  big-peeps  civilization  malthus  roots  old-anglo  britain  early-modern  medieval  social-structure  limits  quantitative-qualitative  rigor  lens  systematic-ad-hoc  analytical-holistic  cycles  space  mechanics  math  geometry  gravity  revolution  novelty  meta:science  is-ought  flexibility  trends  reason  applicability-prereqs  theory-practice  traces  evidence  psycho-atoms 
may 2017 by nhaliday
Typos | West Hunter
In a simple model, a given mutant has an equilibrium frequency μ/s, when μ is the mutation rate from good to bad alleles and s is the size of the selective disadvantage. To estimate the total impact of mutation at that locus, you multiply the frequency by the expected harm, s: which means that the fitness decrease (from effects at that locus) is just μ, the mutation rate. If we assume that these fitness effects are multiplicative, the total fitness decrease (also called ‘mutational load’) is approximately 1 – exp(-U), when U is where U=Σ2μ, the total number of new harmful mutations per diploid individual.

https://westhunt.wordpress.com/2012/10/17/more-to-go-wrong/

https://westhunt.wordpress.com/2012/07/13/sanctuary/
interesting, suggestive comment on Africa:
https://westhunt.wordpress.com/2012/07/13/sanctuary/#comment-3671
https://westhunt.wordpress.com/2012/07/14/too-darn-hot/
http://infoproc.blogspot.com/2012/07/rare-variants-and-human-genetic.html
https://westhunt.wordpress.com/2012/07/18/changes-in-attitudes/
https://westhunt.wordpress.com/2012/08/24/men-and-macaques/
I have reason to believe that few people understand genetic load very well, probably for self-referential reasons, but better explanations are possible.

One key point is that the amount of neutral variation is determined by the long-term mutational rate and population history, while the amount of deleterious variation [genetic load] is set by the selective pressures and the prevailing mutation rate over a much shorter time scale. For example, if you consider the class of mutations that reduce fitness by 1%, what matters is the past few thousand years, not the past few tens or hundreds of of thousands of years.

...

So, assuming that African populations have more neutral variation than non-African populations (which is well-established), what do we expect to see when we compare the levels of probably-damaging mutations in those two populations? If the Africans and non-Africans had experienced essentially similar mutation rates and selective pressures over the past few thousand years, we would expect to see the same levels of probably-damaging mutations. Bottlenecks that happened at the last glacial maximum or in the expansion out of Africa are irrelevant – too long ago to matter.

But we don’t. The amount of rare synonymous stuff is about 22% higher in Africans. The amount of rare nonsynonymous stuff (usually at least slightly deleterious) is 20.6% higher. The number of rare variants predicted to be more deleterious is ~21.6% higher. The amount of stuff predicted to be even more deleterious is ~27% higher. The number of harmful looking loss-of-function mutations (yet more deleterious) is 25% higher.

It looks as if the excess grows as the severity of the mutations increases. There is a scenario in which this is possible: the mutation rate in Africa has increased recently. Not yesterday, but, say, over the past few thousand years.

...

What is the most likely cause of such variations in the mutation rate? Right now, I’d say differences in average paternal age. We know that modest differences (~5 years) in average paternal age can easily generate ~20% differences in the mutation rate. Such between-population differences in mutation rates seem quite plausible, particularly since the Neolithic.
https://westhunt.wordpress.com/2016/04/10/bugs-versus-drift/
more recent: https://westhunt.wordpress.com/2017/06/06/happy-families-are-all-alike-every-unhappy-family-is-unhappy-in-its-own-way/#comment-92491
Probably not, but the question is complex: depends on the shape of the deleterious mutational spectrum [which we don’t know], ancient and recent demography, paternal age, and the extent of truncation selection in the population.
west-hunter  scitariat  discussion  bio  sapiens  biodet  evolution  mutation  genetics  genetic-load  population-genetics  nibble  stylized-facts  methodology  models  equilibrium  iq  neuro  neuro-nitgrit  epidemiology  selection  malthus  temperature  enhancement  CRISPR  genomics  behavioral-gen  multi  poast  africa  roots  pop-diff  ideas  gedanken  paternal-age  🌞  environment  speculation  gene-drift  longevity  immune  disease  parasites-microbiome  scifi-fantasy  europe  asia  race  migration  hsu  study  summary  commentary  shift  the-great-west-whale  nordic  intelligence  eden  long-short-run  debate  hmm  idk  explanans  comparison  structure  occident  mediterranean  geography  within-group  correlation  direction  volo-avolo  demographics  age-generation  measurement  data  applicability-prereqs  aging 
may 2017 by nhaliday
In a handbasket | West Hunter
It strikes me that in many ways, life was gradually getting harder in the Old World, especially in the cradles of civilization.

slavery and Rome/early US: https://westhunt.wordpress.com/2016/06/17/in-a-handbasket/#comment-80503
Rome and innovation: https://westhunt.wordpress.com/2016/06/17/in-a-handbasket/#comment-80505
"Culture’s have flavors and the Roman flavor was unfavorable to being clever. The Greeks were clever but not interested in utility. While the central American civilizations liked to cut people’s hearts out and stick cactus spines through their penis in public. Let us all act according to national customs."
https://twitter.com/Evolving_Moloch/status/881652804900671489
https://en.wikipedia.org/wiki/Bloodletting_in_Mesoamerica

https://westhunt.wordpress.com/2014/07/05/let-no-new-thing-arise/
It helps to think about critical community size (CCS). Consider a disease like measles, one that doesn’t last long and confers lifelong immunity. The virus needs fresh, never-infected hosts (we call them children) all the time, else it will go extinct. The critical community size for measles is probably more than half a million – which means that before agriculture, measles as we know it today couldn’t and didn’t exist. In fact, it looks as if split off from rinderpest within the last two thousand years. Mumps was around in Classical times (Hippocrates gives a good description), but it too has a large CCS and must be relatively new. Rubella can’t be ancient. Whooping cough has a smaller CCS, maybe only 100,000, but it too must postdate agriculture.

"let no new thing arise":
http://www.theseeker.org/cgi-bin/bulletin/show.pl?Todd%20Collier/Que%20no%20hayan%20novedades.
http://itre.cis.upenn.edu/~myl/languagelog/archives/003347.html
http://www.bradwarthen.com/2010/02/que-no-haya-novedad-may-no-new-thing-arise/

https://westhunt.wordpress.com/2013/07/03/legionnaires-disease/
Before 1900, armies usually lost more men from infectious disease than combat, particularly in extended campaigns.  At least that seems to have been the case in modern Western history.

There are indications that infectious disease was qualitatively different – less important –  in  the Roman legions.  For one thing, camps were placed near good supplies of fresh water. The legions had good camp sanitation, at least by the time of the Principate. They used latrines flushed with running water in permanent camps  and deep slit trenches with wooden covers and removable buckets in the field.  Using those latrines would have protected soldiers from diseases like typhoid and dysentery, major killers in recent armies.  Romans armies were mobile, often shifting their camps.  They seldom quartered their soldiers in urban areas –  they feared that city luxuries would corrupt their men, but this habit helped them avoid infectious agents, regardless of their reasons.

They managed to avoid a lot of serious illnesses because the causative organisms  simply weren’t there yet. Smallpox, and maybe measles, didn’t show up until the middle Empire. Falciparum malaria was around, but hadn’t reached Rome itself, during the Republic. It definitely had by the time of the Empire. Bubonic plague doesn’t seem to have caused trouble before Justinian.  Syphilis for sure, and typhus probably,  originated in the Americas, while cholera didn’t arrive until after 1800.
west-hunter  scitariat  history  iron-age  medieval  early-modern  discussion  europe  civilization  technology  innovation  agriculture  energy-resources  disease  parasites-microbiome  recent-selection  lived-experience  multi  mediterranean  the-classics  economics  usa  age-of-discovery  poast  aphorism  latin-america  farmers-and-foragers  cultural-dynamics  social-norms  culture  wealth-of-nations  twitter  social  commentary  quotes  anthropology  nihil  martial  nietzschean  embodied  ritual  wiki  reference  ethnography  flux-stasis  language  jargon  foreign-lang  population  density  speculation  ideas  war  meta:war  military  red-queen  strategy  epidemiology  public-health  trends  zeitgeist  archaeology  novelty  spreading  cost-benefit  conquest-empire  malthus  pre-ww2  the-south  applicability-prereqs  org:edu 
april 2017 by nhaliday
How Universal Is the Big Five? Testing the Five-Factor Model of Personality Variation Among Forager–Farmers in the Bolivian Amazon
We failed to find robust support for the FFM, based on tests of (a) internal consistency of items expected to segregate into the Big Five factors, (b) response stability of the Big Five, (c) external validity of the Big Five with respect to observed behavior, (d) factor structure according to exploratory and confirmatory factor analysis, and (e) similarity with a U.S. target structure based on Procrustes rotation analysis.

...

We argue that Tsimane personality variation displays 2 principal factors that may reflect socioecological characteristics common to small-scale societies. We offer evolutionary perspectives on why the structure of personality variation may not be invariant across human societies.

Niche diversity can explain cross-cultural differences in personality structure: https://www.nature.com/articles/s41562-019-0730-3.epdf?author_access_token=OePuGOtdzdnQNlUm-C2oidRgN0jAjWel9jnR3ZoTv0PAovoNXZmNaZE03-rNo0RKOI7i7PG10G8tISp-_6W5yDqI3sDx0WdZZuk2ekMJbzGZtJ7_XsMUy0k4UGpsNDt9NHMarkg3dmAWt-Ttawxu1g%3D%3D
Cross-cultural studies have challenged this view, finding that less-complex societies exhibit stronger covaria-tion among behavioural characteristics, resulting in fewer derived personality factors. To explain these results, we propose the niche diversity hypothesis, in which a greater diversity of social and ecological niches elicits a broader range of multi-variate behavioural profiles and, hence, lower trait covariance in a population.
...
This work provides a general explanation for population differences in personality structure in both humans and other animals and suggests a substantial reimagining of personality research: instead of reifying statistical descriptions of manifest personality structures, research should focus more on modelling their underlying causes.

sounds obvious but actually kinda interesting
pdf  study  psychology  cog-psych  society  embedded-cognition  personality  metrics  generalization  methodology  farmers-and-foragers  latin-america  context  homo-hetero  info-dynamics  water  psychometrics  exploratory  things  phalanges  dimensionality  anthropology  universalism-particularism  applicability-prereqs  multi  sapiens  cultural-dynamics  social-psych  evopsych  psych-architecture  org:nat  🌞  roots  explanans  causation  pop-diff  cybernetics  ecology  scale  moments  large-factor 
february 2017 by nhaliday
The infinitesimal model | bioRxiv
Our focus here is on the infinitesimal model. In this model, one or several quantitative traits are described as the sum of a genetic and a non-genetic component, the first being distributed as a normal random variable centred at the average of the parental genetic components, and with a variance independent of the parental traits. We first review the long history of the infinitesimal model in quantitative genetics. Then we provide a definition of the model at the phenotypic level in terms of individual trait values and relationships between individuals, but including different evolutionary processes: genetic drift, recombination, selection, mutation, population structure, ... We give a range of examples of its application to evolutionary questions related to stabilising selection, assortative mating, effective population size and response to selection, habitat preference and speciation. We provide a mathematical justification of the model as the limit as the number M of underlying loci tends to infinity of a model with Mendelian inheritance, mutation and environmental noise, when the genetic component of the trait is purely additive. We also show how the model generalises to include epistatic effects. In each case, by conditioning on the pedigree relating individuals in the population, we incorporate arbitrary selection and population structure. We suppose that we can observe the pedigree up to the present generation, together with all the ancestral traits, and we show, in particular, that the genetic components of the individual trait values in the current generation are indeed normally distributed with a variance independent of ancestral traits, up to an error of order M^{-1/2}. Simulations suggest that in particular cases the convergence may be as fast as 1/M.

published version:
The infinitesimal model: Definition, derivation, and implications: https://sci-hub.tw/10.1016/j.tpb.2017.06.001

Commentary: Fisher’s infinitesimal model: A story for the ages: http://www.sciencedirect.com/science/article/pii/S0040580917301508?via%3Dihub
This commentary distinguishes three nested approximations, referred to as “infinitesimal genetics,” “Gaussian descendants” and “Gaussian population,” each plausibly called “the infinitesimal model.” The first and most basic is Fisher’s “infinitesimal” approximation of the underlying genetics – namely, many loci, each making a small contribution to the total variance. As Barton et al. (2017) show, in the limit as the number of loci increases (with enough additivity), the distribution of genotypic values for descendants approaches a multivariate Gaussian, whose variance–covariance structure depends only on the relatedness, not the phenotypes, of the parents (or whether their population experiences selection or other processes such as mutation and migration). Barton et al. (2017) call this rigorously defensible “Gaussian descendants” approximation “the infinitesimal model.” However, it is widely assumed that Fisher’s genetic assumptions yield another Gaussian approximation, in which the distribution of breeding values in a population follows a Gaussian — even if the population is subject to non-Gaussian selection. This third “Gaussian population” approximation, is also described as the “infinitesimal model.” Unlike the “Gaussian descendants” approximation, this third approximation cannot be rigorously justified, except in a weak-selection limit, even for a purely additive model. Nevertheless, it underlies the two most widely used descriptions of selection-induced changes in trait means and genetic variances, the “breeder’s equation” and the “Bulmer effect.” Future generations may understand why the “infinitesimal model” provides such useful approximations in the face of epistasis, linkage, linkage disequilibrium and strong selection.
study  exposition  bio  evolution  population-genetics  genetics  methodology  QTL  preprint  models  unit  len:long  nibble  linearity  nonlinearity  concentration-of-measure  limits  applications  🌞  biodet  oscillation  fisher  perturbation  stylized-facts  chart  ideas  article  pop-structure  multi  pdf  piracy  intricacy  map-territory  kinship  distribution  simulation  ground-up  linear-models  applicability-prereqs  bioinformatics 
january 2017 by nhaliday
Not Final! | West Hunter
In mathematics we often prove that some proposition is true by showing that  the alternative is false.  The principle can sometimes work in other disciplines, but it’s tricky.  You have to have a very good understanding  to know that some things are impossible (or close enough to impossible).   You can do it fairly often in physics, less often in biology.
west-hunter  science  history  reflection  epistemic  occam  contradiction  parsimony  noise-structure  scitariat  info-dynamics  hetero-advantage  sapiens  evolution  disease  sexuality  ideas  genetics  s:*  thinking  the-trenches  no-go  thick-thin  theory-practice  inference  apollonian-dionysian  elegance  applicability-prereqs  necessity-sufficiency 
november 2016 by nhaliday
Information Processing: Evidence for (very) recent natural selection in humans
height (+), infant head circumference (+), some biomolecular stuff, female hip size (+), male BMI (-), age of menarche (+, !!), and birth weight (+)

Strong selection in the recent past can cause allele frequencies to change significantly. Consider two different SNPs, which today have equal minor allele frequency (for simplicity, let this be equal to one half). Assume that one SNP was subject to strong recent selection, and another (neutral) has had approximately zero effect on fitness. The advantageous version of the first SNP was less common in the far past, and rose in frequency recently (e.g., over the last 2k years). In contrast, the two versions of the neutral SNP have been present in roughly the same proportion (up to fluctuations) for a long time. Consequently, in the total past breeding population (i.e., going back tens of thousands of years) there have been many more copies of the neutral alleles (and the chunks of DNA surrounding them) than of the positively selected allele. Each of the chunks of DNA around the SNPs we are considering is subject to a roughly constant rate of mutation.

Looking at the current population, one would then expect a larger variety of mutations in the DNA region surrounding the neutral allele (both versions) than near the favored selected allele (which was rarer in the population until very recently, and whose surrounding region had fewer chances to accumulate mutations). By comparing the difference in local mutational diversity between the two versions of the neutral allele (should be zero modulo fluctuations, for the case MAF = 0.5), and between the (+) and (-) versions of the selected allele (nonzero, due to relative change in frequency), one obtains a sensitive signal for recent selection. See figure at bottom for more detail. In the paper what I call mutational diversity is measured by looking at distance distribution of singletons, which are rare variants found in only one individual in the sample under study.

The 2,000 year selection of the British: http://www.unz.com/gnxp/the-2000-year-selection-of-the-british/

Detection of human adaptation during the past 2,000 years: http://www.biorxiv.org/content/early/2016/05/07/052084

The key idea is that recent selection distorts the ancestral genealogy of sampled haplotypes at a selected site. In particular, the terminal (tip) branches of the genealogy tend to be shorter for the favored allele than for the disfavored allele, and hence, haplotypes carrying the favored allele will tend to carry fewer singleton mutations (Fig. 1A-C and SOM).

To capture this effect, we use the sum of distances to the nearest singleton in each direction from a test SNP as a summary statistic (Fig. 1D).

Figure 1. Illustration of the SDS method.

Figure 2. Properties of SDS.

Based on a recent model of European demography [25], we estimate that the mean tip length for a neutral sample of 3,000 individuals is 75 generations, or roughly 2,000 years (Fig. 2A). Since SDS aims to measure changes in tip lengths of the genealogy, we conjectured that it would be most likely to detect selection approximately within this timeframe.

Indeed, in simulated sweep models with samples of 3,000 individuals (Fig. 2B,C and fig. S2), we find that SDS focuses specifically on very recent time scales, and has equal power for hard and soft sweeps within this timeframe. At individual loci, SDS is powered to detect ~2% selection over 100 generations. Moreover, SDS has essentially no power to detect older selection events that stopped >100 generations before the present. In contrast, a commonly-used test for hard sweeps, iHS [12], integrates signal over much longer timescales (>1,000 generations), has no specificity to the more recent history, and has essentially no power for the soft sweep scenarios.

Catching evolution in the act with the Singleton Density Score: http://www.molecularecologist.com/2016/05/catching-evolution-in-the-act-with-the-singleton-density-score/
The Singleton Density Score (SDS) is a measure based on the idea that changes in allele frequencies induced by recent selection can be observed in a sample’s genealogy as differences in the branch length distribution.

You don’t need a weatherman: https://westhunt.wordpress.com/2016/05/08/you-dont-need-a-weatherman/
You can do a million cool things with this method. Since the effective time scale goes inversely with sample size, you could look at evolution in England over the past 1000 years or the past 500. Differencing, over the period 1-1000 AD. Since you can look at polygenic traits, you can see whether the alleles favoring higher IQs have increased or decreased in frequency over various stretches of time. You can see if Greg Clark’s proposed mechanism really happened. You can (soon) tell if creeping Pinkerization is genetic, or partly genetic.

You could probably find out if the Middle Easterners really have gotten slower, and when it happened.

Looking at IQ alleles, you could not only show whether the Ashkenazi Jews really are biologically smarter but if so, when it happened, which would give you strong hints as to how it happened.

We know that IQ-favoring alleles are going down (slowly) right now (not counting immigration, which of course drastically speeds it up). Soon we will know if this was true while Russia was under the Mongol yoke – we’ll know how smart Periclean Athenians were and when that boost occurred. And so on. And on!

...

“The pace has been so rapid that humans have changed significantly in body and mind over recorded history."

bicameral mind: https://westhunt.wordpress.com/2016/05/08/you-dont-need-a-weatherman/#comment-78934

https://westhunt.wordpress.com/2016/05/08/you-dont-need-a-weatherman/#comment-78939
Chinese, Koreans, Japanese and Ashkenazi Jews all have high levels of myopia. Australian Aborigines have almost none, I think.

https://westhunt.wordpress.com/2016/05/08/you-dont-need-a-weatherman/#comment-79094
I expect that the fall of all great empires is based on long term dysgenic trends. There is no logical reason why so many empires and civilizations throughout history could grow so big and then not simply keep growing, except for dysgenics.
--
I can think of about twenty other possible explanations off the top of my head, but dysgenics is a possible cause.
--
I agree with DataExplorer. The largest factor in the decay of civilizations is dysgenics. The discussion by R. A. Fisher 1930 p. 193 is very cogent on this matter. Soon we will know for sure.
--
Sometimes it can be rapid. Assume that the upper classes are mostly urban, and somewhat sharper than average. Then the Mongols arrive.
sapiens  study  genetics  evolution  hsu  trends  data  visualization  recent-selection  methodology  summary  GWAS  2016  scitariat  britain  commentary  embodied  biodet  todo  control  multi  gnxp  pop-diff  stat-power  mutation  hypothesis-testing  stats  age-generation  QTL  gene-drift  comparison  marginal  aDNA  simulation  trees  time  metrics  density  measurement  conquest-empire  pinker  population-genetics  aphorism  simler  dennett  👽  the-classics  iron-age  mediterranean  volo-avolo  alien-character  russia  medieval  spearhead  gregory-clark  bio  preprint  domestication  MENA  iq  islam  history  poast  west-hunter  scale  behavioral-gen  gotchas  cost-benefit  genomics  bioinformatics  stylized-facts  concept  levers  🌞  pop-structure  nibble  explanation  ideas  usa  dysgenics  list  applicability-prereqs  cohesion  judaism  visuo  correlation  china  asia  japan  korea  civilization  gibbon  rot  roots  fisher  giants  books  old-anglo  selection  agri-mindset  hari-seldon 
august 2016 by nhaliday

related tags

absolute-relative  abstraction  academia  accretion  accuracy  acm  acmtariat  aDNA  advanced  adversarial  advice  africa  age-generation  age-of-discovery  aggregator  aging  agri-mindset  agriculture  ai  ai-control  albion  algorithmic-econ  algorithms  alien-character  alignment  allodium  altruism  amazon  analogy  analysis  analytical-holistic  anglo  anglosphere  anonymity  anthropology  antidemos  antiquity  aphorism  api  apollonian-dionysian  app  applicability-prereqs  applications  archaeology  art  article  ascetic  asia  automation  axelrod  backup  bangbang  behavioral-econ  behavioral-gen  being-right  benchmarks  best-practices  bias-variance  biases  big-peeps  bio  biodet  bioinformatics  bitcoin  blockchain  blowhards  bonferroni  books  bounded-cognition  brain-scan  brands  britain  broad-econ  build-packaging  business  c(pp)  caching  canon  career  causation  chapman  characterization  charity  chart  cheatsheet  checklists  chemistry  china  christianity  civilization  cjones-like  class  classification  clever-rats  client-server  cliometrics  coarse-fine  cocktail  code-dive  code-organizing  cog-psych  cohesion  collaboration  commentary  communication  communism  community  comparison  competition  compilers  complex-systems  complexity  composition-decomposition  computation  computer-memory  concentration-of-measure  concept  conceptual-vocab  concurrency  confidence  confluence  conquest-empire  context  contradiction  contrarianism  control  convergence  convexity-curvature  cool  cooperate-defect  coordination  correlation  cost-benefit  counterexample  coupling-cohesion  course  cracker-prog  criminology  CRISPR  critique  crosstab  crux  crypto-anarchy  cryptocurrency  cs  cultural-dynamics  culture  curiosity  curvature  cybernetics  cycles  cynicism-idealism  dan-luu  dark-arts  darwinian  data  data-science  data-structures  dbs  debate  debugging  decentralized  decision-making  decision-theory  deep-learning  deep-materialism  degrees-of-freedom  democracy  demographics  dennett  density  dependence-independence  descriptive  design  desktop  detail-architecture  developing-world  devops  devtools  dignity  dimensionality  direction  dirty-hands  discrete  discussion  disease  distribution  diversity  domestication  dotnet  DSL  duality  duplication  duty  dysgenics  early-modern  earth  ecology  econometrics  economics  econotariat  ecosystem  eden  EEA  effect-size  egalitarianism-hierarchy  EGT  elections  electromag  elegance  elite  embedded-cognition  embodied  emotion  empirical  ems  endo-exo  endogenous-exogenous  energy-resources  engineering  enhancement  ensembles  environment  envy  epidemiology  epistemic  equilibrium  erik-demaine  error  essay  ethics  ethnography  europe  evidence  evolution  evopsych  examples  exocortex  expansionism  expectancy  expert  expert-experience  explanans  explanation  exploratory  exposition  extratricky  extrema  faq  farmers-and-foragers  fertility  field-study  finance  finiteness  fisher  flexibility  fluid  flux-stasis  foreign-lang  form-design  formal-values  fourier  frameworks  free-riding  frontend  frontier  functional  futurism  game-theory  games  garett-jones  gavisti  GCTA  gedanken  gene-drift  generalization  genetic-load  genetics  genomics  geography  geometry  giants  gibbon  gnon  gnosis-logos  gnxp  good-evil  gotchas  government  graphs  gravity  gray-econ  gregory-clark  ground-up  group-selection  growth-econ  GT-101  guessing  guilt-shame  GWAS  haidt  hanson  hardware  hari-seldon  heavy-industry  henrich  hetero-advantage  heuristic  hi-order-bits  history  hmm  hn  homo-hetero  honor  howto  hsu  human-capital  humanity  humility  hypocrisy  hypothesis-testing  ideas  identification-equivalence  identity  ideology  idk  IEEE  iidness  illusion  immune  impetus  incentives  individualism-collectivism  industrial-revolution  inference  info-dynamics  info-econ  info-foraging  init  innovation  input-output  insight  integration-extension  integrity  intelligence  interdisciplinary  interests  interface-compatibility  intersection-connectedness  interview  intricacy  intuition  investing  iq  iron-age  is-ought  islam  iteration-recursion  japan  jargon  javascript  jobs  judaism  justice  jvm  kinship  knowledge  korea  language  large-factor  LaTeX  latin-america  learning  learning-theory  lectures  legacy  len:long  lens  lesswrong  let-me-see  levers  leviathan  lexical  libraries  limits  linear-algebra  linear-models  linearity  linguistics  links  linux  list  literature  lived-experience  local-global  logic  long-short-run  longevity  love-hate  machine-learning  magnitude  malthus  management  manifolds  map-territory  marginal  marginal-rev  market-failure  markets  markov  martial  math  math.CA  math.CO  measure  measurement  mechanics  mechanism-design  medieval  mediterranean  MENA  meta-analysis  meta:prediction  meta:rhetoric  meta:science  meta:war  metabuch  metal-to-virtual  metameta  methodology  metrics  micro  microfoundations  microsoft  migration  military  minimalism  minimum-viable  miri-cfar  model-organism  models  modernity  moloch  moments  money-for-time  monte-carlo  mooc  morality  motivation  multi  multiplicative  mutation  n-factor  nature  near-far  necessity-sufficiency  network-structure  neuro  neuro-nitgrit  neurons  new-religion  news  nibble  nietzschean  nihil  nitty-gritty  nlp  no-go  noblesse-oblige  noise-structure  nonlinearity  nordic  notation  notetaking  novelty  number  numerics  objektbuch  occam  occident  ocr  old-anglo  oly  openai  opsec  optimism  optimization  order-disorder  orders  ORFE  org:bleg  org:com  org:edge  org:edu  org:junk  org:mag  org:mat  org:med  org:nat  org:popup  organizing  orient  os  oscillation  oss  outcome-risk  outliers  overflow  p:null  p:someday  papers  parallax  parasites-microbiome  parsimony  paternal-age  patho-altruism  pdf  peace-violence  people  performance  personality  perturbation  pessimism  phalanges  phase-transition  philosophy  physics  pic  piketty  pinboard  pinker  piracy  planning  play  pls  poast  poetry  polisci  political-econ  politics  pop-diff  pop-structure  population  population-genetics  postrat  pragmatic  pre-2013  pre-ww2  prediction  preprint  priors-posteriors  privacy  pro-rata  probability  problem-solving  productivity  programming  project  properties  property-rights  prudence  pseudoE  psych-architecture  psycho-atoms  psychology  psychometrics  public-goodish  public-health  putnam-like  python  q-n-a  QTL  quantitative-qualitative  quantum  quixotic  quotes  race  rand-approx  random  rant  rationality  ratty  reading  realness  reason  recent-selection  recommendations  recruiting  red-queen  reddit  reduction  reference  reflection  regression  regularizer  reinforcement  relativity  religion  rent-seeking  replication  repo  reputation  retention  retrofit  review  revolution  rhetoric  rigor  risk  ritual  roadmap  robust  roots  rot  russia  rust  s:*  s:**  s:***  saas  sampling  sampling-bias  sanctity-degradation  sapiens  scale  science  scifi-fantasy  scitariat  search  security  selection  self-interest  sequential  sexuality  shift  signal-noise  signaling  signum  similarity  simler  simplex  simplification-normalization  simulation  singularity  sinosphere  skeleton  sleuthin  smoothness  social  social-choice  social-norms  social-psych  social-science  social-structure  sociality  society  sociology  software  space  spatial  spearhead  speculation  speed  speedometer  spreading  ssc  stackex  stat-mech  stat-power  state  state-of-art  stats  status  stereotypes  stories  strategy  street-fighting  strings  structure  study  studying  stylized-facts  subculture  subjective-objective  summary  survey  sv  synchrony  syntax  synthesis  system-design  systematic-ad-hoc  systems  tcs  tech  tech-infrastructure  technology  techtariat  telos-atelos  temperature  terminal  tetlock  the-basilisk  the-classics  the-great-west-whale  the-self  the-south  the-trenches  the-watchers  the-world-is-just-atoms  theory-of-mind  theory-practice  theos  thermo  thick-thin  things  thinking  threat-modeling  time  time-complexity  todo  tools  top-n  traces  tradecraft  tradeoffs  trees  trends  tribalism  tricki  trivia  trust  truth  tutorial  tv  twin-study  twitter  ubiquity  ui  unaffiliated  uncertainty  uniqueness  unit  universalism-particularism  unix  us-them  usa  vague  values  vampire-squid  variance-components  video  visual-understanding  visualization  visuo  vitality  volo-avolo  war  water  wealth-of-nations  web  webapp  weird  west-hunter  westminster  whole-partial-many  wiki  wild-ideas  wire-guided  within-group  within-without  wkfly  wonkish  workflow  working-stiff  world  wormholes  worrydream  writing  yak-shaving  yoga  zeitgeist  zero-positive-sum  zooming  🌞  🎩  👽  🔬  🖥  🦀 

Copy this bookmark:



description:


tags: