nhaliday + duplication   72

REST is the new SOAP | Hacker News
hn  commentary  techtariat  org:ngo  programming  engineering  web  client-server  networking  rant  rhetoric  contrarianism  idk  org:med  best-practices  working-stiff  api  models  protocol-metadata  internet  state  structure  chart  multi  q-n-a  discussion  expert-experience  track-record  reflection  cost-benefit  design  system-design  comparison  code-organizing  flux-stasis  interface-compatibility  trends  gotchas  stackex  state-of-art  distributed  concurrency  abstraction  concept  conceptual-vocab  python  ubiquity  list  top-n  duplication  synchrony  performance  caching 
26 days ago by nhaliday
How is definiteness expressed in languages with no definite article, clitic or affix? - Linguistics Stack Exchange
All languages, as far as we know, do something to mark information status. Basically this means that when you refer to an X, you have to do something to indicate the answer to questions like:
1. Do you have a specific X in mind?
2. If so, you think your hearer is familiar with the X you're talking about?
3. If so, have you already been discussing that X for a while, or is it new to the conversation?
4. If you've been discussing the X for a while, has it been the main topic of conversation?

Question #2 is more or less what we mean by "definiteness."

But there are lots of other information-status-marking strategies that don't directly involve definiteness marking. For example:
q-n-a  stackex  language  foreign-lang  linguistics  lexical  syntax  concept  conceptual-vocab  thinking  things  span-cover  direction  degrees-of-freedom  communication  anglo  japan  china  asia  russia  mediterranean  grokkability-clarity  intricacy  uniqueness  number  universalism-particularism  whole-partial-many  usa  latin-america  farmers-and-foragers  nordic  novelty  trivia  duplication  dependence-independence  spanish  context  orders  water  comparison 
6 weeks ago by nhaliday
Ask HN: Favorite note-taking software? | Hacker News
Ask HN: What is your ideal note-taking software and/or hardware?: https://news.ycombinator.com/item?id=13221158

my wishlist as of 2019:
- web + desktop macOS + mobile iOS (at least viewing on the last but ideally also editing)
- sync across all those
- open-source data format that's easy to manipulate for scripting purposes
- flexible organization: mostly tree hierarchical (subsuming linear/unorganized) but with the option for directed (acyclic) graph (possibly a second layer of structure/linking)
- can store plain text, LaTeX, diagrams, and raster/vector images (video prob not necessary except as links to elsewhere)
- full-text search
- somehow digest/import data from Pinboard, Workflowy, Papers 3/Bookends, and Skim, ideally absorbing most of their functionality
- so, eg, track notes/annotations side-by-side w/ original PDF/DjVu/ePub documents (to replace Papers3/Bookends/Skim), and maybe web pages too (to replace Pinboard)
- OCR of handwritten notes (how to handle equations/diagrams?)
- various forms of NLP analysis of everything (topic models, clustering, etc)
- maybe version control (less important than export)

- Evernote prob ruled out do to heavy use of proprietary data formats (unless I can find some way to export with tolerably clean output)
- Workflowy/Dynalist are good but only cover a subset of functionality I want
- org-mode doesn't interact w/ mobile well (and I haven't evaluated it in detail otherwise)
- TiddlyWiki/Zim are in the running, but not sure about mobile
- idk about vimwiki but I'm not that wedded to vim and it seems less widely used than org-mode/TiddlyWiki/Zim so prob pass on that
- Quiver/Joplin/Inkdrop look similar and cover a lot of bases, TODO: evaluate more
- Trilium looks especially promising, tho read-only mobile and for macOS desktop look at this: https://github.com/zadam/trilium/issues/511
- RocketBook is interesting scanning/OCR solution but prob not sufficient due to proprietary data format
- TODO: many more candidates, eg, TreeSheets, Gingko, OneNote (macOS?...), Notion (proprietary data format...), Zotero, Nodebook (https://nodebook.io/landing), Polar (https://getpolarized.io), Roam (looks very promising)

Ask HN: What do you use for you personal note taking activity?: https://news.ycombinator.com/item?id=15736102

Ask HN: What are your note-taking techniques?: https://news.ycombinator.com/item?id=9976751

Ask HN: How do you take notes (useful note-taking strategies)?: https://news.ycombinator.com/item?id=13064215

Ask HN: How to get better at taking notes?: https://news.ycombinator.com/item?id=21419478

Ask HN: How did you build up your personal knowledge base?: https://news.ycombinator.com/item?id=21332957
nice comment from math guy on structure and difference between math and CS: https://news.ycombinator.com/item?id=21338628
useful comment collating related discussions: https://news.ycombinator.com/item?id=21333383
Designing a Personal Knowledge base: https://news.ycombinator.com/item?id=8270759
Ask HN: How to organize personal knowledge?: https://news.ycombinator.com/item?id=17892731
Do you use a personal 'knowledge base'?: https://news.ycombinator.com/item?id=21108527
Ask HN: How do you share/organize knowledge at work and life?: https://news.ycombinator.com/item?id=21310030

other stuff:
plain text: https://news.ycombinator.com/item?id=21685660

Tiago Forte: https://www.buildingasecondbrain.com

hn search: https://hn.algolia.com/?query=notetaking&type=story

Slant comparison commentary: https://news.ycombinator.com/item?id=7011281

good comparison of options here in comments here (and Trilium itself looks good): https://news.ycombinator.com/item?id=18840990



Roam: https://news.ycombinator.com/item?id=21440289

intriguing but probably not appropriate for my needs: https://www.sophya.ai/

Inkdrop: https://news.ycombinator.com/item?id=20103589

Joplin: https://news.ycombinator.com/item?id=15815040


Leo Editor (combines tree outlining w/ literate programming/scripting, I think?): https://news.ycombinator.com/item?id=17769892

Frame: https://news.ycombinator.com/item?id=18760079

Notion: https://news.ycombinator.com/item?id=18904648


maybe not the best source for a review/advice

interesting comment(s) about tree outliners and spreadsheets: https://news.ycombinator.com/item?id=21170434

hn  discussion  recommendations  software  tools  desktop  app  notetaking  exocortex  wkfly  wiki  productivity  multi  comparison  crosstab  properties  applicability-prereqs  nlp  info-foraging  chart  webapp  reference  q-n-a  retention  workflow  reddit  social  ratty  ssc  learning  studying  commentary  structure  thinking  network-structure  things  collaboration  ocr  trees  graphs  LaTeX  search  todo  project  money-for-time  synchrony  pinboard  state  duplication  worrydream  simplification-normalization  links  minimalism  design  neurons  ai-control  openai  miri-cfar  parsimony  intricacy 
8 weeks ago by nhaliday
Karol Kuczmarski's Blog – A Haskell retrospective
Even in this hypothetical scenario, I posit that the value proposition of Haskell would still be a tough sell.

There is this old quote from Bjarne Stroustrup (creator of C++) where he says that programming languages divide into those everyone complains about, and those that no one uses.
The first group consists of old, established technologies that managed to accrue significant complexity debt through years and decades of evolution. All the while, they’ve been adapting to the constantly shifting perspectives on what are the best industry practices. Traces of those adaptations can still be found today, sticking out like a leftover appendix or residual tail bone — or like the built-in support for XML in Java.

Languages that “no one uses”, on the other hand, haven’t yet passed the industry threshold of sufficient maturity and stability. Their ecosystems are still cutting edge, and their future is uncertain, but they sometimes champion some really compelling paradigm shifts. As long as you can bear with things that are rough around the edges, you can take advantage of their novel ideas.

Unfortunately for Haskell, it manages to combine the worst parts of both of these worlds.

On one hand, it is a surprisingly old language, clocking more than two decades of fruitful research around many innovative concepts. Yet on the other hand, it bears the signs of a fresh new technology, with relatively few production-grade libraries, scarce coverage of some domains (e.g. GUI programming), and not too many stories of commercial successes.

There are many ways to do it
String theory
Errors and how to handle them
Implicit is better than explicit
Leaky modules
Namespaces are apparently a bad idea
Wild records
Purity beats practicality
techtariat  reflection  functional  haskell  programming  pls  realness  facebook  pragmatic  cost-benefit  legacy  libraries  types  intricacy  engineering  tradeoffs  frontier  homo-hetero  duplication  strings  composition-decomposition  nitty-gritty  error  error-handling  coupling-cohesion  critique  ecosystem  c(pp)  aphorism 
august 2019 by nhaliday
An Efficiency Comparison of Document Preparation Systems Used in Academic Research and Development
The choice of an efficient document preparation system is an important decision for any academic researcher. To assist the research community, we report a software usability study in which 40 researchers across different disciplines prepared scholarly texts with either Microsoft Word or LaTeX. The probe texts included simple continuous text, text with tables and subheadings, and complex text with several mathematical equations. We show that LaTeX users were slower than Word users, wrote less text in the same amount of time, and produced more typesetting, orthographical, grammatical, and formatting errors. On most measures, expert LaTeX users performed even worse than novice Word users. LaTeX users, however, more often report enjoying using their respective software. We conclude that even experienced LaTeX users may suffer a loss in productivity when LaTeX is used, relative to other document preparation systems. Individuals, institutions, and journals should carefully consider the ramifications of this finding when choosing document preparation strategies, or requiring them of authors.


However, our study suggests that LaTeX should be used as a document preparation system only in cases in which a document is heavily loaded with mathematical equations. For all other types of documents, our results suggest that LaTeX reduces the user’s productivity and results in more orthographical, grammatical, and formatting errors, more typos, and less written text than Microsoft Word over the same duration of time. LaTeX users may argue that the overall quality of the text that is created with LaTeX is better than the text that is created with Microsoft Word. Although this argument may be true, the differences between text produced in more recent editions of Microsoft Word and text produced in LaTeX may be less obvious than it was in the past. Moreover, we believe that the appearance of text matters less than the scientific content and impact to the field. In particular, LaTeX is also used frequently for text that does not contain a significant amount of mathematical symbols and formula. We believe that the use of LaTeX under these circumstances is highly problematic and that researchers should reflect on the criteria that drive their preferences to use LaTeX over Microsoft Word for text that does not require significant mathematical representations.


A second decision criterion that factors into the choice to use a particular software system is reflection about what drives certain preferences. A striking result of our study is that LaTeX users are highly satisfied with their system despite reduced usability and productivity. From a psychological perspective, this finding may be related to motivational factors, i.e., the driving forces that compel or reinforce individuals to act in a certain way to achieve a desired goal. A vital motivational factor is the tendency to reduce cognitive dissonance. According to the theory of cognitive dissonance, each individual has a motivational drive to seek consonance between their beliefs and their actual actions. If a belief set does not concur with the individual’s actual behavior, then it is usually easier to change the belief rather than the behavior [6]. The results from many psychological studies in which people have been asked to choose between one of two items (e.g., products, objects, gifts, etc.) and then asked to rate the desirability, value, attractiveness, or usefulness of their choice, report that participants often reduce unpleasant feelings of cognitive dissonance by rationalizing the chosen alternative as more desirable than the unchosen alternative [6, 7]. This bias is usually unconscious and becomes stronger as the effort to reject the chosen alternative increases, which is similar in nature to the case of learning and using LaTeX.


Given these numbers it remains an open question to determine the amount of taxpayer money that is spent worldwide for researchers to use LaTeX over a more efficient document preparation system, which would free up their time to advance their respective field. Some publishers may save a significant amount of money by requesting or allowing LaTeX submissions because a well-formed LaTeX document complying with a well-designed class file (template) is much easier to bring into their publication workflow. However, this is at the expense of the researchers’ labor time and effort. We therefore suggest that leading scientific journals should consider accepting submissions in LaTeX only if this is justified by the level of mathematics presented in the paper. In all other cases, we think that scholarly journals should request authors to submit their documents in Word or PDF format. We believe that this would be a good policy for two reasons. First, we think that the appearance of the text is secondary to the scientific merit of an article and its impact to the field. And, second, preventing researchers from producing documents in LaTeX would save time and money to maximize the benefit of research and development for both the research team and the public.

[ed.: I sense some salt.

And basically no description of how "# errors" was calculated.]

I question the validity of their methodology.
At no point in the paper is exactly what is meant by a "formatting error" or a "typesetting error" defined. From what I gather, the participants in the study were required to reproduce the formatting and layout of the sample text. In theory, a LaTeX file should strictly be a semantic representation of the content of the document; while TeX may have been a raw typesetting language, this is most definitely not the intended use case of LaTeX and is overall a very poor test of its relative advantages and capabilities.
The separation of the semantic definition of the content from the rendering of the document is, in my opinion, the most important feature of LaTeX. Like CSS, this allows the actual formatting to be abstracted away, allowing plain (marked-up) content to be written without worrying about typesetting.
Word has some similar capabilities with styles, and can be used in a similar manner, though few Word users actually use the software properly. This may sound like a relatively insignificant point, but in practice, almost every Word document I have seen has some form of inconsistent formatting. If Word disallowed local formatting changes (including things such as relative spacing of nested bullet points), forcing all formatting changes to be done in document-global styles, it would be a far better typesetting system. Also, the users would be very unhappy.
Yes, LaTeX can undeniably be a pain in the arse, especially when it comes to trying to get figures in the right place; however the combination of a simple, semantic plain-text representation with a flexible and professional typesetting and rendering engine are undeniable and completely unaddressed by this study.
It seems that the test was heavily biased in favor of WYSIWYG.
Of course that approach makes it very simple to reproduce something, as has been tested here. Even simpler would be to scan the document and run OCR. The massive problem with both approaches (WYSIWYG and scanning) is that you can't generalize any of it. You're doomed repeating it forever.
(I'll also note the other significant issue with this study: when the ratings provided by participants came out opposite of their test results, they attributed it to irrational bias.)

Over the past few years however, the line between the tools has blurred. In 2017, Microsoft made it possible to use LaTeX’s equation-writing syntax directly in Word, and last year it scrapped Word’s own equation editor. Other text editors also support elements of LaTeX, allowing newcomers to use as much or as little of the language as they like.

study  hmm  academia  writing  publishing  yak-shaving  technical-writing  software  tools  comparison  latex  scholar  regularizer  idk  microsoft  evidence-based  science  desktop  time  efficiency  multi  hn  commentary  critique  news  org:sci  flux-stasis  duplication  metrics  biases 
june 2019 by nhaliday
Interview with Donald Knuth | Interview with Donald Knuth | InformIT
Andrew Binstock and Donald Knuth converse on the success of open source, the problem with multicore architecture, the disappointing lack of interest in literate programming, the menace of reusable code, and that urban legend about winning a programming contest with a single compilation.

Reusable vs. re-editable code: https://hal.archives-ouvertes.fr/hal-01966146/document
- Konrad Hinsen

I think whether code should be editable or in “an untouchable black box” depends on the number of developers involved, as well as their talent and motivation. Knuth is a highly motivated genius working in isolation. Most software is developed by large teams of programmers with varying degrees of motivation and talent. I think the further you move away from Knuth along these three axes the more important black boxes become.
nibble  interview  giants  expert-experience  programming  cs  software  contrarianism  carmack  oss  prediction  trends  linux  concurrency  desktop  comparison  checking  debugging  stories  engineering  hmm  idk  algorithms  books  debate  flux-stasis  duplication  parsimony  best-practices  writing  documentation  latex  intricacy  structure  hardware  caching  workflow  editors  composition-decomposition  coupling-cohesion  exposition  technical-writing  thinking  cracker-prog  code-organizing  grokkability  multi  techtariat  commentary  pdf  reflection  essay  examples  python  data-science  libraries  grokkability-clarity 
june 2019 by nhaliday
One week of bugs
If I had to guess, I'd say I probably work around hundreds of bugs in an average week, and thousands in a bad week. It's not unusual for me to run into a hundred new bugs in a single week. But I often get skepticism when I mention that I run into multiple new (to me) bugs per day, and that this is inevitable if we don't change how we write tests. Well, here's a log of one week of bugs, limited to bugs that were new to me that week. After a brief description of the bugs, I'll talk about what we can do to improve the situation. The obvious answer to spend more effort on testing, but everyone already knows we should do that and no one does it. That doesn't mean it's hopeless, though.


Here's where I'm supposed to write an appeal to take testing more seriously and put real effort into it. But we all know that's not going to work. It would take 90k LOC of tests to get Julia to be as well tested as a poorly tested prototype (falsely assuming linear complexity in size). That's two person-years of work, not even including time to debug and fix bugs (which probably brings it closer to four of five years). Who's going to do that? No one. Writing tests is like writing documentation. Everyone already knows you should do it. Telling people they should do it adds zero information1.

Given that people aren't going to put any effort into testing, what's the best way to do it?

Property-based testing. Generative testing. Random testing. Concolic Testing (which was done long before the term was coined). Static analysis. Fuzzing. Statistical bug finding. There are lots of options. Some of them are actually the same thing because the terminology we use is inconsistent and buggy. I'm going to arbitrarily pick one to talk about, but they're all worth looking into.


There are a lot of great resources out there, but if you're just getting started, I found this description of types of fuzzers to be one of those most helpful (and simplest) things I've read.

John Regehr has a udacity course on software testing. I haven't worked through it yet (Pablo Torres just pointed to it), but given the quality of Dr. Regehr's writing, I expect the course to be good.

For more on my perspective on testing, there's this.

Everything's broken and nobody's upset: https://www.hanselman.com/blog/EverythingsBrokenAndNobodysUpset.aspx

From the perspective of a user, the purpose of Hypothesis is to make it easier for you to write better tests.

From my perspective as the primary author, that is of course also a purpose of Hypothesis. I write a lot of code, it needs testing, and the idea of trying to do that without Hypothesis has become nearly unthinkable.

But, on a large scale, the true purpose of Hypothesis is to drag the world kicking and screaming into a new and terrifying age of high quality software.

Software is everywhere. We have built a civilization on it, and it’s only getting more prevalent as more services move online and embedded and “internet of things” devices become cheaper and more common.

Software is also terrible. It’s buggy, it’s insecure, and it’s rarely well thought out.

This combination is clearly a recipe for disaster.

The state of software testing is even worse. It’s uncontroversial at this point that you should be testing your code, but it’s a rare codebase whose authors could honestly claim that they feel its testing is sufficient.

Much of the problem here is that it’s too hard to write good tests. Tests take up a vast quantity of development time, but they mostly just laboriously encode exactly the same assumptions and fallacies that the authors had when they wrote the code, so they miss exactly the same bugs that you missed when they wrote the code.

Preventing the Collapse of Civilization [video]: https://news.ycombinator.com/item?id=19945452
- Jonathan Blow

NB: DevGAMM is a game industry conference

- loss of technological knowledge (Antikythera mechanism, aqueducts, etc.)
- hardware driving most gains, not software
- software's actually less robust, often poorly designed and overengineered these days
- *list of bugs he's encountered recently*:
- knowledge of trivia becomes more than general, deep knowledge
- does at least acknowledge value of DRY, reusing code, abstraction saving dev time
techtariat  dan-luu  tech  software  error  list  debugging  linux  github  robust  checking  oss  troll  lol  aphorism  webapp  email  google  facebook  games  julia  pls  compilers  communication  mooc  browser  rust  programming  engineering  random  jargon  formal-methods  expert-experience  prof  c(pp)  course  correctness  hn  commentary  video  presentation  carmack  pragmatic  contrarianism  pessimism  sv  unix  rhetoric  critique  worrydream  hardware  performance  trends  multiplicative  roots  impact  comparison  history  iron-age  the-classics  mediterranean  conquest-empire  gibbon  technology  the-world-is-just-atoms  flux-stasis  increase-decrease  graphics  hmm  idk  systems  os  abstraction  intricacy  worse-is-better/the-right-thing  build-packaging  microsoft  osx  apple  reflection  assembly  things  knowledge  detail-architecture  thick-thin  trivia  info-dynamics  caching  frameworks  generalization  systematic-ad-hoc  universalism-particularism  analytical-holistic  structure  tainter  libraries  tradeoffs  prepping  threat-modeling  network-structure  writing  risk  local-glob 
may 2019 by nhaliday
What’s In A Name? Understanding Classical Music Titles | Parker Symphony Orchestra
Composition Type:
Symphony, sonata, piano quintet, concerto – these are all composition types. Classical music composers wrote works in many of these forms and often the same composer wrote multiple pieces in the same type. This is why saying you enjoy listening to “the Serenade” or “the Concerto” or “the Mazurka” is confusing. Even using the composer name often does not narrow down which piece you are referring to. For example, it is not enough to say “Beethoven Symphony”. He wrote 9 of them!

Generic Name:
Compositions often have a generic name that can describe the work’s composition type, key signature, featured instruments, etc. This could be something as simple as Symphony No. 2 (meaning the 2nd symphony written by that composer), Minuet in G major (minuet being a type of dance), or Concerto for Two Cellos (an orchestral work featuring two cellos as soloists). The problem with referring to a piece by the generic name, even along with the composer, is that, again, that may not enough to identify the exact work. While Symphony No. 2 by Mahler is sufficient since it is his only 2nd symphony, Minuet by Bach is not since he wrote many minuets over his lifetime.

Non-Generic Names:
Non-generic names, or classical music nicknames and sub-titles, are often more well-known than generic names. They can even be so famous that the composer name is not necessary to clarify which piece you are referring to. Eine Kleine Nachtmusik, the Trout Quintet, and the Surprise Symphony are all examples of non-generic names.

Who gave classical music works their non-generic names? Sometimes the composer added a subsidiary name to a work. These are called sub-titles and are considered part of the work’s formal title. The sub-title for Tchaikovsky’s Symphony No. 6 in B minor is “Pathetique”.

A nickname, on the other hand, is not part of the official title and was not assigned by the composer. It is a name that has become associated with a work. For example, Bach’s “Six Concerts à plusieurs instruments” are commonly known as the Brandenburg Concertos because they were presented as a gift to the Margrave of Brandenburg. The name was given by Bach’s biographer, Philipp Spitta, and it stuck. Mozart’s Symphony No. 41 earned the nickname Jupiter most likely because of its exuberant energy and grand scale. Schubert’s Symphony No. 8 is known as the Unfinished Symphony because he died and left it with only 2 complete movements.

In many cases, referring to a work by its non-generic name, especially with the composer name, is enough to identify a piece. Most classical music fans know which work you are referring to when you say “Beethoven’s Eroica Symphony”.

Non-Numeric Titles:
Some classical compositions do not have a generic name, but rather a non-numeric title. These are formal titles given by the composer that do not follow a sequential numeric naming convention. Works that fall into this category include the Symphony Fantastique by Berlioz, Handel’s Messiah, and Also Sprach Zarathustra by Richard Strauss.

Opus Number:
Opus numbers, abbreviated op., are used to distinguish compositions with similar titles and indicate the chronological order of production. Some composers assigned numbers to their own works, but many were inconsistent in their methods. As a result, some composers’ works are referred to with a catalogue number assigned by musicologists. The various catalogue-number systems commonly used include Köchel-Verzeichnis for Mozart (K) and Bach-Werke-Verzeichnis (BWV).

I was always curious why classical composers use names like this Étude in E-flat minor (Frédéric_Chopin) or Missa in G major (Johann Sebastian Bach). Is this from scales of this songs? Weren't they blocked to ever use this scale again? Why didn't they create unique titles?


Using a key did not prohibit a composer from using that key again (there are only thirty keys). Using a key did not prohibit them from using the same key on a work with the same form either. Bach wrote over thirty Prelude and Fugues. Four of these were Prelude and Fugue in A minor. They are now differentiated by their own BWV catalog numbers (assigned in 1950). Many pieces did have unique titles, but with the amounts of pieces the composers composed, unique titles were difficult to come up with. Also, most pieces had no lyrics. It is much easier to come up with a title when there are lyrics. So, they turned to this technique. It was used frequently during the Common Practice Period.

explanation  music  classical  trivia  duplication  q-n-a  stackex  music-theory  init  notation  multi  jargon 
may 2019 by nhaliday
Citizendium, the Citizens' Compendium
That wikipedia alternative by the nerdy spurned co-founder of Jimmy Wales (Larry Sanger). Unfortunately looks rather empty.
wiki  reference  database  search  comparison  organization  duplication  socs-and-mops  the-devil  god-man-beast-victim  guilt-shame 
november 2018 by nhaliday
Reconsidering epistemological scepticism – Dividuals
I blogged before about how I consider an epistemological scepticism fully compatible with being conservative/reactionary. By epistemological scepticism I mean the worldview where concepts, categories, names, classes aren’t considered real, just useful ways to categorize phenomena, but entirely mental constructs, basically just tools. I think you can call this nominalism as well. The nominalism-realism debate was certainly about this. What follows is the pro-empirical worldview where logic and reasoning is considered highly fallible: hence you don’t think and don’t argue too much, you actually look and check things instead. You rely on experience, not reasoning.


Anyhow, the argument is that there are classes, which are indeed artificial, and there are kinds, which are products of natural forces, products of causality.


And the deeper – Darwinian – argument, unspoken but obvious, is that any being with a model of reality that does not conform to such real clumps, gets eaten by a grue.

This is impressive. It seems I have to extend my one-variable epistemology to a two-variable epistemology.

My former epistemology was that we generally categorize things according to their uses or dangers for us. So “chair” is – very roughly – defined as “anything we can sit on”. Similarly, we can categorize “predator” as “something that eats us or the animals that are useful for us”.

The unspoken argument against this is that the universe or the biosphere exists neither for us nor against us. A fox can eat your rabbits and a lion can eat you, but they don’t exist just for the sake of making your life difficult.

Hence, if you interpret phenomena only from the viewpoint of their uses or dangers for humans, you get only half the picture right. The other half is what it really is and where it came from.

Copying is everything: https://dividuals.wordpress.com/2015/12/14/copying-is-everything/
Philosophy professor Ruth Millikan’s insight that everything that gets copied from an ancestor has a proper function or teleofunction: it is whatever feature or function that made it and its ancestor selected for copying, in competition with all the other similar copiable things. This would mean Aristotelean teleology is correct within the field of copyable things, replicators, i.e. within biology, although in physics still obviously incorrect.

Darwinian Reactionary drew attention to it two years ago and I still don’t understand why didn’t it generate a bigger buzz. It is an extremely important insight.

I mean, this is what we were waiting for, a proper synthesis of science and philosophy, and a proper way to rescue Aristotelean teleology, which leads to so excellent common-sense predictions that intuitively it cannot be very wrong, yet modern philosophy always denied it.

The result from that is the briding of the fact-value gap and burying the naturalistic fallacy: we CAN derive values from facts: a thing is good if it is well suitable for its natural purpose, teleofunction or proper function, which is the purpose it was selected for and copied for, the purpose and the suitability for the purpose that made the ancestors of this thing selected for copying, instead of all the other potential, similar ancestors.


What was humankind selected for? I am afraid, the answer is kind of ugly.

Men were selected to compete between groups, the cooperate within groups largely for coordinating for the sake of this competition, and have a low-key competition inside the groups as well for status and leadership. I am afraid, intelligence is all about organizing elaborate tribal raids: “coalitionary arms races”. The most civilized case, least brutal but still expensive case is arms races in prestige status, not dominance status: when Ancient Athens buildt pretty buildings and modern France built the TGV and America sent a man to the Moon in order to gain “gloire” i.e. the prestige type respect and status amongst the nations, the larger groups of mankind. If you are the type who doesn’t like blood, you should probably focus on these kinds of civilized, prestige-project competitions.

Women were selected for bearing children, for having strong and intelligent sons therefore having these heritable traits themselves (HBD kind of contradicts the more radically anti-woman aspects of RedPillery: marry a weak and stupid but attractive silly-blondie type woman and your son’s won’t be that great either), for pleasuring men and in some rarer but existing cases, to be true companions and helpers of their husbands.

- Matter: a change or movement's material cause, is the aspect of the change or movement which is determined by the material that composes the moving or changing things. For a table, that might be wood; for a statue, that might be bronze or marble.
- Form: a change or movement's formal cause, is a change or movement caused by the arrangement, shape or appearance of the thing changing or moving. Aristotle says for example that the ratio 2:1, and number in general, is the cause of the octave.
- Agent: a change or movement's efficient or moving cause, consists of things apart from the thing being changed or moved, which interact so as to be an agency of the change or movement. For example, the efficient cause of a table is a carpenter, or a person working as one, and according to Aristotle the efficient cause of a boy is a father.
- End or purpose: a change or movement's final cause, is that for the sake of which a thing is what it is. For a seed, it might be an adult plant. For a sailboat, it might be sailing. For a ball at the top of a ramp, it might be coming to rest at the bottom.

A proximate cause is an event which is closest to, or immediately responsible for causing, some observed result. This exists in contrast to a higher-level ultimate cause (or distal cause) which is usually thought of as the "real" reason something occurred.


- Ultimate causation explains traits in terms of evolutionary forces acting on them.
- Proximate causation explains biological function in terms of immediate physiological or environmental factors.
gnon  philosophy  ideology  thinking  conceptual-vocab  forms-instances  realness  analytical-holistic  bio  evolution  telos-atelos  distribution  nature  coarse-fine  epistemic  intricacy  is-ought  values  duplication  nihil  the-classics  big-peeps  darwinian  deep-materialism  selection  equilibrium  subjective-objective  models  classification  smoothness  discrete  schelling  optimization  approximation  comparison  multi  peace-violence  war  coalitions  status  s-factor  fashun  reputation  civilization  intelligence  competition  leadership  cooperate-defect  within-without  within-group  group-level  homo-hetero  new-religion  causation  direct-indirect  ends-means  metabuch  physics  axioms  skeleton  wiki  reference  concept  being-becoming  essence-existence  logos  real-nominal 
july 2018 by nhaliday
Why read old philosophy? | Meteuphoric
(This story would suggest that in physics students are maybe missing out on learning the styles of thought that produce progress in physics. My guess is that instead they learn them in grad school when they are doing research themselves, by emulating their supervisors, and that the helpfulness of this might partially explain why Nobel prizewinner advisors beget Nobel prizewinner students.)

The story I hear about philosophy—and I actually don’t know how much it is true—is that as bits of philosophy come to have any methodological tools other than ‘think about it’, they break off and become their own sciences. So this would explain philosophy’s lone status in studying old thinkers rather than impersonal methods—philosophy is the lone ur-discipline without impersonal methods but thinking.

This suggests a research project: try summarizing what Aristotle is doing rather than Aristotle’s views. Then write a nice short textbook about it.
ratty  learning  reading  studying  prioritizing  history  letters  philosophy  science  comparison  the-classics  canon  speculation  reflection  big-peeps  iron-age  mediterranean  roots  lens  core-rats  thinking  methodology  grad-school  academia  physics  giants  problem-solving  meta:research  scholar  the-trenches  explanans  crux  metameta  duplication  sociality  innovation  quixotic  meta:reading  classic 
june 2018 by nhaliday
Cultural variation in cultural evolution | Proceedings of the Royal Society of London B: Biological Sciences
Cultural evolutionary models have identified a range of conditions under which social learning (copying others) is predicted to be adaptive relative to asocial learning (learning on one's own), particularly in humans where socially learned information can accumulate over successive generations. However, cultural evolution and behavioural economics experiments have consistently shown apparently maladaptive under-utilization of social information in Western populations. Here we provide experimental evidence of cultural variation in people's use of social learning, potentially explaining this mismatch. People in mainland China showed significantly more social learning than British people in an artefact-design task designed to assess the adaptiveness of social information use. People in Hong Kong, and Chinese immigrants in the UK, resembled British people in their social information use, suggesting a recent shift in these groups from social to asocial learning due to exposure to Western culture. Finally, Chinese mainland participants responded less than other participants to increased environmental change within the task. Our results suggest that learning strategies in humans are culturally variable and not genetically fixed, necessitating the study of the ‘social learning of social learning strategies' whereby the dynamics of cultural evolution are responsive to social processes, such as migration, education and globalization.


Western education emphasizes individual discovery and creativity, whereas East Asian education emphasizes rote learning from authority [25]. The adoption of consumer products shows less social influence in Western than East Asian countries [26]. Westerners are described as more individualistic/independent, while East Asians are described as more collectivistic/interdependent [27], dimensions which intuitively map on to asocial and social learning, respectively.

Societal background influences social learning in cooperative decision making: https://www.sciencedirect.com/science/article/pii/S1090513817303501
We demonstrate that Chinese participants base their cooperation decisions on information about their peers much more frequently than their British counterparts. Moreover, our results reveal remarkable societal differences in the type of peer information people consider. In contrast to the consensus view, Chinese participants tend to be substantially less majority-oriented than the British. While Chinese participants are inclined to adopt peer behavior that leads to higher payoffs, British participants tend to cooperate only if sufficiently many peers do so too. These results indicate that the basic processes underlying social transmission are not universal; rather, they vary with cultural conditions. As success-based learning is associated with selfish behavior and majority-based learning can help foster cooperation, our study suggests that in different societies social learning can play diverging roles in the emergence and maintenance of cooperation.
study  org:nat  anthropology  cultural-dynamics  sapiens  pop-diff  comparison  sociality  learning  duplication  individualism-collectivism  n-factor  europe  the-great-west-whale  china  asia  sinosphere  britain  anglosphere  strategy  environmental-effects  biodet  within-without  tribalism  things  broad-econ  psychology  cog-psych  social-psych  🎩  🌞  microfoundations  egalitarianism-hierarchy  innovation  creative  explanans  education  culture  curiosity  multi  occident  cooperate-defect  coordination  organizing  self-interest  altruism  patho-altruism  orient  ecology  axelrod  explore-exploit  cybernetics  info-dynamics  spreading 
may 2018 by nhaliday
Theory of Self-Reproducing Automata - John von Neumann

Comparisons between computing machines and the nervous systems. Estimates of size for computing machines, present and near future.

Estimates for size for the human central nervous system. Excursus about the “mixed” character of living organisms. Analog and digital elements. Observations about the “mixed” character of all componentry, artificial as well as natural. Interpretation of the position to be taken with respect to these.

Evaluation of the discrepancy in size between artificial and natural automata. Interpretation of this discrepancy in terms of physical factors. Nature of the materials used.

The probability of the presence of other intellectual factors. The role of complication and the theoretical penetration that it requires.

Questions of reliability and errors reconsidered. Probability of individual errors and length of procedure. Typical lengths of procedure for computing machines and for living organisms--that is, for artificial and for natural automata. Upper limits on acceptable probability of error in individual operations. Compensation by checking and self-correcting features.

Differences of principle in the way in which errors are dealt with in artificial and in natural automata. The “single error” principle in artificial automata. Crudeness of our approach in this case, due to the lack of adequate theory. More sophisticated treatment of this problem in natural automata: The role of the autonomy of parts. Connections between this autonomy and evolution.

- 10^10 neurons in brain, 10^4 vacuum tubes in largest computer at time
- machines faster: 5 ms from neuron potential to neuron potential, 10^-3 ms for vacuum tubes

pdf  article  papers  essay  nibble  math  cs  computation  bio  neuro  neuro-nitgrit  scale  magnitude  comparison  acm  von-neumann  giants  thermo  phys-energy  speed  performance  time  density  frequency  hardware  ems  efficiency  dirty-hands  street-fighting  fermi  estimate  retention  physics  interdisciplinary  multi  wiki  links  people  🔬  atoms  duplication  iteration-recursion  turing  complexity  measure  nature  technology  complex-systems  bits  information-theory  circuits  robust  structure  composition-decomposition  evolution  mutation  axioms  analogy  thinking  input-output  hi-order-bits  coding-theory  flexibility  rigidity  automata-languages 
april 2018 by nhaliday
Eternity in six hours: intergalactic spreading of intelligent life and sharpening the Fermi paradox
We do this by demonstrating that traveling between galaxies – indeed even launching a colonisation project for the entire reachable universe – is a relatively simple task for a star-spanning civilization, requiring modest amounts of energy and resources. We start by demonstrating that humanity itself could likely accomplish such a colonisation project in the foreseeable future, should we want to, and then demonstrate that there are millions of galaxies that could have reached us by now, using similar methods. This results in a considerable sharpening of the Fermi paradox.
pdf  study  article  essay  anthropic  fermi  space  expansionism  bostrom  ratty  philosophy  xenobio  ideas  threat-modeling  intricacy  time  civilization  🔬  futurism  questions  paradox  risk  physics  engineering  interdisciplinary  frontier  technology  volo-avolo  dirty-hands  ai  automation  robotics  duplication  iteration-recursion  von-neumann  data  scale  magnitude  skunkworks  the-world-is-just-atoms  hard-tech  ems  bio  bits  speedometer  nature  model-organism  mechanics  phys-energy  relativity  electromag  analysis  spock  nitty-gritty  spreading  hanson  street-fighting  speed  gedanken  nibble 
march 2018 by nhaliday
Definite optimism as human capital | Dan Wang
I’ve come to the view that creativity and innovative capacity aren’t a fixed stock, coiled and waiting to be released by policy. Now, I know that a country will not do well if it has poor infrastructure, interest rate management, tax and regulation levels, and a whole host of other issues. But getting them right isn’t sufficient to promote innovation; past a certain margin, when they’re all at rational levels, we ought to focus on promoting creativity and drive as a means to propel growth.


When I say “positive” vision, I don’t mean that people must see the future as a cheerful one. Instead, I’m saying that people ought to have a vision at all: A clear sense of how the technological future will be different from today. To have a positive vision, people must first expand their imaginations. And I submit that an interest in science fiction, the material world, and proximity to industry all help to refine that optimism. I mean to promote imagination by direct injection.


If a state has lost most of its jobs for electrical engineers, or nuclear engineers, or mechanical engineers, then fewer young people in that state will study those practices, and technological development in related fields slow down a little further. When I bring up these thoughts on resisting industrial decline to economists, I’m unsatisfied with their responses. They tend to respond by tautology (“By definition, outsourcing improves on the status quo”) or arithmetic (see: gains from comparative advantage, Ricardo). These kinds of logical exercises are not enough. I would like for more economists to consider a human capital perspective for preserving manufacturing expertise (to some degree).

I wonder if the so-called developed countries should be careful of their own premature deindustrialization. The US industrial base has faltered, but there is still so much left to build. Until we’ve perfected asteroid mining and super-skyscrapers and fusion rockets and Jupiter colonies and matter compilers, we can’t be satisfied with innovation confined mostly to the digital world.

Those who don’t mind the decline of manufacturing employment like to say that people have moved on to higher-value work. But I’m not sure that this is usually the case. Even if there’s an endlessly capacious service sector to absorb job losses in manufacturing, it’s often the case that these new jobs feature lower productivity growth and involve greater rent-seeking. Not everyone is becoming hedge fund managers and machine learning engineers. According to BLS, the bulk of service jobs are in 1. government (22 million), 2. professional services (19m), 3. healthcare (18m), 4. retail (15m), and 5. leisure and hospitality (15m). In addition to being often low-paying but still competitive, a great deal of service sector jobs tend to stress capacity for emotional labor over capacity for manual labor. And it’s the latter that tends to be more present in fields involving technological upgrading.


Here’s a bit more skepticism of service jobs. In an excellent essay on declining productivity growth, Adair Turner makes the point that many service jobs are essentially zero-sum. I’d like to emphasize and elaborate on that idea here.


Call me a romantic, but I’d like everyone to think more about industrial lubricants, gas turbines, thorium reactors, wire production, ball bearings, underwater cables, and all the things that power our material world. I abide by a strict rule never to post or tweet about current political stuff; instead I try to draw more attention to the world of materials. And I’d like to remind people that there are many things more edifying than following White House scandals.


First, we can all try to engage more actively with the material world, not merely the digital or natural world. Go ahead and pick an industrial phenomenon and learn more about it. Learn more about the history of aviation, and what it took to break the sound barrier; gaze at the container ships as they sail into port, and keep in mind that they carry 90 percent of the goods you see around you; read about what we mold plastics to do; meditate on the importance of steel in civilization; figure out what’s driving the decline in the cost of solar energy production, or how we draw electricity from nuclear fission, or what it takes to extract petroleum or natural gas from the ground.


Here’s one more point that I’d like to add on Girard at college: I wonder if to some extent current dynamics are the result of the liberal arts approach of “college teaches you how to think, not what to think.” I’ve never seen much data to support this wonderful claim that college is good at teaching critical thinking skills. Instead, students spend most of their energies focused on raising or lowering the status of the works they study or the people around them, giving rise to the Girardian terror that has gripped so many campuses.

College as an incubator of Girardian terror: http://danwang.co/college-girardian-terror/
It’s hard to construct a more perfect incubator for mimetic contagion than the American college campus. Most 18-year-olds are not super differentiated from each other. By construction, whatever distinctions any does have are usually earned through brutal, zero-sum competitions. These tournament-type distinctions include: SAT scores at or near perfection; being a top player on a sports team; gaining master status from chess matches; playing first instrument in state orchestra; earning high rankings in Math Olympiad; and so on, culminating in gaining admission to a particular college.

Once people enter college, they get socialized into group environments that usually continue to operate in zero-sum competitive dynamics. These include orchestras and sport teams; fraternities and sororities; and many types of clubs. The biggest source of mimetic pressures are the classes. Everyone starts out by taking the same intro classes; those seeking distinction throw themselves into the hardest classes, or seek tutelage from star professors, and try to earn the highest grades.

Mimesis Machines and Millennials: http://quillette.com/2017/11/02/mimesis-machines-millennials/
In 1956, a young Liverpudlian named John Winston Lennon heard the mournful notes of Elvis Presley’s Heartbreak Hotel, and was transformed. He would later recall, “nothing really affected me until I heard Elvis. If there hadn’t been an Elvis, there wouldn’t have been the Beatles.” It is an ancient human story. An inspiring model, an inspired imitator, and a changed world.

Mimesis is the phenomenon of human mimicry. Humans see, and they strive to become what they see. The prolific Franco-Californian philosopher René Girard described the human hunger for imitation as mimetic desire. According to Girard, mimetic desire is a mighty psychosocial force that drives human behavior. When attempted imitation fails, (i.e. I want, but fail, to imitate my colleague’s promotion to VP of Business Development), mimetic rivalry arises. According to mimetic theory, periodic scapegoating—the ritualistic expelling of a member of the community—evolved as a way for archaic societies to diffuse rivalries and maintain the general peace.

As civilization matured, social institutions evolved to prevent conflict. To Girard, sacrificial religious ceremonies first arose as imitations of earlier scapegoating rituals. From the mimetic worldview healthy social institutions perform two primary functions,

They satisfy mimetic desire and reduce mimetic rivalry by allowing imitation to take place.
They thereby reduce the need to diffuse mimetic rivalry through scapegoating.
Tranquil societies possess and value institutions that are mimesis tolerant. These institutions, such as religion and family, are Mimesis Machines. They enable millions to see, imitate, and become new versions of themselves. Mimesis Machines, satiate the primal desire for imitation, and produce happy, contented people. Through Mimesis Machines, Elvis fans can become Beatles.

Volatile societies, on the other hand, possess and value mimesis resistant institutions that frustrate attempts at mimicry, and mass produce frustrated, resentful people. These institutions, such as capitalism and beauty hierarchies, are Mimesis Shredders. They stratify humanity, and block the ‘nots’ from imitating the ‘haves’.
techtariat  venture  commentary  reflection  innovation  definite-planning  thiel  barons  economics  growth-econ  optimism  creative  malaise  stagnation  higher-ed  status  error  the-world-is-just-atoms  heavy-industry  sv  zero-positive-sum  japan  flexibility  china  outcome-risk  uncertainty  long-short-run  debt  trump  entrepreneurialism  human-capital  flux-stasis  cjones-like  scifi-fantasy  labor  dirty-hands  engineering  usa  frontier  speedometer  rent-seeking  econ-productivity  government  healthcare  essay  rhetoric  contrarianism  nascent-state  unintended-consequences  volo-avolo  vitality  technology  tech  cs  cycles  energy-resources  biophysical-econ  trends  zeitgeist  rot  alt-inst  proposal  multi  news  org:mag  org:popup  philosophy  big-peeps  speculation  concept  religion  christianity  theos  buddhism  politics  polarization  identity-politics  egalitarianism-hierarchy  inequality  duplication  society  anthropology  culture-war  westminster  info-dynamics  tribalism  institutions  envy  age-generation  letters  noble-lie 
october 2017 by nhaliday
Gimbal lock - Wikipedia
Gimbal lock is the loss of one degree of freedom in a three-dimensional, three-gimbal mechanism that occurs when the axes of two of the three gimbals are driven into a parallel configuration, "locking" the system into rotation in a degenerate two-dimensional space.

The word lock is misleading: no gimbal is restrained. All three gimbals can still rotate freely about their respective axes of suspension. Nevertheless, because of the parallel orientation of two of the gimbals' axes there is no gimbal available to accommodate rotation along one axis.

Now this is where most people stop thinking about the issue and move on with their life. They just conclude that Euler angles are somehow broken. This is also where a lot of misunderstandings happen so it's worth investigating the matter slightly further than what causes gimbal lock.

It is important to understand that this is only problematic if you interpolate in Euler angles**! In a real physical gimbal this is given - you have no other choice. In computer graphics you have many other choices, from normalized matrix, axis angle or quaternion interpolation. Gimbal lock has a much more dramatic implication to designing control systems than it has to 3d graphics. Which is why a mechanical engineer for example will have a very different take on gimbal locking.

You don't have to give up using Euler angles to get rid of gimbal locking, just stop interpolating values in Euler angles. Of course, this means that you can now no longer drive a rotation by doing direct manipulation of one of the channels. But as long as you key the 3 angles simultaneously you have no problems and you can internally convert your interpolation target to something that has less problems.

Using Euler angles is just simply more intuitive to think in most cases. And indeed Euler never claimed it was good for interpolating but just that it can model all possible space orientations. So Euler angles are just fine for setting orientations like they were meant to do. Also incidentally Euler angles have the benefit of being able to model multi turn rotations which will not happen sanely for the other representations.
nibble  dirty-hands  physics  mechanics  robotics  degrees-of-freedom  measurement  gotchas  volo-avolo  duplication  wiki  reference  multi  q-n-a  stackex  graphics  spatial  direction  dimensionality  sky 
september 2017 by nhaliday
Defection – quas lacrimas peperere minoribus nostris!

Kindness Against The Grain: https://srconstantin.wordpress.com/2017/06/08/kindness-against-the-grain/
I’ve heard from a number of secular-ish sources (Carse, Girard, Arendt) that the essential contribution of Christianity to human thought is the concept of forgiveness. (Ribbonfarm also has a recent post on the topic of forgiveness.)

I have never been a Christian and haven’t even read all of the New Testament, so I’ll leave it to commenters to recommend Christian sources on the topic.

What I want to explore is the notion of kindness without a smooth incentive gradient.

The Social Module: https://bloodyshovel.wordpress.com/2015/10/09/the-social-module/
Now one could propose that the basic principle of human behavior is to raise the SP number. Sure there’s survival and reproduction. Most people would forget all their socialization if left hungry and thirsty for days in the jungle. But more often than not, survival and reproduction depend on being high status; having a good name among your peers is the best way to get food, housing and hot mates.

The way to raise one’s SP number depends on thousands of different factors. We could grab most of them and call them “culture”. In China having 20 teenage mistresses as an old man raises your SP; in Western polite society it is social death. In the West making a fuss about disobeying one’s parents raises your SP, everywhere else it lowers it a great deal. People know that; which is why bureaucrats in China go to great lengths to acquire a stash of young women (who they seldom have time to actually enjoy), while teenagers in the West go to great lengths to be annoying to their parents for no good reason.


It thus shouldn’t surprise us that something as completely absurd as Progressivism is the law of the land in most of the world today, even though it denies obvious reality. It is not the case that most people know that progressive points are all bogus, but obey because of fear or cowardice. No, an average human brain has much more neurons being used to scan the social climate and see how SP are allotted, than neurons being used to analyze patterns in reality to ascertain the truth. Surely your brain does care a great deal about truth in some very narrow areas of concern to you. Remember Conquest’s first law: Everybody is Conservative about what he knows best. You have to know the truth about what you do, if you are to do it effectively.

But you don’t really care about truth anywhere else. And why would you? It takes time and effort you can’t really spare, and it’s not really necessary. As long as you have some area of specialization where you can make a living, all the rest you must do to achieve survival and reproduction is to raise your SP so you don’t get killed and your guts sacrificed to the mountain spirits.

SP theory (I accept suggestions for a better name) can also explains the behavior of leftists. Many conservatives of a medium level of enlightenment point out the paradox that leftists historically have held completely different ideas. Leftism used to be about the livelihood of industrial workers, now they agitate about the environment, or feminism, or foreigners. Some people would say that’s just historical change, or pull a No True Scotsman about this or that group not being really leftists. But that’s transparent bullshit; very often we see a single person shifting from agitating about Communism and worker rights, to agitate about global warming or rape culture.


The leftist strategy could be defined as “psychopathic SP maximization”. Leftists attempt to destroy social equilibrium so that they can raise their SP number. If humans are, in a sense, programmed to constantly raise their status, well high status people by definition can’t raise it anymore (though they can squabble against each other for marginal gains), their best strategy is to freeze society in place so that they can enjoy their superiority. High status people by definition have power, and thus social hierarchy during human history tends to be quite stable.

This goes against the interests of many. First of all the lower status people, who, well, want to raise their status, but can’t manage to do so. And it also goes against the interests of the particularly annoying members of the upper class who want to raise their status on the margin. Conservative people can be defined as those who, no matter the absolute level, are in general happy with it. This doesn’t mean they don’t want higher status (by definition all humans do), but the output of other brain modules may conclude that attempts to raise SP might threaten one’s survival and reproduction; or just that the chances of raising one’s individual SP is hopeless, so one might as well stay put.


You can’t blame people for being logically inconsistent; because they can’t possibly know anything about all these issues. Few have any experience or knowledge about evolution and human races, or about the history of black people to make an informed judgment on HBD. Few have time to learn about sex differences, and stuff like the climate is as close to unknowable as there is. Opinions about anything but a very narrow area of expertise are always output of your SP module, not any judgment of fact. People don’t know the facts. And even when they know; I mean most people have enough experience with sex differences and black dysfunction to be quite confident that progressive ideas are false. But you can never be sure. As Hume said, the laws of physics are a judgment of habit; who is to say that a genie isn’t going to change all you know the next morning? At any rate, you’re always better off toeing the line, following the conventional wisdom, and keeping your dear SP. Perhaps you can even raise them a bit. And that is very nice. It is niceness itself.

Leftism is just an easy excuse: https://bloodyshovel.wordpress.com/2015/03/01/leftism-is-just-an-easy-excuse/
Unless you’re not the only defector. You need a way to signal your intention to defect, so that other disloyal fucks such as yourself (and they’re bound to be others) can join up, thus reducing the likely costs of defection. The way to signal your intention to defect is to come up with a good excuse. A good excuse to be disloyal becomes a rallying point through which other defectors can coordinate and cover their asses so that the ruling coalition doesn’t punish them. What is a good excuse?

Leftism is a great excuse. Claiming that the ruling coalition isn’t leftist enough, isn’t holy enough, not inclusive enough of women, of blacks, of gays, or gorillas, of pedophiles, of murderous Salafists, is the perfect way of signalling your disloyalty towards the existing power coalition. By using the existing ideology and pushing its logic just a little bit, you ensure that the powerful can’t punish you. At least not openly. And if you’re lucky, the mass of disloyal fucks in the ruling coalition might join your banner, and use your exact leftist point to jump ship and outflank the powerful.


The same dynamic fuels the flattery inflation one sees in monarchical or dictatorial systems. In Mao China, if you want to defect, you claim to love Mao more than your boss. In Nazi Germany, you proclaim your love for Hitler and the great insight of his plan to take Stalingrad. In the Roman Empire, you claimed that Caesar is a God, son of Hercules, and those who deny it are treacherous bastards. In Ancient Persia you loudly proclaimed your faith in the Shah being the brother of the Sun and the Moon and King of all Kings on Earth. In Reformation Europe you proclaimed that you have discovered something new in the Bible and everybody else is damned to hell. Predestined by God!


And again: the precise content of the ideological point doesn’t matter. Your human brain doesn’t care about ideology. Humans didn’t evolve to care about Marxist theory of class struggle, or about LGBTQWERTY theories of social identity. You just don’t know what it means. It’s all abstract points you’ve been told in a classroom. It doesn’t actually compute. Nothing that anybody ever said in a political debate ever made any actual, concrete sense to a human being.

So why do we care so much about politics? What’s the point of ideology? Ideology is just the water you swim in. It is a structured database of excuses, to be used to signal your allegiance or defection to the existing ruling coalition. Ideology is just the feed of the rationalization Hamster that runs incessantly in that corner of your brain. But it is immaterial, and in most cases actually inaccessible to the logical modules in your brain.

Nobody ever acts on their overt ideological claims if they can get away with it. Liberals proclaim their faith in the potential of black children while clustering in all white suburbs. Communist party members loudly talk about the proletariat while being hedonistic spenders. Al Gore talks about Global Warming while living in a lavish mansion. Cognitive dissonance, you say? No; those cognitive systems are not connected in the first place.


And so, every little step in the way, power-seekers moved the consensus to the left. And open societies, democratic systems are by their decentralized nature, and by the size of their constituencies, much more vulnerable to this sort of signalling attacks. It is but impossible to appraise and enforce the loyalty of every single individual involved in a modern state. There’s too many of them. A Medieval King had a better chance of it; hence the slow movement of ideological innovation in those days. But the bigger the organization, the harder it is to gather accurate information of the loyalty of the whole coalition; and hence the ideological movement accelerates. And there is no stopping it.

Like the Ancients, We Have Gods. They’ll Get Greater: http://www.overcomingbias.com/2018/04/like-the-ancients-we-have-gods-they-may-get… [more]
gnon  commentary  critique  politics  polisci  strategy  tactics  thinking  GT-101  game-theory  cooperate-defect  hypocrisy  institutions  incentives  anthropology  morality  ethics  formal-values  ideology  schelling  equilibrium  multi  links  debate  ethnocentrism  cultural-dynamics  decision-making  socs-and-mops  anomie  power  info-dynamics  propaganda  signaling  axelrod  organizing  impetus  democracy  antidemos  duty  coalitions  kinship  religion  christianity  theos  n-factor  trust  altruism  noble-lie  japan  asia  cohesion  reason  scitariat  status  fashun  history  mostly-modern  world-war  west-hunter  sulla  unintended-consequences  iron-age  china  sinosphere  stories  leviathan  criminal-justice  peace-violence  nihil  wiki  authoritarianism  egalitarianism-hierarchy  cocktail  ssc  parable  open-closed  death  absolute-relative  justice  management  explanans  the-great-west-whale  occident  orient  courage  vitality  domestication  revolution  europe  pop-diff  alien-character  diversity  identity-politics  westminster  kumbaya-kult  cultu 
june 2017 by nhaliday
10 million DTC dense marker genotypes by end of 2017? – Gene Expression
Ultimately I do wonder if I was a bit too optimistic that 50% of the US population will be sequenced at 30x by 2025. But the dynamic is quite likely to change rapidly because of a technological shift as the sector goes through a productivity uptick. We’re talking about exponential growth, which humans have weak intuition about….
gnxp  scitariat  commentary  biotech  scaling-up  genetics  genomics  scale  bioinformatics  multi  toys  measurement  duplication  signal-noise  coding-theory 
june 2017 by nhaliday
spaceships - Can there be a space age without petroleum (crude oil)? - Worldbuilding Stack Exchange

What was really important to our development of technology was not oil, but coal. Access to large deposits of high-quality coal largely fueled the industrial revolution, and it was the industrial revolution that really got us on the first rungs of the technological ladder.

Oil is a fantastic fuel for an advanced civilisation, but it's not essential. Indeed, I would argue that our ability to dig oil out of the ground is a crutch, one that we should have discarded long ago. The reason oil is so essential to us today is that all our infrastructure is based on it, but if we'd never had oil we could still have built a similar infrastructure. Solar power was first displayed to the public in 1878. Wind power has been used for centuries. Hydroelectric power is just a modification of the same technology as wind power.

Without oil, a civilisation in the industrial age would certainly be able to progress and advance to the space age. Perhaps not as quickly as we did, but probably more sustainably.

Without coal, though...that's another matter

What would the industrial age be like without oil and coal?: https://worldbuilding.stackexchange.com/questions/45919/what-would-the-industrial-age-be-like-without-oil-and-coal

Out of the ashes: https://aeon.co/essays/could-we-reboot-a-modern-civilisation-without-fossil-fuels
It took a lot of fossil fuels to forge our industrial world. Now they're almost gone. Could we do it again without them?

But charcoal-based industry didn’t die out altogether. In fact, it survived to flourish in Brazil. Because it has substantial iron deposits but few coalmines, Brazil is the largest charcoal producer in the world and the ninth biggest steel producer. We aren’t talking about a cottage industry here, and this makes Brazil a very encouraging example for our thought experiment.

The trees used in Brazil’s charcoal industry are mainly fast-growing eucalyptus, cultivated specifically for the purpose. The traditional method for creating charcoal is to pile chopped staves of air-dried timber into a great dome-shaped mound and then cover it with turf or soil to restrict airflow as the wood smoulders. The Brazilian enterprise has scaled up this traditional craft to an industrial operation. Dried timber is stacked into squat, cylindrical kilns, built of brick or masonry and arranged in long lines so that they can be easily filled and unloaded in sequence. The largest sites can sport hundreds of such kilns. Once filled, their entrances are sealed and a fire is lit from the top.
q-n-a  stackex  curiosity  gedanken  biophysical-econ  energy-resources  long-short-run  technology  civilization  industrial-revolution  heavy-industry  multi  modernity  frontier  allodium  the-world-is-just-atoms  big-picture  ideas  risk  volo-avolo  news  org:mag  org:popup  direct-indirect  retrofit  dirty-hands  threat-modeling  duplication  iteration-recursion  latin-america  track-record  trivia  cocktail  data 
june 2017 by nhaliday
One more time | West Hunter
One of our local error sources suggested that it would be impossible to rebuild technical civilization, once fallen. Now if every human were dead I’d agree, but in most other scenarios it wouldn’t be particularly difficult, assuming that the survivors were no more silly and fractious than people are today.  So assume a mild disaster, something like the effect of myxomatosis on the rabbits of Australia, or perhaps toe-to-toe nuclear combat with the Russkis – ~90%  casualties worldwide.

Books are everywhere. In the type of scenario I sketched out, almost no knowledge would be lost – so Neolithic tech is irrelevant. Look, if a single copy of the 1911 Britannica survived, all would be well.

You could of course harvest metals from the old cities. But even if if you didn’t, the idea that there is no more copper or zinc or tin in the ground is just silly. “recoverable ore” is mostly an economic concept.

Moreover, if we’re talking wiring and electrical uses, one can use aluminum, which makes up 8% of the Earth’s crust.

Some of those book tell you how to win.

Look, assume that some communities strive to relearn how to make automatic weapons and some don’t. How does that story end? Do I have to explain everything?

I guess so!

Well, perhaps having a zillion times more books around would make a difference. That and all the “X for Dummies” books, which I think the Romans didn’t have.

A lot of Classical civ wasn’t very useful: on the whole they didn’t invent much. On the whole, technology advanced quite a bit more rapidly in Medieval times.

How much coal and oil are in the ground that can still be extracted with 19th century tech? Honest question; I don’t know.
Lots of coal left. Not so much oil (using simple methods), but one could make it from low-grade coal, with the Fischer-Tropsch process. Sasol does this.

Then again, a recovering society wouldn’t need much at first.

reply to: https://westhunt.wordpress.com/2015/05/17/one-more-time/#comment-69220
That’s more like it.

#1. Consider Grand Coulee Dam. Gigawatts. Feeling of power!
#2. Of course.
#3. Might be easier to make superconducting logic circuits with MgB2, starting over.

Your typical biker guy is more mechanically minded than the average Joe. Welding, electrical stuff, this and that.

If fossil fuels were unavailable -or just uneconomical at first- we’d be back to charcoal for our Stanley Steamers and railroads. We’d still have both.

The French, and others, used wood-gasifier trucks during WWII.

Teslas are of course a joke.
west-hunter  scitariat  civilization  risk  nihil  gedanken  frontier  allodium  technology  energy-resources  knowledge  the-world-is-just-atoms  discussion  speculation  analysis  biophysical-econ  big-picture  🔬  ideas  multi  history  iron-age  the-classics  medieval  europe  poast  the-great-west-whale  the-trenches  optimism  volo-avolo  mostly-modern  world-war  gallic  track-record  musk  barons  transportation  driving  contrarianism  agriculture  retrofit  industrial-revolution  dirty-hands  books  competition  war  group-selection  comparison  mediterranean  conquest-empire  gibbon  speedometer  class  threat-modeling  duplication  iteration-recursion  trivia  cocktail  encyclopedic  definite-planning  embodied  gnosis-logos  kumbaya-kult 
may 2017 by nhaliday
Low-Hanging Fruit: Consider the Ant | West Hunter
Which ought to be a reminder that biomimetics is a useful approach to invention:  If you can’t think of anything yourself, steal from the products of evolution.  It’s like an an Edisonian approach, only on steroids.

Along those lines, it is well known, to about 0.1% of the population, that some ants have agriculture. Some protect and herd aphids: others gather leaves as the feedstock for an edible fungus. Those leaf-cutting ants also carry symbiotic fungicide-producing  bacteria that protect against weed fungi [ herbicides invented well before atrazine or 2-4D]  Speaking of, if you really, really want to cause trouble, introduce leaf-cutting ants to Africa.
west-hunter  scitariat  discussion  proposal  low-hanging  innovation  bio  nature  agriculture  technology  ideas  discovery  the-trenches  alt-inst  science  model-organism  track-record  judgement  duplication  analogy 
april 2017 by nhaliday
Reconstruction | West Hunter
Since power descended through the male line, you don’t expect to see the same thing happen with autosomal genes. Genghis accounts for about 25% of Mongolia’s Y-chromosomes, but the general ancestry fraction attributable to him must be a lot lower. Still, what if the average Mongol today is 0.5% Genghis? Upon sequencing lots of typical contemporary Mongols, you would notice certain chromosomal segments showing up again and again: not just in one family but in the whole country, and in other parts of inner Asia as well. If you started keeping track of those segments, you would eventually be able to make a partial reconstruction of Genghis’s genome. It would be incomplete, since any given region of the genome might have missed being transmitted to any of his four legitimate sons (Jochi, Chagatai, Ogedei, and Tolui). They certainly didn’t carry his X-chromosome. You might be able to distinguish the autosomal genes of Genghis and his wife Borte by looking at descendants of his by-blows, if you could find them. Still, even if you managed to retrieve 75% of his genome, that’s not enough to make a clone. It would however, allow sure identification if we found his tomb.

And since he’s likely buried in permafrost, his DNA could be in good shape. Then we could clone him (assuming reasonable continuing progress in genetics) and of course some damn fool would. Will.
west-hunter  scitariat  speculation  discussion  proposal  genetics  genomics  sapiens  asia  aDNA  wild-ideas  history  medieval  ideas  archaeology  conquest-empire  search  duplication  gavisti  traces 
april 2017 by nhaliday
Mark Zuckerberg: Building Global Community | Hacker News
The view of human nature implied by these ideas is pretty dark. If all people want to do is go and look at other people so that they can compare themselves to them and copy what they want – if that is the final, deepest truth about humanity and its motivations – then Facebook doesn’t really have to take too much trouble over humanity’s welfare, since all the bad things that happen to us are things we are doing to ourselves. For all the corporate uplift of its mission statement, Facebook is a company whose essential premise is misanthropic. It is perhaps for that reason that Facebook, more than any other company of its size, has a thread of malignity running through its story. The high-profile, tabloid version of this has come in the form of incidents such as the live-streaming of rapes, suicides, murders and cop-killings. But this is one of the areas where Facebook seems to me relatively blameless. People live-stream these terrible things over the site because it has the biggest audience; if Snapchat or Periscope were bigger, they’d be doing it there instead.
This isnt about whether 'dangerous' speech should be suppressed, but whether to validate tech industry's selfconception as educators of man.
hn  commentary  facebook  barons  internet  community  society  civic  diversity  media  multi  news  org:mag  org:biz  rhetoric  privacy  civil-liberty  org:med  announcement  technocracy  managerial-state  universalism-particularism  nationalism-globalism  vampire-squid  kumbaya-kult  org:rec  org:anglo  letters  books  review  critique  rant  backup  twitter  social  discussion  gnon  🐸  envy  thiel  duplication  utopia-dystopia 
february 2017 by nhaliday
What Chinese corner-cutting reveals about modernity | Aeon Essays
Your balcony fell off? Chabuduo. Vaccines are overheated? Chabuduo. How China became the land of disastrous corner-cutting

The copy is the original: https://aeon.co/essays/why-in-china-and-japan-a-copy-is-just-as-good-as-an-original
In China and Japan, temples may be rebuilt and ancient warriors cast again. There is nothing sacred about the ‘original'
news  org:mag  culture  china  business  institutions  asia  analytical-holistic  sinosphere  org:popup  n-factor  approximation  heavy-industry  speedometer  dirty-hands  quality  tightness  discipline  multi  japan  pop-diff  cultural-dynamics  innovation  creative  explanans  values  duplication  sanctity-degradation  europe  orient  occident  the-great-west-whale  religion  christianity  buddhism  morality  ethics  cycles  forms-instances  apollonian-dionysian  being-becoming  essence-existence 
december 2016 by nhaliday
- quantum supremacy [Scott Aaronson]
- gene drive
- gene editing/CRISPR
- carcinogen may be entropy
- differentiable programming
- quantitative biology
- antisocial punishment of pro-social cooperators
- "strongest prejudice" (politics) [Haidt]
- Europeans' origins [Cochran]
- "Anthropic Capitalism And The New Gimmick Economy" [Eric Weinstein]

There's an underdiscussed contradiction between the idea that our society would make almost all knowledge available freely and instantaneously to almost everyone and that almost everyone would find gainful employment as knowledge workers. Value is in scarcity not abundance.
You’d need to turn reputational-based systems into an income stream
technology  discussion  trends  gavisti  west-hunter  aaronson  haidt  list  expert  science  biotech  geoengineering  top-n  org:edge  frontier  multi  CRISPR  2016  big-picture  links  the-world-is-just-atoms  quantum  quantum-info  computation  metameta  🔬  scitariat  q-n-a  zeitgeist  speedometer  cancer  random  epidemiology  mutation  GT-101  cooperate-defect  cultural-dynamics  anthropology  expert-experience  tcs  volo-avolo  questions  thiel  capitalism  labor  supply-demand  internet  tech  economics  broad-econ  prediction  automation  realness  gnosis-logos  iteration-recursion  similarity  uniqueness  homo-hetero  education  duplication  creative  software  programming  degrees-of-freedom  futurism  order-disorder  flux-stasis  public-goodish  markets  market-failure  piracy  property-rights  free-riding  twitter  social  backup  ratty  unaffiliated  gnon  contradiction  career  planning  hmm  idk  knowledge  higher-ed  pro-rata  sociality  reinforcement  tribalism  us-them  politics  coalitions  prejudice  altruism  human-capital  engineering  unintended-consequences 
november 2016 by nhaliday
What’s the catch? | West Hunter
Neanderthals and the Wrath of Khan

if someone were to try to create a Neanderthal a few years from now, starting with ancient DNA, they’d have to have worry a lot about data errors, because such errors would translate into mutations, which might be harmful or even lethal. Assume that we have figured out how to get the gene expression right, have all the proper methylation etc: we have modern humans as a template and you know there isn’t that much difference.

They might try consensus averaging – take three high-quality Neanderthal genomes and make your synthetic genome by majority rule: we ignore a nucleotide change in one genome if it’s not there in the other two. ‘tell me three times’, a simple form of error-correcting code.

But doing this would cause a problem. Can you see what the problem is?
west-hunter  sapiens  speculation  enhancement  archaics  discussion  genetics  genetic-load  🌞  gedanken  unintended-consequences  cocktail  error  aDNA  signal-noise  coding-theory  scitariat  wild-ideas  ideas  archaeology  perturbation  iteration-recursion  duplication  forms-instances  traces 
november 2016 by nhaliday
Reproducing bugs is awful. You get an issue like “Problem with Sidebar” that vaguely describes some odd behavior. Now you must somehow reproduce it exactly. Was it the specific timing of events? Was it bad data from the server? Was it specific to a certain user? Was it a recently updated dependency? As you slog through all these possibilities, the most annoying thing is that the person who opened the bug report already had all this information! In an ideal world, you could just replay their exact session.

Elm 0.18 lets you do exactly that! In debug mode, Elm lets you import and export the exact sequence of events from a program. You get all the information necessary to reproduce the session exactly, from mouse clicks to HTTP requests.
worrydream  functional  pls  announcement  debugging  frontend  web  javascript  time  traces  sequential  roots  explanans  replication  duplication  live-coding  state  direction 
november 2016 by nhaliday
Overcoming Bias : Two Kinds Of Status
prestige and dominance

More here. I was skeptical at first, but now am convinced: humans see two kinds of status, and approve of prestige-status much more than domination-status. I’ll have much more to say about this in the coming days, but it is far from clear to me that prestige-status is as much better than domination-status as people seem to think. Efforts to achieve prestige-status also have serious negative side-effects.

Two Ways to the Top: Evidence That Dominance and Prestige Are Distinct Yet Viable Avenues to Social Rank and Influence: https://henrich.fas.harvard.edu/files/henrich/files/cheng_et_al_2013.pdf
Dominance (the use of force and intimidation to induce fear) and Prestige (the sharing of expertise or know-how to gain respect)


According to the model, Dominance initially arose in evolutionary history as a result of agonistic contests for material resources and mates that were common among nonhuman species, but continues to exist in contemporary human societies, largely in the form of psychological intimidation, coercion, and wielded control over costs and benefits (e.g., access to resources, mates, and well-being). In both humans and nonhumans, Dominance hierarchies are thought to emerge to help maintain patterns of submission directed from subordinates to Dominants, thereby minimizing agonistic battles and incurred costs.

In contrast, Prestige is likely unique to humans, because it is thought to have emerged from selection pressures to preferentially attend to and acquire cultural knowledge from highly skilled or successful others, a capacity considered to be less developed in other animals (Boyd & Richerson, 1985; Laland & Galef, 2009). In this view, social learning (i.e., copying others) evolved in humans as a low-cost fitness-maximizing, information-gathering mechanism (Boyd & Richerson, 1985). Once it became adaptive to copy skilled others, a preference for social models with better than average information would have emerged. This would promote competition for access to the highest quality models, and deference toward these models in exchange for copying and learning opportunities. Consequently, selection likely favored Prestige differentiation, with individuals possessing high-quality information or skills elevated to the top of the hierarchy. Meanwhile, other individuals may reach the highest ranks of their group’s hierarchy by wielding threat of force, regardless of the quality of their knowledge or skills. Thus, Dominance and Prestige can be thought of as coexisting avenues to attaining rank and influence within social groups, despite being underpinned by distinct motivations and behavioral patterns, and resulting in distinct patterns of imitation and deference from subordinates.

Importantly, both Dominance and Prestige are best conceptualized as cognitive and behavioral strategies (i.e., suites of subjective feelings, cognitions, motivations, and behavioral patterns that together produce certain outcomes) deployed in certain situations, and can be used (with more or less success) by any individual within a group. They are not types of individuals, or even, necessarily, traits within individuals. Instead, we assume that all situated dyadic relationships contain differential degrees of both Dominance and Prestige, such that each person is simultaneously Dominant and Prestigious to some extent, to some other individual. Thus, it is possible that a high degree of Dominance and a high degree of Prestige may be found within the same individual, and may depend on who is doing the judging. For example, by controlling students’ access to rewards and punishments, school teachers may exert Dominance in their relationships with some students, but simultaneously enjoy Prestige with others, if they are respected and deferred to for their competence and wisdom. Indeed, previous studies have shown that based on both self- and peer ratings, Dominance and Prestige are largely independent (mean r = -.03; Cheng et al., 2010).

Status Hypocrisy: https://www.overcomingbias.com/2017/01/status-hypocrisy.html
Today we tend to say that our leaders have prestige, while their leaders have dominance. That is, their leaders hold power via personal connections and the threat and practice of violence, bribes, sex, gossip, and conformity pressures. Our leaders, instead, mainly just have whatever abilities follow from our deepest respect and admiration regarding their wisdom and efforts on serious topics that matter for us all. Their leaders more seek power, while ours more have leadership thrust upon them. Because of this us/them split, we tend to try to use persuasion on us, but force on them, when seeking to to change behaviors.


Clearly, while there is some fact of the matter about how much a person gains their status via licit or illicit means, there is also a lot of impression management going on. We like to give others the impression that we personally mainly want prestige in ourselves and our associates, and that we only grant others status via the prestige they have earned. But let me suggest that, compared to this ideal, we actually want more dominance in ourselves and our associates than we like to admit, and we submit more often to dominance.

Cads, Dads, Doms: https://www.overcomingbias.com/2010/07/cads-dads-doms.html
"The proper dichotomy is not “virile vs. wimpy” as has been supposed, but “exciting vs. drab,” with the former having the two distinct sub-groups “macho man vs. pretty boy.” Another way to see that this is the right dichotomy is to look around the world: wherever girls really dig macho men, they also dig the peacocky musician type too, finding safe guys a bit boring. And conversely, where devoted dads do the best, it’s more difficult for macho men or in-town-for-a-day rockstars to make out like bandits. …

Whatever it is about high-pathogen-load areas that selects for greater polygynous behavior … will result in an increase in both gorilla-like and peacock-like males, since they’re two viable ways to pursue a polygynous mating strategy."

This fits with there being two kinds of status: dominance and prestige. Macho men, such as CEOs and athletes, have dominance, while musicians and artists have prestige. But women seek both short and long term mates. Since both kinds of status suggest good genes, both attract women seeking short term mates. This happens more when women are younger and richer, and when there is more disease. Foragers pretend they don’t respect dominance as much as they do, so prestigious men get more overt attention, while dominant men get more covert attention.

Women seeking long term mates also consider a man’s ability to supply resources, and may settle for poorer genes to get more resources. Dominant men tend to have more resources than prestigious men, so such men are more likely to fill both roles, being long term mates for some women and short term mates for others. Men who can offer only prestige must accept worse long term mates, while men who can offer only resources must accept few short term mates. Those low in prestige, resources, or dominance must accept no mates. A man who had prestige, dominance, and resources would get the best short and long term mates – what men are these?

Stories are biased toward dramatic events, and so are biased toward events with risky men; it is harder to tell a good story about the attraction of a resource-rich man. So stories naturally encourage short term mating. Shouldn’t this make long-term mates wary of strong mate attraction to dramatic stories?

Woman want three things: someone to fight for them (the Warrior), someone to provide for them (the Tycoon) and someone to excite their emotions or entertain them (the Wizard).

In this context,

Dad= Tycoon
Cad= Wizard

To repeat:

Dom (Cocky)+ Dad (Generous) + Cad (Exciting/Funny) = Laid

There is an old distinction between "proximate" and "ultimate" causes. Evolution is an ultimate cause, physiology (and psychology, here) is a proximate cause. The flower bends to follow the sun because it gathers more light that way, but the immediate mechanism of the bending involves hormones called auxins. I see a lot of speculation about, say, sexual cognitive dimorphism whose ultimate cause is evolutionary, but not so much speculation about the proximate cause - the "how" of the difference, rather than the "why". And here I think a visit to an older mode of explanation like Marsden's - one which is psychological rather than genetic - can sensitize us to the fact that the proximate causes of a behavioral tendency need not be a straightforward matter of being hardwired differently.

This leads to my second point, which is just that we should remember that human beings actually possess consciousness. This means not only that the proximate cause of a behavior may deeply involve subjectivity, self-awareness, and an existential situation. It also means that all of these propositions about what people do are susceptible to change once they have been spelled out and become part of the culture. It is rather like the stock market: once everyone knows (or believes) something, then that information provides no advantage, creating an incentive for novelty.

Finally, the consequences of new beliefs about the how and the why of human nature and human behavior. Right or wrong, theories already begin to have consequences once they are taken up and incorporated into subjectivity. We really need a new Foucault to take on this topic.

The Economics of Social Status: http://www.meltingasphalt.com/the-economics-of-social-status/
Prestige vs. dominance. Joseph Henrich (of WEIRD fame) distinguishes two types of status. Prestige is the kind of status we get from being an impressive human specimen (think Meryl Streep), and it's governed by our 'approach' instincts. Dominance, on the other hand, is … [more]
things  status  hanson  thinking  comparison  len:short  anthropology  farmers-and-foragers  phalanges  ratty  duty  power  humility  hypocrisy  hari-seldon  multi  sex  gender  signaling  🐝  tradeoffs  evopsych  insight  models  sexuality  gender-diff  chart  postrat  yvain  ssc  simler  critique  essay  debate  paying-rent  gedanken  empirical  operational  vague  info-dynamics  len:long  community  henrich  long-short-run  rhetoric  contrarianism  coordination  social-structure  hidden-motives  politics  2016-election  rationality  links  study  summary  list  hive-mind  speculation  coalitions  values  🤖  metabuch  envy  universalism-particularism  egalitarianism-hierarchy  s-factor  unintended-consequences  tribalism  group-selection  justice  inequality  competition  cultural-dynamics  peace-violence  ranking  machiavelli  authoritarianism  strategy  tactics  organizing  leadership  management  n-factor  duplication  thiel  volo-avolo  todo  technocracy  rent-seeking  incentives  econotariat  marginal-rev  civilization  rot  gibbon 
september 2016 by nhaliday
Are You Living in a Computer Simulation?
Bostrom's anthropic arguments

In sum, if your descendants might make simulations of lives like yours, then you might be living in a simulation. And while you probably cannot learn much detail about the specific reasons for and nature of the simulation you live in, you can draw general conclusions by making analogies to the types and reasons of simulations today. If you might be living in a simulation then all else equal it seems that you should care less about others, live more for today, make your world look likely to become eventually rich, expect to and try to participate in pivotal events, be entertaining and praiseworthy, and keep the famous people around you happy and interested in you.

Theological Implications of the Simulation Argument: https://www.tandfonline.com/doi/pdf/10.1080/15665399.2010.10820012
Nick Bostrom’s Simulation Argument (SA) has many intriguing theological implications. We work out some of them here. We show how the SA can be used to develop novel versions of the Cosmological and Design Arguments. We then develop some of the affinities between Bostrom’s naturalistic theogony and more traditional theological topics. We look at the resurrection of the body and at theodicy. We conclude with some reflections on the relations between the SA and Neoplatonism (friendly) and between the SA and theism (less friendly).

lesswrong  philosophy  weird  idk  thinking  insight  links  summary  rationality  ratty  bostrom  sampling-bias  anthropic  theos  simulation  hanson  decision-making  advice  mystic  time-preference  futurism  letters  entertainment  multi  morality  humility  hypocrisy  wealth  malthus  power  drama  gedanken  pdf  article  essay  religion  christianity  the-classics  big-peeps  iteration-recursion  aesthetics  nietzschean  axioms  gwern  analysis  realness  von-neumann  space  expansionism  duplication  spreading  sequential  cs  computation  outcome-risk  measurement  empirical  questions  bits  information-theory  efficiency  algorithms  physics  relativity  ems  neuro  data  scale  magnitude  complexity  risk  existence  threat-modeling  civilization  forms-instances 
september 2016 by nhaliday
Why Information Grows – Paul Romer
thinking like a physicist:

The key element in thinking like a physicist is being willing to push simultaneously to extreme levels of abstraction and specificity. This sounds paradoxical until you see it in action. Then it seems obvious. Abstraction means that you strip away inessential detail. Specificity means that you take very seriously the things that remain.

Abstraction vs. Radical Specificity: https://paulromer.net/abstraction-vs-radical-specificity/
books  summary  review  economics  growth-econ  interdisciplinary  hmm  physics  thinking  feynman  tradeoffs  paul-romer  econotariat  🎩  🎓  scholar  aphorism  lens  signal-noise  cartoons  skeleton  s:**  giants  electromag  mutation  genetics  genomics  bits  nibble  stories  models  metameta  metabuch  problem-solving  composition-decomposition  structure  abstraction  zooming  examples  knowledge  human-capital  behavioral-econ  network-structure  info-econ  communication  learning  information-theory  applications  volo-avolo  map-territory  externalities  duplication  spreading  property-rights  lattice  multi  government  polisci  policy  counterfactual  insight  paradox  parallax  reduction  empirical  detail-architecture  methodology  crux  visual-understanding  theory-practice  matching  analytical-holistic  branches  complement-substitute  local-global  internet  technology  cost-benefit  investing  micro  signaling  limits  public-goodish  interpretation  elegance  meta:reading  intellectual-property  writing 
september 2016 by nhaliday
The Future of Genetic Enhancement is Not in the West | Quillette

If it becomes possible to safely genetically increase babies’ IQ, it will become inevitable: https://www.washingtonpost.com/news/volokh-conspiracy/wp/2015/07/14/if-it-becomes-possible-to-safely-genetically-increase-babies-iq-it-will-become-inevitable/

Baby Genome Sequencing for Sale in China: https://www.technologyreview.com/s/608086/baby-genome-sequencing-for-sale-in-china/
Chinese parents can now decode the genomes of their healthy newborns, revealing disease risks as well as the likelihood of physical traits like male-pattern baldness.

China launches massive genome research initiative: https://news.cgtn.com/news/7767544e34637a6333566d54/share_p.html

research ethics:
First results of CRISPR gene editing of normal embryos released: https://www.newscientist.com/article/2123973-first-results-of-crispr-gene-editing-of-normal-embryos-released/
caveats: https://ipscell.com/2017/08/4-reasons-mitalipov-paper-doesnt-herald-safe-crispr-human-genetic-modification/

So this title is a bit misleading; something like, "cells edited with CRISPR injected into a person for the first time" would be better. While CRISPR is promising for topological treatments, that's not what happened here.
China sprints ahead in CRISPR therapy race: http://science.sciencemag.org/content/358/6359/20
China, Unhampered by Rules, Races Ahead in Gene-Editing Trials: https://www.wsj.com/articles/china-unhampered-by-rules-races-ahead-in-gene-editing-trials-1516562360
U.S. scientists helped devise the Crispr biotechnology tool. First to test it in humans are Chinese doctors



lol: http://www.theonion.com/infographic/pros-and-cons-gene-editing-56740

Japan set to allow gene editing in human embryos [ed.: (for research)]: https://www.nature.com/articles/d41586-018-06847-7
Draft guidelines permit gene-editing tools for research into early human development.
futurism  prediction  enhancement  biotech  essay  china  asia  culture  poll  len:short  new-religion  accelerationism  letters  news  org:mag  org:popup  🌞  sinosphere  🔬  sanctity-degradation  morality  values  democracy  authoritarianism  genetics  CRISPR  scaling-up  orient  multi  org:lite  india  competition  speedometer  org:rec  right-wing  rhetoric  slippery-slope  iq  usa  incentives  technology  org:nat  org:sci  org:biz  trends  current-events  genomics  gnxp  scitariat  commentary  hsu  org:foreign  volo-avolo  regulation  coordination  cooperate-defect  moloch  popsci  announcement  politics  government  policy  science  ethics  :/  org:anglo  cancer  medicine  hn  tech  immune  sapiens  study  summary  bio  disease  critique  regularizer  accuracy  lol  comedy  hard-tech  skunkworks  twitter  social  backup  gnon  🐸  randy-ayndy  civil-liberty  FDA  duplication  left-wing  chart  abortion-contraception-embryo 
august 2016 by nhaliday
Rob Pike: Notes on Programming in C
Issues of typography
Sometimes they care too much: pretty printers mechanically produce pretty output that accentuates irrelevant detail in the program, which is as sensible as putting all the prepositions in English text in bold font. Although many people think programs should look like the Algol-68 report (and some systems even require you to edit programs in that style), a clear program is not made any clearer by such presentation, and a bad program is only made laughable.
Typographic conventions consistently held are important to clear presentation, of course - indentation is probably the best known and most useful example - but when the ink obscures the intent, typography has taken over.


Finally, I prefer minimum-length but maximum-information names, and then let the context fill in the rest. Globals, for instance, typically have little context when they are used, so their names need to be relatively evocative. Thus I say maxphysaddr (not MaximumPhysicalAddress) for a global variable, but np not NodePointer for a pointer locally defined and used. This is largely a matter of taste, but taste is relevant to clarity.


C is unusual in that it allows pointers to point to anything. Pointers are sharp tools, and like any such tool, used well they can be delightfully productive, but used badly they can do great damage (I sunk a wood chisel into my thumb a few days before writing this). Pointers have a bad reputation in academia, because they are considered too dangerous, dirty somehow. But I think they are powerful notation, which means they can help us express ourselves clearly.
Consider: When you have a pointer to an object, it is a name for exactly that object and no other.


A delicate matter, requiring taste and judgement. I tend to err on the side of eliminating comments, for several reasons. First, if the code is clear, and uses good type names and variable names, it should explain itself. Second, comments aren't checked by the compiler, so there is no guarantee they're right, especially after the code is modified. A misleading comment can be very confusing. Third, the issue of typography: comments clutter code.
But I do comment sometimes. Almost exclusively, I use them as an introduction to what follows.


Most programs are too complicated - that is, more complex than they need to be to solve their problems efficiently. Why? Mostly it's because of bad design, but I will skip that issue here because it's a big one. But programs are often complicated at the microscopic level, and that is something I can address here.
Rule 1. You can't tell where a program is going to spend its time. Bottlenecks occur in surprising places, so don't try to second guess and put in a speed hack until you've proven that's where the bottleneck is.

Rule 2. Measure. Don't tune for speed until you've measured, and even then don't unless one part of the code overwhelms the rest.

Rule 3. Fancy algorithms are slow when n is small, and n is usually small. Fancy algorithms have big constants. Until you know that n is frequently going to be big, don't get fancy. (Even if n does get big, use Rule 2 first.) For example, binary trees are always faster than splay trees for workaday problems.

Rule 4. Fancy algorithms are buggier than simple ones, and they're much harder to implement. Use simple algorithms as well as simple data structures.

The following data structures are a complete list for almost all practical programs:

linked list
hash table
binary tree
Of course, you must also be prepared to collect these into compound data structures. For instance, a symbol table might be implemented as a hash table containing linked lists of arrays of characters.
Rule 5. Data dominates. If you've chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming. (See The Mythical Man-Month: Essays on Software Engineering by F. P. Brooks, page 102.)

Rule 6. There is no Rule 6.

Programming with data.
One of the reasons data-driven programs are not common, at least among beginners, is the tyranny of Pascal. Pascal, like its creator, believes firmly in the separation of code and data. It therefore (at least in its original form) has no ability to create initialized data. This flies in the face of the theories of Turing and von Neumann, which define the basic principles of the stored-program computer. Code and data are the same, or at least they can be. How else can you explain how a compiler works? (Functional languages have a similar problem with I/O.)

Function pointers
Another result of the tyranny of Pascal is that beginners don't use function pointers. (You can't have function-valued variables in Pascal.) Using function pointers to encode complexity has some interesting properties.
Some of the complexity is passed to the routine pointed to. The routine must obey some standard protocol - it's one of a set of routines invoked identically - but beyond that, what it does is its business alone. The complexity is distributed.

There is this idea of a protocol, in that all functions used similarly must behave similarly. This makes for easy documentation, testing, growth and even making the program run distributed over a network - the protocol can be encoded as remote procedure calls.

I argue that clear use of function pointers is the heart of object-oriented programming. Given a set of operations you want to perform on data, and a set of data types you want to respond to those operations, the easiest way to put the program together is with a group of function pointers for each type. This, in a nutshell, defines class and method. The O-O languages give you more of course - prettier syntax, derived types and so on - but conceptually they provide little extra.


Include files
Simple rule: include files should never include include files. If instead they state (in comments or implicitly) what files they need to have included first, the problem of deciding which files to include is pushed to the user (programmer) but in a way that's easy to handle and that, by construction, avoids multiple inclusions. Multiple inclusions are a bane of systems programming. It's not rare to have files included five or more times to compile a single C source file. The Unix /usr/include/sys stuff is terrible this way.
There's a little dance involving #ifdef's that can prevent a file being read twice, but it's usually done wrong in practice - the #ifdef's are in the file itself, not the file that includes it. The result is often thousands of needless lines of code passing through the lexical analyzer, which is (in good compilers) the most expensive phase.

Just follow the simple rule.

cf https://stackoverflow.com/questions/1101267/where-does-the-compiler-spend-most-of-its-time-during-parsing
First, I don't think it actually is true: in many compilers, most time is not spend in lexing source code. For example, in C++ compilers (e.g. g++), most time is spend in semantic analysis, in particular in overload resolution (trying to find out what implicit template instantiations to perform). Also, in C and C++, most time is often spend in optimization (creating graph representations of individual functions or the whole translation unit, and then running long algorithms on these graphs).

When comparing lexical and syntactical analysis, it may indeed be the case that lexical analysis is more expensive. This is because both use state machines, i.e. there is a fixed number of actions per element, but the number of elements is much larger in lexical analysis (characters) than in syntactical analysis (tokens).

programming  systems  philosophy  c(pp)  summer-2014  intricacy  engineering  rhetoric  contrarianism  diogenes  parsimony  worse-is-better/the-right-thing  data-structures  list  algorithms  stylized-facts  essay  ideas  performance  functional  state  pls  oop  gotchas  blowhards  duplication  compilers  syntax  lexical  checklists  metabuch  lens  notation  thinking  neurons  guide  pareto  heuristic  time  cost-benefit  multi  q-n-a  stackex  plt  hn  commentary  minimalism  techtariat  rsc  writing  technical-writing  cracker-prog  code-organizing  grokkability  protocol-metadata  direct-indirect  grokkability-clarity  latency-throughput 
august 2014 by nhaliday

bundles : abstractthinking

related tags

2016-election  :/  aaronson  ability-competence  abortion-contraception-embryo  absolute-relative  abstraction  academia  accelerationism  accuracy  acm  acmtariat  aDNA  advertising  advice  aesthetics  africa  age-generation  aggregator  agriculture  ai  ai-control  albion  algorithms  alien-character  alignment  allodium  alt-inst  altruism  amazon  analogy  analysis  analytical-holistic  anglo  anglosphere  announcement  anomie  anonymity  anthropic  anthropology  antidemos  aphorism  api  apollonian-dionysian  app  apple  applicability-prereqs  applications  approximation  arbitrage  archaeology  archaics  architecture  aristos  arms  art  article  asia  assembly  atmosphere  atoms  attention  audio  authoritarianism  automata-languages  automation  axelrod  axioms  backup  barons  bayesian  behavioral-econ  being-becoming  benchmarks  benevolence  best-practices  better-explained  biases  big-peeps  big-picture  bio  biodet  bioinformatics  biophysical-econ  biotech  bitcoin  bits  blowhards  books  bostrom  branches  brands  britain  broad-econ  browser  buddhism  build-packaging  business  business-models  c(pp)  caching  california  canada  cancer  canon  capital  capitalism  career  carmack  cartoons  causation  chart  cheatsheet  checking  checklists  china  christianity  circuits  civic  civil-liberty  civilization  cjones-like  class  class-warfare  classic  classical  classification  clever-rats  client-server  climate-change  cloud  coalitions  coarse-fine  cocktail  cocoa  code-dive  code-organizing  coding-theory  cog-psych  cohesion  cold-war  collaboration  comedy  comics  commentary  communication  communism  community  comparison  compensation  competition  compilers  complement-substitute  complex-systems  complexity  composition-decomposition  compression  computation  computer-memory  computer-vision  concept  conceptual-vocab  concrete  concurrency  config  confusion  conquest-empire  consumerism  context  contracts  contradiction  contrarianism  cooperate-defect  coordination  core-rats  correctness  correlation  cost-benefit  counterfactual  coupling-cohesion  courage  course  cracker-prog  creative  crime  criminal-justice  CRISPR  critique  crooked  crosstab  crux  crypto  crypto-anarchy  cs  cultural-dynamics  culture  culture-war  curiosity  current-events  cybernetics  cycles  cynicism-idealism  dan-luu  dark-arts  darwinian  data  data-science  data-structures  database  dataviz  death  debate  debt  debugging  decision-making  deep-learning  deep-materialism  definite-planning  degrees-of-freedom  democracy  dennett  density  dependence-independence  design  desktop  detail-architecture  developing-world  devtools  dimensionality  diogenes  direct-indirect  direction  dirty-hands  discipline  discovery  discrete  discussion  disease  distributed  distribution  diversity  documentation  domestication  douthatish  drama  driving  drugs  DSL  duplication  duty  early-modern  ecology  econ-productivity  econometrics  economics  econotariat  ecosystem  eden  eden-heaven  editors  education  EEA  efficiency  egalitarianism-hierarchy  EGT  einstein  electromag  elegance  elite  email  embedded-cognition  embeddings  embodied  emergent  emotion  empirical  ems  encyclopedic  endo-exo  endogenous-exogenous  ends-means  energy-resources  engineering  enhancement  enlightenment-renaissance-restoration-reformation  ensembles  entertainment  entrepreneurialism  environment  environmental-effects  envy  epidemiology  epistemic  equilibrium  ergo  error  error-handling  essay  essence-existence  estimate  ethics  ethnocentrism  EU  europe  evidence-based  evolution  evopsych  examples  existence  exocortex  expansionism  expert  expert-experience  explanans  explanation  exploratory  explore-exploit  exposition  externalities  extra-introversion  extratricky  facebook  farmers-and-foragers  fashun  FDA  features  fermi  fertility  feudal  feynman  fiction  finance  flexibility  flux-stasis  focus  foreign-lang  form-design  formal-methods  formal-values  forms-instances  frameworks  free-riding  frequency  frontend  frontier  functional  futurism  gallic  game-theory  games  gavisti  gedanken  gender  gender-diff  generalization  genetic-load  genetics  genomics  geoengineering  geography  germanic  giants  gibbon  github  gnon  gnosis-logos  gnxp  god-man-beast-victim  golang  google  gotchas  government  gowers  grad-school  graphics  graphs  grokkability  grokkability-clarity  group-level  group-selection  growth-econ  GT-101  guide  guilt-shame  gwern  hacker  haidt  hanson  hard-tech  hardware  hari-seldon  harvard  hashing  haskell  healthcare  heavy-industry  heavyweights  henrich  heterodox  heuristic  hi-order-bits  hidden-motives  high-variance  higher-ed  history  hive-mind  hmm  hn  homo-hetero  honor  houellebecq  howto  hsu  human-capital  human-ml  humility  hypocrisy  hypothesis-testing  ideas  identity-politics  ideology  idk  illusion  immune  impact  impetus  impro  incentives  increase-decrease  india  individualism-collectivism  industrial-revolution  inequality  info-dynamics  info-econ  info-foraging  information-theory  init  innovation  input-output  insight  institutions  intel  intellectual-property  intelligence  interdisciplinary  interests  interface-compatibility  internet  interpretation  intersection  intersection-connectedness  interview  interview-prep  intricacy  intuition  investing  ios  iq  iron-age  is-ought  islam  iteration-recursion  janus  japan  jargon  javascript  journos-pundits  judaism  judgement  julia  justice  jvm  keyboard  kinship  knowledge  kumbaya-kult  labor  land  language  latency-throughput  latex  latin-america  lattice  law  leadership  learning  lecture-notes  left-wing  legacy  len:long  len:short  lens  lesswrong  letters  leviathan  lexical  libraries  limits  linguistics  links  linux  lisp  list  literature  live-coding  local-global  logos  lol  long-short-run  long-term  longevity  love-hate  low-hanging  machiavelli  machine-learning  macro  magnitude  malaise  malthus  management  managerial-state  map-territory  marginal  marginal-rev  market-failure  market-power  markets  martial  matching  math  math.CA  meaningness  measure  measurement  mechanics  media  medicine  medieval  mediterranean  meta:math  meta:prediction  meta:reading  meta:research  meta:rhetoric  metabuch  metal-to-virtual  metameta  methodology  metric-space  metrics  micro  microfoundations  microsoft  migrant-crisis  migration  minimalism  miri-cfar  mobile  model-class  model-organism  models  modernity  moloch  moments  monetary-fiscal  money  money-for-time  mooc  morality  mostly-modern  move-fast-(and-break-things)  multi  multiplicative  music  music-theory  musk  mutation  mystic  myth  n-factor  narrative  nascent-state  nationalism-globalism  nature  near-far  network-structure  networking  neuro  neuro-nitgrit  neurons  new-religion  news  nibble  nietzschean  nihil  nitty-gritty  nlp  no-go  noble-lie  nordic  northeast  notation  notetaking  novelty  nuclear  number  nutrition  nyc  ocaml-sml  occam  occident  ocr  old-anglo  oly  oop  open-closed  open-problems  openai  operational  optimism  optimization  order-disorder  orders  org:anglo  org:biz  org:bleg  org:com  org:edge  org:foreign  org:lite  org:mag  org:mat  org:med  org:nat  org:ngo  org:popup  org:rec  org:sci  organization  organizing  orient  orwellian  os  oss  osx  other-xtian  outcome-risk  outliers  overflow  p:whenever  papers  parable  paradox  parallax  parasites-microbiome  pareto  parsimony  patho-altruism  patience  paul-romer  paying-rent  pdf  peace-violence  people  performance  personality  persuasion  perturbation  pessimism  phalanges  pharma  philosophy  phys-energy  physics  pic  pinboard  piracy  planning  plots  pls  plt  poast  polanyi-marx  polarization  policy  polisci  politics  poll  pop-diff  popsci  population  postrat  power  power-law  pragmatic  pre-ww2  prediction  prediction-markets  predictive-processing  preference-falsification  prejudice  prepping  preprint  presentation  primitivism  princeton  prioritizing  priors-posteriors  privacy  pro-rata  probability  problem-solving  productivity  prof  profile  programming  project  proofs  propaganda  properties  property-rights  proposal  protestant-catholic  protocol-metadata  prudence  psych-architecture  psycho-atoms  psychology  psychometrics  public-goodish  publishing  python  q-n-a  qra  quality  quantitative-qualitative  quantum  quantum-info  questions  quixotic  quiz  quora  quotes  race  random  randy-ayndy  ranking  rant  rationality  ratty  reading  real-nominal  realness  reason  recommendations  recruiting  red-queen  reddit  redistribution  reduction  reference  reflection  regularizer  regulation  reinforcement  relativity  religion  rent-seeking  replication  reputation  responsibility  retention  retrofit  review  revolution  rhetoric  rhythm  right-wing  rigidity  rigor  risk  ritual  robotics  robust  rock  roots  rot  rsc  russia  rust  s-factor  s:**  s:***  sampling-bias  sanctity-degradation  sapiens  scale  scaling-up  schelling  scholar  science  scifi-fantasy  scitariat  search  securities  security  selection  self-interest  sequential  sex  sexuality  shakespeare  shift  shipping  signal-noise  signaling  similarity  simler  simplification-normalization  simulation  singularity  sinosphere  skeleton  skunkworks  sky  sleep  slippery-slope  smoothness  social  social-capital  social-choice  social-norms  social-psych  social-structure  sociality  society  socs-and-mops  software  space  span-cover  spanish  spatial  speculation  speed  speedometer  spengler  spock  sports  spotify  spreading  ssc  stackex  stagnation  stanford  startups  state  state-of-art  statesmen  static-dynamic  stats  status  stereotypes  stochastic-processes  stock-flow  stories  strategy  straussian  street-fighting  stress  strings  structure  study  studying  stylized-facts  subjective-objective  success  sulla  summary  summer-2014  supply-demand  survival  sv  synchrony  syntax  synthesis  system-design  systematic-ad-hoc  systems  tactics  tails  tainter  tcs  teaching  tech  technical-writing  technocracy  technology  techtariat  telos-atelos  terminal  terrorism  the-basilisk  the-classics  the-devil  the-founding  the-great-west-whale  the-monster  the-trenches  the-watchers  the-west  the-world-is-just-atoms  theory-of-mind  theory-practice  theos  thermo  thick-thin  thiel  things  thinking  threat-modeling  tidbits  tightness  time  time-preference  time-series  time-use  todo  toolkit  tools  top-n  toys  traces  track-record  trade  tradeoffs  tradition  transportation  trees  trends  tribalism  tricki  tricks  trivia  troll  trump  trust  truth  turing  tutorial  tv  twitter  types  ubiquity  ui  unaffiliated  uncertainty  unintended-consequences  uniqueness  unit  universalism-particularism  unix  urban-rural  us-them  usa  utopia-dystopia  vague  values  vampire-squid  venture  video  virginia-DC  virtu  visual-understanding  visualization  vitality  volo-avolo  von-neumann  vr  vulgar  war  water  wealth  web  webapp  weird  welfare-state  west-hunter  westminster  whole-partial-many  wiki  wild-ideas  winner-take-all  wire-guided  wisdom  within-group  within-without  wkfly  workflow  working-stiff  world  world-war  worrydream  worse-is-better/the-right-thing  writing  X-not-about-Y  xenobio  yak-shaving  yoga  yvain  zeitgeist  zero-positive-sum  zooming  🌞  🎓  🎩  🐝  🐸  👳  🔬  🖥  🤖 

Copy this bookmark: