nhaliday + tradeoffs   147

Advantages and disadvantages of building a single page web application - Software Engineering Stack Exchange
Advantages
- All data has to be available via some sort of API - this is a big advantage for my use case as I want to have an API to my application anyway. Right now about 60-70% of my calls to get/update data are done through a REST API. Doing a single page application will allow me to better test my REST API since the application itself will use it. It also means that as the application grows, the API itself will grow since that is what the application uses; no need to maintain the API as an add-on to the application.
- More responsive application - since all data loaded after the initial page is kept to a minimum and transmitted in a compact format (like JSON), data requests should generally be faster, and the server will do slightly less processing.

Disadvantages
- Duplication of code - for example, model code. I am going to have to create models both on the server side (PHP in this case) and the client side in Javascript.
- Business logic in Javascript - I can't give any concrete examples on why this would be bad but it just doesn't feel right to me having business logic in Javascript that anyone can read.
- Javascript memory leaks - since the page never reloads, Javascript memory leaks can happen, and I would not even know where to begin to debug them.

--

Disadvantages I often see with Single Page Web Applications:
- Inability to link to a specific part of the site, there's often only 1 entry point.
- Disfunctional back and forward buttons.
- The use of tabs is limited or non-existant.
(especially mobile:)
- Take very long to load.
- Don't function at all.
- Can't reload a page, a sudden loss of network takes you back to the start of the site.

This answer is outdated, Most single page application frameworks have a way to deal with the issues above – Luis May 27 '14 at 1:41
@Luis while the technology is there, too often it isn't used. – Pieter B Jun 12 '14 at 6:53

https://softwareengineering.stackexchange.com/questions/201838/building-a-web-application-that-is-almost-completely-rendered-by-javascript-whi

https://softwareengineering.stackexchange.com/questions/143194/what-advantages-are-conferred-by-using-server-side-page-rendering
Server-side HTML rendering:
- Fastest browser rendering
- Page caching is possible as a quick-and-dirty performance boost
- For "standard" apps, many UI features are pre-built
- Sometimes considered more stable because components are usually subject to compile-time validation
- Leans on backend expertise
- Sometimes faster to develop*
*When UI requirements fit the framework well.

Client-side HTML rendering:
- Lower bandwidth usage
- Slower initial page render. May not even be noticeable in modern desktop browsers. If you need to support IE6-7, or many mobile browsers (mobile webkit is not bad) you may encounter bottlenecks.
- Building API-first means the client can just as easily be an proprietary app, thin client, another web service, etc.
- Leans on JS expertise
- Sometimes faster to develop**
**When the UI is largely custom, with more interesting interactions. Also, I find coding in the browser with interpreted code noticeably speedier than waiting for compiles and server restarts.

https://softwareengineering.stackexchange.com/questions/237537/progressive-enhancement-vs-single-page-apps

https://stackoverflow.com/questions/21862054/single-page-application-advantages-and-disadvantages
=== ADVANTAGES ===
1. SPA is extremely good for very responsive sites:
2. With SPA we don't need to use extra queries to the server to download pages.
3.May be any other advantages? Don't hear about any else..

=== DISADVANTAGES ===
1. Client must enable javascript.
2. Only one entry point to the site.
3. Security.

https://softwareengineering.stackexchange.com/questions/287819/should-you-write-your-back-end-as-an-api
focused on .NET

https://softwareengineering.stackexchange.com/questions/337467/is-it-normal-design-to-completely-decouple-backend-and-frontend-web-applications
A SPA comes with a few issues associated with it. Here are just a few that pop in my mind now:
- it's mostly JavaScript. One error in a section of your application might prevent other sections of the application to work because of that Javascript error.
- CORS.
- SEO.
- separate front-end application means separate projects, deployment pipelines, extra tooling, etc;
- security is harder to do when all the code is on the client;

- completely interact in the front-end with the user and only load data as needed from the server. So better responsiveness and user experience;
- depending on the application, some processing done on the client means you spare the server of those computations.
- have a better flexibility in evolving the back-end and front-end (you can do it separately);
- if your back-end is essentially an API, you can have other clients in front of it like native Android/iPhone applications;
- the separation might make is easier for front-end developers to do CSS/HTML without needing to have a server application running on their machine.

Create your own dysfunctional single-page app: https://news.ycombinator.com/item?id=18341993
I think are three broadly assumed user benefits of single-page apps:
1. Improved user experience.
2. Improved perceived performance.
3. It’s still the web.

5 mistakes to create a dysfunctional single-page app
Mistake 1: Under-estimate long-term development and maintenance costs
Mistake 2: Use the single-page app approach unilaterally
Mistake 3: Under-invest in front end capability
Mistake 4: Use naïve dev practices
Mistake 5: Surf the waves of framework hype

The disadvantages of single page applications: https://news.ycombinator.com/item?id=9879685
You probably don't need a single-page app: https://news.ycombinator.com/item?id=19184496
https://news.ycombinator.com/item?id=20384738
MPA advantages:
- Stateless requests
- The browser knows how to deal with a traditional architecture
- Fewer, more mature tools
- SEO for free

When to go for the single page app:
- Core functionality is real-time (e.g Slack)
- Rich UI interactions are core to the product (e.g Trello)
- Lots of state shared between screens (e.g. Spotify)

Hybrid solutions
...
Github uses this hybrid approach.
...

Ask HN: Is it ok to use traditional server-side rendering these days?: https://news.ycombinator.com/item?id=13212465
q-n-a  stackex  programming  engineering  tradeoffs  system-design  design  web  frontend  javascript  cost-benefit  analysis  security  state  performance  traces  measurement  intricacy  code-organizing  applicability-prereqs  multi  comparison  smoothness  shift  critique  techtariat  chart  ui  coupling-cohesion  interface-compatibility  hn  commentary  best-practices  discussion  trends  client-server  api  composition-decomposition  cycles  frameworks  ecosystem  degrees-of-freedom  dotnet 
22 days ago by nhaliday
Sci-Hub | The Moral Machine experiment. Nature | 10.1038/s41586-018-0637-6
Preference for inaction
Sparing pedestrians
Sparing the lawful
Sparing females
Sparing the fit
Sparing higher status
Sparing more characters
Sparing the young
Sparing humans

We selected the 130 countries with at least 100 respondents (n range 101–448,125), standardized the nine target AMCEs of each country, and conducted a hierarchical clustering on these nine scores, using Euclidean distance and Ward’s minimum variance method20. This analysis identified three distinct ‘moral clusters’ of countries. These are shown in Fig. 3a, and are broadly consistent with both geographical and cultural proximity according to the Inglehart–Welzel Cultural Map 2010–201421.

The first cluster (which we label the Western cluster) contains North America as well as many European countries of Protestant, Catholic, and Orthodox Christian cultural groups. The internal structure within this cluster also exhibits notable face validity, with a sub-cluster containing Scandinavian countries, and a sub-cluster containing Commonwealth countries.

The second cluster (which we call the Eastern cluster) contains many far eastern countries such as Japan and Taiwan that belong to the Confucianist cultural group, and Islamic countries such as Indonesia, Pakistan and Saudi Arabia.

The third cluster (a broadly Southern cluster) consists of the Latin American countries of Central and South America, in addition to some countries that are characterized in part by French influence (for example, metropolitan France, French overseas territories, and territories that were at some point under French leadership). Latin American countries are cleanly separated in their own sub-cluster within the Southern cluster.

...

Fig. 3 | Country-level clusters.

[ed.: I actually rather like how the values the West has compare w/ the global mean according in this plot.]

...
Participants from individualistic cultures, which emphasize the distinctive value of each individual23, show a stronger preference for sparing the greater number of characters (Fig. 4a). Furthermore, participants from collectivistic cultures, which emphasize the respect that is due to older members of the community23, show a weaker preference for sparing younger characters (Fig. 4a, inset).
pdf  study  org:nat  psychology  social-psych  poll  values  data  experiment  empirical  morality  ethics  pop-diff  cultural-dynamics  tradeoffs  death  safety  ai  automation  things  world  gender  biases  status  class  egalitarianism-hierarchy  order-disorder  anarcho-tyranny  crime  age-generation  quantitative-qualitative  number  nature  piracy  exploratory  phalanges  n-factor  europe  the-great-west-whale  nordic  usa  anglo  anglosphere  sinosphere  asia  japan  china  islam  MENA  latin-america  gallic  wonkish  correlation  measure  similarity  dignity  universalism-particularism  law  leviathan  wealth  econ-metrics  institutions  demographics  religion  group-level  within-group  expression-survival  comparison  technocracy  visualization  trees  developing-world  regional-scatter-plots 
5 weeks ago by nhaliday
Two Performance Aesthetics: Never Miss a Frame and Do Almost Nothing - Tristan Hume
I’ve noticed when I think about performance nowadays that I think in terms of two different aesthetics. One aesthetic, which I’ll call Never Miss a Frame, comes from the world of game development and is focused on writing code that has good worst case performance by making good use of the hardware. The other aesthetic, which I’ll call Do Almost Nothing comes from a more academic world and is focused on algorithmically minimizing the work that needs to be done to the extent that there’s barely any work left, paying attention to the performance at all scales.

[ed.: Neither of these exactly matches TCS performance PoV but latter is closer (the focus on diffs is kinda weird).]

...

Never Miss a Frame

In game development the most important performance criteria is that your game doesn’t miss frame deadlines. You have a target frame rate and if you miss the deadline for the screen to draw a new frame your users will notice the jank. This leads to focusing on the worst case scenario and often having fixed maximum limits for various quantities. This property can also be important in areas other than game development, like other graphical applications, real-time audio, safety-critical systems and many embedded systems. A similar dynamic occurs in distributed systems where one server needs to query 100 others and combine the results, you’ll wait for the slowest of the 100 every time so speeding up some of them doesn’t make the query faster, and queries occasionally taking longer (e.g because of garbage collection) will impact almost every request!

...

In this kind of domain you’ll often run into situations where in the worst case you can’t avoid processing a huge number of things. This means you need to focus your effort on making the best use of the hardware by writing code at a low level and paying attention to properties like cache size and memory bandwidth.

Projects with inviolable deadlines need to adjust different factors than speed if the code runs too slow. For example a game might decrease the size of a level or use a more efficient but less pretty rendering technique.

Aesthetically: Data should be tightly packed, fixed size, and linear. Transcoding data to and from different formats is wasteful. Strings and their variable lengths and inefficient operations must be avoided. Only use tools that allow you to work at a low level, even if they’re annoying, because that’s the only way you can avoid piles of fixed costs making everything slow. Understand the machine and what your code does to it.

Personally I identify this aesthetic most with Jonathan Blow. He has a very strong personality and I’ve watched enough of videos of him that I find imagining “What would Jonathan Blow say?” as a good way to tap into this aesthetic. My favourite articles about designs following this aesthetic are on the Our Machinery Blog.

...

Do Almost Nothing

Sometimes, it’s important to be as fast as you can in all cases and not just orient around one deadline. The most common case is when you simply have to do something that’s going to take an amount of time noticeable to a human, and if you can make that time shorter in some situations that’s great. Alternatively each operation could be fast but you may run a server that runs tons of them and you’ll save on server costs if you can decrease the load of some requests. Another important case is when you care about power use, for example your text editor not rapidly draining a laptop’s battery, in this case you want to do the least work you possibly can.

A key technique for this approach is to never recompute something from scratch when it’s possible to re-use or patch an old result. This often involves caching: keeping a store of recent results in case the same computation is requested again.

The ultimate realization of this aesthetic is for the entire system to deal only in differences between the new state and the previous state, updating data structures with only the newly needed data and discarding data that’s no longer needed. This way each part of the system does almost no work because ideally the difference from the previous state is very small.

Aesthetically: Data must be in whatever structure scales best for the way it is accessed, lots of trees and hash maps. Computations are graphs of inputs and results so we can use all our favourite graph algorithms to optimize them! Designing optimal systems is hard so you should use whatever tools you can to make it easier, any fixed cost they incur will be made negligible when you optimize away all the work they need to do.

Personally I identify this aesthetic most with my friend Raph Levien and his articles about the design of the Xi text editor, although Raph also appreciates the other aesthetic and taps into it himself sometimes.

...

_I’m conflating the axes of deadline-oriented vs time-oriented and low-level vs algorithmic optimization, but part of my point is that while they are different, I think these axes are highly correlated._

...

Text Editors

Sublime Text is a text editor that mostly follows the Never Miss a Frame approach. ...

The Xi Editor is designed to solve this problem by being designed from the ground up to grapple with the fact that some operations, especially those interacting with slow compilers written by other people, can’t be made instantaneous. It does this using a fancy asynchronous plugin model and lots of fancy data structures.
...

...

Compilers

Jonathan Blow’s Jai compiler is clearly designed with the Never Miss a Frame aesthetic. It’s written to be extremely fast at every level, and the language doesn’t have any features that necessarily lead to slow compiles. The LLVM backend wasn’t fast enough to hit his performance goals so he wrote an alternative backend that directly writes x86 code to a buffer without doing any optimizations. Jai compiles something like 100,000 lines of code per second. Designing both the language and compiler to not do anything slow lead to clean build performance 10-100x faster than other commonly-used compilers. Jai is so fast that its clean builds are faster than most compilers incremental builds on common project sizes, due to limitations in how incremental the other compilers are.

However, Jai’s compiler is still O(n) in the codebase size where incremental compilers can be O(n) in the size of the change. Some compilers like the work-in-progress rust-analyzer and I think also Roslyn for C# take a different approach and focus incredibly hard on making everything fully incremental. For small changes (the common case) this can let them beat Jai and respond in milliseconds on arbitrarily large projects, even if they’re slower on clean builds.

Conclusion
I find both of these aesthetics appealing, but I also think there’s real trade-offs that incentivize leaning one way or the other for a given project. I think people having different performance aesthetics, often because one aesthetic really is better suited for their domain, is the source of a lot of online arguments about making fast systems. The different aesthetics also require different bases of knowledge to pursue, like knowledge of data-oriented programming in C++ vs knowledge of abstractions for incrementality like Adapton, so different people may find that one approach seems way easier and better for them than the other.

I try to choose how to dedicate my effort to pursuing each aesthetics on a per project basis by trying to predict how effort in each direction would help. Some projects I know if I code it efficiently it will always hit the performance deadline, others I know a way to drastically cut down on work by investing time in algorithmic design, some projects need a mix of both. Personally I find it helpful to think of different programmers where I have a good sense of their aesthetic and ask myself how they’d solve the problem. One reason I like Rust is that it can do both low-level optimization and also has a good ecosystem and type system for algorithmic optimization, so I can more easily mix approaches in one project. In the end the best approach to follow depends not only on the task, but your skills or the skills of the team working on it, as well as how much time you have to work towards an ambitious design that may take longer for a better result.
techtariat  reflection  things  comparison  lens  programming  engineering  cracker-prog  carmack  games  performance  big-picture  system-design  constraint-satisfaction  metrics  telos-atelos  distributed  incentives  concurrency  cost-benefit  tradeoffs  systems  metal-to-virtual  latency-throughput  abstraction  marginal  caching  editors  strings  ideas  ui  common-case  examples  applications  flux-stasis  nitty-gritty  ends-means  thinking  summary  correlation  degrees-of-freedom  c(pp)  rust  interface  integration-extension  aesthetics  interface-compatibility  efficiency  adversarial 
10 weeks ago by nhaliday
python - Why do some languages like C++ and Java have a built-in LinkedList datastructure? - Stack Overflow
I searched through Guido's Python History blog, because I was sure he'd written about this, but apparently that's not where he did so. So, this is based on a combination of reasoning (aka educated guessing) and memory (possibly faulty).

Let's start from the end: Without knowing why Guido didn't add linked lists in Python 0.x, do we at least know why the core devs haven't added them since then, even though they've added a bunch of other types from OrderedDict to set?

Yes, we do. The short version is: Nobody has asked for it, in over two decades. Almost of what's been added to builtins or the standard library over the years has been (a variation on) something that's proven to be useful and popular on PyPI or the ActiveState recipes. That's where OrderedDict and defaultdict came from, for example, and enum and dataclass (based on attrs). There are popular libraries for a few other container types—various permutations of sorted dict/set, OrderedSet, trees and tries, etc., and both SortedContainers and blist have been proposed, but rejected, for inclusion in the stdlib.

But there are no popular linked list libraries, and that's why they're never going to be added.

So, that brings the question back a step: Why are there no popular linked list libraries?
q-n-a  stackex  impetus  roots  programming  pls  python  tradeoffs  cost-benefit  design  data-structures 
10 weeks ago by nhaliday
Karol Kuczmarski's Blog – A Haskell retrospective
Even in this hypothetical scenario, I posit that the value proposition of Haskell would still be a tough sell.

There is this old quote from Bjarne Stroustrup (creator of C++) where he says that programming languages divide into those everyone complains about, and those that no one uses.
The first group consists of old, established technologies that managed to accrue significant complexity debt through years and decades of evolution. All the while, they’ve been adapting to the constantly shifting perspectives on what are the best industry practices. Traces of those adaptations can still be found today, sticking out like a leftover appendix or residual tail bone — or like the built-in support for XML in Java.

Languages that “no one uses”, on the other hand, haven’t yet passed the industry threshold of sufficient maturity and stability. Their ecosystems are still cutting edge, and their future is uncertain, but they sometimes champion some really compelling paradigm shifts. As long as you can bear with things that are rough around the edges, you can take advantage of their novel ideas.

Unfortunately for Haskell, it manages to combine the worst parts of both of these worlds.

On one hand, it is a surprisingly old language, clocking more than two decades of fruitful research around many innovative concepts. Yet on the other hand, it bears the signs of a fresh new technology, with relatively few production-grade libraries, scarce coverage of some domains (e.g. GUI programming), and not too many stories of commercial successes.

There are many ways to do it
String theory
Errors and how to handle them
Implicit is better than explicit
Leaky modules
Namespaces are apparently a bad idea
Wild records
Purity beats practicality
techtariat  reflection  functional  haskell  programming  pls  realness  facebook  pragmatic  cost-benefit  legacy  libraries  types  intricacy  engineering  tradeoffs  frontier  homo-hetero  duplication  strings  composition-decomposition  nitty-gritty  error  error-handling  coupling-cohesion  critique  ecosystem  c(pp)  aphorism 
august 2019 by nhaliday
Foreign-Born Teaching Assistants and the Academic Performance of Undergraduates
The data suggest that foreign-born Teaching Assistants have an adverse impact on the class performance of undergraduate students.
study  economics  education  higher-ed  borjas  migration  labor  cost-benefit  tradeoffs  branches  language  foreign-lang  grad-school  teaching  attaq  wonkish  lol 
july 2019 by nhaliday
c++ - mmap() vs. reading blocks - Stack Overflow
The discussion of mmap/read reminds me of two other performance discussions:

Some Java programmers were shocked to discover that nonblocking I/O is often slower than blocking I/O, which made perfect sense if you know that nonblocking I/O requires making more syscalls.

Some other network programmers were shocked to learn that epoll is often slower than poll, which makes perfect sense if you know that managing epoll requires making more syscalls.

Conclusion: Use memory maps if you access data randomly, keep it around for a long time, or if you know you can share it with other processes (MAP_SHARED isn't very interesting if there is no actual sharing). Read files normally if you access data sequentially or discard it after reading. And if either method makes your program less complex, do that. For many real world cases there's no sure way to show one is faster without testing your actual application and NOT a benchmark.
q-n-a  stackex  programming  systems  performance  tradeoffs  objektbuch  stylized-facts  input-output  caching  computer-memory  sequential  applicability-prereqs 
july 2019 by nhaliday
Laurence Tratt: What Challenges and Trade-Offs do Optimising Compilers Face?
Summary
It's important to be realistic: most people don't care about program performance most of the time. Modern computers are so fast that most programs run fast enough even with very slow language implementations. In that sense, I agree with Daniel's premise: optimising compilers are often unimportant. But “often” is often unsatisfying, as it is here. Users find themselves transitioning from not caring at all about performance to suddenly really caring, often in the space of a single day.

This, to me, is where optimising compilers come into their own: they mean that even fewer people need care about program performance. And I don't mean that they get us from, say, 98 to 99 people out of 100 not needing to care: it's probably more like going from 80 to 99 people out of 100 not needing to care. This is, I suspect, more significant than it seems: it means that many people can go through an entire career without worrying about performance. Martin Berger reminded me of A N Whitehead’s wonderful line that “civilization advances by extending the number of important operations which we can perform without thinking about them” and this seems a classic example of that at work. Even better, optimising compilers are widely tested and thus generally much more reliable than the equivalent optimisations performed manually.

But I think that those of us who work on optimising compilers need to be honest with ourselves, and with users, about what performance improvement one can expect to see on a typical program. We have a tendency to pick the maximum possible improvement and talk about it as if it's the mean, when there's often a huge difference between the two. There are many good reasons for that gap, and I hope in this blog post I've at least made you think about some of the challenges and trade-offs that optimising compilers are subject to.

[1]
Most readers will be familiar with Knuth’s quip that “premature optimisation is the root of all evil.” However, I doubt that any of us have any real idea what proportion of time is spent in the average part of the average program. In such cases, I tend to assume that Pareto’s principle won't be far too wrong (i.e. that 80% of execution time is spent in 20% of code). In 1971 a study by Knuth and others of Fortran programs, found that 50% of execution time was spent in 4% of code. I don't know of modern equivalents of this study, and for them to be truly useful, they'd have to be rather big. If anyone knows of something along these lines, please let me know!
techtariat  programming  compilers  performance  tradeoffs  cost-benefit  engineering  yak-shaving  pareto  plt  c(pp)  rust  golang  trivia  data  objektbuch  street-fighting  estimate  distribution  pro-rata 
july 2019 by nhaliday
The Existential Risk of Math Errors - Gwern.net
How big is this upper bound? Mathematicians have often made errors in proofs. But it’s rarer for ideas to be accepted for a long time and then rejected. But we can divide errors into 2 basic cases corresponding to type I and type II errors:

1. Mistakes where the theorem is still true, but the proof was incorrect (type I)
2. Mistakes where the theorem was false, and the proof was also necessarily incorrect (type II)

Before someone comes up with a final answer, a mathematician may have many levels of intuition in formulating & working on the problem, but we’ll consider the final end-product where the mathematician feels satisfied that he has solved it. Case 1 is perhaps the most common case, with innumerable examples; this is sometimes due to mistakes in the proof that anyone would accept is a mistake, but many of these cases are due to changing standards of proof. For example, when David Hilbert discovered errors in Euclid’s proofs which no one noticed before, the theorems were still true, and the gaps more due to Hilbert being a modern mathematician thinking in terms of formal systems (which of course Euclid did not think in). (David Hilbert himself turns out to be a useful example of the other kind of error: his famous list of 23 problems was accompanied by definite opinions on the outcome of each problem and sometimes timings, several of which were wrong or questionable5.) Similarly, early calculus used ‘infinitesimals’ which were sometimes treated as being 0 and sometimes treated as an indefinitely small non-zero number; this was incoherent and strictly speaking, practically all of the calculus results were wrong because they relied on an incoherent concept - but of course the results were some of the greatest mathematical work ever conducted6 and when later mathematicians put calculus on a more rigorous footing, they immediately re-derived those results (sometimes with important qualifications), and doubtless as modern math evolves other fields have sometimes needed to go back and clean up the foundations and will in the future.7

...

Isaac Newton, incidentally, gave two proofs of the same solution to a problem in probability, one via enumeration and the other more abstract; the enumeration was correct, but the other proof totally wrong and this was not noticed for a long time, leading Stigler to remark:

...

TYPE I > TYPE II?
“Lefschetz was a purely intuitive mathematician. It was said of him that he had never given a completely correct proof, but had never made a wrong guess either.”
- Gian-Carlo Rota13

Case 2 is disturbing, since it is a case in which we wind up with false beliefs and also false beliefs about our beliefs (we no longer know that we don’t know). Case 2 could lead to extinction.

...

Except, errors do not seem to be evenly & randomly distributed between case 1 and case 2. There seem to be far more case 1s than case 2s, as already mentioned in the early calculus example: far more than 50% of the early calculus results were correct when checked more rigorously. Richard Hamming attributes to Ralph Boas a comment that while editing Mathematical Reviews that “of the new results in the papers reviewed most are true but the corresponding proofs are perhaps half the time plain wrong”.

...

Gian-Carlo Rota gives us an example with Hilbert:

...

Olga labored for three years; it turned out that all mistakes could be corrected without any major changes in the statement of the theorems. There was one exception, a paper Hilbert wrote in his old age, which could not be fixed; it was a purported proof of the continuum hypothesis, you will find it in a volume of the Mathematische Annalen of the early thirties.

...

Leslie Lamport advocates for machine-checked proofs and a more rigorous style of proofs similar to natural deduction, noting a mathematician acquaintance guesses at a broad error rate of 1/329 and that he routinely found mistakes in his own proofs and, worse, believed false conjectures30.

[more on these "structured proofs":
https://academia.stackexchange.com/questions/52435/does-anyone-actually-publish-structured-proofs
https://mathoverflow.net/questions/35727/community-experiences-writing-lamports-structured-proofs
]

We can probably add software to that list: early software engineering work found that, dismayingly, bug rates seem to be simply a function of lines of code, and one would expect diseconomies of scale. So one would expect that in going from the ~4,000 lines of code of the Microsoft DOS operating system kernel to the ~50,000,000 lines of code in Windows Server 2003 (with full systems of applications and libraries being even larger: the comprehensive Debian repository in 2007 contained ~323,551,126 lines of code) that the number of active bugs at any time would be… fairly large. Mathematical software is hopefully better, but practitioners still run into issues (eg Durán et al 2014, Fonseca et al 2017) and I don’t know of any research pinning down how buggy key mathematical systems like Mathematica are or how much published mathematics may be erroneous due to bugs. This general problem led to predictions of doom and spurred much research into automated proof-checking, static analysis, and functional languages31.

[related:
https://mathoverflow.net/questions/11517/computer-algebra-errors
I don't know any interesting bugs in symbolic algebra packages but I know a true, enlightening and entertaining story about something that looked like a bug but wasn't.

Define sinc𝑥=(sin𝑥)/𝑥.

Someone found the following result in an algebra package: ∫∞0𝑑𝑥sinc𝑥=𝜋/2
They then found the following results:

...

So of course when they got:

∫∞0𝑑𝑥sinc𝑥sinc(𝑥/3)sinc(𝑥/5)⋯sinc(𝑥/15)=(467807924713440738696537864469/935615849440640907310521750000)𝜋

hmm:
Which means that nobody knows Fourier analysis nowdays. Very sad and discouraging story... – fedja Jan 29 '10 at 18:47

--

Because the most popular systems are all commercial, they tend to guard their bug database rather closely -- making them public would seriously cut their sales. For example, for the open source project Sage (which is quite young), you can get a list of all the known bugs from this page. 1582 known issues on Feb.16th 2010 (which includes feature requests, problems with documentation, etc).

That is an order of magnitude less than the commercial systems. And it's not because it is better, it is because it is younger and smaller. It might be better, but until SAGE does a lot of analysis (about 40% of CAS bugs are there) and a fancy user interface (another 40%), it is too hard to compare.

I once ran a graduate course whose core topic was studying the fundamental disconnect between the algebraic nature of CAS and the analytic nature of the what it is mostly used for. There are issues of logic -- CASes work more or less in an intensional logic, while most of analysis is stated in a purely extensional fashion. There is no well-defined 'denotational semantics' for expressions-as-functions, which strongly contributes to the deeper bugs in CASes.]

...

Should such widely-believed conjectures as P≠NP or the Riemann hypothesis turn out be false, then because they are assumed by so many existing proofs, a far larger math holocaust would ensue38 - and our previous estimates of error rates will turn out to have been substantial underestimates. But it may be a cloud with a silver lining, if it doesn’t come at a time of danger.

https://mathoverflow.net/questions/338607/why-doesnt-mathematics-collapse-down-even-though-humans-quite-often-make-mista

more on formal methods in programming:
https://www.quantamagazine.org/formal-verification-creates-hacker-proof-code-20160920/
https://intelligence.org/2014/03/02/bob-constable/

https://softwareengineering.stackexchange.com/questions/375342/what-are-the-barriers-that-prevent-widespread-adoption-of-formal-methods
Update: measured effort
In the October 2018 issue of Communications of the ACM there is an interesting article about Formally verified software in the real world with some estimates of the effort.

Interestingly (based on OS development for military equipment), it seems that producing formally proved software requires 3.3 times more effort than with traditional engineering techniques. So it's really costly.

On the other hand, it requires 2.3 times less effort to get high security software this way than with traditionally engineered software if you add the effort to make such software certified at a high security level (EAL 7). So if you have high reliability or security requirements there is definitively a business case for going formal.

WHY DON'T PEOPLE USE FORMAL METHODS?: https://www.hillelwayne.com/post/why-dont-people-use-formal-methods/
You can see examples of how all of these look at Let’s Prove Leftpad. HOL4 and Isabelle are good examples of “independent theorem” specs, SPARK and Dafny have “embedded assertion” specs, and Coq and Agda have “dependent type” specs.6

If you squint a bit it looks like these three forms of code spec map to the three main domains of automated correctness checking: tests, contracts, and types. This is not a coincidence. Correctness is a spectrum, and formal verification is one extreme of that spectrum. As we reduce the rigour (and effort) of our verification we get simpler and narrower checks, whether that means limiting the explored state space, using weaker types, or pushing verification to the runtime. Any means of total specification then becomes a means of partial specification, and vice versa: many consider Cleanroom a formal verification technique, which primarily works by pushing code review far beyond what’s humanly possible.

...

The question, then: “is 90/95/99% correct significantly cheaper than 100% correct?” The answer is very yes. We all are comfortable saying that a codebase we’ve well-tested and well-typed is mostly correct modulo a few fixes in prod, and we’re even writing more than four lines of code a day. In fact, the vast… [more]
ratty  gwern  analysis  essay  realness  truth  correctness  reason  philosophy  math  proofs  formal-methods  cs  programming  engineering  worse-is-better/the-right-thing  intuition  giants  old-anglo  error  street-fighting  heuristic  zooming  risk  threat-modeling  software  lens  logic  inference  physics  differential  geometry  estimate  distribution  robust  speculation  nonlinearity  cost-benefit  convexity-curvature  measure  scale  trivia  cocktail  history  early-modern  europe  math.CA  rigor  news  org:mag  org:sci  miri-cfar  pdf  thesis  comparison  examples  org:junk  q-n-a  stackex  pragmatic  tradeoffs  cracker-prog  techtariat  invariance  DSL  chart  ecosystem  grokkability  heavyweights  CAS  static-dynamic  lower-bounds  complexity  tcs  open-problems  big-surf  ideas  certificates-recognition  proof-systems  PCP  mediterranean  SDP  meta:prediction  epistemic  questions  guessing  distributed  overflow  nibble  soft-question  track-record  big-list  hmm  frontier  state-of-art  move-fast-(and-break-things)  grokkability-clarity  technical-writing  trust 
july 2019 by nhaliday
The Law of Leaky Abstractions – Joel on Software
[TCP/IP example]

All non-trivial abstractions, to some degree, are leaky.

...

- Something as simple as iterating over a large two-dimensional array can have radically different performance if you do it horizontally rather than vertically, depending on the “grain of the wood” — one direction may result in vastly more page faults than the other direction, and page faults are slow. Even assembly programmers are supposed to be allowed to pretend that they have a big flat address space, but virtual memory means it’s really just an abstraction, which leaks when there’s a page fault and certain memory fetches take way more nanoseconds than other memory fetches.

- The SQL language is meant to abstract away the procedural steps that are needed to query a database, instead allowing you to define merely what you want and let the database figure out the procedural steps to query it. But in some cases, certain SQL queries are thousands of times slower than other logically equivalent queries. A famous example of this is that some SQL servers are dramatically faster if you specify “where a=b and b=c and a=c” than if you only specify “where a=b and b=c” even though the result set is the same. You’re not supposed to have to care about the procedure, only the specification. But sometimes the abstraction leaks and causes horrible performance and you have to break out the query plan analyzer and study what it did wrong, and figure out how to make your query run faster.

...

- C++ string classes are supposed to let you pretend that strings are first-class data. They try to abstract away the fact that strings are hard and let you act as if they were as easy as integers. Almost all C++ string classes overload the + operator so you can write s + “bar” to concatenate. But you know what? No matter how hard they try, there is no C++ string class on Earth that will let you type “foo” + “bar”, because string literals in C++ are always char*’s, never strings. The abstraction has sprung a leak that the language doesn’t let you plug. (Amusingly, the history of the evolution of C++ over time can be described as a history of trying to plug the leaks in the string abstraction. Why they couldn’t just add a native string class to the language itself eludes me at the moment.)

- And you can’t drive as fast when it’s raining, even though your car has windshield wipers and headlights and a roof and a heater, all of which protect you from caring about the fact that it’s raining (they abstract away the weather), but lo, you have to worry about hydroplaning (or aquaplaning in England) and sometimes the rain is so strong you can’t see very far ahead so you go slower in the rain, because the weather can never be completely abstracted away, because of the law of leaky abstractions.

One reason the law of leaky abstractions is problematic is that it means that abstractions do not really simplify our lives as much as they were meant to. When I’m training someone to be a C++ programmer, it would be nice if I never had to teach them about char*’s and pointer arithmetic. It would be nice if I could go straight to STL strings. But one day they’ll write the code “foo” + “bar”, and truly bizarre things will happen, and then I’ll have to stop and teach them all about char*’s anyway.

...

The law of leaky abstractions means that whenever somebody comes up with a wizzy new code-generation tool that is supposed to make us all ever-so-efficient, you hear a lot of people saying “learn how to do it manually first, then use the wizzy tool to save time.” Code generation tools which pretend to abstract out something, like all abstractions, leak, and the only way to deal with the leaks competently is to learn about how the abstractions work and what they are abstracting. So the abstractions save us time working, but they don’t save us time learning.
techtariat  org:com  working-stiff  essay  programming  cs  software  abstraction  worrydream  thinking  intricacy  degrees-of-freedom  networking  examples  traces  no-go  volo-avolo  tradeoffs  c(pp)  pls  strings  dbs  transportation  driving  analogy  aphorism  learning  paradox  systems  elegance  nitty-gritty  concrete  cracker-prog  metal-to-virtual  protocol-metadata  design  system-design 
july 2019 by nhaliday
Which of Haskell and OCaml is more practical? For example, in which aspect will each play a key role? - Quora
- Tikhon Jelvis,

Haskell.

This is a question I'm particularly well-placed to answer because I've spent quite a bit of time with both Haskell and OCaml, seeing both in the real world (including working at Jane Street for a bit). I've also seen the languages in academic settings and know many people at startups using both languages. This gives me a good perspective on both languages, with a fairly similar amount of experience in the two (admittedly biased towards Haskell).

And so, based on my own experience rather than the languages' reputations, I can confidently say it's Haskell.

Parallelism and Concurrency

...

Libraries

...

Typeclasses vs Modules

...

In some sense, OCaml modules are better behaved and founded on a sounder theory than Haskell typeclasses, which have some serious drawbacks. However, the fact that typeclasses can be reliably inferred whereas modules have to be explicitly used all the time more than makes up for this. Moreover, extensions to the typeclass system enable much of the power provided by OCaml modules.

...

Of course, OCaml has some advantages of its own as well. It has a performance profile that's much easier to predict. The module system is awesome and often missed in Haskell. Polymorphic variants can be very useful for neatly representing certain situations, and don't have an obvious Haskell analog.

While both languages have a reasonable C FFI, OCaml's seems a bit simpler. It's hard for me to say this with any certainty because I've only used the OCaml FFI myself, but it was quite easy to use—a hard bar for Haskell's to clear. One really nice use of modules in OCaml is to pass around values directly from C as abstract types, which can help avoid extra marshalling/unmarshalling; that seemed very nice in OCaml.

However, overall, I still think Haskell is the more practical choice. Apart from the reasoning above, I simply have my own observations: my Haskell code tends to be clearer, simpler and shorter than my OCaml code. I'm also more productive in Haskell. Part of this is certainly a matter of having more Haskell experience, but the delta is limited especially as I'm working at my third OCaml company. (Of course, the first two were just internships.)

Both Haskell and OCaml are uniquivocally superb options—miles ahead of any other languages I know. While I do prefer Haskell, I'd choose either one in a pinch.

--
I've looked at F# a bit, but it feels like it makes too many tradeoffs to be on .NET. You lose the module system, which is probably OCaml's best feature, in return for an unfortunate, nominally typed OOP layer.

I'm also not invested in .NET at all: if anything, I'd prefer to avoid it in favor of simplicity. I exclusively use Linux and, from the outside, Mono doesn't look as good as it could be. I'm also far more likely to interoperate with a C library than a .NET library.

If I had some additional reason to use .NET, I'd definitely go for F#, but right now I don't.

https://www.reddit.com/r/haskell/comments/3huexy/what_are_haskellers_critiques_of_f_and_ocaml/
https://www.reddit.com/r/haskell/comments/3huexy/what_are_haskellers_critiques_of_f_and_ocaml/cub5mmb/
Thinking about it now, it boils down to a single word: expressiveness. When I'm writing OCaml, I feel more constrained than when I'm writing Haskell. And that's important: unlike so many others, what first attracted me to Haskell was expressiveness, not safety. It's easier for me to write code that looks how I want it to look in Haskell. The upper bound on code quality is higher.

...

Perhaps it all boils down to OCaml and its community feeling more "worse is better" than Haskell, something I highly disfavor.

...

Laziness or, more strictly, non-strictness is big. A controversial start, perhaps, but I stand by it. Unlike some, I do not see non-strictness as a design mistake but as a leap in abstraction. Perhaps a leap before its time, but a leap nonetheless. Haskell lets me program without constantly keeping the code's order in my head. Sure, it's not perfect and sometimes performance issues jar the illusion, but they are the exception not the norm. Coming from imperative languages where order is omnipresent (I can't even imagine not thinking about execution order as I write an imperative program!) it's incredibly liberating, even accounting for the weird issues and jinks I'd never see in a strict language.

This is what I imagine life felt like with the first garbage collectors: they may have been slow and awkward, the abstraction might have leaked here and there, but, for all that, it was an incredible advance. You didn't have to constantly think about memory allocation any more. It took a lot of effort to get where we are now and garbage collectors still aren't perfect and don't fit everywhere, but it's hard to imagine the world without them. Non-strictness feels like it has the same potential, without anywhere near the work garbage collection saw put into it.

...

The other big thing that stands out are typeclasses. OCaml might catch up on this front with implicit modules or it might not (Scala implicits are, by many reports, awkward at best—ask Edward Kmett about it, not me) but, as it stands, not having them is a major shortcoming. Not having inference is a bigger deal than it seems: it makes all sorts of idioms we take for granted in Haskell awkward in OCaml which means that people simply don't use them. Haskell's typeclasses, for all their shortcomings (some of which I find rather annoying), are incredibly expressive.

In Haskell, it's trivial to create your own numeric type and operators work as expected. In OCaml, while you can write code that's polymorphic over numeric types, people simply don't. Why not? Because you'd have to explicitly convert your literals and because you'd have to explicitly open a module with your operators—good luck using multiple numeric types in a single block of code! This means that everyone uses the default types: (63/31-bit) ints and doubles. If that doesn't scream "worse is better", I don't know what does.

...

There's more. Haskell's effect management, brought up elsewhere in this thread, is a big boon. It makes changing things more comfortable and makes informal reasoning much easier. Haskell is the only language where I consistently leave code I visit better than I found it. Even if I hadn't worked on the project in years. My Haskell code has better longevity than my OCaml code, much less other languages.

http://blog.ezyang.com/2011/02/ocaml-gotchas/
One observation about purity and randomness: I think one of the things people frequently find annoying in Haskell is the fact that randomness involves mutation of state, and thus be wrapped in a monad. This makes building probabilistic data structures a little clunkier, since you can no longer expose pure interfaces. OCaml is not pure, and as such you can query the random number generator whenever you want.

However, I think Haskell may get the last laugh in certain circumstances. In particular, if you are using a random number generator in order to generate random test cases for your code, you need to be able to reproduce a particular set of random tests. Usually, this is done by providing a seed which you can then feed back to the testing script, for deterministic behavior. But because OCaml's random number generator manipulates global state, it's very easy to accidentally break determinism by asking for a random number for something unrelated. You can work around it by manually bracketing the global state, but explicitly handling the randomness state means providing determinism is much more natural.
q-n-a  qra  programming  pls  engineering  nitty-gritty  pragmatic  functional  haskell  ocaml-sml  dotnet  types  arrows  cost-benefit  tradeoffs  concurrency  libraries  performance  expert-experience  composition-decomposition  comparison  critique  multi  reddit  social  discussion  techtariat  reflection  review  random  data-structures  numerics  rand-approx  sublinear  syntax  volo-avolo  causation  scala  jvm  ecosystem  metal-to-virtual 
june 2019 by nhaliday
data structures - Why are Red-Black trees so popular? - Computer Science Stack Exchange
- AVL trees have smaller average depth than red-black trees, and thus searching for a value in AVL tree is consistently faster.
- Red-black trees make less structural changes to balance themselves than AVL trees, which could make them potentially faster for insert/delete. I'm saying potentially, because this would depend on the cost of the structural change to the tree, as this will depend a lot on the runtime and implemntation (might also be completely different in a functional language when the tree is immutable?)

There are many benchmarks online that compare AVL and Red-black trees, but what struck me is that my professor basically said, that usually you'd do one of two things:
- Either you don't really care that much about performance, in which case the 10-20% difference of AVL vs Red-black in most cases won't matter at all.
- Or you really care about performance, in which you case you'd ditch both AVL and Red-black trees, and go with B-trees, which can be tweaked to work much better (or (a,b)-trees, I'm gonna put all of those in one basket.)

--

> For some kinds of binary search trees, including red-black trees but not AVL trees, the "fixes" to the tree can fairly easily be predicted on the way down and performed during a single top-down pass, making the second pass unnecessary. Such insertion algorithms are typically implemented with a loop rather than recursion, and often run slightly faster in practice than their two-pass counterparts.

So a RedBlack tree insert can be implemented without recursion, on some CPUs recursion is very expensive if you overrun the function call cache (e.g SPARC due to is use of Register window)

--

There are some cases where you can't use B-trees at all.

One prominent case is std::map from C++ STL. The standard requires that insert does not invalidate existing iterators

...

I also believe that "single pass tail recursive" implementation is not the reason for red black tree popularity as a mutable data structure.

First of all, stack depth is irrelevant here, because (given log𝑛 height) you would run out of the main memory before you run out of stack space. Jemalloc is happy with preallocating worst case depth on the stack.
nibble  q-n-a  overflow  cs  algorithms  tcs  data-structures  functional  orders  trees  cost-benefit  tradeoffs  roots  explanans  impetus  performance  applicability-prereqs  programming  pls  c(pp)  ubiquity 
june 2019 by nhaliday
C++ Core Guidelines
This document is a set of guidelines for using C++ well. The aim of this document is to help people to use modern C++ effectively. By “modern C++” we mean effective use of the ISO C++ standard (currently C++17, but almost all of our recommendations also apply to C++14 and C++11). In other words, what would you like your code to look like in 5 years’ time, given that you can start now? In 10 years’ time?

https://isocpp.github.io/CppCoreGuidelines/
“Within C++ is a smaller, simpler, safer language struggling to get out.” – Bjarne Stroustrup

...

The guidelines are focused on relatively higher-level issues, such as interfaces, resource management, memory management, and concurrency. Such rules affect application architecture and library design. Following the rules will lead to code that is statically type safe, has no resource leaks, and catches many more programming logic errors than is common in code today. And it will run fast - you can afford to do things right.

We are less concerned with low-level issues, such as naming conventions and indentation style. However, no topic that can help a programmer is out of bounds.

Our initial set of rules emphasize safety (of various forms) and simplicity. They may very well be too strict. We expect to have to introduce more exceptions to better accommodate real-world needs. We also need more rules.

...

The rules are designed to be supported by an analysis tool. Violations of rules will be flagged with references (or links) to the relevant rule. We do not expect you to memorize all the rules before trying to write code.

contrary:
https://aras-p.info/blog/2018/12/28/Modern-C-Lamentations/
This will be a long wall of text, and kinda random! My main points are:
1. C++ compile times are important,
2. Non-optimized build performance is important,
3. Cognitive load is important. I don’t expand much on this here, but if a programming language or a library makes me feel stupid, then I’m less likely to use it or like it. C++ does that a lot :)
programming  engineering  pls  best-practices  systems  c(pp)  guide  metabuch  objektbuch  reference  cheatsheet  elegance  frontier  libraries  intricacy  advanced  advice  recommendations  big-picture  novelty  lens  philosophy  state  error  types  concurrency  memory-management  performance  abstraction  plt  compilers  expert-experience  multi  checking  devtools  flux-stasis  safety  system-design  techtariat  time  measure  dotnet  comparison  examples  build-packaging  thinking  worse-is-better/the-right-thing  cost-benefit  tradeoffs  essay  commentary  oop  correctness  computer-memory  error-handling  resources-effects  latency-throughput 
june 2019 by nhaliday
The End of the Editor Wars » Linux Magazine
Moreover, even if you assume a broad margin of error, the pollings aren't even close. With all the various text editors available today, Vi and Vim continue to be the choice of over a third of users, while Emacs well back in the pack, no longer a competitor for the most popular text editor.

https://www.quora.com/Are-there-more-Emacs-or-Vim-users
I believe Vim is actually more popular, but it's hard to find any real data on it. The best source I've seen is the annual StackOverflow developer survey where 15.2% of developers used Vim compared to a mere 3.2% for Emacs.

Oddly enough, the report noted that "Data scientists and machine learning developers are about 3 times more likely to use Emacs than any other type of developer," which is not necessarily what I would have expected.

[ed. NB: Vim still dominates overall.]

https://pinboard.in/u:nhaliday/b:6adc1b1ef4dc

Time To End The vi/Emacs Debate: https://cacm.acm.org/blogs/blog-cacm/226034-time-to-end-the-vi-emacs-debate/fulltext

Vim, Emacs and their forever war. Does it even matter any more?: https://blog.sourcerer.io/vim-emacs-and-their-forever-war-does-it-even-matter-any-more-697b1322d510
Like an episode of “Silicon Valley”, a discussion of Emacs vs. Vim used to have a polarizing effect that would guarantee a stimulating conversation, regardless of an engineer’s actual alignment. But nowadays, diehard Emacs and Vim users are getting much harder to find. Maybe I’m in the wrong orbit, but looking around today, I see that engineers are equally or even more likely to choose any one of a number of great (for any given definition of ‘great’) modern editors or IDEs such as Sublime Text, Visual Studio Code, Atom, IntelliJ (… or one of its siblings), Brackets, Visual Studio or Xcode, to name a few. It’s not surprising really — many top engineers weren’t even born when these editors were at version 1.0, and GUIs (for better or worse) hadn’t been invented.

...

… both forums have high traffic and up-to-the-minute comment and discussion threads. Some of the available statistics paint a reasonably healthy picture — Stackoverflow’s 2016 developer survey ranks Vim 4th out of 24 with 26.1% of respondents in the development environments category claiming to use it. Emacs came 15th with 5.2%. In combination, over 30% is, actually, quite impressive considering they’ve been around for several decades.

What’s odd, however, is that if you ask someone — say a random developer — to express a preference, the likelihood is that they will favor for one or the other even if they have used neither in anger. Maybe the meme has spread so widely that all responses are now predominantly ritualistic, and represent something more fundamental than peoples’ mere preference for an editor? There’s a rather obvious political hypothesis waiting to be made — that Emacs is the leftist, socialist, centralized state, while Vim represents the right and the free market, specialization and capitalism red in tooth and claw.

How is Emacs/Vim used in companies like Google, Facebook, or Quora? Are there any libraries or tools they share in public?: https://www.quora.com/How-is-Emacs-Vim-used-in-companies-like-Google-Facebook-or-Quora-Are-there-any-libraries-or-tools-they-share-in-public
In Google there's a fair amount of vim and emacs. I would say at least every other engineer uses one or another.

Among Software Engineers, emacs seems to be more popular, about 2:1. Among Site Reliability Engineers, vim is more popular, about 9:1.
--
People use both at Facebook, with (in my opinion) slightly better tooling for Emacs than Vim. We share a master.emacs and master.vimrc file, which contains the bare essentials (like syntactic highlighting for the Hack language). We also share a Ctags file that's updated nightly with a cron script.

Beyond the essentials, there's a group for Emacs users at Facebook that provides tips, tricks, and major-modes created by people at Facebook. That's where Adam Hupp first developed his excellent mural-mode (ahupp/mural), which does for Ctags what iDo did for file finding and buffer switching.
--
For emacs, it was very informal at Google. There wasn't a huge community of Emacs users at Google, so there wasn't much more than a wiki and a couple language styles matching Google's style guides.

https://trends.google.com/trends/explore?date=all&geo=US&q=%2Fm%2F07zh7,%2Fm%2F01yp0m

https://www.quora.com/Why-is-interest-in-Emacs-dropping
And it is still that. It’s just that emacs is no longer unique, and neither is Lisp.

Dynamically typed scripting languages with garbage collection are a dime a dozen now. Anybody in their right mind developing an extensible text editor today would just use python, ruby, lua, or JavaScript as the extension language and get all the power of Lisp combined with vibrant user communities and millions of lines of ready-made libraries that Stallman and Steele could only dream of in the 70s.

In fact, in many ways emacs and elisp have fallen behind: 40 years after Lambda, the Ultimate Imperative, elisp is still dynamically scoped, and it still doesn’t support multithreading — when I try to use dired to list the files on a slow NFS mount, the entire editor hangs just as thoroughly as it might have in the 1980s. And when I say “doesn’t support multithreading,” I don’t mean there is some other clever trick for continuing to do work while waiting on a system call, like asynchronous callbacks or something. There’s start-process which forks a whole new process, and that’s about it. It’s a concurrency model straight out of 1980s UNIX land.

But being essentially just a decent text editor has robbed emacs of much of its competitive advantage. In a world where every developer tool is scriptable with languages and libraries an order of magnitude more powerful than cranky old elisp, the reason to use emacs is not that it lets a programmer hit a button and evaluate the current expression interactively (which must have been absolutely amazing at one point in the past).

https://www.reddit.com/r/emacs/comments/bh5kk7/why_do_many_new_users_still_prefer_vim_over_emacs/

more general comparison, not just popularity:
Differences between Emacs and Vim: https://stackoverflow.com/questions/1430164/differences-between-Emacs-and-vim

https://www.reddit.com/r/emacs/comments/9hen7z/what_are_the_benefits_of_emacs_over_vim/

https://unix.stackexchange.com/questions/986/what-are-the-pros-and-cons-of-vim-and-emacs

https://www.quora.com/Why-is-Vim-the-programmers-favorite-editor
- Adrien Lucas Ecoffet,

Because it is hard to use. Really.

However, the second part of this sentence applies to just about every good editor out there: if you really learn Sublime Text, you will become super productive. If you really learn Emacs, you will become super productive. If you really learn Visual Studio… you get the idea.

Here’s the thing though, you never actually need to really learn your text editor… Unless you use vim.

...

For many people new to programming, this is the first time they have been a power user of… well, anything! And because they’ve been told how great Vim is, many of them will keep at it and actually become productive, not because Vim is particularly more productive than any other editor, but because it didn’t provide them with a way to not be productive.

They then go on to tell their friends how great Vim is, and their friends go on to become power users and tell their friends in turn, and so forth. All these people believe they became productive because they changed their text editor. Little do they realize that they became productive because their text editor changed them[1].

This is in no way a criticism of Vim. I myself was a beneficiary of such a phenomenon when I learned to type using the Dvorak layout: at that time, I believed that Dvorak would help you type faster. Now I realize the evidence is mixed and that Dvorak might not be much better than Qwerty. However, learning Dvorak forced me to develop good typing habits because I could no longer rely on looking at my keyboard (since I was still using a Qwerty physical keyboard), and this has made me a much more productive typist.

Technical Interview Performance by Editor/OS/Language: https://triplebyte.com/blog/technical-interview-performance-by-editor-os-language
[ed.: I'm guessing this is confounded to all hell.]

The #1 most common editor we see used in interviews is Sublime Text, with Vim close behind.

Emacs represents a fairly small market share today at just about a quarter the userbase of Vim in our interviews. This nicely matches the 4:1 ratio of Google Search Trends for the two editors.

...

Vim takes the prize here, but PyCharm and Emacs are close behind. We’ve found that users of these editors tend to pass our interview at an above-average rate.

On the other end of the spectrum is Eclipse: it appears that someone using either Vim or Emacs is more than twice as likely to pass our technical interview as an Eclipse user.

...

In this case, we find that the average Ruby, Swift, and C# users tend to be stronger, with Python and Javascript in the middle of the pack.

...

Here’s what happens after we select engineers to work with and send them to onsites:

[Python does best.]

There are no wild outliers here, but let’s look at the C++ segment. While C++ programmers have the most challenging time passing Triplebyte’s technical interview on average, the ones we choose to work with tend to have a relatively easier time getting offers at each onsite.

The Rise of Microsoft Visual Studio Code: https://triplebyte.com/blog/editor-report-the-rise-of-visual-studio-code
This chart shows the rates at which each editor's users pass our interview compared to the mean pass rate for all candidates. First, notice the preeminence of Emacs and Vim! Engineers who use these editors pass our interview at significantly higher rates than other engineers. And the effect size is not small. Emacs users pass our interview at a rate 50… [more]
news  linux  oss  tech  editors  devtools  tools  comparison  ranking  flux-stasis  trends  ubiquity  unix  increase-decrease  multi  q-n-a  qra  data  poll  stackex  sv  facebook  google  integration-extension  org:med  politics  stereotypes  coalitions  decentralized  left-wing  right-wing  chart  scale  time-series  distribution  top-n  list  discussion  ide  parsimony  intricacy  cost-benefit  tradeoffs  confounding  analysis  crosstab  pls  python  c(pp)  jvm  microsoft  golang  hmm  correlation  debate  critique  quora  contrarianism  ecosystem  DSL 
june 2019 by nhaliday
Should I go for TensorFlow or PyTorch?
Honestly, most experts that I know love Pytorch and detest TensorFlow. Karpathy and Justin from Stanford for example. You can see Karpthy's thoughts and I've asked Justin personally and the answer was sharp: PYTORCH!!! TF has lots of PR but its API and graph model are horrible and will waste lots of your research time.

--

...

Updated Mar 12
Update after 2019 TF summit:

TL/DR: previously I was in the pytorch camp but with TF 2.0 it’s clear that Google is really going to try to have parity or try to be better than Pytorch in all aspects where people voiced concerns (ease of use/debugging/dynamic graphs). They seem to be allocating more resources on development than Facebook so the longer term currently looks promising for Google. Prior to TF 2.0 I thought that Pytorch team had more momentum. One area where FB/Pytorch is still stronger is Google is a bit more closed and doesn’t seem to release reproducible cutting edge models such as AlphaGo whereas FAIR released OpenGo for instance. Generally you will end up running into models that are only implemented in one framework of the other so chances are you might end up learning both.
q-n-a  qra  comparison  software  recommendations  cost-benefit  tradeoffs  python  libraries  machine-learning  deep-learning  data-science  sci-comp  tools  google  facebook  tech  competition  best-practices  trends  debugging  expert-experience  ecosystem  theory-practice  pragmatic  wire-guided  static-dynamic  state  academia  frameworks  open-closed 
may 2019 by nhaliday
One week of bugs
If I had to guess, I'd say I probably work around hundreds of bugs in an average week, and thousands in a bad week. It's not unusual for me to run into a hundred new bugs in a single week. But I often get skepticism when I mention that I run into multiple new (to me) bugs per day, and that this is inevitable if we don't change how we write tests. Well, here's a log of one week of bugs, limited to bugs that were new to me that week. After a brief description of the bugs, I'll talk about what we can do to improve the situation. The obvious answer to spend more effort on testing, but everyone already knows we should do that and no one does it. That doesn't mean it's hopeless, though.

...

Here's where I'm supposed to write an appeal to take testing more seriously and put real effort into it. But we all know that's not going to work. It would take 90k LOC of tests to get Julia to be as well tested as a poorly tested prototype (falsely assuming linear complexity in size). That's two person-years of work, not even including time to debug and fix bugs (which probably brings it closer to four of five years). Who's going to do that? No one. Writing tests is like writing documentation. Everyone already knows you should do it. Telling people they should do it adds zero information1.

Given that people aren't going to put any effort into testing, what's the best way to do it?

Property-based testing. Generative testing. Random testing. Concolic Testing (which was done long before the term was coined). Static analysis. Fuzzing. Statistical bug finding. There are lots of options. Some of them are actually the same thing because the terminology we use is inconsistent and buggy. I'm going to arbitrarily pick one to talk about, but they're all worth looking into.

...

There are a lot of great resources out there, but if you're just getting started, I found this description of types of fuzzers to be one of those most helpful (and simplest) things I've read.

John Regehr has a udacity course on software testing. I haven't worked through it yet (Pablo Torres just pointed to it), but given the quality of Dr. Regehr's writing, I expect the course to be good.

For more on my perspective on testing, there's this.

Everything's broken and nobody's upset: https://www.hanselman.com/blog/EverythingsBrokenAndNobodysUpset.aspx
https://news.ycombinator.com/item?id=4531549

https://hypothesis.works/articles/the-purpose-of-hypothesis/
From the perspective of a user, the purpose of Hypothesis is to make it easier for you to write better tests.

From my perspective as the primary author, that is of course also a purpose of Hypothesis. I write a lot of code, it needs testing, and the idea of trying to do that without Hypothesis has become nearly unthinkable.

But, on a large scale, the true purpose of Hypothesis is to drag the world kicking and screaming into a new and terrifying age of high quality software.

Software is everywhere. We have built a civilization on it, and it’s only getting more prevalent as more services move online and embedded and “internet of things” devices become cheaper and more common.

Software is also terrible. It’s buggy, it’s insecure, and it’s rarely well thought out.

This combination is clearly a recipe for disaster.

The state of software testing is even worse. It’s uncontroversial at this point that you should be testing your code, but it’s a rare codebase whose authors could honestly claim that they feel its testing is sufficient.

Much of the problem here is that it’s too hard to write good tests. Tests take up a vast quantity of development time, but they mostly just laboriously encode exactly the same assumptions and fallacies that the authors had when they wrote the code, so they miss exactly the same bugs that you missed when they wrote the code.

Preventing the Collapse of Civilization [video]: https://news.ycombinator.com/item?id=19945452
- Jonathan Blow

NB: DevGAMM is a game industry conference

- loss of technological knowledge (Antikythera mechanism, aqueducts, etc.)
- hardware driving most gains, not software
- software's actually less robust, often poorly designed and overengineered these days
- *list of bugs he's encountered recently*:
https://youtu.be/pW-SOdj4Kkk?t=1387
- knowledge of trivia becomes more than general, deep knowledge
- does at least acknowledge value of DRY, reusing code, abstraction saving dev time
techtariat  dan-luu  tech  software  error  list  debugging  linux  github  robust  checking  oss  troll  lol  aphorism  webapp  email  google  facebook  games  julia  pls  compilers  communication  mooc  browser  rust  programming  engineering  random  jargon  formal-methods  expert-experience  prof  c(pp)  course  correctness  hn  commentary  video  presentation  carmack  pragmatic  contrarianism  pessimism  sv  unix  rhetoric  critique  worrydream  hardware  performance  trends  multiplicative  roots  impact  comparison  history  iron-age  the-classics  mediterranean  conquest-empire  gibbon  technology  the-world-is-just-atoms  flux-stasis  increase-decrease  graphics  hmm  idk  systems  os  abstraction  intricacy  worse-is-better/the-right-thing  build-packaging  microsoft  osx  apple  reflection  assembly  things  knowledge  detail-architecture  thick-thin  trivia  info-dynamics  caching  frameworks  generalization  systematic-ad-hoc  universalism-particularism  analytical-holistic  structure  tainter  libraries  tradeoffs  prepping  threat-modeling  network-structure  writing  risk  local-glob 
may 2019 by nhaliday
algorithm - Skip List vs. Binary Search Tree - Stack Overflow
Skip lists are more amenable to concurrent access/modification. Herb Sutter wrote an article about data structure in concurrent environments. It has more indepth information.

The most frequently used implementation of a binary search tree is a red-black tree. The concurrent problems come in when the tree is modified it often needs to rebalance. The rebalance operation can affect large portions of the tree, which would require a mutex lock on many of the tree nodes. Inserting a node into a skip list is far more localized, only nodes directly linked to the affected node need to be locked.
q-n-a  stackex  nibble  programming  tcs  data-structures  performance  concurrency  comparison  cost-benefit  applicability-prereqs  random  trees  tradeoffs 
may 2019 by nhaliday
linux - What's the difference between .so, .la and .a library files? - Stack Overflow
.so files are dynamic libraries. The suffix stands for "shared object", because all the applications that are linked with the library use the same file, rather than making a copy in the resulting executable.

.a files are static libraries. The suffix stands for "archive", because they're actually just an archive (made with the ar command -- a predecessor of tar that's now just used for making libraries) of the original .o object files.
q-n-a  stackex  programming  engineering  build-packaging  tradeoffs  yak-shaving  nitty-gritty  best-practices  cost-benefit  worse-is-better/the-right-thing 
april 2019 by nhaliday
“Give Anything” | An Algorithmic Lucidity
As a freshman on my high school's cross country team, our captain told me that to be a good runner, you needed to love pain.

I objected: a great runner could love to race, I said, and endure the pain only for the sake of competing and winning.

It's only fifteen years later (practically one foot in the grave), that I now see that I was wrong and he was right.
ratty  techtariat  aphorism  running  fitness  stoic  impetus  ends-means  biases  emotion  endurance  cost-benefit  tradeoffs 
march 2019 by nhaliday
Which benchmark programs are faster? | Computer Language Benchmarks Game
old:
https://salsa.debian.org/benchmarksgame-team/archive-alioth-benchmarksgame
https://web.archive.org/web/20170331153459/http://benchmarksgame.alioth.debian.org/
includes Scala

very outdated but more languages: https://web.archive.org/web/20110401183159/http://shootout.alioth.debian.org:80/

OCaml seems to offer the best tradeoff of performance vs parsimony (Haskell not so much :/)
https://blog.chewxy.com/2019/02/20/go-is-average/
http://blog.gmarceau.qc.ca/2009/05/speed-size-and-dependability-of.html
old official: https://web.archive.org/web/20130731195711/http://benchmarksgame.alioth.debian.org/u64q/code-used-time-used-shapes.php
https://web.archive.org/web/20121125103010/http://shootout.alioth.debian.org/u64q/code-used-time-used-shapes.php
Haskell does better here

other PL benchmarks:
https://github.com/kostya/benchmarks
BF 2.0:
Kotlin, C++ (GCC), Rust < Nim, D (GDC,LDC), Go, MLton < Crystal, Go (GCC), C# (.NET Core), Scala, Java, OCaml < D (DMD) < C# Mono < Javascript V8 < F# Mono, Javascript Node, Haskell (MArray) << LuaJIT << Python PyPy < Haskell < Racket <<< Python << Python3
mandel.b:
C++ (GCC) << Crystal < Rust, D (GDC), Go (GCC) < Nim, D (LDC) << C# (.NET Core) < MLton << Kotlin << OCaml << Scala, Java << D (DMD) << Go << C# Mono << Javascript Node << Haskell (MArray) << LuaJIT < Python PyPy << F# Mono <<< Racket
https://github.com/famzah/langs-performance
C++, Rust, Java w/ custom non-stdlib code < Python PyPy < C# .Net Core < Javscript Node < Go, unoptimized C++ (no -O2) << PHP << Java << Python3 << Python
comparison  pls  programming  performance  benchmarks  list  top-n  ranking  systems  time  multi  🖥  cost-benefit  tradeoffs  data  analysis  plots  visualization  measure  intricacy  parsimony  ocaml-sml  golang  rust  jvm  javascript  c(pp)  functional  haskell  backup  scala  realness  generalization  accuracy  techtariat  crosstab  database  repo  objektbuch  static-dynamic  gnu 
december 2018 by nhaliday
WHO | Priority environment and health risks
also: http://www.who.int/heli/risks/vectors/vector/en/

Environmental factors are a root cause of a significant disease burden, particularly in developing countries. An estimated 25% of death and disease globally, and nearly 35% in regions such as sub-Saharan Africa, is linked to environmental hazards. Some key areas of risk include the following:

- Unsafe water, poor sanitation and hygiene kill an estimated 1.7 million people annually, particularly as a result of diarrhoeal disease.
- Indoor smoke from solid fuels kills an estimated 1.6 million people annually due to respiratory diseases.
- Malaria kills over 1.2 million people annually, mostly African children under the age of five. Poorly designed irrigation and water systems, inadequate housing, poor waste disposal and water storage, deforestation and loss of biodiversity, all may be contributing factors to the most common vector-borne diseases including malaria, dengue and leishmaniasis.
- Urban air pollution generated by vehicles, industries and energy production kills approximately 800 000 people annually.
- Unintentional acute poisonings kill 355 000 people globally each year. In developing countries, where two-thirds of these deaths occur, such poisonings are associated strongly with excessive exposure to, and inappropriate use of, toxic chemicals and pesticides present in occupational and/or domestic environments.
- Climate change impacts including more extreme weather events, changed patterns of disease and effects on agricultural production, are estimated to cause over 150 000 deaths annually.

ed.:
Note the high point at human origin (Africa, Middle East) and Asia. Low points in New World and Europe/Russia. Probably key factor in explaining human psychological variation (Haidt axes, individualism-collectivism, kinship structure, etc.). E.g., compare Islam/Judaism (circumcision, food preparation/hygiene rules) and Christianity (orthodoxy more than orthopraxy, no arbitrary practices for group-marking).

I wonder if the dietary and hygiene laws of Christianity get up-regulated in higher parasite load places (the US South, Middle Eastern Christianity, etc.)?

Also the reason for this variation probably basically boils down how long local microbes have had time to adapt to the human immune system.

obv. correlation: https://pinboard.in/u:nhaliday/b:074ecdf30c50

Tropical disease: https://en.wikipedia.org/wiki/Tropical_disease
Tropical diseases are diseases that are prevalent in or unique to tropical and subtropical regions.[1] The diseases are less prevalent in temperate climates, due in part to the occurrence of a cold season, which controls the insect population by forcing hibernation. However, many were present in northern Europe and northern America in the 17th and 18th centuries before modern understanding of disease causation. The initial impetus for tropical medicine was to protect the health of colonialists, notably in India under the British Raj.[2] Insects such as mosquitoes and flies are by far the most common disease carrier, or vector. These insects may carry a parasite, bacterium or virus that is infectious to humans and animals. Most often disease is transmitted by an insect "bite", which causes transmission of the infectious agent through subcutaneous blood exchange. Vaccines are not available for most of the diseases listed here, and many do not have cures.

cf. Galton: https://pinboard.in/u:nhaliday/b:f72f8e03e729
org:gov  org:ngo  trivia  maps  data  visualization  pro-rata  demographics  death  disease  spreading  parasites-microbiome  world  developing-world  africa  MENA  asia  china  sinosphere  orient  europe  the-great-west-whale  occident  explanans  individualism-collectivism  n-factor  things  phalanges  roots  values  anthropology  cultural-dynamics  haidt  scitariat  morality  correlation  causation  migration  sapiens  history  antiquity  time  bio  EEA  eden-heaven  religion  christianity  islam  judaism  theos  ideology  database  list  tribalism  us-them  archaeology  environment  nature  climate-change  atmosphere  health  fluid  farmers-and-foragers  age-of-discovery  usa  the-south  speculation  questions  flexibility  epigenetics  diet  food  sanctity-degradation  multi  henrich  kinship  gnon  temperature  immune  investing  cost-benefit  tradeoffs 
july 2018 by nhaliday
Antinomia Imediata – experiments in a reaction from the left
https://antinomiaimediata.wordpress.com/lrx/
So, what is the Left Reaction? First of all, it’s reaction: opposition to the modern rationalist establishment, the Cathedral. It opposes the universalist Jacobin program of global government, favoring a fractured geopolitics organized through long-evolved complex systems. It’s profoundly anti-socialist and anti-communist, favoring market economy and individualism. It abhors tribalism and seeks a realistic plan for dismantling it (primarily informed by HBD and HBE). It looks at modernity as a degenerative ratchet, whose only way out is intensification (hence clinging to crypto-marxist market-driven acceleration).

How come can any of this still be in the *Left*? It defends equality of power, i.e. freedom. This radical understanding of liberty is deeply rooted in leftist tradition and has been consistently abhored by the Right. LRx is not democrat, is not socialist, is not progressist and is not even liberal (in its current, American use). But it defends equality of power. It’s utopia is individual sovereignty. It’s method is paleo-agorism. The anti-hierarchy of hunter-gatherer nomads is its understanding of the only realistic objective of equality.

...

In more cosmic terms, it seeks only to fulfill the Revolution’s side in the left-right intelligence pump: mutation or creation of paths. Proudhon’s antinomy is essentially about this: the collective force of the socius, evinced in moral standards and social organization vs the creative force of the individuals, that constantly revolutionize and disrupt the social body. The interplay of these forces create reality (it’s a metaphysics indeed): the Absolute (socius) builds so that the (individualistic) Revolution can destroy so that the Absolute may adapt, and then repeat. The good old formula of ‘solve et coagula’.

Ultimately, if the Neoreaction promises eternal hell, the LRx sneers “but Satan is with us”.

https://antinomiaimediata.wordpress.com/2016/12/16/a-statement-of-principles/
Liberty is to be understood as the ability and right of all sentient beings to dispose of their persons and the fruits of their labor, and nothing else, as they see fit. This stems from their self-awareness and their ability to control and choose the content of their actions.

...

Equality is to be understood as the state of no imbalance of power, that is, of no subjection to another sentient being. This stems from their universal ability for empathy, and from their equal ability for reason.

...

It is important to notice that, contrary to usual statements of these two principles, my standpoint is that Liberty and Equality here are not merely compatible, meaning they could coexist in some possible universe, but rather they are two sides of the same coin, complementary and interdependent. There can be NO Liberty where there is no Equality, for the imbalance of power, the state of subjection, will render sentient beings unable to dispose of their persons and the fruits of their labor[1], and it will limit their ability to choose over their rightful jurisdiction. Likewise, there can be NO Equality without Liberty, for restraining sentient beings’ ability to choose and dispose of their persons and fruits of labor will render some more powerful than the rest, and establish a state of subjection.

https://antinomiaimediata.wordpress.com/2017/04/18/flatness/
equality is the founding principle (and ultimately indistinguishable from) freedom. of course, it’s only in one specific sense of “equality” that this sentence is true.

to try and eliminate the bullshit, let’s turn to networks again:

any nodes’ degrees of freedom is the number of nodes they are connected to in a network. freedom is maximum when the network is symmetrically connected, i. e., when all nodes are connected to each other and thus there is no topographical hierarchy (middlemen) – in other words, flatness.

in this understanding, the maximization of freedom is the maximization of entropy production, that is, of intelligence. As Land puts it:

https://antinomiaimediata.wordpress.com/category/philosophy/mutualism/
gnon  blog  stream  politics  polisci  ideology  philosophy  land  accelerationism  left-wing  right-wing  paradox  egalitarianism-hierarchy  civil-liberty  power  hmm  revolution  analytical-holistic  mutation  selection  individualism-collectivism  tribalism  us-them  modernity  multi  tradeoffs  network-structure  complex-systems  cybernetics  randy-ayndy  insight  contrarianism  metameta  metabuch  characterization  cooperate-defect  n-factor  altruism  list  coordination  graphs  visual-understanding  cartoons  intelligence  entropy-like  thermo  information-theory  order-disorder  decentralized  distribution  degrees-of-freedom  analogy  graph-theory  extrema  evolution  interdisciplinary  bio  differential  geometry  anglosphere  optimate  nascent-state  deep-materialism  new-religion  cool  mystic  the-classics  self-interest  interests  reason  volo-avolo  flux-stasis  invariance  government  markets  paying-rent  cost-benefit  peace-violence  frontier  exit-voice  nl-and-so-can-you  war  track-record  usa  history  mostly-modern  world-war  military  justice  protestant-cathol 
march 2018 by nhaliday
The Space Trilogy - Wikipedia
Out of the Silent Planet:

Weston makes a long speech justifying his proposed invasion of Malacandra on "progressive" and evolutionary grounds, which Ransom attempts to translate into Malacandrian, thus laying bare the brutality and crudity of Weston's ambitions.

Oyarsa listens carefully to Weston's speech and acknowledges that the scientist is acting out of a sense of duty to his species, and not mere greed. This renders him more mercifully disposed towards the scientist, who accepts that he may die while giving Man the means to continue. However, on closer examination Oyarsa points out that Weston's loyalty is not to Man's mind – or he would equally value the intelligent alien minds already inhabiting Malacandra, instead of seeking to displace them in favour of humanity; nor to Man's body – since, as Weston is well aware of and at ease with, Man's physical form will alter over time, and indeed would have to in order to adapt to Weston's programme of space exploration and colonisation. It seems then that Weston is loyal only to "the seed" – Man's genome – which he seeks to propagate. When Oyarsa questions why this is an intelligible motivation for action, Weston's eloquence fails him and he can only articulate that if Oyarsa does not understand Man's basic loyalty to Man then he, Weston, cannot possibly instruct him.

...

Perelandra:

The rafts or floating islands are indeed Paradise, not only in the sense that they provide a pleasant and care-free life (until the arrival of Weston) but also in the sense that Ransom is for weeks and months naked in the presence of a beautiful naked woman without once lusting after her or being tempted to seduce her. This is because of the perfection in that world.

The plot thickens when Professor Weston arrives in a spaceship and lands in a part of the ocean quite close to the Fixed Land. He at first announces to Ransom that he is a reformed man, but appears to still be in search of power. Instead of the strictly materialist attitude he displayed when first meeting Ransom, he asserts he had become aware of the existence of spiritual beings and pledges allegiance to what he calls the "Life-Force." Ransom, however, disagrees with Weston's position that the spiritual is inherently good, and indeed Weston soon shows signs of demonic possession.

In this state, the possessed Weston finds the Queen and tries to tempt her into defying Maleldil's orders by spending a night on the Fixed Land. Ransom, perceiving this, believes that he must act as a counter-tempter. Well versed in the Bible and Christian theology, Ransom realises that if the pristine Queen, who has never heard of Evil, succumbs to the tempter's arguments, the Fall of Man will be re-enacted on Perelandra. He struggles through day after day of lengthy arguments illustrating various approaches to temptation, but the demonic Weston shows super-human brilliance in debate (though when "off-duty" he displays moronic, asinine behaviour and small-minded viciousness) and moreover appears never to need sleep.

With the demonic Weston on the verge of winning, the desperate Ransom hears in the night what he gradually realises is a Divine voice, commanding him to physically attack the Tempter. Ransom is reluctant, and debates with the divine (inner) voice for the entire duration of the night. A curious twist is introduced here; whereas the name "Ransom" is said to be derived from the title "Ranolf's Son", it can also refer to a reward given in exchange for a treasured life. Recalling this, and recalling that his God would (and has) sacrificed Himself in a similar situation, Ransom decides to confront the Tempter outright.

Ransom attacks his opponent bare-handed, using only physical force. Weston's body is unable to withstand this despite the Tempter's superior abilities of rhetoric, and so the Tempter flees. Ultimately Ransom chases him over the ocean, Weston fleeing and Ransom chasing on the backs of giant and friendly fish. During a fleeting truce, the "real" Weston appears to momentarily re-inhabit his body, and recount his experience of Hell, wherein the damned soul is not consigned to pain or fire, as supposed by popular eschatology, but is absorbed into the Devil, losing all independent existence.
fiction  scifi-fantasy  tip-of-tongue  literature  big-peeps  religion  christianity  theos  space  xenobio  analogy  myth  eden  deep-materialism  new-religion  sanctity-degradation  civil-liberty  exit-voice  speaking  truth  realness  embodied  fighting  old-anglo  group-selection  war  paying-rent  counter-revolution  morality  parable  competition  the-basilisk  gnosis-logos  individualism-collectivism  language  physics  science  evolution  conquest-empire  self-interest  hmm  intricacy  analytical-holistic  tradeoffs  paradox  heterodox  narrative  philosophy  expansionism  genetics  duty  us-them  interests  nietzschean  parallax  the-devil  the-self 
january 2018 by nhaliday
Quis custodiet ipsos custodes? - Wikipedia
Quis custodiet ipsos custodes? is a Latin phrase found in the work of the Roman poet Juvenal from his Satires (Satire VI, lines 347–348). It is literally translated as "Who will guard the guards themselves?", though it is also known by variant translations.

The original context deals with the problem of ensuring marital fidelity, though it is now commonly used more generally to refer to the problem of controlling the actions of persons in positions of power, an issue discussed by Plato in the Republic. It is not clear whether the phrase was written by Juvenal, or whether the passage in which it appears was interpolated into his works.

...

This phrase is used generally to consider the embodiment of the philosophical question as to how power can be held to account. It is sometimes incorrectly attributed as a direct quotation from Plato's Republic in both popular media and academic contexts.[3] There is no exact parallel in the Republic, but it is used by modern authors to express Socrates' concerns about the guardians, _the solution to which is to properly train their souls_. Several 19th century examples of the association with Plato can be found, often dropping "ipsos".[4][5] John Stuart Mill quotes it thus in Considerations on Representative Government (1861), though without reference to Plato. Plato's Republic though was hardly ever referenced by classical Latin authors like Juvenal, and it has been noted that it simply disappeared from literary awareness for a thousand years except for traces in the writings of Cicero and St. Augustine.[6] In the Republic, a putatively perfect society is described by Socrates, the main character in this Socratic dialogue. Socrates proposed a guardian class to protect that society, and the custodes (watchmen) from the Satires are often interpreted as being parallel to the Platonic guardians (phylakes in Greek). Socrates' answer to the problem is, in essence, that _the guardians will be manipulated to guard themselves against themselves via a deception often called the "noble lie" in English_.[7] As Leonid Hurwicz pointed out in his 2007 lecture on accepting the Nobel Memorial Prize in Economic Sciences, one of Socrates' interlocutors in the Republic, Glaucon, even goes so far as to say "it would be absurd that a guardian should need a guard."[8] But Socrates returns to this point at 590d, where he says that _the best person "has a divine ruler within himself," and that "it is better for everyone to be ruled by divine reason, preferably within himself and his own, otherwise imposed from without."_[9]
wiki  reference  aphorism  quotes  canon  literature  big-peeps  the-classics  philosophy  polisci  politics  government  institutions  leviathan  paradox  egalitarianism-hierarchy  n-factor  trust  organizing  power  questions  cynicism-idealism  gender  nascent-state  religion  theos  noble-lie  intel  privacy  managerial-state  explanans  the-great-west-whale  occident  sinosphere  orient  courage  vitality  vampire-squid  axelrod  cooperate-defect  coordination  ideas  democracy  foreign-lang  mediterranean  poetry  insight  virtu  decentralized  tradeoffs  analytical-holistic  ethical-algorithms  new-religion  the-watchers  interests  hypocrisy  madisonian  hari-seldon  wisdom  noblesse-oblige  illusion  comics  christianity  europe  china  asia  janus  guilt-shame  responsibility  volo-avolo  telos-atelos  parallax  alignment  whole-partial-many 
january 2018 by nhaliday
The idea of empire in the "Aeneid" on JSTOR
http://latindiscussion.com/forum/latin/to-rule-mankind-and-make-the-world-obey.11016/
Let's see...Aeneid, Book VI, ll. 851-853:

tu regere imperio populos, Romane, memento
(hae tibi erunt artes), pacique imponere morem,
parcere subiectis et debellare superbos.'

Which Dryden translated as:
To rule mankind, and make the world obey,
Disposing peace and war by thy own majestic way;
To tame the proud, the fetter'd slave to free:
These are imperial arts, and worthy thee."

If you wanted a literal translation,
"You, Roman, remember to rule people by command
(these were arts to you), and impose the custom to peace,
to spare the subjected and to vanquish the proud."

I don't want to derail your thread but pacique imponere morem -- "to impose the custom to peace"
Does it mean "be the toughest kid on the block," as in Pax Romana?

...

That 17th century one is a loose translation indeed. Myself I'd put it as

"Remember to rule over (all) the (world's) races by means of your sovereignty, oh Roman, (for indeed) you (alone) shall have the means (to do so), and to inculcate the habit of peace, and to have mercy on the enslaved and to destroy the arrogant."

http://classics.mit.edu/Virgil/aeneid.6.vi.html
And thou, great hero, greatest of thy name,
Ordain'd in war to save the sinking state,
And, by delays, to put a stop to fate!
Let others better mold the running mass
Of metals, and inform the breathing brass,
And soften into flesh a marble face;
Plead better at the bar; describe the skies,
And when the stars descend, and when they rise.
But, Rome, 't is thine alone, with awful sway,
To rule mankind, and make the world obey,
Disposing peace and war by thy own majestic way;
To tame the proud, the fetter'd slave to free:
These are imperial arts, and worthy thee."
study  article  letters  essay  pdf  piracy  history  iron-age  mediterranean  the-classics  big-peeps  literature  aphorism  quotes  classic  alien-character  sulla  poetry  conquest-empire  civilization  martial  vitality  peace-violence  order-disorder  domestication  courage  multi  poast  universalism-particularism  world  leviathan  foreign-lang  nascent-state  canon  org:junk  org:edu  tradeoffs  checklists  power  strategy  tactics  paradox  analytical-holistic  hari-seldon  aristos  wisdom  janus  parallax 
january 2018 by nhaliday
Christianity in China | Council on Foreign Relations
projected to outpace CCP membership soon

This fascinating map shows the new religious breakdown in China: http://www.businessinsider.com/new-religious-breakdown-in-china-14

Map Showing the Distribution of Christians in China: http://www.epm.org/resources/2010/Oct/18/map-showing-distribution-christians-china/

Christianity in China: https://en.wikipedia.org/wiki/Christianity_in_China
Accurate data on Chinese Christians is hard to access. According to the most recent internal surveys there are approximately 31 million Christians in China today (2.3% of the total population).[5] On the other hand, some international Christian organizations estimate there are tens of millions more, which choose not to publicly identify as such.[6] The practice of religion continues to be tightly controlled by government authorities.[7] Chinese over the age of 18 are only permitted to join officially sanctioned Christian groups registered with the government-approved Protestant Three-Self Church and China Christian Council and the Chinese Patriotic Catholic Church.[8]

In Xi we trust - Is China cracking down on Christianity?: http://www.dw.com/en/in-xi-we-trust-is-china-cracking-down-on-christianity/a-42224752A

In China, Unregistered Churches Are Driving a Religious Revolution: https://www.theatlantic.com/international/archive/2017/04/china-unregistered-churches-driving-religious-revolution/521544/

Cracks in the atheist edifice: https://www.economist.com/news/briefing/21629218-rapid-spread-christianity-forcing-official-rethink-religion-cracks

Jesus won’t save you — President Xi Jinping will, Chinese Christians told: https://www.washingtonpost.com/news/worldviews/wp/2017/11/14/jesus-wont-save-you-president-xi-jinping-will-chinese-christians-told/

http://www.sixthtone.com/news/1001611/noodles-for-the-messiah-chinas-creative-christian-hymns

https://www.reuters.com/article/us-pope-china-exclusive/exclusive-china-vatican-deal-on-bishops-ready-for-signing-source-idUSKBN1FL67U
Catholics in China are split between those in “underground” communities that recognize the pope and those belonging to a state-controlled Catholic Patriotic Association where bishops are appointed by the government in collaboration with local Church communities.

http://www.bbc.com/news/world-asia-china-42914029
The underground churches recognise only the Vatican's authority, whereas the Chinese state churches refuse to accept the authority of the Pope.

There are currently about 100 Catholic bishops in China, with some approved by Beijing, some approved by the Vatican and, informally, many now approved by both.

...

Under the agreement, the Vatican would be given a say in the appointment of future bishops in China, a Vatican source told news agency Reuters.

For Beijing, an agreement with the Vatican could allow them more control over the country's underground churches.

Globally, it would also enhance China's prestige - to have the world's rising superpower engaging with one of the world's major religions.

Symbolically, it would the first sign of rapprochement between China and the Catholic church in more than half a century.

The Vatican is the only European state that maintains formal diplomatic relations with Taiwan. It is currently unclear if an agreement between China and the Vatican would affect this in any way.

What will this mean for the country's Catholics?

There are currently around 10 million Roman Catholics in China.

https://www.washingtonpost.com/world/asia_pacific/china-vatican-deal-on-bishops-reportedly-ready-for-signing/2018/02/01/2adfc6b2-0786-11e8-b48c-b07fea957bd5_story.html

http://www.catholicherald.co.uk/news/2018/02/06/china-is-the-best-implementer-of-catholic-social-doctrine-says-vatican-bishop/
The chancellor of the Pontifical Academy of Social Sciences praised the 'extraordinary' Communist state

“Right now, those who are best implementing the social doctrine of the Church are the Chinese,” a senior Vatican official has said.

Bishop Marcelo Sánchez Sorondo, chancellor of the Pontifical Academy of Social Sciences, praised the Communist state as “extraordinary”, saying: “You do not have shantytowns, you do not have drugs, young people do not take drugs”. Instead, there is a “positive national conscience”.

The bishop told the Spanish-language edition of Vatican Insider that in China “the economy does not dominate politics, as happens in the United States, something Americans themselves would say.”

Bishop Sánchez Sorondo said that China was implementing Pope Francis’s encyclical Laudato Si’ better than many other countries and praised it for defending Paris Climate Accord. “In that, it is assuming a moral leadership that others have abandoned”, he added.

...

As part of the diplomacy efforts, Bishop Sánchez Sorondo visited the country. “What I found was an extraordinary China,” he said. “What people don’t realise is that the central value in China is work, work, work. There’s no other way, fundamentally it is like St Paul said: he who doesn’t work, doesn’t eat.”

China reveals plan to remove ‘foreign influence’ from Catholic Church: http://catholicherald.co.uk/news/2018/06/02/china-reveals-plan-to-remove-foreign-influence-from-catholic-church1/

China, A Fourth Rome?: http://thermidormag.com/china-a-fourth-rome/
As a Chinaman born in the United States, I find myself able to speak to both places and neither. By accidents of fortune, however – or of providence, rather – I have identified more with China even as I have lived my whole life in the West. English is my third language, after Cantonese and Mandarin, even if I use it to express my intellectually most complex thoughts; and though my best of the three in writing, trained by the use of Latin, it is the vehicle of a Chinese soul. So it is in English that for the past year I have memed an idea as unconventional as it is ambitious, unto the Europæans a stumbling-block, and unto the Chinese foolishness: #China4thRome.

This idea I do not attempt to defend rigorously, between various powers’ conflicting claims to carrying on the Roman heritage; neither do I intend to claim that Moscow, which has seen itself as a Third Rome after the original Rome and then Constantinople, is fallen. Instead, I think back to the division of the Roman empire, first under Diocletian’s Tetrarchy and then at the death of Theodosius I, the last ruler of the undivided Roman empire. In the second partition, at the death of Theodosius, Arcadius became emperor of the East, with his capital in Constantinople, and Honorius emperor of the West, with his capital in Milan and then Ravenna. That the Roman empire did not stay uniformly strong under a plurality of emperors is not the point. What is significant about the administrative division of the Roman empire among several emperors is that the idea of Rome can be one even while its administration is diverse.

By divine providence, the Christian religion – and through it, Rome – has spread even through the bourgeois imperialism of the 19th and 20th centuries. Across the world, the civil calendar of common use is that of Rome, reckoned from 1 January; few places has Roman law left wholly untouched. Nevertheless, never have we observed in the world of Roman culture an ethnogenetic pattern like that of the Chinese empire as described by the prologue of Luo Guanzhong’s Romance of the Three Kingdoms 三國演義: ‘The empire, long divided, must unite; long united, must divide. Thus it has ever been.’1 According to classical Chinese cosmology, the phrase rendered the empire is more literally all under heaven 天下, the Chinese œcumene being its ‘all under heaven’ much as a Persian proverb speaks of the old Persian capital of Isfahan: ‘Esfahān nesf-e jahān ast,’ Isfahan is half the world. As sociologist Fei Xiaotong describes it in his 1988 Tanner Lecture ‘Plurality and Unity in the Configuration of the Chinese People’,

...

And this Chinese œcumene has united and divided for centuries, even as those who live in it have recognized a fundamental unity. But Rome, unlike the Chinese empire, has lived on in multiple successor polities, sometimes several at once, without ever coming back together as one empire administered as one. Perhaps something of its character has instead uniquely suited it to being the spirit of a kind of broader world empire. As Dante says in De Monarchia, ‘As the human race, then, has an end, and this end is a means necessary to the universal end of nature, it follows that nature must have the means in view.’ He continues,

If these things are true, there is no doubt but that nature set apart in the world a place and a people for universal sovereignty; otherwise she would be deficient in herself, which is impossible. What was this place, and who this people, moreover, is sufficiently obvious in what has been said above, and in what shall be added further on. They were Rome and her citizens or people. On this subject our Poet [Vergil] has touched very subtly in his sixth book [of the Æneid], where he brings forward Anchises prophesying in these words to Aeneas, father of the Romans: ‘Verily, that others shall beat out the breathing bronze more finely, I grant you; they shall carve the living feature in the marble, plead causes with more eloquence, and trace the movements of the heavens with a rod, and name the rising stars: thine, O Roman, be the care to rule the peoples with authority; be thy arts these, to teach men the way of peace, to show mercy to the subject, and to overcome the proud.’ And the disposition of place he touches upon lightly in the fourth book, when he introduces Jupiter speaking of Aeneas to Mercury in this fashion: ‘Not such a one did his most beautiful mother promise to us, nor for this twice rescue him from Grecian arms; rather was he to be the man to govern Italy teeming with empire and tumultuous with war.’ Proof enough has been given that the Romans were by nature ordained for sovereignty. Therefore the Roman … [more]
org:ngo  trends  foreign-policy  china  asia  hmm  idk  religion  christianity  theos  anomie  meaningness  community  egalitarianism-hierarchy  protestant-catholic  demographics  time-series  government  leadership  nationalism-globalism  org:data  comparison  sinosphere  civic  the-bones  power  great-powers  thucydides  multi  maps  data  visualization  pro-rata  distribution  geography  within-group  wiki  reference  article  news  org:lite  org:biz  islam  buddhism  org:euro  authoritarianism  antidemos  leviathan  regulation  civil-liberty  chart  absolute-relative  org:mag  org:rec  org:anglo  org:foreign  music  culture  gnon  org:popup  🐸  memes(ew)  essay  rhetoric  conquest-empire  flux-stasis  spreading  paradox  analytical-holistic  tradeoffs  solzhenitsyn  spengler  nietzschean  europe  the-great-west-whale  occident  orient  literature  big-peeps  history  medieval  mediterranean  enlightenment-renaissance-restoration-reformation  expansionism  early-modern  society  civilization  world  MENA  capital  capitalism  innovation  race  alien-character  optimat 
january 2018 by nhaliday
Books 2017 | West Hunter
Arabian Sands
The Aryans
The Big Show
The Camel and the Wheel
Civil War on Western Waters
Company Commander
Double-edged Secrets
The Forgotten Soldier
Genes in Conflict
Hive Mind
The horse, the wheel, and language
The Penguin Atlas of Medieval History
Habitable Planets for Man
The genetical theory of natural selection
The Rise of the Greeks
To Lose a Battle
The Jewish War
Tropical Gangsters
The Forgotten Revolution
Egil’s Saga
Shapers
Time Patrol

Russo: https://westhunt.wordpress.com/2017/12/14/books-2017/#comment-98568
west-hunter  scitariat  books  recommendations  list  top-n  confluence  2017  info-foraging  canon  🔬  ideas  s:*  history  mostly-modern  world-war  britain  old-anglo  travel  MENA  frontier  reflection  europe  gallic  war  sapiens  antiquity  archaeology  technology  divergence  the-great-west-whale  transportation  nature  long-short-run  intel  tradecraft  japan  asia  usa  spearhead  garett-jones  hive-mind  economics  broad-econ  giants  fisher  space  iron-age  medieval  the-classics  civilization  judaism  conquest-empire  africa  developing-world  institutions  science  industrial-revolution  the-trenches  wild-ideas  innovation  speedometer  nordic  mediterranean  speculation  fiction  scifi-fantasy  time  encyclopedic  multi  poast  critique  cost-benefit  tradeoffs  quixotic 
december 2017 by nhaliday
What Does a “Normal” Human Genome Look Like? | Science
So, what have our first glimpses of variation in the genomes of generally healthy people taught us? First, balancing selection, the evolutionary process that favors genetic diversification rather than the fixation of a single “best” variant, appears to play a minor role outside the immune system. Local adaptation, which accounts for variation in traits such as pigmentation, dietary specialization, and susceptibility to particular pathogens is also a second-tier player. What is on the top tier? Increasingly, the answer appears to be mutations that are “deleterious” by biochemical or standard evolutionary criteria. These mutations, as has long been appreciated, overwhelmingly make up the most abundant form of nonneutral variation in all genomes. A model for human genetic individuality is emerging in which there actually is a “wild-type” human genome—one in which most genes exist in an evolutionarily optimized form. There just are no “wild-type” humans: We each fall short of this Platonic ideal in our own distinctive ways.
article  essay  org:nat  🌞  bio  biodet  genetics  genomics  mutation  genetic-load  QTL  evolution  sapiens  survey  summary  coding-theory  enhancement  signal-noise  egalitarianism-hierarchy  selection  tradeoffs  immune  recent-selection  perturbation  nibble  ideas  forms-instances 
november 2017 by nhaliday
Bouncing Off the Bottom | West Hunter
Actually going extinct would seem to be a bad thing, but a close call can, in principle, be a good thing.

Pathogens can be a heavy burden on a species, worse than a 50-lb sack of cement. Lifting that burden can have a big effect: we know that many species flourish madly once they escape their typical parasites. That’s often the case with invasive species. It’s also a major strategy in agriculture: crops often do best in a country far away from their place of origin – where the climate is familiar, but most parasites have been left behind. For example, rubber trees originated in South America, but they’re a lot easier to grow in Liberia or Malaysia.

Consider a situation with a really burdensome pathogen – one that specializes in and depends on a single host species. That pathogen has to find new host individuals every so often in order to survive, and in order for that to happen, the host population has to exceed a certain number, usually called the critical community size. That size depends on the parasite’s persistence and mode of propagation: it can vary over a huge range. CCS is something like a quarter of a million for measles, ~300 for chickenpox, surely smaller than that for Epstein-Barr.

A brush with extinction- say from an asteroid strike – might well take a species below the CCS for a number of its pathogens. If those pathogens were limited to that species, they’d go extinct: no more burden. That alone might be enough to generate a rapid recovery from the population bottleneck. Or a single, highly virulent pathogen might cause a population crash that resulted in the extinction of several of that species’s major pathogens – quite possibly including the virulent pathogen itself. It’s a bottleneck in time, rather than one in space as you often see in colonization.

Such positive effects could last a long time – things need not go back to the old normal. The flea-unbitten species might be able to survive and prosper in ecological niches that it couldn’t before. You might see a range expansion. New evolutionary paths could open up. That brush with extinction could be the making of them.

When you add it all up, you begin to wonder if a population crash isn’t just what the doctor ordered. Sure, it wouldn’t be fun to be one of the billions of casualties, but just think how much better off the billions living after the bottleneck will be. Don’t be selfish.
west-hunter  scitariat  ideas  speculation  discussion  parasites-microbiome  spreading  disease  scale  population  density  bio  nature  long-short-run  nihil  equilibrium  death  unintended-consequences  red-queen  tradeoffs  cost-benefit  gedanken 
november 2017 by nhaliday
The Rise and Fall of Cognitive Control - Behavioral Scientist
The results highlight the downsides of controlled processing. Within a population, controlled processing may—rather than ensuring undeterred progress—usher in short-sighted, irrational, and detrimental behavior, ultimately leading to population collapse. This is because the innovations produced by controlled processing benefit everyone, even those who do not act with control. Thus, by making non-controlled agents better off, these innovations erode the initial advantage of controlled behavior. This results in the demise of control and the rise of lack-of-control. In turn, this eventually leads to a return to poor decision making and the breakdown of the welfare-enhancing innovations, possibly accelerated and exacerbated by the presence of the enabling technologies themselves. Our models therefore help to explain societal cycles whereby periods of rationality and forethought are followed by plunges back into irrationality and short-sightedness.

https://static1.squarespace.com/static/51ed234ae4b0867e2385d879/t/595fac998419c208a6d99796/1499442499093/Cyclical-Population-Dynamics.pdf
Psychologists, neuroscientists, and economists often conceptualize decisions as arising from processes that lie along a continuum from automatic (i.e., “hardwired” or overlearned, but relatively inflexible) to controlled (less efficient and effortful, but more flexible). Control is central to human cognition, and plays a key role in our ability to modify the world to suit our needs. Given its advantages, reliance on controlled processing may seem predestined to increase within the population over time. Here, we examine whether this is so by introducing an evolutionary game theoretic model of agents that vary in their use of automatic versus controlled processes, and in which cognitive processing modifies the environment in which the agents interact. We find that, under a wide range of parameters and model assumptions, cycles emerge in which the prevalence of each type of processing in the population oscillates between 2 extremes. Rather than inexorably increasing, the emergence of control often creates conditions that lead to its own demise by allowing automaticity to also flourish, thereby undermining the progress made by the initial emergence of controlled processing. We speculate that this observation may have relevance for understanding similar cycles across human history, and may lend insight into some of the circumstances and challenges currently faced by our species.
econotariat  economics  political-econ  policy  decision-making  behavioral-econ  psychology  cog-psych  cycles  oscillation  unintended-consequences  anthropology  broad-econ  cultural-dynamics  tradeoffs  cost-benefit  rot  dysgenics  study  summary  multi  EGT  dynamical  volo-avolo  self-control  discipline  the-monster  pdf  error  rationality  info-dynamics  bounded-cognition  hive-mind  iq  intelligence  order-disorder  risk  microfoundations  science-anxiety  big-picture  hari-seldon  cybernetics 
july 2017 by nhaliday
Hubris - Wikipedia
Hubris (/ˈhjuːbrɪs/, also hybris, from ancient Greek ὕβρις) describes a personality quality of extreme or foolish pride or dangerous overconfidence.[1] In its ancient Greek context, it typically describes behavior that defies the norms of behavior or challenges the gods, and which in turn brings about the downfall, or nemesis, of the perpetrator of hubris.

...

In ancient Greek, hubris referred to actions that shamed and humiliated the victim for the pleasure or gratification of the abuser.[3] The term had a strong sexual connotation, and the shame reflected upon the perpetrator as well.[4]

Violations of the law against hubris included what might today be termed assault and battery; sexual crimes; or the theft of public or sacred property. Two well-known cases are found in the speeches of Demosthenes, a prominent statesman and orator in ancient Greece. These two examples occurred when first Midias punched Demosthenes in the face in the theatre (Against Midias), and second when (in Against Conon) a defendant allegedly assaulted a man and crowed over the victim. Yet another example of hubris appears in Aeschines' Against Timarchus, where the defendant, Timarchus, is accused of breaking the law of hubris by submitting himself to prostitution and anal intercourse. Aeschines brought this suit against Timarchus to bar him from the rights of political office and his case succeeded.[5]

In ancient Athens, hubris was defined as the use of violence to shame the victim (this sense of hubris could also characterize rape[6]). Aristotle defined hubris as shaming the victim, not because of anything that happened to the committer or might happen to the committer, but merely for that committer's own gratification:

to cause shame to the victim, not in order that anything may happen to you, nor because anything has happened to you, but merely for your own gratification. Hubris is not the requital of past injuries; this is revenge. As for the pleasure in hubris, its cause is this: naive men think that by ill-treating others they make their own superiority the greater.[7][8][9]

Crucial to this definition are the ancient Greek concepts of honour (τιμή, timē) and shame (αἰδώς, aidōs). The concept of honour included not only the exaltation of the one receiving honour, but also the shaming of the one overcome by the act of hubris. This concept of honour is akin to a zero-sum game. Rush Rehm simplifies this definition of hubris to the contemporary concept of "insolence, contempt, and excessive violence".[citation needed]

...

In its modern usage, hubris denotes overconfident pride combined with arrogance.[10] Hubris is often associated with a lack of humility. Sometimes a person's hubris is also associated with ignorance. The accusation of hubris often implies that suffering or punishment will follow, similar to the occasional pairing of hubris and nemesis in Greek mythology. The proverb "pride goeth (goes) before destruction, a haughty spirit before a fall" (from the biblical Book of Proverbs, 16:18) is thought to sum up the modern use of hubris. Hubris is also referred to as "pride that blinds" because it often causes a committer of hubris to act in foolish ways that belie common sense.[11] In other words, the modern definition may be thought of as, "that pride that goes just before the fall."

Examples of hubris are often found in literature, most famously in John Milton's Paradise Lost, 'where Lucifer attempts to force the other angels to worship him, but is cast into hell by God and the innocent angels, and proclaims: "Better to reign in hell than serve in heaven." Victor in Mary Shelley's Frankenstein manifests hubris in his attempt to become a great scientist by creating life through technological means, but eventually regrets this previous desire. Marlowe's play Doctor Faustus portrays the eponymous character as a scholar whose arrogance and pride compel him to sign a deal with the Devil, and retain his haughtiness until his death and damnation, despite the fact that he could easily have repented had he chosen to do so.

One notable example is the Battle of Little Big Horn, as General George Armstrong Custer was apocryphally reputed to have said there: "Where did all those damned Indians come from?"[12]
virtu  humility  things  history  iron-age  mediterranean  the-classics  big-peeps  old-anglo  aristos  wiki  reference  stories  literature  morality  values  alien-character  honor  foreign-lang  language  emotion  courage  wisdom  egalitarianism-hierarchy  eden-heaven  analytical-holistic  tradeoffs  paradox  religion  theos  zero-positive-sum  social-norms  reinforcement  guilt-shame  good-evil  confidence  benevolence  lexical 
june 2017 by nhaliday
Genomic analysis of family data reveals additional genetic effects on intelligence and personality | bioRxiv
methodology:
Using Extended Genealogy to Estimate Components of Heritability for 23 Quantitative and Dichotomous Traits: http://journals.plos.org/plosgenetics/article?id=10.1371/journal.pgen.1003520
Pedigree- and SNP-Associated Genetics and Recent Environment are the Major Contributors to Anthropometric and Cardiometabolic Trait Variation: http://journals.plos.org/plosgenetics/article?id=10.1371/journal.pgen.1005804

Missing Heritability – found?: https://westhunt.wordpress.com/2017/02/09/missing-heritability-found/
There is an interesting new paper out on genetics and IQ. The claim is that they have found the missing heritability – in rare variants, generally different in each family.

Some of the variants, the ones we find with GWAS, are fairly common and fitness-neutral: the variant that slightly increases IQ confers the same fitness (or very close to the same) as the one that slightly decreases IQ – presumably because of other effects it has. If this weren’t the case, it would be impossible for both of the variants to remain common.

The rare variants that affect IQ will generally decrease IQ – and since pleiotropy is the norm, usually they’ll be deleterious in other ways as well. Genetic load.

Happy families are all alike; every unhappy family is unhappy in its own way.: https://westhunt.wordpress.com/2017/06/06/happy-families-are-all-alike-every-unhappy-family-is-unhappy-in-its-own-way/
It now looks as if the majority of the genetic variance in IQ is the product of mutational load, and the same may be true for many psychological traits. To the extent this is the case, a lot of human psychological variation must be non-adaptive. Maybe some personality variation fulfills an evolutionary function, but a lot does not. Being a dumb asshole may be a bug, rather than a feature. More generally, this kind of analysis could show us whether particular low-fitness syndromes, like autism, were ever strategies – I suspect not.

It’s bad new news for medicine and psychiatry, though. It would suggest that what we call a given type of mental illness, like schizophrenia, is really a grab-bag of many different syndromes. The ultimate causes are extremely varied: at best, there may be shared intermediate causal factors. Not good news for drug development: individualized medicine is a threat, not a promise.

see also comment at: https://pinboard.in/u:nhaliday/b:a6ab4034b0d0

https://www.reddit.com/r/slatestarcodex/comments/5sldfa/genomic_analysis_of_family_data_reveals/
So the big implication here is that it's better than I had dared hope - like Yang/Visscher/Hsu have argued, the old GCTA estimate of ~0.3 is indeed a rather loose lower bound on additive genetic variants, and the rest of the missing heritability is just the relatively uncommon additive variants (ie <1% frequency), and so, like Yang demonstrated with height, using much more comprehensive imputation of SNP scores or using whole-genomes will be able to explain almost all of the genetic contribution. In other words, with better imputation panels, we can go back and squeeze out better polygenic scores from old GWASes, new GWASes will be able to reach and break the 0.3 upper bound, and eventually we can feasibly predict 0.5-0.8. Between the expanding sample sizes from biobanks, the still-falling price of whole genomes, the gradual development of better regression methods (informative priors, biological annotation information, networks, genetic correlations), and better imputation, the future of GWAS polygenic scores is bright. Which obviously will be extremely helpful for embryo selection/genome synthesis.

The argument that this supports mutation-selection balance is weaker but plausible. I hope that it's true, because if that's why there is so much genetic variation in intelligence, then that strongly encourages genetic engineering - there is no good reason or Chesterton fence for intelligence variants being non-fixed, it's just that evolution is too slow to purge the constantly-accumulating bad variants. And we can do better.
https://rubenarslan.github.io/generation_scotland_pedigree_gcta/

The surprising implications of familial association in disease risk: https://arxiv.org/abs/1707.00014
https://spottedtoad.wordpress.com/2017/06/09/personalized-medicine-wont-work-but-race-based-medicine-probably-will/
As Greg Cochran has pointed out, this probably isn’t going to work. There are a few genes like BRCA1 (which makes you more likely to get breast and ovarian cancer) that we can detect and might affect treatment, but an awful lot of disease turns out to be just the result of random chance and deleterious mutation. This means that you can’t easily tailor disease treatment to people’s genes, because everybody is fucked up in their own special way. If Johnny is schizophrenic because of 100 random errors in the genes that code for his neurons, and Jack is schizophrenic because of 100 other random errors, there’s very little way to test a drug to work for either of them- they’re the only one in the world, most likely, with that specific pattern of errors. This is, presumably why the incidence of schizophrenia and autism rises in populations when dads get older- more random errors in sperm formation mean more random errors in the baby’s genes, and more things that go wrong down the line.

The looming crisis in human genetics: http://www.economist.com/node/14742737
Some awkward news ahead
- Geoffrey Miller

Human geneticists have reached a private crisis of conscience, and it will become public knowledge in 2010. The crisis has depressing health implications and alarming political ones. In a nutshell: the new genetics will reveal much less than hoped about how to cure disease, and much more than feared about human evolution and inequality, including genetic differences between classes, ethnicities and races.

2009!
study  preprint  bio  biodet  behavioral-gen  GWAS  missing-heritability  QTL  🌞  scaling-up  replication  iq  education  spearhead  sib-study  multi  west-hunter  scitariat  genetic-load  mutation  medicine  meta:medicine  stylized-facts  ratty  unaffiliated  commentary  rhetoric  wonkish  genetics  genomics  race  pop-structure  poast  population-genetics  psychiatry  aphorism  homo-hetero  generalization  scale  state-of-art  ssc  reddit  social  summary  gwern  methodology  personality  britain  anglo  enhancement  roots  s:*  2017  data  visualization  database  let-me-see  bioinformatics  news  org:rec  org:anglo  org:biz  track-record  prediction  identity-politics  pop-diff  recent-selection  westminster  inequality  egalitarianism-hierarchy  high-dimension  applications  dimensionality  ideas  no-go  volo-avolo  magnitude  variance-components  GCTA  tradeoffs  counter-revolution  org:mat  dysgenics  paternal-age  distribution  chart  abortion-contraception-embryo 
june 2017 by nhaliday
Is Pharma Research Worse Than Chance? | Slate Star Codex
Here’s one hypothesis: at the highest level, the brain doesn’t have that many variables to affect, or all the variables are connected. If you smack the brain really really hard in some direction or other, you will probably treat some psychiatric disease. Drugs of abuse are ones that smack the brain really hard in some direction or other. They do something. So find the psychiatric illness that’s treated by smacking the brain in that direction, and you’re good.

Actual carefully-researched psychiatric drugs are exquisitely selected for having few side effects. The goal is something like an SSRI – mild stomach discomfort, some problems having sex, but overall you can be on them forever and barely notice their existence. In the grand scheme of things their side effects are tiny – in most placebo-controlled studies, people have a really hard time telling whether they’re in the experimental or the placebo group.

...

But given that we’re all very excited to learn about ketamine and MDMA, and given that if their original promise survives further testing we will consider them great discoveries, it suggests we chose the wrong part of the tradeoff curve. Or at least it suggests a different way of framing that tradeoff curve. A drug that makes you feel extreme side effects for a few hours – but also has very strong and lasting treatment effects – is better than a drug with few side effects and weaker treatment effects. That suggests a new direction pharmaceutical companies might take: look for the chemicals that have the strongest and wackiest effects on the human mind. Then see if any of them also treat some disease.

I think this is impossible with current incentives. There’s too little risk-tolerance at every stage in the system. But if everyone rallied around the idea, it might be that trying the top hundred craziest things Alexander Shulgin dreamed up on whatever your rat model is would be orders of magnitude more productive than whatever people are doing now.
ratty  yvain  ssc  reflection  psychiatry  medicine  pharma  drugs  error  efficiency  random  meta:medicine  flexibility  outcome-risk  incentives  stagnation  innovation  low-hanging  tradeoffs  realness  perturbation  degrees-of-freedom  volo-avolo  null-result 
june 2017 by nhaliday
Living with Inequality - Reason.com
That's why I propose the creation of the Tenth Commandment Club. The tenth commandment—"You shall not covet"—is a foundation of social peace. The Nobel Laureate economist Vernon Smith noted the tenth commandment along with the eighth (you shall not steal) in his Nobel toast, saying that they "provide the property right foundations for markets, and warned that petty distributional jealousy must not be allowed to destroy" those foundations. If academics, pundits, and columnists would avowedly reject covetousness, would openly reject comparisons between the average (extremely fortunate) American and the average billionaire, would mock people who claimed that frugal billionaires are a systematic threat to modern life, then soon our time could be spent discussing policy issues that really matter.

Enlightenment -> social justice: https://twitter.com/GarettJones/status/866448789825105920
US reconquista: https://twitter.com/AngloRemnant/status/865980569397731329
https://archive.is/SR8OI
envy and psychology textbooks: https://twitter.com/tcjfs/status/887115182257917952

various Twitter threads: https://twitter.com/search?q=GarettJones+inequality

http://www.npr.org/sections/goatsandsoda/2017/09/13/542261863/cash-aid-changed-this-family-s-life-so-why-is-their-government-skeptical

Civilization means saying no to the poor: https://bonald.wordpress.com/2017/11/18/civilization-means-saying-no-to-the-poor/
Although I instinctively dislike him, I do agree with Professor Scott on one point: “exploitation” really is the essence of civilization, whether by exploitation one simply means authority as described by those insensible to its moral force or more simply the refusal of elites to divulge their resources to the poor.

In fact, no human creation of lasting worth could ever be made without a willingness to tell the poor to *** off. If we really listened to the demands of social justice, if we really let compassion be our guide, we could have no art, no music, no science, no religion, no philosophy, no architecture beyond the crudest shelters. The poor are before us, their need perpetually urgent. It is inexcusable for us ever to build a sculpture, a cathedral, a particle accelerator. And the poor, we have it on two good authorities (the other being common sense), will be with us always. What we give for their needs today will have disappeared tomorrow, and they will be hungry again. Imagine if some Savonarola had come to Florence a century or two earlier and convinced the Florentine elite to open their hearts and their wallets to the poor in preference for worldly vanities. All that wealth would have been squandered on the poor and would have disappeared without a trace. Instead, we got the Renaissance.

https://twitter.com/tcjfs/status/904169207293730816
https://archive.is/tYZAi
Reward the lawless; punish the law abiding. Complete inversion which will eventually drive us back to the 3rd world darkness whence we came.

https://twitter.com/tcjfs/status/917492530308112384
https://archive.is/AeXEs
This idea that a group is only honorable in virtue of their victimization is such a pernicious one.
for efficiency, just have "Victims of WASPs Day." A kind of All Victims' Day. Otherwise U.S. calendar will be nothing but days of grievance.
Bonald had a good bit on this (of course).
https://bonald.wordpress.com/2016/08/05/catholics-must-resist-cosmopolitan-universalism/
Steve King is supposedly stupid for claiming that Western Civilization is second to none. One might have supposed that Catholics would take some pride as Catholics in Western civilization, a thing that was in no small part our creation. Instead, the only history American Catholics are to remember is being poor and poorly regarded recent immigrants in America.

https://twitter.com/AngloRemnant/status/917612415243706368
https://archive.is/NDjwK
Don't even bother with the rat race if you value big family. I won the race, & would've been better off as a dentist in Peoria.
.. College prof in Athens, OH. Anesthesiologist in Knoxville. State govt bureaucrat in Helena.
.. This is the formula: Middle America + regulatory capture white-collar job. anyone attempting real work in 2017 america is a RETARD.
.. Also unclear is why anyone in the US would get married. knock your girl up and put that litter on Welfare.
You: keep 50% of your earnings after taxes. 25% is eaten by cost of living. save the last 25%, hope our bankrupt gov doesn't expropriate l8r
The main difference in this country between welfare and 7-figure income is the quality of your kitchen cabinets.

wtf: https://www.bls.gov/ooh/healthcare/dentists.htm
$159,770 per year
$76.81 per hour

18% (Much faster than average)

http://study.com/how_long_does_it_take_to_be_a_dentist.html
Admission into dental school is highly competitive. Along with undergraduate performance, students are evaluated for their Dental Admissions Test (DAT) scores. Students have the opportunity to take this test before graduating college. After gaining admission into dental school, students can go on to complete four years of full-time study to earn the Doctor of Dental Surgery or Doctor of Dental Medicine. Students typically spend the first two years learning general and dental science in classroom and laboratory settings. They may take courses like oral anatomy, histology and pathology. In the final years, dental students participate in clinical practicums, gaining supervised, hands-on experience in dental clinics.

https://twitter.com/AngloRemnant/status/985935089250062337
https://archive.is/yIXfk
https://archive.is/Qscq7
https://archive.is/IQQhU
Career ideas for the minimally ambitious dissident who wants to coast, shitpost, & live well:
- econ phd -> business school prof
- dentistry
- 2 years of banking/consulting -> F500 corp dev or strategy
- gov't bureaucrat in a state capital
--
Bad career ideas, for contrast:
- law
- humanities prof
- IT
- anything 'creative'

[ed.: Personally, I'd also throw in 'actuary' (though keep in mind ~20% risk of automation).]

https://twitter.com/DividualsTweet/status/1143214978142527488
https://archive.is/yzgVA
Best life advice: try getting a boring, not very high status but decently paying job. Like programming payroll software. SJWs are uninterested.
news  org:mag  rhetoric  contrarianism  econotariat  garett-jones  economics  growth-econ  piketty  inequality  winner-take-all  morality  values  critique  capital  capitalism  class  envy  property-rights  justice  religion  christianity  theos  aphorism  egalitarianism-hierarchy  randy-ayndy  aristos  farmers-and-foragers  redistribution  right-wing  peace-violence  🎩  multi  twitter  social  discussion  reflection  ideology  democracy  civil-liberty  welfare-state  history  early-modern  mostly-modern  politics  polisci  government  enlightenment-renaissance-restoration-reformation  counter-revolution  unaffiliated  gnon  modernity  commentary  psychology  cog-psych  social-psych  academia  westminster  social-science  biases  bootstraps  search  left-wing  discrimination  order-disorder  civilization  current-events  race  identity-politics  incentives  law  leviathan  social-norms  rot  fertility  strategy  planning  hmm  long-term  career  s-factor  regulation  managerial-state  dental  supply-demand  progression  org:gov 
june 2017 by nhaliday
Columbia | West Hunter
I remember this all pretty well: I’d still welcome the chance to strangle the key NASA players. I remember how they forbade lower-level people at NASA to talk to the Air Force and ask for recon assets – how they peddled ass-covering bullshit about how nothing could possibly have been done. A lie.

One of the dogs that didn’t bark was the fact that NASA acted as if relevant DOD assets did not exist. For example, if you could have put a package into a matching low orbit with those consumables in shortest supply, say CO2 absorbers and/or cheeseburgers, there would would have been considerably more time available to assemble a rescue mission. For some forgotten reason the Air Force has hundreds of missiles (Minuteman-IIIs) that can be launched on a moment’s notice – it wouldn’t be that hard to replace a warhead with a consumables package. A moment’s thought tells you that some such capability is likely to exist – one intended to rapidly replaced destroyed recon sats, for example. Certainly worth considering, worth checking, before giving up on the crew. Just as the Air Force has recon assets that could have been most helpful in diagnosing the state of the ship – but NASA would rather die than expose itself to Air Force cooties. Not that the Air Force doesn’t have cooties, but NASA has quite a few of its own already.

If we ever had a real reason for manned space travel – I can imagine some – the first thing you’d need to do is kill everyone in the NASA manned space program. JPL you could keep.

usefulness of LEO:
https://westhunt.wordpress.com/2016/02/01/columbia/#comment-75883
https://westhunt.wordpress.com/2016/02/01/columbia/#comment-75891

hmm:
Book Review: Whitey On the Moon: http://www.henrydampier.com/2015/02/book-review-whitey-moon/

https://twitter.com/AngloRemnant/status/960997033053171712
https://archive.is/DTyGN
Homicidal stat of the day: The US spends more in 1 year of providing Medicaid to hispanics than the entire inflation-adjusted cost of the Apollo program.
west-hunter  scitariat  speculation  rant  stories  error  management  space  the-trenches  usa  government  ideas  discussion  multi  poast  dirty-hands  the-world-is-just-atoms  cost-benefit  track-record  gnon  right-wing  books  history  mostly-modern  cold-war  rot  institutions  race  africa  identity-politics  diversity  ability-competence  twitter  social  data  analysis  backup  🐸  monetary-fiscal  money  scale  counter-revolution  nascent-state  attaq  healthcare  redistribution  welfare-state  civilization  gibbon  vampire-squid  egalitarianism-hierarchy  tradeoffs  virtu 
may 2017 by nhaliday
Interview: Mostly Sealing Wax | West Hunter
https://soundcloud.com/user-519115521/greg-cochran-part-2
https://medium.com/@houstoneuler/annotating-part-2-of-the-greg-cochran-interview-with-james-miller-678ba33f74fc

- conformity and Google, defense and spying (China knows prob almost all our "secrets")
- in the past you could just find new things faster than people could reverse-engineer. part of the problem is that innovation is slowing down today (part of the reason for convergence by China/developing world).
- introgression from archaics of various kinds
- mutational load and IQ, wrath of khan neanderthal
- trade and antiquity (not that useful besides ideas tbh), Roman empire, disease, smallpox
- spices needed to be grown elsewhere, but besides that...
- analogy: caste system in India (why no Brahmin car repairmen?), slavery in Greco-Roman times, more water mills in medieval times (rivers better in north, but still could have done it), new elite not liking getting hands dirty, low status of engineers, rise of finance
- crookery in finance, hedge fund edge might be substantially insider trading
- long-term wisdom of moving all manufacturing to China...?
- economic myopia: British financialization before WW1 vis-a-vis Germany. North vs. South and cotton/industry, camels in Middle East vs. wagons in Europe
- Western medicine easier to convert to science than Eastern, pseudoscience and wrong theories better than bag of recipes
- Greeks definitely knew some things that were lost (eg, line in Pliny makes reference to combinatorics calculation rediscovered by German dude much later. think he's referring to Catalan numbers?), Lucio Russo book
- Indo-Europeans, Western Europe, Amerindians, India, British Isles, gender, disease, and conquest
- no farming (Dark Age), then why were people still farming on Shetland Islands north of Scotland?
- "symbolic" walls, bodies with arrows
- family stuff, children learning, talking dog, memory and aging
- Chinese/Japanese writing difficulty and children learning to read
- Hatfield-McCoy feud: the McCoy family was actually a case study in a neurological journal. they had anger management issues because of cancers of their adrenal gland (!!).

the Chinese know...: https://macropolo.org/casting-off-real-beijings-cryptic-warnings-finance-taking-economy/
Over the last couple of years, a cryptic idiom has crept into the way China’s top leaders talk about risks in the country’s financial system: tuo shi xiang xu (脱实向虚), which loosely translates as “casting off the real for the empty.” Premier Li Keqiang warned against it at his press conference at the end of the 2016 National People’s Congress (NPC). At this year’s NPC, Li inserted this very expression into his annual work report. And in April, while on an inspection tour of Guangxi, President Xi Jinping used the term, saying that China must “unceasingly promote industrial modernization, raise the level of manufacturing, and not allow the real to be cast off for the empty.”

Such an odd turn of phrase is easy to overlook, but it belies concerns about a significant shift in the way that China’s economy works. What Xi and Li were warning against is typically called financialization in developed economies. It’s when “real” companies—industrial firms, manufacturers, utility companies, property developers, and anyone else that produces a tangible product or service—take their money and, rather than put it back into their businesses, invest it in “empty”, or speculative, assets. It occurs when the returns on financial investments outstrip those in the real economy, leading to a disproportionate amount of money being routed into the financial system.

https://twitter.com/gcochran99/status/1160589827651203073
https://archive.is/Yzjyv
Bad day for Lehman Bros.
--
Good day for everyone else, then.
west-hunter  interview  audio  podcast  econotariat  cracker-econ  westminster  culture-war  polarization  tech  sv  google  info-dynamics  business  multi  military  security  scitariat  intel  error  government  defense  critique  rant  race  clown-world  patho-altruism  history  mostly-modern  cold-war  russia  technology  innovation  stagnation  being-right  archaics  gene-flow  sapiens  genetics  the-trenches  thinking  sequential  similarity  genomics  bioinformatics  explanation  europe  asia  china  migration  evolution  recent-selection  immune  atmosphere  latin-america  ideas  sky  developing-world  embodied  africa  MENA  genetic-load  unintended-consequences  iq  enhancement  aDNA  gedanken  mutation  QTL  missing-heritability  tradeoffs  behavioral-gen  biodet  iron-age  mediterranean  the-classics  trade  gibbon  disease  parasites-microbiome  demographics  population  urban  transportation  efficiency  cost-benefit  india  agriculture  impact  status  class  elite  vampire-squid  analogy  finance  higher-ed  trends  rot  zeitgeist  🔬  hsu  stories  aphorism  crooked  realne 
may 2017 by nhaliday
None So Blind | West Hunter
There have been several articles in the literature claiming that the gene frequency of the 35delG allele of connexin-26, the most common allele causing deafness in Europeans, has doubled in the past 200 years, as a result of relaxed selection and assortative mating over that period.

That’s fucking ridiculous. I see people referencing this in journal articles and books. It’s mentioned in OMIM. But it’s pure nonsense.

https://westhunt.wordpress.com/2013/03/05/none-so-blind/#comment-10483
The only way you’re going to see such a high frequency of an effectively lethal recessive in a continental population is if it conferred a reproductive advantage in heterozygotes. The required advantage must have been as large as its gene frequency, something around 1-2%.

So it’s like sickle-cell.

Now, if you decreased the bad reproductive consequences of deafness, what would you expect to happen? Gradual increase, at around 1 or 2% a generation, if the carrier advantage held – but it probably didn’t. It was probably a defense against some infectious disease, and those have become much less important. If there was no longer any carrier advantage, the frequency wouldn’t change at all.

In order to double in 200 years, you would need a carrier advantage > 9%.

Assortative mating,deaf people marrying other deaf people, would not make much difference. Even if deaf people substantially out-reproduced normals, which they don’t, only ~1-2% of the copies of 35delG reside in deaf people.
west-hunter  scitariat  rant  critique  thinking  realness  being-right  clarity  evolution  genetics  population-genetics  recent-selection  null-result  sapiens  tradeoffs  disease  parasites-microbiome  embodied  multi  poast  ideas  grokkability-clarity 
may 2017 by nhaliday
Unenumerated: Why the industrial revolution?
unaffiliated  szabo  history  early-modern  spearhead  gregory-clark  roots  industrial-revolution  divergence  culture  society  anthropology  age-of-discovery  developing-world  protestant-catholic  embedded-cognition  commentary  chart  multi  institutions  agriculture  europe  asia  the-great-west-whale  britain  anglosphere  values  china  hanson  gwern  econotariat  marginal-rev  debate  economics  growth-econ  speculation  sinosphere  oceans  capital  capitalism  🎩  transportation  law  iron-age  mediterranean  the-classics  broad-econ  pseudoE  cultural-dynamics  galor-like  tradeoffs  parenting  developmental  life-history  malthus  zeitgeist  wealth-of-nations  enlightenment-renaissance-restoration-reformation  property-rights  conquest-empire  modernity  political-econ  microfoundations  ideas  network-structure  meta:reading  writing 
may 2017 by nhaliday
The Roman State and Genetic Pacification - Peter Frost, 2010
- Table 1 is a good summary, but various interesting tidbits throughout
main points:
- latrones reminds me of bandit-states, Big Men in anthropology, and Rome's Indo-European past
- started having trouble recruiting soldiers, population less martial
- Church opposition to State violence, preferred to 'convert enemies by prayer'
- a Christian could use violence 'only to defend others and not for self-defense'
- Altar of Victory was more metaphorical than idolatrous, makes its removal even more egregious

http://evoandproud.blogspot.com/2010/07/roman-state-and-genetic-pacification.html

should read:
BANDITS IN THE ROMAN EMPIRE: http://sci-hub.tw/http://academic.oup.com/past/article-abstract/105/1/3/1442375/BANDITS-IN-THE-ROMAN-EMPIRE
Bandits in the Roman Empire: Myth and reality: https://historicalunderbelly.files.wordpress.com/2012/12/thoma-grunewald-bandits-in-the-roman-empire-myth-and-reality-2004.pdf

What Difference Did Christianity Make?: http://sci-hub.tw/https://www.jstor.org/stable/4435970
Author(s): Ramsay Mac Mullen

The extent of this impact I test in five areas. The first two have to do with domestic relations: sexual norms and slavery. The latter three have to do with matters in which public authorities were more involved: gladiatorial shows, judicial penalties, and corruption.

Clark/Frost Domestication: https://westhunt.wordpress.com/2013/05/14/clarkfrost-domestication/
Thinking about the response of the pacified and submission Roman population to barbarian invaders immediately brings to mind the response of contemporary North Americans and Atlantic Europeans to barbarian invaders. It reads just the same: “welcome new neighbor!”

What about the Eastern empire? They kept the barbarians out for a few centuries longer in the European half, but accounts of the loss of the Asian provinces show the Clark/Frost pattern, a pacified submissive population hardly contesting the invasion of Islam (Jenkins 2008, 2010). The new neighbors simply walked in and took over. The downfall of the Western Roman empire reads much like the downfall of the Asian and North African parts of the empire. It is certainly no accident that the Asian provinces were the heartland of Christianity.

This all brings up an interesting question: what happened in East Asia over the same period? No one to my knowledge has traced parallels with the European and Roman experience in Japan or China. Is the different East Asian trajectory related to the East Asian reluctance to roll over, wag their tails, and welcome new barbarian neighbors?

gwern in da comments
“empires domesticate their people”
Greg said in our book something like “for the same reason that farmers castrate their bulls”
study  evopsych  sociology  biodet  sapiens  recent-selection  history  iron-age  mediterranean  the-classics  gibbon  religion  christianity  war  order-disorder  nihil  leviathan  domestication  gnon  lived-experience  roots  speculation  theos  madisonian  cultural-dynamics  behavioral-gen  zeitgeist  great-powers  peace-violence  us-them  hate  conquest-empire  multi  broad-econ  piracy  pdf  microfoundations  alien-character  prejudice  rot  variance-components  spearhead  gregory-clark  west-hunter  scitariat  north-weingast-like  government  institutions  foreign-lang  language  property-rights  books  gavisti  pop-diff  martial  prudence  self-interest  patho-altruism  anthropology  honor  unintended-consequences  biophysical-econ  gene-flow  status  migration  demographics  population  scale  emotion  self-control  environment  universalism-particularism  homo-hetero  egalitarianism-hierarchy  justice  morality  philosophy  courage  agri-mindset  ideas  explanans  feudal  tradeoffs  sex  sexuality  social-norms  corruption  crooked 
may 2017 by nhaliday
Readings: The Gods of the Copybook Headings
When the Cambrian measures were forming, They promised perpetual peace.
They swore, if we gave them our weapons, that the wars of the tribes would cease.
But when we disarmed They sold us and delivered us bound to our foe,
And the Gods of the Copybook Headings said: "Stick to the Devil you know."

On the first Feminian Sandstones we were promised the Fuller Life
(Which started by loving our neighbour and ended by loving his wife)
Till our women had no more children and the men lost reason and faith,
And the Gods of the Copybook Headings said: "The Wages of Sin is Death."

In the Carboniferous Epoch we were promised abundance for all,
By robbing selected Peter to pay for collective Paul;
But, though we had plenty of money, there was nothing our money could buy,
And the Gods of the Copybook Headings said: "If you don't work you die."

Then the Gods of the Market tumbled, and their smooth-tongued wizards withdrew
And the hearts of the meanest were humbled and began to believe it was true
That All is not Gold that Glitters, and Two and Two make Four —
And the Gods of the Copybook Headings limped up to explain it once more.

. . . . . . . . . . . . . . . . . .

As it will be in the future, it was at the birth of Man —
There are only four things certain since Social Progress began: —
That the Dog returns to his Vomit and the Sow returns to her Mire,
And the burnt Fool's bandaged finger goes wabbling back to the Fire;

And that after this is accomplished, and the brave new world begins
When all men are paid for existing and no man must pay for his sins,
As surely as Water will wet us, as surely as Fire will burn,
The Gods of the Copybook Headings with terror and slaughter return!
gnon  isteveish  commentary  big-peeps  literature  poetry  values  virtu  britain  anglosphere  optimate  aristos  org:junk  prudence  paleocon  old-anglo  albion  hate  darwinian  tradition  pre-ww2  prejudice  morality  gender  sex  sexuality  fertility  demographic-transition  rot  aphorism  communism  labor  egalitarianism-hierarchy  no-go  volo-avolo  war  peace-violence  tribalism  universalism-particularism  us-them  life-history  capitalism  redistribution  flux-stasis  reason  pessimism  markets  unintended-consequences  religion  christianity  theos  nascent-state  envy  civil-liberty  sanctity-degradation  yarvin  degrees-of-freedom  civilization  paying-rent  realness  truth  westminster  duty  responsibility  cynicism-idealism  tradeoffs  s:**  new-religion  deep-materialism  2018  the-basilisk  order-disorder  eden-heaven  janus  utopia-dystopia  love-hate  afterlife  judgement 
april 2017 by nhaliday
« earlier      
per page:    204080120160

bundles : abstractpatternsvague

related tags

-_-  2016-election  :/  ability-competence  abortion-contraception-embryo  absolute-relative  abstraction  academia  accelerationism  accuracy  acm  acmtariat  aDNA  advanced  adversarial  advertising  advice  aesthetics  africa  afterlife  age-generation  age-of-discovery  aging  agri-mindset  agriculture  ai  ai-control  akrasia  albion  alesina  algorithms  alien-character  alignment  allodium  alt-inst  altruism  amazon  analogy  analysis  analytical-holistic  anarcho-tyranny  anglo  anglosphere  anomie  anonymity  anthropology  antidemos  antiquity  aphorism  api  apollonian-dionysian  apple  applicability-prereqs  applications  arbitrage  archaeology  archaics  aristos  arms  arrows  art  article  ascetic  asia  assembly  atmosphere  attaq  attention  audio  authoritarianism  autism  automata-languages  automation  aversion  axelrod  axioms  backup  bangbang  barons  begin-middle-end  behavioral-econ  behavioral-gen  being-becoming  being-right  benchmarks  benevolence  berkeley  best-practices  bias-variance  biases  big-list  big-peeps  big-picture  big-surf  bio  biodet  bioinformatics  biophysical-econ  biotech  bits  blog  blowhards  books  bootstraps  borjas  bostrom  bounded-cognition  brain-scan  branches  brands  britain  broad-econ  browser  buddhism  build-packaging  business  business-models  c(pp)  c:*  c:**  caching  california  canada  cancer  candidate-gene  canon  capital  capitalism  career  carmack  cartoons  CAS  causation  censorship  certificates-recognition  characterization  charity  chart  cheatsheet  checking  checklists  chemistry  china  christianity  civic  civil-liberty  civilization  cjones-like  clarity  class  class-warfare  classic  classification  clever-rats  client-server  climate-change  clinton  cliometrics  clown-world  coalitions  coarse-fine  cocktail  code-dive  code-organizing  coding-theory  cog-psych  cohesion  cold-war  collaboration  combo-optimization  comics  coming-apart  commentary  common-case  communication  communism  community  comparison  compensation  competition  compilers  complement-substitute  complex-systems  complexity  composition-decomposition  computation  computer-memory  computer-vision  concept  conceptual-vocab  concrete  concurrency  confidence  confluence  confounding  conquest-empire  constraint-satisfaction  consumerism  context  contracts  contradiction  contrarianism  control  convexity-curvature  cooking  cool  cooperate-defect  coordination  core-rats  corporation  correctness  correlation  corruption  cost-benefit  counter-revolution  counterfactual  coupling-cohesion  courage  course  cracker-econ  cracker-prog  creative  crime  criminal-justice  criminology  CRISPR  critique  crooked  crosstab  crux  cs  cultural-dynamics  culture  culture-war  curiosity  current-events  cybernetics  cycles  cynicism-idealism  dan-luu  dark-arts  darwinian  data  data-science  data-structures  database  dbs  death  debate  debt  debugging  decentralized  decision-making  deep-learning  deep-materialism  deepgoog  defense  definite-planning  definition  degrees-of-freedom  dementia  democracy  demographic-transition  demographics  dennett  density  dental  descriptive  design  desktop  detail-architecture  developing-world  developmental  devops  devtools  diet  differential  dignity  dimensionality  diogenes  direct-indirect  direction  dirty-hands  discipline  discovery  discrete  discrimination  discussion  disease  distributed  distribution  divergence  diversity  documentation  domestication  dominant-minority  dotnet  douthatish  draft  drama  driving  drugs  DSL  duality  duplication  duty  dynamic  dynamical  dysgenics  early-modern  earth  eastern-europe  ecology  econ-metrics  econ-productivity  econometrics  economics  econotariat  ecosystem  eden  eden-heaven  editors  education  EEA  effect-size  effective-altruism  efficiency  egalitarianism-hierarchy  EGT  einstein  elections  electromag  elegance  elite  email  embedded-cognition  embodied  embodied-pack  embodied-street-fighting  emergent  emotion  empirical  ems  encyclopedic  end-times  endo-exo  endocrine  endogenous-exogenous  ends-means  endurance  energy-resources  engineering  enhancement  enlightenment-renaissance-restoration-reformation  ensembles  entertainment  entrepreneurialism  entropy-like  environment  environmental-effects  envy  epidemiology  epigenetics  epistemic  equilibrium  erdos  ergodic  erlang  error  error-handling  essay  essence-existence  estimate  ethical-algorithms  ethics  ethnocentrism  EU  europe  events  evidence-based  evolution  evopsych  examples  exit-voice  exocortex  expansionism  expectancy  experiment  expert  expert-experience  explanans  explanation  exploration-exploitation  exploratory  explore-exploit  exposition  expression-survival  externalities  extra-introversion  extrema  facebook  faq  farmers-and-foragers  fashun  FDA  features  fedja  fertility  feudal  feynman  fiction  field-study  fighting  finance  fisher  fitness  flexibility  fluid  flux-stasis  flynn  focus  food  foreign-lang  foreign-policy  form-design  formal-methods  formal-values  forms-instances  frameworks  free  free-riding  frequency  frontend  frontier  functional  futurism  gallic  galor-like  galton  game-theory  games  garett-jones  gavisti  gbooks  GCTA  gedanken  gender  gender-diff  gene-flow  general-survey  generalization  genetic-correlation  genetic-load  genetics  genomics  geoengineering  geography  geometry  geopolitics  germanic  giants  gibbon  gilens-page  git  github  gnon  gnosis-logos  gnu  gnxp  god-man-beast-victim  golang  good-evil  google  gotchas  government  gowers  grad-school  gradient-descent  graph-theory  graphics  graphs  great-powers  gregory-clark  grokkability  grokkability-clarity  ground-up  group-level  group-selection  growth  growth-econ  GT-101  gtd  guessing  guide  guilt-shame  GWAS  gwern  h2o  haidt  hamming  hanson  hard-tech  hardware  hari-seldon  harvard  haskell  hate  health  healthcare  heavy-industry  heavyweights  henrich  heterodox  heuristic  hi-order-bits  hidden-motives  high-dimension  high-variance  higher-ed  history  hive-mind  hmm  hn  homepage  homo-hetero  honor  housing  howto  hsu  huge-data-the-biggest  human-bean  human-capital  human-ml  human-study  humanity  humility  huntington  hypochondria  hypocrisy  ide  ideas  identity-politics  ideology  idk  illusion  immune  impact  impetus  impro  incentives  increase-decrease  india  individualism-collectivism  industrial-org  industrial-revolution  inequality  inference  info-dynamics  info-econ  info-foraging  infographic  information-theory  inhibition  init  innovation  input-output  insight  institutions  integration-extension  integrity  intel  intellectual-property  intelligence  interdisciplinary  interests  interface  interface-compatibility  internet  interpretability  interpretation  intervention  interview  intricacy  intuition  invariance  investing  iq  iran  iron-age  is-ought  islam  israel  isteveish  iteration-recursion  janus  japan  jargon  javascript  jobs  journos-pundits  judaism  judgement  julia  justice  jvm  kinship  knowledge  korea  kumbaya-kult  labor  land  language  large-factor  latency-throughput  latin-america  lattice  law  leadership  learning  lecture-notes  lee-kuan-yew  left-wing  legacy  len:long  len:short  lens  lesswrong  let-me-see  letters  leviathan  lexical  libraries  life-history  lifts-projections  limits  linguistics  links  linux  lisp  list  literature  lived-experience  llvm  local-global  logic  lol  long-short-run  long-term  longevity  longitudinal  love-hate  low-hanging  lower-bounds  machiavelli  machine-learning  macro  madisonian  magnitude  malaise  malthus  management  managerial-state  manifolds  map-territory  maps  marginal  marginal-rev  market-failure  market-power  markets  martial  matching  math  math.CA  math.CO  math.DS  math.MG  math.NT  mathtariat  matrix-factorization  maxim-gun  meaningness  measure  measurement  media  medicine  medieval  mediterranean  memes(ew)  memory-management  MENA  meta-analysis  meta:math  meta:medicine  meta:prediction  meta:reading  meta:research  meta:rhetoric  meta:science  meta:war  metabolic  metabuch  metal-to-virtual  metameta  methodology  metrics  micro  microbiz  microfoundations  microsoft  migrant-crisis  migration  military  minimalism  minimum-viable  miri-cfar  missing-heritability  mobile  mobility  model-organism  models  modernity  mokyr-allen-mccloskey  moloch  moments  monetary-fiscal  money  mooc  morality  mostly-modern  move-fast-(and-break-things)  msr  multi  multiplicative  music  musk  mutation  mystic  myth  n-factor  narrative  nascent-state  nationalism-globalism  natural-experiment  nature  near-far  neocons  network-structure  networking  neuro  neuro-nitgrit  neurons  new-religion  news  nibble  nietzschean  nihil  nitty-gritty  nl-and-so-can-you  nlp  no-go  noble-lie  noblesse-oblige  noise-structure  nonlinearity  nordic  north-weingast-like  northeast  novelty  nuclear  null-result  number  numerics  nutrition  nyc  obama  objektbuch  ocaml-sml  occam  occident  oceans  offense-defense  old-anglo  oly  oop  open-closed  open-problems  operational  optimate  optimism  optimization  order-disorder  orders  org:anglo  org:biz  org:bleg  org:com  org:data  org:econlib  org:edu  org:euro  org:foreign  org:gov  org:health  org:junk  org:lite  org:local  org:mag  org:mat  org:med  org:nat  org:ngo  org:popup  org:rec  org:sci  organization  organizing  orient  orwellian  os  oscillation  oss  osx  other-xtian  outcome-risk  outliers  overflow  p:whenever  paganism  paleocon  papers  parable  paradox  parallax  parasites-microbiome  parenting  pareto  parsimony  paternal-age  path-dependence  patho-altruism  patience  paul-romer  paulg  paying-rent  PCP  pdf  peace-violence  pennsylvania  people  performance  personal-finance  personality  perturbation  pessimism  phalanges  pharma  phd  philosophy  phys-energy  physics  pic  piketty  pinker  piracy  planning  plots  pls  plt  poast  podcast  poetry  polanyi-marx  polarization  policy  polis  polisci  political-econ  politics  poll  polynomials  pop-diff  pop-structure  population  population-genetics  populism  postmortem  postrat  power  power-law  practice  pragmatic  pre-2013  pre-ww2  prediction  preference-falsification  prejudice  prepping  preprint  presentation  primitivism  princeton  prioritizing  privacy  pro-rata  probability  problem-solving  product-management  productivity  prof  programming  progression  project  proof-systems  proofs  propaganda  properties  property-rights  protestant-catholic  protocol-metadata  prudence  pseudoE  psych-architecture  psychiatry  psycho-atoms  psychology  psychometrics  public-goodish  public-health  putnam-like  python  q-n-a  qra  QTL  quality  quantitative-qualitative  quantum  questions  quixotic  quiz  quora  quotes  r-lang  race  rand-approx  random  randy-ayndy  ranking  rant  rat-pack  rationality  ratty  realness  realpolitik  reason  recent-selection  recommendations  recruiting  red-queen  reddit  redistribution  reduction  reference  reflection  regional-scatter-plots  regression-to-mean  regularizer  regulation  reinforcement  religion  rent-seeking  replication  repo  reputation  research  research-program  resources-effects  responsibility  retention  review  revolution  rhetoric  rhythm  right-wing  rigidity  rigor  risk  ritual  robotics  robust  rock  roots  rot  rsc  running  russia  rust  s-factor  s:*  s:**  s:***  saas  safety  sales  sampling-bias  sanctity-degradation  sapiens  scala  scale  scaling-tech  scaling-up  scholar  scholar-pack  sci-comp  science  science-anxiety  scifi-fantasy  scitariat  SDP  search  securities  security  selection  self-control  self-interest  sequential  serene  sex  sexuality  shakespeare  shannon  shift  shipping  short-circuit  sib-study  SIGGRAPH  signal-noise  signaling  similarity  simler  sinosphere  skeleton  skunkworks  sky  slides  slippery-slope  smoothness  social  social-capital  social-choice  social-norms  social-psych  social-science  social-structure  sociality  society  sociology  socs-and-mops  soft-question  software  solzhenitsyn  space  span-cover  spatial  speaking  spearhead  speculation  speed  speedometer  spengler  spock  sports  spreading  ssc  stackex  stagnation  stamina  stanford  startups  state  state-of-art  statesmen  static-dynamic  stats  status  stereotypes  stochastic-processes  stock-flow  stoic