nhaliday + ioannidis   20

No, science’s reproducibility problem is not limited to psychology - The Washington Post
But now then: Are psychology experiments more likely than, say, chemistry experiments or physics experiments to have issues with reproducibility? Ioannidis told me yes, probably so.

“I think on average physics and chemistry would do better. I don’t know how much better," he said.

Maybe someone should try to constrain the differences between the physical sciences and the social sciences. Perhaps physics and chemistry will do their own version of the reproducibility study?
news  org:rec  ioannidis  replication  science  meta:science  social-science  psychology  social-psych 
september 2017 by nhaliday
Meta-assessment of bias in science
Science is said to be suffering a reproducibility crisis caused by many biases. How common are these problems, across the wide diversity of research fields? We probed for multiple bias-related patterns in a large random sample of meta-analyses taken from all disciplines. The magnitude of these biases varied widely across fields and was on average relatively small. However, we consistently observed that small, early, highly cited studies published in peer-reviewed journals were likely to overestimate effects. We found little evidence that these biases were related to scientific productivity, and we found no difference between biases in male and female researchers. However, a scientist’s early-career status, isolation, and lack of scientific integrity might be significant risk factors for producing unreliable results.
study  academia  science  meta:science  metabuch  stylized-facts  ioannidis  replication  error  incentives  integrity  trends  social-science  meta-analysis  🔬  hypothesis-testing  effect-size  usa  biases  org:nat  info-dynamics 
march 2017 by nhaliday
Information Processing: Is science self-correcting?
A toy model of the dynamics of scientific research, with probability distributions for accuracy of experimental results, mechanisms for updating of beliefs by individual scientists, crowd behavior, bounded cognition, etc. can easily exhibit parameter regions where progress is limited (one could even find equilibria in which most beliefs held by individual scientists are false!). Obviously the complexity of the systems under study and the quality of human capital in a particular field are important determinants of the rate of progress and its character.
hsu  scitariat  ioannidis  science  meta:science  error  commentary  physics  limits  oscillation  models  equilibrium  bounded-cognition  complex-systems  being-right  info-dynamics  the-trenches  truth 
january 2017 by nhaliday
The Mass Production of Redundant, Misleading, and Conflicted Systematic Reviews and Meta-analyses - IOANNIDIS - 2016 - The Milbank Quarterly - Wiley Online Library
Currently, _probably more systematic reviews of trials than new randomized trials are published annually_. Most topics addressed by meta-analyses of randomized trials have overlapping, redundant meta-analyses; same-topic meta-analyses may exceed 20 sometimes. Some fields produce massive numbers of meta-analyses; for example, 185 meta-analyses of antidepressants for depression were published between 2007 and 2014. These meta-analyses are often produced either by industry employees or by authors with industry ties and results are aligned with sponsor interests. _China has rapidly become the most prolific producer of English-language, PubMed-indexed meta-analyses_. The most massive presence of Chinese meta-analyses is on genetic associations (63% of global production in 2014), where almost all results are misleading since they combine fragmented information from mostly abandoned era of candidate genes. Furthermore, many contracting companies working on evidence synthesis receive industry contracts to produce meta-analyses, many of which probably remain unpublished. Many other meta-analyses have serious flaws. Of the remaining, most have weak or insufficient evidence to inform decision making. Few systematic reviews and meta-analyses are both non-misleading and useful.
study  ioannidis  science  medicine  replication  methodology  meta:science  critique  evidence-based  meta-analysis  china  asia  genetics  anomie  cochrane  candidate-gene  info-dynamics  sinosphere 
january 2017 by nhaliday
WHAT'S TO KNOW ABOUT THE CREDIBILITY OF EMPIRICAL ECONOMICS? - Ioannidis - 2013 - Journal of Economic Surveys - Wiley Online Library
Abstract. The scientific credibility of economics is itself a scientific question that can be addressed with both theoretical speculations and empirical data. In this review, we examine the major parameters that are expected to affect the credibility of empirical economics: sample size, magnitude of pursued effects, number and pre-selection of tested relationships, flexibility and lack of standardization in designs, definitions, outcomes and analyses, financial and other interests and prejudices, and the multiplicity and fragmentation of efforts. We summarize and discuss the empirical evidence on the lack of a robust reproducibility culture in economics and business research, the prevalence of potential publication and other selective reporting biases, and other failures and biases in the market of scientific information. Overall, the credibility of the economics literature is likely to be modest or even low.

The Power of Bias in Economics Research: http://onlinelibrary.wiley.com/doi/10.1111/ecoj.12461/full
We investigate two critical dimensions of the credibility of empirical economics research: statistical power and bias. We survey 159 empirical economics literatures that draw upon 64,076 estimates of economic parameters reported in more than 6,700 empirical studies. Half of the research areas have nearly 90% of their results under-powered. The median statistical power is 18%, or less. A simple weighted average of those reported results that are adequately powered (power ≥ 80%) reveals that nearly 80% of the reported effects in these empirical economics literatures are exaggerated; typically, by a factor of two and with one-third inflated by a factor of four or more.

Economics isn't a bogus science — we just don't use it correctly: http://www.latimes.com/opinion/op-ed/la-oe-ioannidis-economics-is-a-science-20171114-story.html
https://archive.is/AU7Xm
study  ioannidis  social-science  meta:science  economics  methodology  critique  replication  bounded-cognition  error  stat-power  🎩  🔬  info-dynamics  piracy  empirical  biases  econometrics  effect-size  network-structure  realness  paying-rent  incentives  academia  multi  evidence-based  news  org:rec  rhetoric  contrarianism  backup  cycles  finance  huge-data-the-biggest  org:local 
january 2017 by nhaliday
Information Processing: What is medicine’s 5 sigma?
I'm not aware of this history you reference, but I am only a recent entrant into this field. On the other hand Ioannidis is both a long time genomics researcher and someone who does meta-research on science, so he should know. He may have even written a paper on this subject -- I seem to recall he had hard numbers on the rate of replication of candidate gene studies and claimed it was in the low percents. BTW, this result shows that the vaunted intuition of biomedical types about "how things really work" in the human body is worth very little. We are much better off, in my opinion, relying on machine learning methods and brute force statistical power than priors based on, e.g., knowledge of biochemical pathways or cartoon models of cell function. (Even though such things are sometimes deemed sufficient to raise ~$100m in biotech investment!) This situation may change in the future but the record from the first decade of the 21st century is there for any serious scholar of the scientific method to study.

Both Ioannidis and I (through separate and independent analyses) feel that modern genomics is a good example of biomedical science that (now) actually works and produces results that replicate with relatively high confidence. It should be a model for other areas ...
hsu  replication  science  medicine  scitariat  meta:science  evidence-based  ioannidis  video  interview  bio  genomics  lens  methodology  thick-thin  candidate-gene  hypothesis-testing  complex-systems  stat-power  bounded-cognition  postmortem  info-dynamics  stats 
november 2016 by nhaliday

bundles : peepssci

related tags

academia  analysis  anomie  asia  audio  backup  behavioral-gen  being-right  biases  big-picture  bio  biodet  bioinformatics  bounded-cognition  brain-scan  cancer  candidate-gene  causation  chart  china  cochrane  cog-psych  commentary  complex-systems  contrarianism  core-rats  critique  cycles  data  diet  discussion  distribution  econometrics  economics  education  effect-size  empirical  endo-exo  endogenous-exogenous  equilibrium  error  ethics  evidence-based  finance  food  genetics  genomics  gotchas  GWAS  gwern  health  hsu  huge-data-the-biggest  hypothesis-testing  incentives  info-dynamics  integrity  interview  intricacy  ioannidis  iq  lens  limits  measurement  medicine  meta-analysis  meta:medicine  meta:science  metabolic  metabuch  metameta  methodology  missing-heritability  model-organism  models  multi  network-structure  neuro  news  nibble  nutrition  org:data  org:lite  org:local  org:mag  org:nat  org:rec  org:sci  oscillation  paying-rent  physics  piracy  podcast  population-genetics  postmortem  preprint  profile  psychology  QTL  ratty  realness  reddit  regularizer  replication  rhetoric  scale  scaling-up  science  scitariat  sinosphere  social  social-choice  social-psych  social-science  speedometer  ssc  stat-power  state-of-art  stats  study  stylized-facts  summary  survey  the-trenches  thick-thin  trends  truth  usa  video  visualization  🌞  🎩  🔬  🤖 

Copy this bookmark:



description:


tags: