nhaliday + stat-mech   47

Equilibrium thermodynamics - Wikipedia
Equilibrium Thermodynamics is the systematic study of transformations of matter and energy in systems in terms of a concept called thermodynamic equilibrium. The word equilibrium implies a state of balance. Equilibrium thermodynamics, in origins, derives from analysis of the Carnot cycle. Here, typically a system, as cylinder of gas, initially in its own state of internal thermodynamic equilibrium, is set out of balance via heat input from a combustion reaction. Then, through a series of steps, as the system settles into its final equilibrium state, work is extracted.

In an equilibrium state the potentials, or driving forces, within the system, are in exact balance. A central aim in equilibrium thermodynamics is: given a system in a well-defined initial state of thermodynamic equilibrium, subject to accurately specified constraints, to calculate, when the constraints are changed by an externally imposed intervention, what the state of the system will be once it has reached a new equilibrium. An equilibrium state is mathematically ascertained by seeking the extrema of a thermodynamic potential function, whose nature depends on the constraints imposed on the system. For example, a chemical reaction at constant temperature and pressure will reach equilibrium at a minimum of its components' Gibbs free energy and a maximum of their entropy.

Equilibrium thermodynamics differs from non-equilibrium thermodynamics, in that, with the latter, the state of the system under investigation will typically not be uniform but will vary locally in those as energy, entropy, and temperature distributions as gradients are imposed by dissipative thermodynamic fluxes. In equilibrium thermodynamics, by contrast, the state of the system will be considered uniform throughout, defined macroscopically by such quantities as temperature, pressure, or volume. Systems are studied in terms of change from one equilibrium state to another; such a change is called a thermodynamic process.
nibble  wiki  reference  concept  conceptual-vocab  explanation  definition  physics  stat-mech  thermo  equilibrium  summary
november 2017 by nhaliday
Drude model - Wikipedia
The Drude model of electrical conduction was proposed in 1900[1][2] by Paul Drude to explain the transport properties of electrons in materials (especially metals). The model, which is an application of kinetic theory, assumes that the microscopic behavior of electrons in a solid may be treated classically and looks much like _a pinball machine_, with a sea of constantly jittering electrons bouncing and re-bouncing off heavier, relatively immobile positive ions.

The two most significant results of the Drude model are an electronic equation of motion,

d<p(t)>/dt = q(E + 1/m <p(t)> x B) - <p(t)>/τ

and a linear relationship between current density J and electric field E,

J = (nq^2τ/m) E

latter is Ohm's law
nibble  physics  electromag  models  local-global  stat-mech  identity  atoms  wiki  reference  ground-up  cartoons
september 2017 by nhaliday
Flows With Friction
To see how the no-slip condition arises, and how the no-slip condition and the fluid viscosity lead to frictional stresses, we can examine the conditions at a solid surface on a molecular scale. When a fluid is stationary, its molecules are in a constant state of motion with a random velocity v. For a gas, v is equal to the speed of sound. When a fluid is in motion, there is superimposed on this random velocity a mean velocity V, sometimes called the bulk velocity, which is the velocity at which fluid from one place to another. At the interface between the fluid and the surface, there exists an attraction between the molecules or atoms that make up the fluid and those that make up the solid. This attractive force is strong enough to reduce the bulk velocity of the fluid to zero. So the bulk velocity of the fluid must change from whatever its value is far away from the wall to a value of zero at the wall (figure 7). This is called the no-slip condition.

http://www.engineeringarchives.com/les_fm_noslip.html
The fluid property responsible for the no-slip condition and the development of the boundary layer is viscosity.
https://www.quora.com/What-is-the-physics-behind-no-slip-condition-in-fluid-mechanics
https://www.researchgate.net/post/Can_someone_explain_what_exactly_no_slip_condition_or_slip_condition_means_in_terms_of_momentum_transfer_of_the_molecules
https://en.wikipedia.org/wiki/Boundary_layer_thickness
http://www.fkm.utm.my/~ummi/SME1313/Chapter%201.pdf
org:junk  org:edu  physics  mechanics  h2o  identity  atoms  constraint-satisfaction  volo-avolo  flux-stasis  chemistry  stat-mech  nibble  multi  q-n-a  reddit  social  discussion  dirty-hands  pdf  slides  lectures  qra  fluid  local-global  explanation
september 2017 by nhaliday
All models are wrong - Wikipedia
Box repeated the aphorism in a paper that was published in the proceedings of a 1978 statistics workshop.[2] The paper contains a section entitled "All models are wrong but some are useful". The section is copied below.

Now it would be very remarkable if any system existing in the real world could be exactly represented by any simple model. However, cunningly chosen parsimonious models often do provide remarkably useful approximations. For example, the law PV = RT relating pressure P, volume V and temperature T of an "ideal" gas via a constant R is not exactly true for any real gas, but it frequently provides a useful approximation and furthermore its structure is informative since it springs from a physical view of the behavior of gas molecules.

For such a model there is no need to ask the question "Is the model true?". If "truth" is to be the "whole truth" the answer must be "No". The only question of interest is "Is the model illuminating and useful?".
thinking  metabuch  metameta  map-territory  models  accuracy  wire-guided  truth  philosophy  stats  data-science  methodology  lens  wiki  reference  complex-systems  occam  parsimony  science  nibble  hi-order-bits  info-dynamics  the-trenches  meta:science  physics  fluid  thermo  stat-mech  applicability-prereqs  theory-practice  elegance  simplification-normalization
august 2017 by nhaliday
Edge.org: 2017 : WHAT SCIENTIFIC TERM OR CONCEPT OUGHT TO BE MORE WIDELY KNOWN?
highlights:
- the genetic book of the dead [Dawkins]
- complementarity [Frank Wilczek]
- relative information
- effective theory [Lisa Randall]
- affordances [Dennett]
- spontaneous symmetry breaking
- relatedly, equipoise [Nicholas Christakis]
- case-based reasoning
- population reasoning (eg, common law)
- criticality [Cesar Hidalgo]
- Haldan's law of the right size (!SCALE!)
- polygenic scores
- non-ergodic
- ansatz
- state [Aaronson]: http://www.scottaaronson.com/blog/?p=3075
- transfer learning
- effect size
- satisficing
- scaling
- the breeder's equation [Greg Cochran]
- impedance matching

soft:
- reciprocal altruism
- life history [Plomin]
- intellectual honesty [Sam Harris]
- coalitional instinct (interesting claim: building coalitions around "rationality" actually makes it more difficult to update on new evidence as it makes you look like a bad person, eg, the Cathedral)

more: https://www.edge.org/conversation/john_tooby-coalitional-instincts

interesting timing. how woke is this dude?
org:edge  2017  technology  discussion  trends  list  expert  science  top-n  frontier  multi  big-picture  links  the-world-is-just-atoms  metameta  🔬  scitariat  conceptual-vocab  coalitions  q-n-a  psychology  social-psych  anthropology  instinct  coordination  duty  power  status  info-dynamics  cultural-dynamics  being-right  realness  cooperate-defect  westminster  chart  zeitgeist  rot  roots  epistemic  rationality  meta:science  analogy  physics  electromag  geoengineering  environment  atmosphere  climate-change  waves  information-theory  bits  marginal  quantum  metabuch  homo-hetero  thinking  sapiens  genetics  genomics  evolution  bio  GT-101  low-hanging  minimum-viable  dennett  philosophy  cog-psych  neurons  symmetry  humility  life-history  social-structure  GWAS  behavioral-gen  biodet  missing-heritability  ergodic  machine-learning  generalization  west-hunter  population-genetics  methodology  blowhards  spearhead  group-level  scale  magnitude  business  scaling-tech  tech  business-models  optimization  effect-size  aaronson  state  bare-hands  problem-solving  politics
may 2017 by nhaliday
Mean field theory - Wikipedia
In physics and probability theory, mean field theory (MFT also known as self-consistent field theory) studies the behavior of large and complex stochastic models by studying a simpler model. Such models consider a large number of small individual components which interact with each other. The effect of all the other individuals on any given individual is approximated by a single averaged effect, thus reducing a many-body problem to a one-body problem.
concept  atoms  models  physics  stat-mech  ising  approximation  parsimony  wiki  reference  nibble
march 2017 by nhaliday
Evolution of sexual asymmetry | BMC Evolutionary Biology | Full Text
Background
The clear dominance of two-gender sex in recent species is a notorious puzzle of evolutionary theory. It has at least two layers: besides the most fundamental and challenging question why sex exists at all, the other part of the problem is equally perplexing but much less studied. Why do most sexual organisms use a binary mating system? Even if sex confers an evolutionary advantage (through whatever genetic mechanism), why does it manifest that advantage in two, and exactly two, genders (or mating types)? Why not just one, and why not more than two?

Results
Assuming that sex carries an inherent fitness advantage over pure clonal multiplication, we attempt to give a feasible solution to the problem of the evolution of dimorphic sexual asymmetry as opposed to monomorphic symmetry by using a spatial (cellular automaton) model and its non-spatial (mean-field) approximation. Based on a comparison of the spatial model to the mean-field approximation we suggest that spatial population structure must have played a significant role in the evolution of mating types, due to the largely clonal (self-aggregated) spatial distribution of gamete types, which is plausible in aquatic habitats for physical reasons, and appears to facilitate the evolution of a binary mating system.

Conclusions
Under broad ecological and genetic conditions the cellular automaton predicts selective removal from the population of supposedly primitive gametes that are able to mate with their own type, whereas the non-spatial model admits coexistence of the primitive type and the mating types. Thus we offer a basically ecological solution to a theoretical problem that earlier models based on random gamete encounters had failed to resolve.

Having sex, yes, but with whom? Inferences from fungi on the evolution of anisogamy and mating types: http://onlinelibrary.wiley.com/doi/10.1111/j.1469-185X.2010.00153.x/full
study  bio  evolution  sex  gender  roots  eden  EGT  dynamical  GT-101  🌞  symmetry  oceans  models  stat-mech  deep-materialism  speculation  gender-diff  explanans  multi  model-organism  empirical  ecology  automata-languages
march 2017 by nhaliday
Orthogonal — Greg Egan
In Yalda’s universe, light has no universal speed and its creation generates energy.

On Yalda’s world, plants make food by emitting their own light into the dark night sky.
greg-egan  fiction  gedanken  physics  electromag  differential  geometry  thermo  space  cool  curiosity  reading  exposition  init  stat-mech  waves  relativity  positivity  unit  wild-ideas  speed  gravity  big-picture  🔬  xenobio  ideas  scifi-fantasy  signum
february 2017 by nhaliday
On epistasis: why it is unimportant in polygenic directional selection: http://rstb.royalsocietypublishing.org/content/365/1544/1241.short
- James F. Crow

The Evolution of Multilocus Systems Under Weak Selection: http://www.genetics.org/content/genetics/134/2/627.full.pdf
- Thomas Nagylaki

Data and Theory Point to Mainly Additive Genetic Variance for Complex Traits: http://journals.plos.org/plosgenetics/article?id=10.1371/journal.pgen.1000008
The relative proportion of additive and non-additive variation for complex traits is important in evolutionary biology, medicine, and agriculture. We address a long-standing controversy and paradox about the contribution of non-additive genetic variation, namely that knowledge about biological pathways and gene networks imply that epistasis is important. Yet empirical data across a range of traits and species imply that most genetic variance is additive. We evaluate the evidence from empirical studies of genetic variance components and find that additive variance typically accounts for over half, and often close to 100%, of the total genetic variance. We present new theoretical results, based upon the distribution of allele frequencies under neutral and other population genetic models, that show why this is the case even if there are non-additive effects at the level of gene action. We conclude that interactions at the level of genes are not likely to generate much interaction at the level of variance.
hsu  scitariat  commentary  links  study  list  evolution  population-genetics  genetics  methodology  linearity  nonlinearity  comparison  scaling-up  nibble  lens  bounded-cognition  ideas  bio  occam  parsimony  🌞  summary  quotes  multi  org:nat  QTL  stylized-facts  article  explanans  sapiens  biodet  selection  variance-components  metabuch  thinking  models  data  deep-materialism  chart  behavioral-gen  evidence-based  empirical  mutation  spearhead  model-organism  bioinformatics  linear-models  math  magnitude  limits  physics  interdisciplinary  stat-mech
february 2017 by nhaliday
Edge.org: Q-Bio, the most interesting recent [scientific] news
Applied mathematicians and theoretical physicists are rushing to develop new sophisticated tools that can capture the other, non-genomic challenges posed in trying to quantify biology. One of these challenges is that the number of individuals in a community may be large, but not as large as there are molecules of gas in your lungs, for example. So the traditional tools of physics based on statistical modeling have to be upgraded to deal with the large fluctuations encountered, such as in the number of proteins in a cell or individuals in an ecosystem. Another fundamental challenge is that living systems need an energy source.

They are inherently out of thermodynamic equilibrium, and so cannot be described by the century-old tools of statistical thermodynamics developed by Einstein, Boltzmann and Gibbs. Stanislaw Ulam, a mathematician who helped originate the basic principle behind the hydrogen bomb, once quipped, “Ask not what physics can do for biology. Ask what biology can do for physics.” Today, the answer is clear: biology is forcing physicists to develop new experimental and theoretical tools to explore living cells in action.
bio  trends  science  interdisciplinary  physics  thermo  org:edge  giants  einstein  boltzmann  stat-mech  equilibrium  complex-systems  cybernetics
november 2016 by nhaliday