stochastic-processes   90

« earlier    

[1802.09679] A guide to Brownian motion and related stochastic processes
This is a guide to the mathematical theory of Brownian motion and related stochastic processes, with indications of how this theory is related to other branches of mathematics, most notably the classical theory of partial differential equations associated with the Laplace and heat operators, and various generalizations thereof. As a typical reader, we have in mind a student, familiar with the basic concepts of probability based on measure theory, at the level of the graduate texts of Billingsley and Durrett , and who wants a broader perspective on the theory of Brownian motion and related stochastic processes than can be found in these texts.
brownian-motion  stochastic-processes  via:rvenkat 
july 2018 by arsyed
'P' Versus 'Q': Differences and Commonalities between the Two Areas of Quantitative Finance by Attilio Meucci :: SSRN
There exist two separate branches of finance that require advanced quantitative techniques: the "Q" area of derivatives pricing, whose task is to "extrapolate the present"; and the "P" area of quantitative risk and portfolio management, whose task is to "model the future."

We briefly trace the history of these two branches of quantitative finance, highlighting their different goals and challenges. Then we provide an overview of their areas of intersection: the notion of risk premium; the stochastic processes used, often under different names and assumptions in the Q and in the P world; the numerical methods utilized to simulate those processes; hedging; and statistical arbitrage.
study  essay  survey  ORFE  finance  investing  probability  measure  stochastic-processes  outcome-risk 
december 2017 by nhaliday
Lecture 14: When's that meteor arriving
- Meteors as a random process
- Limiting approximations
- Derivation of the Exponential distribution
- Derivation of the Poisson distribution
- A "Poisson process"
nibble  org:junk  org:edu  exposition  lecture-notes  physics  mechanics  space  earth  probability  stats  distribution  stochastic-processes  closure  additive  limits  approximation  tidbits  acm  binomial  multiplicative 
september 2017 by nhaliday
[1502.05274] How predictable is technological progress?
Recently it has become clear that many technologies follow a generalized version of Moore's law, i.e. costs tend to drop exponentially, at different rates that depend on the technology. Here we formulate Moore's law as a correlated geometric random walk with drift, and apply it to historical data on 53 technologies. We derive a closed form expression approximating the distribution of forecast errors as a function of time. Based on hind-casting experiments we show that this works well, making it possible to collapse the forecast errors for many different technologies at different time horizons onto the same universal distribution. This is valuable because it allows us to make forecasts for any given technology with a clear understanding of the quality of the forecasts. As a practical demonstration we make distributional forecasts at different time horizons for solar photovoltaic modules, and show how our method can be used to estimate the probability that a given technology will outperform another technology at a given point in the future.

model:
- p_t = unit price of tech
- log(p_t) = y_0 - μt + ∑_{i <= t} n_i
- n_t iid noise process
preprint  study  economics  growth-econ  innovation  discovery  technology  frontier  tetlock  meta:prediction  models  time  definite-planning  stylized-facts  regression  econometrics  magnitude  energy-resources  phys-energy  money  cost-benefit  stats  data-science  🔬  ideas  speedometer  multiplicative  methodology  stochastic-processes  time-series  stock-flow  iteration-recursion  org:mat 
april 2017 by nhaliday
Mixing (mathematics) - Wikipedia
One way to describe this is that strong mixing implies that for any two possible states of the system (realizations of the random variable), when given a sufficient amount of time between the two states, the occurrence of the states is independent.

Mixing coefficient is
α(n) = sup{|P(A∪B) - P(A)P(B)| : A in σ(X_0, ..., X_{t-1}), B in σ(X_{t+n}, ...), t >= 0}
for σ(...) the sigma algebra generated by those r.v.s.

So it's a notion of total variational distance between the true distribution and the product distribution.
concept  math  acm  physics  probability  stochastic-processes  definition  mixing  iidness  wiki  reference  nibble  limits  ergodic  math.DS  measure  dependence-independence 
february 2017 by nhaliday

« earlier    

related tags

acm  acmtariat  additive  algorithms  analysis  applications  approximation  arbitrage  arxiv  atoms  bandits  bayesian-methods  bayesian  bias-variance  binomial  biology  booklog  books  brexit  britain  brownian-motion  calculus  caltech  cartoons  chaining  characterization  classic  closure  clustering  cmu  columbia  combo-optimization  comparison  computerscience  concentration-of-measure  concept  confluence  confusion  control  cosma-shalizi  cost-benefit  counterexample  course  courses  data-mining  data-science  deep-learning  definite-planning  definition  dependence-independence  dependent-data  differential-equations  differential  dimensionality  direction  dirichlet-process  dirichlet  discovery  discrete  distance  distribution  documents  dp  draft  dsp  dynamical-systems  dynamical  early-modern  earth  econometrics  economic-inequality  economics  egt  einstein  elections  electromag  energy-resources  ergodic  eric-kaufmann  essay  estimate  europe  evolution  examples  existence  expectancy  explanans  exposition  extrema  ferguson  fft  finance  finished:2015  fourier  frontier  galton  game-design  game-theory  gaussian-processes  gender  generative  geometry  germanic  giants  google  gotchas  gowers  gradient-descent  graphical-models  graphs  growth-econ  hawkes-process  high-dimension  history  homepage  homogeneity-testing  hsu  ideas  identity  iidness  information-theory  init  innovation  integral  integration  interdisciplinary  intuition  invariance  investing  israel  iteration-recursion  kernels  las-vegas  latent-variables  learning-theory  learning  lecture-notes  lectures  levers  libs  limits  linear-algebra  linear-programming  links  list  machine-learning  machinelearning  magnitude  markets  markov-chains  markov  martingale  math.ca  math.co  math.ds  math.fa  math.mg  math  mathematics  mathtariat  measure  mechanics  meta:prediction  metabuch  methodology  metric  mit  mixing  model-class  models  money  monte-carlo  mostly-modern  multi  multiplicative  network-structure  networks  nibble  nips  nlp  nonfiction  nonlinearity  nonparametric  nonstandard-analysis  notes  ocw  off-convex  old-anglo  online-learning  optimal-stopping  optimization  orfe  org:bleg  org:edu  org:fin  org:junk  org:mat  orourke  outcome-risk  overflow  p:*  p:someday  p:whenever  papers  paradox  pdf  people  phys-energy  physics  pic  plots  point-process  poisson-process  polisci  power-law  pragmatic  pre-ww2  preprint  princeton  probability  prof  properties  psychology  puzzles  python  q-n-a  qra  quantum  quixotic  random-walk  random  reading  recommendations  reference  regression  relativity  representation-learning  research-article  research  restaurant-processes  reviews  rigor  risk  sampling  science  scitariat  sequential-learning  sequential  shalizi  sheldon-ross  simulation  slides  social-engineering  social-science  sociology  solutions  space  sparsity  speedometer  stat-mech  statcomp  statistics  stats  stochastic-differential-equations  stock-flow  stopping-times  stories  structure  study  stylized-facts  submodular  survey  surveys  symbolic-methods  symmetry  synchrony  tails  tcs  tcstariat  technology  tetlock  text-analysis  text-mining  the-trenches  thinking  tidbits  time-series  time  top-n  topics  tutorial  tutorials  unit  videos  visual-understanding  visualization  washington  wiki  winter-2017  wormholes  wray-buntine  yoga  👳  🔬 

Copy this bookmark:



description:


tags: