markov   1580

« earlier    

Optimal Bayesian estimation of the state of a probabilistically mapped memory-conditional Markov process with application to manual Morse decoding : Bell, Edison Lee : Free Download, Borrow, and Streaming : Internet Archive
This dissertation investigates the problem of automatic transcription of the hand-keyed' Morse signal. A unified model for this signal process transmitted over a noisy channel is shown to be a system in which the state of the Morse process evolves as a memory-conditioned probabilistic mapping of a conditional Markov process, with the state of this process playing the role of a parameter vector of the channel model. The decoding problem is then posed as finding an optimal estimate of the state of the Morse process, given a sequence of measurements of the detected signal. The Bayesian solution to this nonlinear estimation problem is obtained explicitly for the parameter-conditional lineargaussian channel, and the resulting optimal decoder is shown to consist of a denumerable but exponentially expanding set of linear Kalman filters operating ona dynamically evolving trellis. Decoder performance is obtained by computer simulation, for the case of random letter message texts. For nonrandom texts, further research is indicated to specify linguistic and formatdependent models consistent with the model structure developed herein
markov  cw  bayesian  machine-learning  morse-code 
23 days ago by mwishek
Conditional Markov Chains: Properties, Construction and Structured Dependence
In this paper we contribute to the theory of conditional Markov chains (CMCs) that take 􏰅nitely many values and that admit intensity. We provide a method for constructing a CMC with given intensity and with given con- ditional initial law, and which is also a doubly stochastic Markov chain. We provide a martingale characterization for such process, and we discuss other useful properties. We de􏰅ne and give su􏰇cient and necessary conditions for strong Markovian consistency and weak Markovian consistency of a multi- variate CMC. We use these results to model structured dependence between univariate CMCs, that is, to model a multivariate CMC whose components are univariate CMCs with given laws. An example of potential application of our theory is presented.
markov  condition-markov-chain  machine-learning 
23 days ago by mwishek

« earlier    

related tags

2048  ai  algorithm  algorithms  amazon  analysis  apr18  article  artificial+intelligence  artificial  aug18  autocomplete  bayes  bayesian  best  bot  calculus  carlo  chain  chains  chess  cli  clippings  clojure  cognition  composition  computer  condition-markov-chain  course  cw  cypher  daily  data  datascience  dataset  decision  deep_learning  devel  distribution  education  explained  explanation  fame  flask  friston  game  generator  gibbs  github  glitch  go  golang  gps  gradient  hackernewscomments  hamiltonian  hidden  hmm  howto  important  inference  informationtheory  intelligence  introduction  jvm  lang:swift  learn  learning  learning_code  library  logic  lyrics  machine-learning  machine  machine_learning  machinelearning  markov-chain  markov-chains  markov_chains  markovchains  math  mathematics  maths  matrix  mcmc  mdp  model  monte  montecarlo  morse-code  mozart  music  name  neo4j  network  neural  neural_network  neuroscience  nlp  partial_derivative  patterns  pca  person  philosophy  physics  pocket  probabilistic  probability  programming  psychiatry  python  quotes  random  reinforcementlearning  reward  road  scala  scalar  science  simple  simulation  sip  slack  slope  statemachine  statistics  stripe  swift  testing  text  tree  tutorial  tutoriall  twitter  vector  visualization  visually  wikipedia 

Copy this bookmark:



description:


tags: