lstm   725

« earlier    

Detecting spacecraft anomalies using LSTMs and nonparametric dynamic thresholding
A framework for using LSTMs to detect anomalies in multivariate time series data. Includes spacecraft anomaly data and experiments from the Mars Science Laboratory and SMAP missions
LSTM  Anomaly  Detection 
yesterday by FredericJacobs
Understanding LSTM Networks -- colah's blog
Humans don’t start their thinking from scratch every second. As you read this essay, you understand each word based on your understanding of previous words. You don’t throw everything away and start thinking from scratch again. Your thoughts have persistence.

Traditional neural networks can’t do this, and it seems like a major shortcoming. For example, imagine you want to classify what kind of event is happening at every point in a movie. It’s unclear how a traditional neural network could use its reasoning about previous events in the film to inform later ones.

Recurrent neural networks address this issue. They are networks with loops in them, allowing information to persist.
ai  ml  machinelearning  rnn  LSTM 
4 weeks ago by euler
jaungiers/LSTM-Neural-Network-for-Time-Series-Prediction: LSTM built using Keras Python package to predict time series steps and sequences. Includes sin wave and stock market data
LSTM built using Keras Python package to predict time series steps and sequences. Includes sin wave and stock market data - jaungiers/LSTM-Neural-Network-for-Time-Series-Prediction
lstm  time-series  keras  deep-learning  github 
6 weeks ago by nharbour
Time Series Prediction Using LSTM Deep Neural Networks
CONCLUSION
Whilst this article aims to give a working example of LSTM deep neural networks in practice, it has only scratched the surface of their potential and application in sequential and temporal problems.

As of writing, LSTMs have been successfully used in a multitude of real-world problems from classical time series issues as described here, to text auto-correct, anomaly detection and fraud detection, to having a core in self-driving car technologies being developed.

There are currently some limitations with using the vanilla LSTMs described above, specifically in the use of a financial time series, the series itself has non-stationary properties which is very hard to model (although advancements have been made in using Bayesian Deep Neural Network methods for tackling non-stationarity of time series). Also for some applications it has also been found that newer advancements in attention based mechanisms for neural networks have out-performed LSTMs (and LSTMs coupled with these attention based mechanisms have outperformed either on their own).
deeplearning  timeseries  AI  ML  LSTM 
6 weeks ago by euler

« earlier    

related tags

activity  ai  algorithm  allocation  alternative  andrew  anomaly-detection  anomaly  architecture  archive  arima  armpl  arxiv  asr  associative  attention  audio  aug1  autoencoder  awd-lstm  benchmarks  blogs  books  cell  churn-model  classification  classifier  clojure  cloud  cnn  cnns  code  coding  color  data-science  data  dataset  datasets  deep-learning  deep  deep_learning_researcher  deeplearning  detection  dl  dnn  dropout  ebooks  econometrics  embeddings  evolutionary  example  fast.ai  favorite  feature-engineering  finance  financial  fitness  forecast  forecasting  fwam  gans  generation  gist  github  good  google  graves  grid  gru  hierarchical  hierarchy  howto  hyperapp  ifttt  imageprocessing  imagerecognition  javascript  kaggle  keras  language-modeling  language  learning  length  machine-learning  machine  machine_learning  machinelearning  market  memory  ml  model  models  music  ner  network  neural  neuralnetwork  neuralnetworks  nlp  nn  numpy  ocr  pack  pad  padded  padding  papers  prediction  python  pytorch  qrnn  quantitative_finance  r  recurrent  reference  regression  reinforcement-learning  reinforcement  reinforcementlearning  resource  resources  rnn  rnns  scratch  search  security  seq2seq  seqm  sequence-modeling  sequence  sequencelearning  series  speechrecognition  stock  stocks  study-group  teaching  tensorflow  text  theory  time-series  time  time_series  timeseries  toread  tracker  training  transferlearning  transformer  tutorial  tutorials  tweetit  twitter  understanding  variable  vs  wordembedding 

Copy this bookmark:



description:


tags: