ann   902

« earlier    

linkedin/scanns: A scalable nearest neighbor search library in Apache Spark
ScANNS is a nearest neighbor search library for Apache Spark originally developed by Namit Katariya from the LinkedIn Machine Learning Algorithms team. It enables nearest neighbor search in a batch offline context within the cosine, jaccard and euclidean distance spaces.
ANN  spark 
11 days ago by foodbaby
[1806.09823] Approximate Nearest Neighbor Search in High Dimensions
The nearest neighbor problem is defined as follows: Given a set P of n points in some metric space (X,D), build a data structure that, given any point q, returns a point in P that is closest to q (its "nearest neighbor" in P). The data structure stores additional information about the set P, which is then used to find the nearest neighbor without computing all distances between q and P. The problem has a wide range of applications in machine learning, computer vision, databases and other fields.
To reduce the time needed to find nearest neighbors and the amount of memory used by the data structure, one can formulate the {\em approximate} nearest neighbor problem, where the the goal is to return any point p′∈P such that the distance from q to p′ is at most c⋅minp∈PD(q,p), for some c≥1. Over the last two decades, many efficient solutions to this problem were developed. In this article we survey these developments, as well as their connections to questions in geometric functional analysis and combinatorial geometry.
nearest-neighbor  ANN  survey  papers 
28 days ago by foodbaby
Abstract for An Investigation of Practical Approximate Nearest Neighbor Algorithms - Semantic Scholar
This paper concerns approximate nearest neighbor searching algorithms, which have become increasingly important, especially in high dimensional perception areas such as computer vision, with dozens of publications in recent years. Much of this enthusiasm is due to a successful new approximate nearest neighbor approach called Locality Sensitive Hashing (LSH). In this paper we ask the question: can earlier spatial data structure approaches to exact nearest neighbor, such as metric trees, be altered to provide approximate answers to proximity queries and if so, how? We introduce a new kind of metric tree that allows overlap: certain datapoints may appear in both the children of a parent. We also introduce new approximate k-NN search algorithms on this structure. We show why these structures should be able to exploit the same randomprojection-based approximations that LSH enjoys, but with a simpler algorithm and perhaps with greater efficiency. We then provide a detailed empirical evaluation on five large, high dimensional datasets which show up to 31-fold accelerations over LSH. This result holds true throughout the spectrum of approximation levels.
ANN  LSH  random-projection  metric  tree 
january 2019 by foodbaby
Vectors in Search – Towards More Semantic Matching - Simon Hughes, Di…
Approximate Nearest Neighbor Search • Faster than full k-NN, with some loss in accuracy • Approaches can be either: • Data Dependent • Learns and adjusts from the data • Makes indexing new documents hard • Data Independent • Some Approaches: • KD Tree • LSH • Heuristic Methods • K-Means Tree • Randomized KD Forest • Paper: • HNSW (Hierarchical Navigable Small World Graphs – Top on • Paper: • Vector Thresholding • Choice of similarity metric is important in choosing an algorithm
ANN  vectors  search 
december 2018 by foodbaby
The Unreasonable Effectiveness of Recurrent Neural Networks
Amazing article showing the generation of new things from trained RNNs. Includes one of my favourite examples.- machine generated Shakespeare.
rnn  recurrentneuralnetwork  ann  artificialneuralnetwork  ai  machinelearning  deeplearning 
december 2018 by ids

« earlier    

related tags

18th  2017  2018  2019  a  actually  adventures  after  ai  algo  algorithm  algorithms  and  anime  anime_webpages  anna  anneorshine  annoy  anxiety  appeals  ar  arbor  artificial-neural-network  artificial  artificialneuralnetwork  augmented  aunt  aws  azure  bag  balm  ben  ben_leslie  benchmark  benchmarks  bigdata  bird  birthday  book  bow  brown  budge  bujo  bul...  bullet  but  butter  c++  cake  chives  cluster-network  clustering  cnn  collection  committing  computer  contractors  convolutional  cool  cooper  coulter  course  courses  coworking  cpp  database  datascience  david  deal:  deep-learning  deep  deep_learning  deeplearning  deliver?  dempster  depression  derby  designers  detection  disagreement  discuss  disney  dl  doc  does  doll  dowd:  duality  edinburgh  education  elsa  excellent  fake  fans  fever  fondant  for  framework  free  friendship  from  fromreese  gay  gnocchi  go  google  got  graphics  guide  handmade  hashing  hd  hearts  hearts:  hibernian  hibs  history  hit  hope  how-to  how  howto  howtocookthat  humphrey  ia  ideas  ifttt  image  immiigration  important  in  index  instrument  integrated  intelligence  interpretability  ir  is  italian  javascript  jeb!’  journal  june  kd-tree  knn  kuster  language  law  lawless  le  learning  leeann  lennon  lesson  lessons  library  lithuania  lsh  lydia  lyndonville  machine-learning  machine  machinelearning  make  maker  mary  mclane  me  medium  meltdown  metric  michigan  mit  ml  models  mooc  nail  nearest-neighbor-search  nearest-neighbor  nearest-neighbors  nearest-neighbour  need  neil  nek  neopaganism  network  networks  neural-network  neural  neuralnetworks  nlp  nmslib  nygard  object  occult  october  oears  of  on  online  optical  optics  optimization  orlin  overview  paganism  painting  papers  parameter  parksandrecreation  philosophy  pinterest  plan  playing  polish  politics  post306  pq  princ...  princess  programming  python  quin  raising  random-projection  ranking  ranknet  read  reality  reardon  recipe  recognition  recurrentneuralnetwork  reese  ref  rep.  repair  research  retrieval  rl  rnn  roads&kingdoms  rose  rpa  sarah  science  search  seqgan  serps  set  setup  sharing  shutdown  similarity  slams  slides  son  songbirds  space  spark  speech  spotify  sri  survey  technology  teen  tensorflow  the  thelema  threads  to  tool  tree  troubles  trump  tutorial  up  usa  ussr  uutiset  v  var  vector  vectors  vermont  video  vision  vitale  voted  wagner  wall.  wall  wants  watercolor  witchcraft  with  wittgenstein  wood  wordpress  yahoo  ‘we  公主蛋糕教程 

Copy this bookmark: