k-means   247

« earlier    

K-Means Clustering: Diving into Unsupervised Learning | Data Stuff
Unsupervised Learning has been called the closest thing we have to "actual" Artificial Intelligence, in the sense of General AI. Let's see it in action.
clustering  ml  dev  k-means  machine-learning  toread 
april 2019 by xer0x
Parameter Estimation for von Mises–Fisher Mixture Model via Gaussian Distribution
A variant of Clustering on the Unit Hypersphere using von Mises-Fisher Distributions using Variational Bayes technique (ouch).
coink  clustering  mixture  von-mises-fisher  spherical  k-means  machine  learning 
january 2019 by paunit
Clustering on the Unit Hypersphere using von Mises-Fisher Distributions
Everything is in the title -- vMF distribution on hypersphere is somehow the "minimal info" parametric distribution targetting a certain mean. Nice derivation of approx for ML estimates for mixtures of vMF and application in large dim (> 100)
coink  clustering  mixture  von-mises-fisher  spherical  k-means  machine  learning 
january 2019 by paunit
[1612.07545] A Revisit of Hashing Algorithms for Approximate Nearest Neighbor Search
"Approximate Nearest Neighbor Search (ANNS) is a fundamental problem in many areas of machine learning and data mining. During the past decade, numerous hashing algorithms are proposed to solve this problem. Every proposed algorithm claims outperform other state-of-the-art hashing methods. However, the evaluation of these hashing papers was not thorough enough, and those claims should be re-examined. The ultimate goal of an ANNS method is returning the most accurate answers (nearest neighbors) in the shortest time. If implemented correctly, almost all the hashing methods will have their performance improved as the code length increases. However, many existing hashing papers only report the performance with the code length shorter than 128. In this paper, we carefully revisit the problem of search with a hash index, and analyze the pros and cons of two popular hash index search procedures. Then we proposed a very simple but effective two level index structures and make a thorough comparison of eleven popular hashing algorithms. Surprisingly, the random-projection-based Locality Sensitive Hashing (LSH) is the best performed algorithm, which is in contradiction to the claims in all the other ten hashing papers. Despite the extreme simplicity of random-projection-based LSH, our results show that the capability of this algorithm has been far underestimated. For the sake of reproducibility, all the codes used in the paper are released on GitHub, which can be used as a testing platform for a fair comparison between various hashing algorithms."
to:NB  data_mining  approximation  nearest_neighbors  locality-sensitive_hashing  hashing  have_read  via:vaguery  random_projections  k-means  databases 
january 2019 by cshalizi
K-means Loss Calculation - vision - PyTorch Forums
Can someone give an idea on how to implement k-means clustering loss in pytorch?

Also I am using Pytorch nn.mse() loss. Is there a way to add L2 reguarization to this term. In short, if I want to use L2-Reg. loss.
k-means  k-nearest-neighbours  pytorch  deep-learning 
august 2018 by nharbour
K-Means Clustering in R Tutorial (article) - DataCamp
Learn all about clustering and, more specifically, k-means in this R Tutorial, where you'll focus on a case study with Uber data.
k-means  r-statistical 
march 2018 by dougleigh
K-Means Clustering in Python · Mubaris NK
Clustering is a type of Unsupervised learning. This is very often used when you don’t have labeled data. K-Means Clustering is one of the popular clustering algorithm. The goal of this algorithm is to find groups(clusters) in the given data. In this post we will implement K-Means algorithm using Python from scratch.
python  k-means  clustering  tutorial 
october 2017 by morganwatch
k_means_clustering/kmeans.py.ipynb at master · llSourcell/k_means_clustering
k_means_clustering - This is the code for "K-Means Clustering - The Math of Intelligence (Week 3)" By SIraj Raval on Youtube
k-means  machine-learning  deep-learning 
september 2017 by nharbour

« earlier    

related tags

a_bendaizer  active-learning  adaboost  algorithm  algorithms  analysis  analytics  approximation  apriori  artificial-intelligence  author-identification  autoencoder  bayesian  big-data  blog  c4.5  cart  clojure  cluster-analysis  cluster  clusteranalysis  clustering  clusters  code  coink  color  coloring  colors  colourquantisation  comparison  computer  computerscience  convnet  coreset  countries  cuda  cv  d3  dask  data-analysis  data-mining  data-science  data-stream  data  data_mining  data_science  dataanalysis  databases  datamining  datascience  dataviz  dbscan  deep-learning  density  dev  development  dimensionality-reduction  diversity  dominant  dp-means  dynamic-programming  economics  em  exploration  fast.ai  fast  feature-learning  featurelearning  free  gap  gdp  gem  gep_gep  ggplot2  gotchas  gpu  growth  hadoop  hashing  have_read  hdbscan  how-to  howto  image-processing  image  images  impl  imported  information_theory  ipython  jump_method  k-mc2  k-nearest-neighbours  kifi  kmeans  knn  lda  learning  library  libs  linguistics  links  locality-sensitive_hashing  lunch  machine-learning  machine  machine_learning  machinelearning  map  master_class_data_science  mathematics  matlab  mean-shift  means  mini-batch  minibatch  mining  mixture  ml  mllib  modeling  naive-bayes  nearest  nearest_neighbors  neighbor  neural-net  neuralnetworks  nlp  nonparametrics  noverlap  open  opencv  optdigits  pagerank  papers  pca  performance  postgres  primary  private_discussions  processing  py  pyimage  python  pytorch  r-project  r-statistical  r  rails  random-projections  random_projections  rchart  recommender-system  reference  representative  research-article  rjdbc  ruby  sampling  scaling  science  scikit  scipy  sckikit-learn  scotch  seed  similiarity  sklearn  slides  social  sofia  software  spark  spherical  statistics  stats.stackexchange  stats  summarization  supervised  svm  tensorflow  text-mining  textbook  thesis  to:nb  topic  toread  trading  traps  tutorial  twitter  unsupervised-learning  unsupervised  validation  vector  vision  visual  visualization  von-mises-fisher  web  weka  whisky  word2vec 

Copy this bookmark:



description:


tags: