Machine-Learning   23162

« earlier    

graph2vec: Learning Distributed Representations of Graphs
Pointed out by a coworker. The goal is to create a vector embedding (“distributed representation”) for graphs. So you start with a collection of graphs, which need not be the same size, and you end with a matrix of vectors such that distances between the vectors a related in some way to similarity between the graphs. The approach here as best I can tell is to literally take `doc2vec` and replace “documents” with “graphs” and “words” with “rooted subgraph.” (Their rooted subgraphs are the subgraphs accessible within k hops from the root; I thought there was a standard name for this but can’t find it right now.) They have some benchmark data sets and tasks, which generally show `graph2vec` performing about as well as something called “Deep WL kernel,” although in their tests it takes less time to train `graph2vec.`

My comments:

The comparisons are not very useful. They include a comparison to `node2vec` in their benchmarks, but in order to get a graph representation from `node2vec` they just average the vectors for all the nodes in the graph. They also don’t discuss their methodology for training `node2vec` given a corpus of graphs; `node2vec` (I think) was originally specified for a single graph.

The timing information is also quite telling — `node2vec` and Deep whatever whatever take orders of magnitude longer than other techniques, yet increase the accuracy over the WL kernel by barely measurable, possibly not even significant amounts.

Interesting follow ups:

The Weisfeiler-Lehman graph kernel - what is this and what are they so in to it?

The WL relabeling technique — their method needs node labels, which are apparently supplied just by using the node degree? How does this affect realistic applications?

The `word2vec` skipgram model and negative sampling — gosh I should really actually learn these
research  networks  machine-learning  vector-embedding 
21 hours ago by DGrady
spaCy - Industrial-strength Natural Language Processing in Python
spaCy excels at large-scale information extraction tasks. It's written from the ground up in carefully memory-managed Cython. Independent research has confirmed that spaCy is the fastest in the world. If your application needs to process entire web dumps, spaCy is the library you want to be using.
machine-learning  programming  nlp  libraries  python 
yesterday by casey.chow

« earlier    

related tags

_forpiper  accord  add-to-list  advertising  ai-policy  ai  algorithms  analogy  apple  archive  arduino  art  artificial-intelligence  arxiv  atoms  audio  auto-learning  bias  blog  book  browser  c#  chaos  chatbots  cheatsheet  children  classical-statistics  classification  computer-vision  concept  content-samurai  conversational-interfaces  convnet  copywriting  course-materials  course  courses  crash  crypto  data-science  data  datascience  deep-learning  deeplearning  department-for-transport  design  distributed-computing  dnn  dotnet  edge-comp  education  email-marketing  email  ethics  executive-summary  flexibility  framework  game  gans  gender  github  google  gradient-descent  graphics  hack  hashing  heard-the-talk  hmm  howto  ico  image-dataset  image  imaging  insightful  interprability  javascript  keras  learning  lecture  lib  libraries  library  liner-notes  lstm  machine-learning  machinelearning  magenta  marketing  math  mathematics  medium  ml  music  networks  neural-net  neural-networks  neural  neuro  nibble  nlp  nlu  noteworthy  nsynth  object-identification  oneday  open-ai  open-government  open-rate  open-rates  optimization  overfitting  p5js  papers  physics  pocket  prediction  probability  programming  project  python  pytorch  quora  recurrent-neural-network  reference  regularization  reinforcement-learning  reinforcement  research  robotics  science  scienceontheweb  scikit-learn  search  security  service  siri  split-testing  stability  statistics  study-group  systematic-review  techtariat  tensorflow  tensorflowjs  text-classification  text-extraction  text  to-invite-to-gptp  to-read  tutorial  tutorials  vector-embedding  video-games  video  vision  voice  voxable  webgl  wikipedia  wire-guided  wow  youtube 

Copy this bookmark:



description:


tags: