neural-networks   1845

« earlier    

[1812.04948] A Style-Based Generator Architecture for Generative Adversarial Networks
We propose an alternative generator architecture for generative adversarial networks, borrowing from style transfer literature. The new architecture leads to an automatically learned, unsupervised separation of high-level attributes (e.g., pose and identity when trained on human faces) and stochastic variation in the generated images (e.g., freckles, hair), and it enables intuitive, scale-specific control of the synthesis. The new generator improves the state-of-the-art in terms of traditional distribution quality metrics, leads to demonstrably better interpolation properties, and also better disentangles the latent factors of variation. To quantify interpolation quality and disentanglement, we propose two new, automated methods that are applicable to any generator architecture. Finally, we introduce a new, highly varied and high-quality dataset of human faces.
generative-art  generative-models  neural-networks  multiscale  very-impressive  to-write-about  consider:performance-measures 
2 days ago by Vaguery
[1810.00845] CHET: Compiler and Runtime for Homomorphic Evaluation of Tensor Programs
Fully Homomorphic Encryption (FHE) refers to a set of encryption schemes that allow computations to be applied directly on encrypted data without requiring a secret key. This enables novel application scenarios where a client can safely offload storage and computation to a third-party cloud provider without having to trust the software and the hardware vendors with the decryption keys. Recent advances in both FHE schemes and implementations have moved such applications from theoretical possibilities into the realm of practicalities.
This paper proposes a compact and well-reasoned interface called the Homomorphic Instruction Set Architecture (HISA) for developing FHE applications. Just as the hardware ISA interface enabled hardware advances to proceed independent of software advances in the compiler and language runtimes, HISA decouples compiler optimizations and runtimes for supporting FHE applications from advancements in the underlying FHE schemes.
This paper demonstrates the capabilities of HISA by building an end-to-end software stack for evaluating neural network models on encrypted data. Our stack includes an end-to-end compiler, runtime, and a set of optimizations. Our approach shows generated code, on a set of popular neural network architectures, is faster than hand-optimized implementations.
to-understand  machine-learning  algorithms  distributed-processing  seems-important  languages  neural-networks 
4 days ago by Vaguery
[1811.09620] TimbreTron: A WaveNet(CycleGAN(CQT(Audio))) Pipeline for Musical Timbre Transfer
In this work, we address the problem of musical timbre transfer, where the goal is to manipulate the timbre of a sound sample from one instrument to match another instrument while preserving other musical content, such as pitch, rhythm, and loudness. In principle, one could apply image-based style transfer techniques to a time-frequency representation of an audio signal, but this depends on having a representation that allows independent manipulation of timbre as well as high-quality waveform generation. We introduce TimbreTron, a method for musical timbre transfer which applies "image" domain style transfer to a time-frequency representation of the audio signal, and then produces a high-quality waveform using a conditional WaveNet synthesizer. We show that the Constant Q Transform (CQT) representation is particularly well-suited to convolutional architectures due to its approximate pitch equivariance. Based on human perceptual evaluations, we confirmed that TimbreTron recognizably transferred the timbre while otherwise preserving the musical content, for both monophonic and polyphonic samples.
style-transfer  neural-networks  feature-extraction  signal-processing  audio  to-write-about  consider:performance-measures 
5 days ago by Vaguery
[1803.09473] code2vec: Learning Distributed Representations of Code
We present a neural model for representing snippets of code as continuous distributed vectors ("code embeddings"). The main idea is to represent a code snippet as a single fixed-length code vector, which can be used to predict semantic properties of the snippet. This is performed by decomposing code to a collection of paths in its abstract syntax tree, and learning the atomic representation of each path simultaneously with learning how to aggregate a set of them. We demonstrate the effectiveness of our approach by using it to predict a method's name from the vector representation of its body. We evaluate our approach by training a model on a dataset of 14M methods. We show that code vectors trained on this dataset can predict method names from files that were completely unobserved during training. Furthermore, we show that our model learns useful method name vectors that capture semantic similarities, combinations, and analogies. Comparing previous techniques over the same data set, our approach obtains a relative improvement of over 75%, being the first to successfully predict method names based on a large, cross-project, corpus. Our trained model, visualizations and vector similarities are available as an interactive online demo at this http URL. The code, data, and trained models are available at this https URL.
representation  genetic-programming  (it-ain't)  deep-learning  neural-networks  feature-construction  to-write-about  discrete-and-continuous-sittin-in-a-tree 
6 days ago by Vaguery
[1808.09357] Rational Recurrences
Despite the tremendous empirical success of neural models in natural language processing, many of them lack the strong intuitions that accompany classical machine learning approaches. Recently, connections have been shown between convolutional neural networks (CNNs) and weighted finite state automata (WFSAs), leading to new interpretations and insights. In this work, we show that some recurrent neural networks also share this connection to WFSAs. We characterize this connection formally, defining rational recurrences to be recurrent hidden state update functions that can be written as the Forward calculation of a finite set of WFSAs. We show that several recent neural models use rational recurrences. Our analysis provides a fresh view of these models and facilitates devising new neural architectures that draw inspiration from WFSAs. We present one such model, which performs better than two recent baselines on language modeling and text classification. Our results demonstrate that transferring intuitions from classical models like WFSAs can be an effective approach to designing and understanding neural models.
automata  representation  neural-networks  recurrent-networks  architecture  rather-interesting  ReQ  to-write-about 
7 days ago by Vaguery
[1810.06758] Discriminator Rejection Sampling
We propose a rejection sampling scheme using the discriminator of a GAN to approximately correct errors in the GAN generator distribution. We show that under quite strict assumptions, this will allow us to recover the data distribution exactly. We then examine where those strict assumptions break down and design a practical algorithm - called Discriminator Rejection Sampling (DRS) - that can be used on real data-sets. Finally, we demonstrate the efficacy of DRS on a mixture of Gaussians and on the SAGAN model, state-of-the-art in the image generation task at the time of developing this work. On ImageNet, we train an improved baseline that increases the Inception Score from 52.52 to 62.36 and reduces the Frechet Inception Distance from 18.65 to 14.79. We then use DRS to further improve on this baseline, improving the Inception Score to 76.08 and the FID to 13.75.
neural-networks  machine-learning  algorithms  generative-models  schemes  rather-interesting  to-understand 
7 days ago by Vaguery
Random Forests Classifiers in Python (article) - DataCamp
Learn about Random Forests and build your own model in Python, for both classification and regression.
machine-learning  neural-networks  ai 
10 days ago by hay
Predicting the Survival of Titanic Passengers – Towards Data Science
In this blog-post, I will go through the whole process of creating a machine learning model on the famous Titanic dataset, which is used by many people all over the world. It provides information on…
ai  neural-networks  statistics  dataset 
10 days ago by hay
Simple Housing Price Prediction Using Neural Networks with TensorFlow
GitHub is where people build software. More than 28 million people use GitHub to discover, fork, and contribute to over 85 million projects.
neural-networks  machine-learning  ai 
10 days ago by hay
wondonghyeon/face-classification: Face model to classify gender and race. Trained on LFWA+ Dataset.
Face model to classify gender and race. Trained on LFWA+ Dataset. - wondonghyeon/face-classification
face  neural-networks  machine-learning  ai 
10 days ago by hay
Neural Network Architectures – Towards Data Science
Deep neural networks and Deep Learning are powerful and popular algorithms. And a lot of their success lays in the careful design of the neural network architecture. I wanted to revisit the history…
deep_learning  cnn  data_science  machine_learning  neural-networks  NeuralNetworks 
24 days ago by tranqy
Let the AI Do the Talk: Adventures with Natural Language Generation - Speaker Deck
Slides of @marcobonzanini at #pyparis: Let the AI Do the Talk: Adventures with Natural Language Generation
nlp  generation  python  neural-networks 
4 weeks ago by ronnix

« earlier    

related tags

(it-ain't)  3d  a-i  adamw  aesthetics  ai  algorithmics  algorithms  also:duh  approximation-theory  approximation  architecture  articles  artificial-intelligence  artificial_intelligence  arxiv  audio  automata  awesome  back-propagation  beatboxing  biometrics  blockchain  c  cellular-automata  chip  classical  cnn  code  computing  consider:feature-discovery  consider:performance-measures  consider:representation  control-theory  data-science  data_science  dataset  debugging  deep-learning  deep_learning  deepfakes  deeplearning  dev  differential-equations  discrete-and-continuous-sittin-in-a-tree  distributed-processing  drums  dynamical-systems  electronics  embedding  embeddings  emergence  equivalences-of-motion-over-different-paths  evolutionary-algorithms  example  explicability  exploits  face  facebook  faces  fake  favorites  feature-construction  feature-extraction  forensics  fp  fpga  function-approximation  functional  gan  generalization  generation  generative-art  generative-models  genetic-programming  github  google  gradient-flows  hardware  haskell  heard-the-talk  howto  ian-goodfellow  image-processing  images  inverse-problems  kaggle  languages  learning-in-public  learning  library  machine-learning  machine-ux  machine_learning  machinelearning  math  matrix  ml  mooc  multiscale  music  neural  neuralnets  neuralnetwork  neuralnetworks  nips  nlp  ntm  nudge-targets  ontology  optimization  overview  papers  performance-measure  photography  printers  privacy  problem-solving  programming  projects  python  quantum-computing  r-project  randomization  rather-interesting  read2of  recomendation  recommender  recurrent-networks  reference  representation  req  research-article  riscv  rnn  robotics  schemes  search  security  seems-important  signal-processing  similarity  stability  statistics  style-transfer  super-convergence  system-identification  tensorflow  testing  the-mangle-in-practice  to-do  to-read  to-understand  to-write-about  tools  topology  turing  tutorial  very-impressive  visualization  voice-driven-interfaces  whoopsie-daisy  word2vec 

Copy this bookmark:



description:


tags: