AI News, Highlights from ICML 2016

Highlights from ICML 2016

There were walls and walls of multi-arm bandit posters (etc.) but I mostly checked out neural net, deep learning things.

ResNets and their cousins (like highway networks) succeed essentially because wiggling a little tends to be better than going somewhere entirely different.

Paper: A Deep Learning Approach to Unsupervised Ensemble Learning Discrete Deep Feature Extraction: A Theory and New Architectures is a little math-heavy but close to interesting ideas about understanding and using the internals of neural networks.

Autoencoding beyond pixels using a learned similarity metric does neat things using the internals of a network for more network work - it might be metaphorically 'reflective' - and comes up with some flashy visual results, too.

Semi-Supervised Learning with Generative Adversarial Networks shows that a fairly natural idea for GANs does in fact work: you can use labels like 'real_a, real_b, real_c, fake' instead of just 'real, fake'.

We show that this method can be used to create a more data-efficient classifier and that it allows for generating higher quality samples than a regular GAN.

Convergence may no longer be guaranteed, learning may hecome prohibitively slow, and final performance after learning may be be too poor to be interesting, Simard said that quote from the paper isn't really true;

This is a quote I took down as he was speaking: Gradient descent is so robust to errors in the gradient, your code probably has bugs, but you don't need to fix them.

During a panel discussion, Yann LeCun said max pooling makes adversarial training unstable, and so it's better to use strided convolution if you want to downsample.

Later I noticed related sentiment in Karpathy's notes: Getting rid of pooling.​ Many people dislike the pooling operation and think that we can get away without it.

Discarding pooling layers has also been found to be important in training good generative models, such as variational autoencoders (VAEs) or generative adversarial networks (GANs).

Barak Pearlmutter gave a talk about automatic differentiation and a really fast system called vlad that I think runs on stalin.

The next day Ryan Adams gave a talk on the same topic at the AutoML workshop featured a Python autograd package that seemed to have some of the features (closure, for example) as vlad.

Recurrent Neural Networks - Ep. 9 (Deep Learning SIMPLIFIED)

Our previous discussions of deep net applications were limited to static patterns, but how can a net decipher and label patterns that change with time?

Network Devices Explained | Hub, Bridge, Router, Switch

This video will go over basic network devices. We will see how they work and what they are used for. Hubs Hubs were around before switches. A Hubs job is to ...

How to Win Slot Machines - Intro to Deep Learning #13

We'll learn how to solve the multi-armed bandit problem (maximizing success for a given slot machine) using a reinforcement learning technique called policy ...

Computer Networks: Crash Course Computer Science #28

Today we start a three episode arc on the rise of a global telecommunications network that changed the world forever. We're going to begin with computer ...

How good is your fit? - Ep. 21 (Deep Learning SIMPLIFIED)

A good model follows the “Goldilocks” principle in terms of data fitting. Models that underfit data will have poor accuracy, while models that overfit data will fail to ...

Capsule Networks (CapsNets) – Tutorial

CapsNets are a hot new architecture for neural networks, invented by Geoffrey Hinton, one of the godfathers of deep learning. NIPS 2017 Paper: * Dynamic ...

Teaching my computer to give me friends (I... I mean images!) (convolutional neural networks)

The 1.5-month-long hiatus is over! Note to self: Never lip-sync things that don't need to be lip-synced. It takes forever Here's HyperGAN, the tool I used to create ...

Crush a Diamond with a Hammer? | Dude Perfect

It's time for our brand NEW series: Overtime! Subscribe AND share this video for a chance to win new DP merch! ▻ Click HERE to subscribe to Dude Perfect!

Keynote Talk: Model Based Machine Learning

The Academic Research Summit, co-organized by Microsoft Research and the Association for Computing Machinery, is a forum to foster meaningful discussion ...

Lecture 16 - Radial Basis Functions

Radial Basis Functions - An important learning model that connects several machine learning models and techniques. Lecture 16 of 18 of Caltech's Machine ...