AI News, NIPS Proceedingsβ

NIPS Proceedingsβ

Part of: Advances in Neural Information Processing Systems 27 (NIPS 2014) L-BFGS has been applied as an effective parameter estimation method for various machine learning algorithms since 1980s.

Second, we propose a new L-BFGS algorithm, called Vector-free L-BFGS, which avoids the expensive dot product operations in the two loop recursion and greatly improves computation efficiency with a great degree of parallelism.

Lecture 7 | Training Neural Networks II

Lecture 7 continues our discussion of practical issues for training neural networks. We discuss different update rules commonly used to optimize neural networks ...

6. Advanced Optimization

Video from Coursera - Standford University - Course: Machine Learning:

Efficient Second-order Optimization for Machine Learning

Stochastic gradient-based methods are the state-of-the-art in large-scale machine learning optimization due to their extremely efficient per-iteration ...

Build an AI Artist - Machine Learning for Hackers #5

Only a few days left to signup for my Decentralized Applications course! This video will get you up and running with your first AI Artist ..

A Communication-Efficient Parallel Algorithm for Decision Tree

NIPS 2016 Spotlight : A Communication-Efficient Parallel Algorithm for Decision Tree.

UW Allen School Colloquium: David Knowles (Stanford University)

Abstract Splicing, the cellular process by which "junk" intronic regions are removed from precursor messenger RNA, is tightly regulated in healthy human ...

Stochastic gradient descent

Stochastic gradient descent is a gradient descent optimization method for minimizing an objective function that is written as a sum of differentiable functions.

Mod-06 Lec-16 Quasi-Newton Methods - Rank One Correction, DFP Method

Numerical Optimization by Dr. Shirish K. Shevade, Department of Computer Science and Engineering, IISc Bangalore. For more details on NPTEL visit ...