AI News, alexandraj777/machine-learning-samples forked from aws-samples/machine-learning-samples
- On Tuesday, June 5, 2018
- By Read More
alexandraj777/machine-learning-samples forked from aws-samples/machine-learning-samples
This sample code builds a hyperparameter optimization pipeline for Amazon Machine Learning using the latest AWS SDK for Python (Boto 3).
If want to use SigOpt to optimize your hyperparameters faster and better than tuning by hand, sign up for a free trial on our website and grab your API token from your user profile.
This script relies on a manually specified list of hyperparameters to tune the hyperparameters of a linear binary classification model.
Many machine learning models have exposed parameters, commonly known as hyperparameters (Amazon ML sometimes calls them training parameters), that you choose values for before model training begins.
You'll notice that learning rate is also a hyperparameter of your model, but Amazon ML is automatically selecting a value for it based on your data, so we can't tune it in this example.
In keeping with best practices for hyperparameter optimization, and to prevent over-fitting of the model, this example actually maximizes the average of k-fold cross validated AUC metrics.
This description skips over the details of how cross validation is performed with Amazon ML, because it is described much better in the README for the K-Fold Cross Validation example that formed the basis for this code sample.
To perform hyperparameter optimization the scripts iteratively choose new values of regularization_type and regularization_amount, evaluates a model with these new hyperparameters for every fold of the data, averages the AUC metrics, and records the performance of the assignments.
Each time the script evaluates a model on a new set of hyperparameters it creates k machine learning models, one for each train datasource, and k evaluations, one for each evaluate datasource, on Amazon ML.
API calls via the python SDK will return quickly so that you can build a datasource, machine learning model, and an evaluation while the datasource is still pending!
- On Tuesday, May 26, 2020
Second Order Optimization - The Math of Intelligence #2
Gradient Descent and its variants are very useful, but there exists an entire other class of optimization techniques that aren't as widely understood. We'll learn ...
Lecture 16 | Adversarial Examples and Adversarial Training
In Lecture 16, guest lecturer Ian Goodfellow discusses adversarial examples in deep learning. We discuss why deep networks and other machine learning ...
What's new with Azure Machine Learning : Build 2018
In September, we launched a huge set of updates for Azure Machine Learning to let you manage the end to end lifecycle of Machine Learning Development.
Lecture 8 | Deep Learning Software
In Lecture 8 we discuss the use of different software packages for deep learning, focusing on TensorFlow and PyTorch. We also discuss some differences ...
AI @ Microsoft, How we do it and how you can too! : Build 2018
For the last 3 decades, Microsoft has been powered by Machine Learning. Come to this session for a first time ever, under the hood look at how we use ML to ...
Technology Keynote: Microsoft Azure : Build 2018
Recurrent Neural Network - The Math of Intelligence (Week 5)
Recurrent neural networks let us learn from sequential data (time series, music, audio, video frames, etc ). We're going to build one from scratch in numpy ...
Small Deep Neural Networks - Their Advantages, and Their Design
Deep neural networks (DNNs) have led to significant improvements to the accuracy of machine-learning applications. For many problems, such as object ...
Mozilla's DeepSpeech and Common Voice projects
Open and offline-capable voice recognition for everyone Presented by Tilman Kamp. First presented at FOSDEM, Feb 3, 2018.
On Characterizing the Capacity of Neural Networks using Algebraic Topology
The learnability of different neural architectures can be characterized directly by computable measures of data complexity. In this talk, we reframe the problem of ...