AI News, Modern Machine Learning Algorithms: Strengths and Weaknesses

Modern Machine Learning Algorithms: Strengths and Weaknesses

In this guide, we’ll take a practical, concise tour through modern machine learning algorithms.

For example, Scikit-Learn’s documentation page groups algorithms by their learning mechanism. This produces categories such as: However, from our experience, this isn’t always the most practical way to group algorithms.

That’s because for applied machine learning, you’re usually not thinking, “boy do I want to train a support vector machine today!”

Of course, the algorithms you try must be appropriate for your problem, which is where picking the right machine learning task comes in.

As an analogy, if you need to clean your house, you might use a vacuum, a broom, or a mop, but you wouldn't bust out a shovel and start digging.

They are: In Part 2, we will cover dimensionality reduction, including: Two notes before continuing: Regression is the supervised learning task for modeling and predicting continuous, numeric variables. Examples include predicting real-estate prices, stock price movements, or student test scores.

decision trees) learn in a hierarchical fashion by repeatedly splitting your dataset into separate branches that maximize the information gain of each split.

We won't go into their underlying mechanics here, but in practice, RF's often perform very well out-of-the-box while GBM's are harder to tune but tend to have higher performance ceilings.

They use 'hidden layers' between inputs and outputs in order to model intermediary representations of the data that other algorithms cannot easily learn.

However, deep learning still requires much more data to train compared to other algorithms because the models have orders of magnitudes more parameters to estimate.

These algorithms are memory-intensive, perform poorly for high-dimensional data, and require a meaningful distance function to calculate similarity.

Examples include predicting employee churn, email spam, financial fraud, or student letter grades.

Predictions are mapped to be between 0 and 1 through the logistic function, which means that predictions can be interpreted as class probabilities.

The models themselves are still 'linear,' so they work well when your classes are linearly separable (i.e. they can be separated by a single decision surface).

To predict a new observation, you'd simply 'look up' the class probabilities in your 'probability table' based on its feature values.

However, we want to leave you with a few words of advice based on our experience: If you'd like to learn more about the applied machine learning workflow and how to efficiently train professional-grade models, we invite you to check out our Data Science Primer.

For more over-the-shoulder guidance, we also offer a comprehensive masterclass that further explains the intuition behind many of these algorithms and teaches you how to apply them to real-world problems.

Difference between Classification and Regression - Georgia Tech - Machine Learning

Watch on Udacity: Check out the full Advanced Operating Systems course for free ..

Machine Learning in R - Classification, Regression and Clustering Problems

Learn the basics of Machine Learning with R. Start our Machine Learning Course for free: ...

Linear Regression - Machine Learning Fun and Easy

Linear Regression - Machine Learning Fun and Easy

Machine Learning - Supervised Learning Regression Algorithms

Enroll in the course for free at: Machine Learning can be an incredibly beneficial tool to ..

Classification or Regression – Machine Learning Interview Preparation Questions

Looking to nail your Machine Learning job interview? In this video, I explain when classification should be used over regression, which is a commonly asked ...

Regression How it Works - Practical Machine Learning Tutorial with Python p.7

Welcome to the seventh part of our machine learning regression tutorial within our Machine Learning with Python tutorial series. Up to this point, you have been ...

3.4: Linear Regression with Gradient Descent - Intelligence and Learning

In this video I continue my Machine Learning series and attempt to explain Linear Regression with Gradient Descent. My Video explaining the Mathematics of ...

Training a machine learning model with scikit-learn

Now that we're familiar with the famous iris dataset, let's actually use a classification model in scikit-learn to predict the species of an iris! We'll learn how the ...

Brian Lange | It's Not Magic: Explaining Classification Algorithms

PyData Chicago 2016 As organizations increasingly make use of data and machine learning methods, people must build a basic "data literacy". Data scientist ...

Random Forest - Fun and Easy Machine Learning

Random Forest - Fun and Easy Machine Learning