AI News, The one machine learning concept you need to know
- On Sunday, June 3, 2018
- By Read More
The one machine learning concept you need to know
In order to learn a technical subject, it pays off to have a solid understanding of the conceptual framework that underlies that subject.
What I’m suggesting, is that before you really dive into the details, you need to have a solid, intuitive understanding of what you’re actually trying to do when you apply a machine learning algorithm.
We’re going to perform some basic machine learning with this data, and in doing this, I want to help you understand an important concept.
In machine learning, the variables that we use as inputs to our machine learning algorithms are commonly called inputs, but they are also frequently referred to as predictors or features (the terms are more or less interchangeable).
In ML, the variable that we’re trying to predict is commonly called a target variable, but they are also called output variables or response variables (again, the terms are largely interchangeable).
As I just mentioned, input_var is the independent input variable and target_var is the dependent target variable that we’re going to predict.
(Note: if you’re not familiar with how ggplot works, and how to create scatterplots in ggplot, you can find out more information in a separate blog post.
To be honest, knowledge of data visualization and exploratory data analysis is a prerequisite for doing machine learning.
If you don’t know data visualization and EDA, you might want to start there …) Ok, here’s the code to create a scatterplot of our data using ggplot:
want to pause here for a moment and bring us back to our original question: “what is the core task of machine learning?”
data = indicates that we’re going to be building a model with training observations in the df.unknown_fxn_data dataset.
If you had a dataset with several predictor variables, you could use several of them by separating each predictor with a + sign.) Finally, method = “lm”
That is, we’re going to fit a straight line to these data of the form , where is the slope and is the intercept.
Now that we’ve built a simple linear model using the train() function, let’s plot the data and plot the linear model on top of it.
So ultimately, when we began this exercise, there was a hidden function that generated the data, and we used machine learning to estimate that function.
The machine learning method used only the 28 data points in our dataset to select an estimated function that approximates the underlying function, .
When you’re doing machine learning (specifically, supervised learning), you’re essentially using computational techniques to reverse engineer the underlying function from the data points alone.
In the exercise in this blog post, I intentionally kept the underlying function hidden from you, because we never know the underlying function (I only revealed it at the end to help drive home the concept).
As I mentioned in the beginning of the blog post, the real secret to mastering a technical subject is developing intuition for each major concept.
- On Monday, September 23, 2019
Gradient descent, how neural networks learn | Chapter 2, deep learning
Subscribe for more (part 3 will be on backpropagation): Thanks to everybody supporting on Patreon
But what *is* a Neural Network? | Chapter 1, deep learning
Subscribe to stay notified about new videos: Support more videos like this on Patreon: Special .
Machine Learning Audiobook | Ethem Alpaydi
Get this audiobook title in full for free: Narrated by Steven Menasche Duration 4 hours 25 minutes 30 seconds Today, machine learning ..
Alan Turing: Crash Course Computer Science #15
Today we're going to take a step back from programming and discuss the person who formulated many of the theoretical concepts that underlie modern ...
Avi Goldfarb & Ajay Agrawal: "Prediction Machines: The Simple Economics of AI" | Talks at Google
The idea of artificial intelligence--job-killing robots, self-driving cars, and self-managing organizations--captures the imagination, evoking a combination of ...
How Computers Calculate - the ALU: Crash Course Computer Science #5
Take the 2017 PBS Digital Studios Survey: Today we're going to talk about a fundamental part of all modern computers
Intro - The Math of Intelligence
Welcome to The Math of Intelligence! In this 3 month course, we'll cover the most fundamental math concepts in Machine Learning. In this first lesson, we'll go ...
Compression: Crash Course Computer Science #21
Get your first two months of CuriosityStream free by going to and using the promo code “crashcourse”. So last episode we ..
7. ChIP-seq Analysis; DNA-protein Interactions
MIT 7.91J Foundations of Computational and Systems Biology, Spring 2014 View the complete course: Instructor: David Gifford In ..
Seminar 9: Surya Ganguli - Statistical Physics of Deep Learning
MIT RES.9-003 Brains, Minds and Machines Summer Course, Summer 2015 View the complete course: Instructor: Surya ..