AI News, Have You Tried Using a 'Nearest Neighbor Search'?

Have You Tried Using a 'Nearest Neighbor Search'?

Roughly a year and a half ago, I had the privelage of taking a graduate 'Introduction to Machine Learning' course under the tutelage of the fantastic Professor Leslie Kaelbling.

While I learned a great deal over the course of the semester, there was one minor point that she made to the class which stuck with me more than I expected it to at the time: before using a really fancy or sophisticated or 'in-vogue' machine learning algorithm to solve your problem, try a simple Nearest Neighbor Search first.

In addition, if you don't have very many points in your initial data set, the performance of this approach is questionableIt's worth noting that having few data points in one's training set is already enough to give most machine learning researchers pause.

I found myself asking most of them the same question: Machine learning is, in many ways, a science of comparisonThere are often theoretical reasons to prefer one technique over another, but sometimes effectiveness or popularity is reason enough on its own, as has become the case for neural networks.

However, there's certainly a more general lesson to be learned here: in the midst of an age characterized by algorithms of famously 'unreasonable effectiveness', it's important to remember that simpler techniques are still powerful enough to solve many problems.

My job was to detect a vehicle — not a specific type of vehicle and not a make or model of car — a single, specific vehicle that we have in our garage.

My first instinct was to pull up the cutting edge work in real-time object detection (which I did) and get it to work on my own machine (which I also did) and to train it with a massive amount of images of our particular vehicle (which I was unable to doCollecting thousands of images on one's own is difficult enough without having to vary backgrounds and lighting conditions to build an effective training set.

The takeaway here is that though the simpler algorithms may not perform quite as well as the state-of-the-art, the gain in both time and computational complexity often outweighs the difficulties associated with more sophisticated solutions.

What is machine learning and how to learn it ?

Machine learning is just to give trained data to a program and get better result for complex problems. It is very close to data ..

Pedro Domingos: "The Master Algorithm" | Talks at Google

Machine learning is the automation of discovery, and it is responsible for making our smartphones work, helping Netflix suggest movies for us to watch, and ...

What Makes a Good Feature? - Machine Learning Recipes #3

Good features are informative, independent, and simple. In this episode, we'll introduce these concepts by using a histogram to visualize a feature from a toy ...

Cryptography: The Science of Making and Breaking Codes

There are lots of different ways to encrypt a message, from early, simple ciphers to the famous Enigma machine. But it's tough to make a code truly unbreakable.

11. Introduction to Machine Learning

MIT 6.0002 Introduction to Computational Thinking and Data Science, Fall 2016 View the complete course: Instructor: Eric Grimson ..

Visualizing a Decision Tree - Machine Learning Recipes #2

Last episode, we treated our Decision Tree as a blackbox. In this episode, we'll build one on a real dataset, add code to visualize it, and practice reading it - so ...

Gradient descent, how neural networks learn | Chapter 2, deep learning

Subscribe for more (part 3 will be on backpropagation): Thanks to everybody supporting on Patreon

Blockchain Consensus Algorithms and Artificial Intelligence

Is blockchain + AI a winning combo? Yes! They are complementary technologies, and knowing how both work will make you a much more powerful developer.

Build a Neural Net in 4 Minutes

How does a Neural network work? Its the basis of deep learning and the reason why image recognition, chatbots, self driving cars, and language translation ...

Quantum Algorithm - The Math of Intelligence #10

Quantum Computing offers hope for computing progress as we approach the limits of transistor density on silicon hardware. We're going to talk about the theory ...