# AI News, Machine Learning FAQ

- On Thursday, June 7, 2018
- By Read More

## Machine Learning FAQ

Index There are 3 main strategies to reduce the number of features if necessary to avoid overfitting (due to the curse of dimensionality) and/or reduce the computational complexity (i.e., increase the computational efficiency).

Let’s assume the blue samples belong to one class, and the red circles belong to a second class.

Furthermore, we assume that this dataset has too many dimensions (okay, we only have 2 features here, but we need to keep it “simple” for visualization purposes).

- On Thursday, February 21, 2019

**Lecture 15 - Kernel Methods**

Kernel Methods - Extending SVM to infinite-dimensional spaces using the kernel trick, and to non-separable data using soft margins. Lecture 15 of 18 of ...

**Lecture 04 - Error and Noise**

Error and Noise - The principled choice of error measures. What happens when the target we want to learn is noisy. Lecture 4 of 18 of Caltech's Machine ...

**Lecture 13 - Validation**

Validation - Taking a peek out of sample. Model selection and data contamination. Cross validation. Lecture 13 of 18 of Caltech's Machine Learning Course - CS ...

**Mod-09 Lec-32 SVM formulation with slack variables; nonlinear SVM classifiers**

Pattern Recognition by Prof. P.S. Sastry, Department of Electronics & Communication Engineering, IISc Bangalore. For more details on NPTEL visit ...

**Anomaly Detection: Algorithms, Explanations, Applications**

Anomaly detection is important for data cleaning, cybersecurity, and robust AI systems. This talk will review recent work in our group on (a) benchmarking ...

**Mod-09 Lec-36 Positive Definite Kernels; RKHS; Representer Theorem**

Pattern Recognition by Prof. P.S. Sastry, Department of Electronics & Communication Engineering, IISc Bangalore. For more details on NPTEL visit ...

**Lecture 18 - Epilogue**

Epilogue - The map of machine learning. Brief views of Bayesian learning and aggregation methods. Lecture 18 of 18 of Caltech's Machine Learning Course ...

**Mod-02 Lec-22 Fisher’s LDA**

Pattern Recognition by Prof. C.A. Murthy & Prof. Sukhendu Das,Department of Computer Science and Engineering,IIT Madras.For more details on NPTEL visit ...

**Principal component analysis**

Principal component analysis (PCA) is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated ...

**Mod-10 Lec-39 Assessing Learnt classifiers; Cross Validation;**

Pattern Recognition by Prof. P.S. Sastry, Department of Electronics & Communication Engineering, IISc Bangalore. For more details on NPTEL visit ...