Week 8: Support vector machines, nearest neighbours and regularisation
We will cover:
Where would you put the boundary to classify these two groups?
Here’s where LDA puts the boundary. What’s wrong with it?
Why is this the better fit?
LDA is an example of a classifier that generates a separating hyperplane
is defines a line orthogonal to the 1D separating hyperplane, with slope 0.81.
Equation for the separating hyperplane is \(x_2 = mx_1+b\), where \(m=\) -1.24, and \(b\) can be solved by substituting in the point \(((\bar{x}_{A1}+\bar{x}_{B1})/2, (\bar{x}_{A2}+\bar{x}_{B2})/2)\). (Separating hyperplane has to pass through the average of the two means, if prior probabilities of each class are equal.)
Separating hyperplane produced by LDA on 4D penguins data, for Gentoo vs Adelie. (It is 3D.)