Support Vector Machines
-
AKA Maximal Margin Classifier
-
Support Vectors are the examples / nodes on the graph that are on the edge in the class group, meaning they are at the border and no same class instance will occur after them
- Closer to the hyperplane
-
Find hyperplane that maximizes the distance / margin from the support vectors of each class label
- This hyperplane will act as our final decision boundary
-
Higher Dimension Problems
- We transform the data into higher dimensional space to make it linearly separable
Mathematical Intuition
Equation of Hyperplane
Points that are on the hyperplane satisfy this equation
Points that are above the hyperplane, AKA belonging to +ve class satisfy this equation
Points that are below the hyperplane, AKA belonging to -ve class satisfy this equation
Distance between two Support Vectors / Edge Points
where is the magnitude of the Weight Vector
Distance of a point from the Hyperplane
where is the actual label value, the weight vector magnitude and is the value of hyperplane equation for point x
For a point lying in Origin
where is the value of intercept in since with will give us
Canonical Hyperplane
we need to find for points that are Support Vectors thus for them this eq. will always give
is the absolute distance with any normalisation since we are not dividing by
Our goal is to make this absolute distance as between Hyperplane and the Support Vectors
We then multiply s with our default hyperplane equation . Thus now points that will be support vector will have the distance from the Hyperplane as one