Sunday, January 8, 2017

Eigenfaces

In this post we’ll talk about the application of principle component analysis in face recognition.

Eigen vectors as directions of variation

Given a -dimensional dataset with samples, each sample being a face photo, we would like to find out unit vectors in the space along which the dataset varies the most around the mean .

To simplify the matter let’s assume that the dataset has been standardized as .

Suppose we have such a unit vector , then the projection of a sample on would be , so the coeffecient is .

The variance along :

where is the covariance matrix of the dataset.

To maximize , needs to be the eigenvector that corresponds to the largest eigenvalue of .

Dimensionality trick

With large images is going to be large, which poses a numericl difficult when you solve for the eigenvectors. There is neat trick to overcome this.

We have and want to find out the eigenvectors of , considering , we try to find the eigenvectors of first:

Thus we find an eigenvector of and transform it by and get the eigenvector of .

Eigenfaces

The eigenvectors thus obtained above are also face photos:

enter image description here

Face reconstruction from eigenfaces

To reconstruct the face photos (approximately), do this:

Where each columen in is an eigenface and gives you the coeffecients.
e

Face space

But we dont’ have to reconstruct the faces in order to analyze them, gives us a new dataset with its dimensionality reduced from to , the latter now called the face space.

Give a new sampel , a simplified face recognition procedure would be:

  • Project into the face space: .
  • Find the most similar row for in , the reduced trainding data (one nearest neighbor).
  • That’s it!

1 comments:

Unknown said...

Interesting page! Maybe you visit my similar blog?