eigenSPACE

Page 1

eigenSPACE

MAS ETH ARCH/CAAD Prof. Dr. Ludger Hovestadt Dr. phil. Vera B端hlmann MAS Thesis eigenSPACE ALEKSANDAR LALOVIC


2


Contents:

Introduction 5 PCA - principal component analysis

8

Programing 10 input DATA

14

eigenIMAGE 0

16

eigenIMAGE 1

17

eigenIMAGE 2

18

eigenIMAGE 3

19

eigenIMAGE 4

20

eigenIMAGE 5

21

Resources 22


4


Introduction

PCA (principal component analysis) has been used extensively in facial recognition procedures and although this procedure has been defined and familiar for a while now, images generated during this procedure were mostly a by-product instead of focus point. The goal of this research is to try and use PCA to synthesize images of space or objects that could potentially be of use for analytical or design purposes. Since the algorithm for such procedure is more or less defined, for synthesis of such images key issue becomes optimization of the procedure with an intention to produce hi-resolution output, and the shaping of input data stream. By performing PCA on data set which consist only of elements belonging to a same or similar type, we could say that resulting eigenimages could represent predominant character, or quality of the data set in question. So instead of describing a notion of style as a set of individual characteristic features (elements, proportions, colours etc.), one impression-image could be created for which it could be assumed that it contains of all of the relevant qualities of the chosen data set. On the other hand, instead to analyse, PCA could potentially be used with an intention to synthesize new content. In this approach it would be important to form a data set out of elements belonging to different types of data. Eigenimages that would come as a result of a such PCA would be hybrids of chosen input types.

Figure 1 Set of eigenFaces http://blog.finalevil.com/2008/07/2.html Š

5


Curent case study deals with coridors, trying to capture and compress diversity in character of all of these elements belonging to a same type into a single element, eigenCoridor. It starts by performing a spatial compression of tree dimensional into two dimensional, volume into plane, space into an image. Based upon these images, eigenImages are calculated which could be preceived as a new generation of hybrid images, where every eigenIMAGE contains a certain percentage of every input image. These eigenImages are then animated into a sequence simulating an eigenCoridor. Familiar data compression and encoding procedures (images, video, sound ect.) are conducted with an intention to preserve same or similar meaning while saving resources. During such procedures data usually appears unreadable in its encoded form, and needs to be decoded before it can be used. EigenImages, as a by-product of PCA although appear to be encoded or compressed are actually readable, and as such represent a new value. Such procedure could be considered to be an image synthesizing tool and these eigenIMAGES’s could potentially be used to create a space which would be fundamentally different than its already familiar input elements. One of the qualities of PCA is that it is lossless, meaning that all of the processed data is preserved and could easily be retrieved. It is also automated, and doesn’t rely on our ability to comprehend or manage data manually. Therefore the amount of data (images) we can input is only limited by our memory and computation resources. Such an amount of images would be very difficult to multiply manually and it would usually result in loss of data. Folowing example is pixel based, meaning that calculation is performed on grayscale values of every pixel ranging from 0 to 255, but since its becoming more simple and affordable to collect and store 3D data (Kinect - a motion sensing input device by Microsoft), its easy to imagine procesing of 3D point coordinates with PCA, and an actual 3D model as a final result. This calculation would not be significantly more demanding in terms of computation time or memory.

6


7


PCA - principal component analysis

PCA1 is referred to as a procedure for finding patterns in data of high dimension and expressing it in such a way to highlight their similarities and differences. It is mathematically defined as an orthogonal linear transformation that transforms the data to a new coordinate system such that the greatest variance by any projection of the data comes to lie on the first coordinate (first principal component), the second greatest variance on the second coordinate, and so on. Data set In general for PCA to work, input samples have to be of a same size. Since in this case images are used as an input it means that: • image resolution should be the same • image perspectives should be as closely aligned as possible • lighting conditions should match as closely as possible Once these criteria has been met, gray scale values in a 0 to 255 range from each pixel within each image are read and combined into a single data matrix. This matrix is then used to calculate covariance values between dimensions (images). Covariance Covariance is measured between two dimensions and is used to find out how much the dimensions vary from the mean with respect to each other. Since covariance is always measured between 2 dimensions. If we have a data set with more than 2 dimensions, there is more than one covariance measurement that can be calculated. In that case we would calculate all of the covariance values and put it into a matrix where each entry in the matrix is the result of calculating the covariance between two separate dimensions. For an n dimensional data set the matrix would have n rows and n columns (square matrix). This is called covariance matrix. Based upon these covariance matrixes eigenvectors are calculated. There are many programming libraries available on-line which are able to perform this calculation efficiently. The one that was used in this particular case was JMathTools2.

1

For more detailed explanation on PCA refer to : Lindsay I Smith, A tutorial on Principal Components Analysis (February 26, 2002) http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf 2 http://jmathtools.berlios.de/

8


EigenVector EigenVectors at this point represent the origin and orientation for new transformed coordinate system. For an n dimensional data set n eigenvectors with n corresponding eigenvalues can be calculated. Each eigenvector has a corresponding eigenvalue. Eigenvalues sort eigenvectors by importance and determine which vectors describe the data set most closely. The eigenvector with the highest eigenvalue is the principal component of the data set. Eigenimage Eigenimages represent original image pixel data values read from the new transformed coordinate system. This data is calculated by multiplying two matrices, one containing the original data set and other eigenvectors. Figure 2 shows PCA performed on a two dimensional data set where x and y dimensions are two input images. Every nth point within the graph represents nth pixel of both images where its corresponding x and y values are infect grayscale values of nth pixel in first and the second image respectively.

x 300

270

240

210

180

150

120

90

60

30

y 0

30

60

90

120

150

180

210

240

270

300

Figure 2 Two-dimensional PCA with eigenVectors in red and green

9


Programing

The PCA simulation has been programed in Java and additional libraries: • JMathTools1, a collection of independent packages containing linear algebra methods as well as basic matrix operations. • Processing2, an open source programming language, was used for operations with images and pixel based calculations.

The actual Java code consists of 3 classes: • MainApplet - main function body containing calls for other classes that also draws eigenImages from calculated data. • Data Input: a folder containing consecutively named ( 0,1,2,3,4 … n) JPEG images of the same resolution Output: two dimensional array of data, with nth column containing all of the pixels of nth image, and each row containing nth pixel of every image. • PCA Input: Two dimensional array of data Output: Eigenvectors and Eigenvalues

1 2

10

http://jmathtools.berlios.de/ http://www.processing.org/


package eigen_Image; import static org.math.array.DoubleArray.transpose; import static org.math.array.LinearAlgebra.times; import processing.core.PApplet;

MainApplet class

public class MainApplet extends PApplet{ int col; int row; int eigenNo=4; int imgWidth=300; int numberOfImages=4; double [][] finalData; public void setup(){ size (1200,400);

//_______________________________________________________ load data

Data rawImages = new Data(numberOfImages); rawImages.loadData(); double rawData [][] = rawImages.getPixelValues(); double pixData[][]= transpose(rawData);

//__________________________________________________ perform PCA

PCA pca = new PCA(pixData); pca.showPCA();

//______________________________________ multiply data by eigenVECTOR

double [][] fin = pca.out; finalData = times(fin,rawData);

//________________________________________________ draw eigenIMAGE

int mov = 0; for (int j=0;j<eigenNo;j++){ row=0; pushMatrix(); translate(mov,0); for (int i=0;i<finalData[0].length;i++){ int str = (int)finalData[j][i]; if (str<0){ str=str*(-1); } stroke(str); point(col,row); if (col==imgWidth){ col=0; row=row+1; } col=col+1; } mov=mov+imgWidth; popMatrix(); } } }

11


package eigen_Image; import static org.math.array.StatisticSample.mean; import java.awt.Color; import processing.core.PApplet; import processing.core.PImage;

DATA class

public class DATA extends PApplet{ int imgNo; int pixNo = 0; int width; PImage curent; PImage[] images; public double [] meanT; double [][] pixelValues; //__________________________________________________________ class constructor

Data(int imageNo){ imgNo=imageNo; } //_________________________________________________________________ load data

public void loadData (){ images = new PImage[imgNo]; PImage ct = loadImage(0 + “.jpg”); width = ct.width; pixelValues=new double[imgNo][ct.pixels.length]; for ( int i = 0; i< images.length; i++ ){ curent = loadImage( i + “.jpg” ); images[i] = curent; for(int j=0; j<curent.pixels.length; j++){ Color col = new Color(curent.pixels[j]); int gray=(int)(0.3*col.getRed()+0.59*col.getGreen()+0.11*col. getBlue()); pixelValues[i][j]=gray; } } }

//___________________________________________________________ geters and seters public double[][] getPixelValues() { return pixelValues; } }

12


package eigen_Image; import org.math.array.*; import Jama.EigenvalueDecomposition; import static org.math.array.LinearAlgebra.*; import static org.math.array.StatisticSample.*;;

PCA

class

public class PCA {

double[][] X; // initial datas : lines = events and columns = variables double[] meanX, stdevX; double[][] Z; // X centered reduced double[][] cov; // Z covariance matrix double[][] U; // projection matrix double[] info; // information matrix static double [][] out; static double [][] finalData;

public PCA(double[][] _X) { X = _X; stdevX = stddeviation(X); meanX = mean(X); Z = center_reduce(X); cov = covariance(Z); EigenvalueDecomposition e = eigen(cov); U = transpose(e.getV().getArrayCopy()); info = e.getRealEigenvalues(); } // ________________________normalization of x relatively to X mean and standard deviation public double[][] center_reduce(double[][] x) { double[][] y = new double[x.length][x[0].length]; for (int i = 0; i < y.length; i++) for (int j = 0; j < y[i].length; j++) y[i][j] = (x[i][j] - meanX[j]) / stdevX[j]; return y; } // _______________________________________________ command line display of results public void showPCA() { System.out.println(“Projection vectors\n” + DoubleArray.toString(transpose(U))); System.out.println(“Information per projection vector\n” + DoubleArray.toString(info)); out = U; } }

13


input DATA 14


15


eigenIMAGE 0 16


eigenIMAGE 1 17


eigenIMAGE 2 18


eigenIMAGE 3 19


eigenIMAGE 4 20


eigenIMAGE 5 21


BIBLIOGRAPHY • Richard Szeliski. Computer Vision – Algorithms and Applications. (Published by: Springer –Verlag London, 2011) • Howard Anton, Elementary Linear Algebra (Published by: John Wiley & Sons Inc.) • Lindsay I Smith, A tutorial on Principal Components Analysis (February 26, 2002) http://www.cs.otago.ac.nz/cosc453/student_tutorials/principal_components.pdf IMAGES • Figure 1 Set of eigenFaces http://blog.finalevil.com/2008/07/2.html RESOURCES • Eclipse, IDE for Java Developers http://eclipse.org/downloads/moreinfo/java.php • JMathTools, a collection of independent Java packages containing simple linear algebra methods http://jmathtools.berlios.de • Processing, an open source programming language http://www.processing.org/

22


23


ALEKSANDAR LALOVIC aleksandar_lalovic@yahoo.com ETH Z端rich CAAD Department HPZ Floor F Schafmattstr. 32 8093 Zurich Switzerland


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.