Sami Publishing CompanyInternational Journal of Advanced Biological and Biomedical Research2383-27621720130701Compression of Breast Cancer Images By Principal Component Analysis7677767803ENMonikaSaraswatElectrical, RGPV/MITS, Gwalior, M.P, IndiaA. K.WadhwaniElectrical, RGPV/MITS, Gwalior, M.P, IndiaManishDubeyElectrical, RGPV/MITS, Gwalior, M.P, IndiaJournal Article20140817The principle of dimensionality reduction with PCA is the representation of the dataset ‘X’in terms of eigenvectors e<sup>i </sup>∈ R<sup>N </sup> of its covariance matrix. The eigenvectors oriented in the direction with the maximum variance of <em>X </em>in <em>R<sup>N </sup></em>carry the most relevant information of <em>X.</em> These eigenvectors are called principal components [8]. Assume that <em>n </em>images in a set are originally represented in matrix form as Ui∈ R<sup>r </sup><sup>×c</sup>, i = 1,......,<em>n</em>, where <em>r </em>and <em>c </em>are, repetitively, the number of rows and columns of the matrix. In vectorized representation (matrix-to-vector alignment) each <em>Ui is</em> a N = r × c- dimensional vector <em>ai </em>computed by sequentially concatenating all of the lines of the matrix <em>Ui.</em> To compute the Principal Components the covariance matrix of <em>U </em>is formed and Eigen values, with the corresponding eigenvectors, are evaluated. The Eigen vectors forms a set of linearly independent vectors, i.e., the base {φ}<sup> n </sup><em>i=1</em> which consist of a new axis system [10]https://www.ijabbr.com/article_7803_c1bfed921b1cb6eddb3d3d9a00f75143.pdf