28.09.2015 Views

Research Journal of Social Science & Management - RJSSM - The ...

Research Journal of Social Science & Management - RJSSM - The ...

Research Journal of Social Science & Management - RJSSM - The ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Cov( ,F) = E( F′) = 0 (pxm)<br />

Orthogonal Factor Model with m Common Factors<br />

X (px1) = µ (px1) + L (pxm) F (mx1) + (px1) .................................................................. (3)<br />

µ i = mean <strong>of</strong> variable i<br />

i = ith specific factor<br />

F j = jth common factor<br />

l ij = loading <strong>of</strong> the ith variable on the jth factor<br />

<strong>The</strong> orthogonal factor model provides the covariance structure for X. By definition and using the<br />

model equation (3) we have<br />

∑ = Cov (X) = E (X - µ)(X - µ)′<br />

= LE (FF′) L′ + E ( F′) L′ + LE (F ′) + ′<br />

= LL′ + Ψ<br />

Also by independence, Cov ( F) =E ( F′) = 0<br />

Communalities<br />

<strong>The</strong> important assumption <strong>of</strong> linearity is inherent in the formulation <strong>of</strong> the factor model. Hence the<br />

portion <strong>of</strong> the variance <strong>of</strong> the i th variable contributed by the m common factors is called the i th<br />

communality denoted by Var (X i ) = δ ii. This introduces uniqueness in definition <strong>of</strong> t-he i th specific<br />

variance. Thus the δ ii is composed <strong>of</strong> two components as shown below.<br />

<strong>The</strong> i th communality denoted by h 2 i is given as<br />

δ ii = l 2 i1 + l 2 i2 + … + l 2 im + Ψ i<br />

and δ ii = h 2 i + Ψ i , where i = 1, 2, ….., p, h 2 i = l 2 i1 + l 2 i2 + ….. + l 2 im<br />

Principal Component Solution for the Factor Model<br />

Principal Component Analysis (PCA)<br />

Principal component analysis (PCA) is commonly thought <strong>of</strong> as a statistical technique for data<br />

reduction. It helps to reduce the number <strong>of</strong> variables in an analysis by describing a series <strong>of</strong><br />

uncorrelated linear combinations <strong>of</strong> the variables that contain most <strong>of</strong> the variance. PCA was<br />

introduced by Pearson (1901) and later developed by Hotelling (1933) who described the variation in a<br />

set <strong>of</strong> multivariate data in terms <strong>of</strong> a set <strong>of</strong> uncorrelated variables.<br />

<strong>The</strong> objective <strong>of</strong> PCA is to find unit-length linear combinations <strong>of</strong> the variables with the greatest<br />

variance. <strong>The</strong> first principal component has a maximal overall variance. <strong>The</strong> second principal<br />

component has maximal variance among all unit length linear combinations that are uncorrelated to the<br />

first principal component, etc. <strong>The</strong> last principal component has the smallest variance among all unit<br />

length linear combinations <strong>of</strong> the variables. All principal components combined contain the same<br />

information as the original variables, but the important information is partitioned over the components<br />

in a particular way such that the components are orthogonal, and earlier components contain more<br />

information than later components. PCA thus conceived is just a linear transformation <strong>of</strong> the data.<br />

<strong>The</strong> purpose <strong>of</strong> factor analysis is to describe the covariance relationship among many variables in<br />

terms <strong>of</strong> a few underlying on observable random quantity called factors. Now with the principal<br />

component factor analysis <strong>of</strong> the sample covariance matrix S is specified in terms <strong>of</strong> its eigenvalueeigenvector<br />

pairs (λ 1 , 1 ), (λ 2 , 2 ),.............. (λ p , p), where λ 1 ≥ λ 2 ≥ ....... ≥ λ p . Let m < p be the<br />

number <strong>of</strong> common factors. <strong>The</strong> principal component factor analysis <strong>of</strong> the sample correlation matrix<br />

is obtained by starting with R in the place <strong>of</strong> S, where R is the subset <strong>of</strong> S and <strong>of</strong> a full rank. For the<br />

principal component solution, the estimated loadings for a given factor do not change as the number <strong>of</strong><br />

factors is increased.<br />

Rotating Factor Loadings<br />

<strong>The</strong> factor loadings are obtained from the initial loadings by an orthogonal transformation while<br />

retaining the same ability to reproduce the covariance (or correlation) matrix. An orthogonal<br />

www.theinternationaljournal.org > <strong>RJSSM</strong>: Volume: 03, Number: 03, July-2013 Page 5

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!