18.11.2014 Views

Download - ijcer

Download - ijcer

Download - ijcer

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Load forecasting for practical power systems by…<br />

III.<br />

ESTIMATION OF PARAMETERS IN LINEAR REGRESSION MODELS SIMPLE<br />

REGRESSION:<br />

Where X is independent variable and Y is dependent variable. Where “a” indicates intercept<br />

a y bx<br />

.Where “b” indicates<br />

xy nxy x y<br />

b wherex andy =<br />

2 2<br />

x nx<br />

n n<br />

<br />

slope……………..(4)<br />

Multiple Linear Regressions<br />

The method of least squares is typically used to estimate the regression coefficients in a multiple linear<br />

regression model. Suppose that n>k observations on the response variable are available say, y 1 , y 2 … y n . Along<br />

with each observed response y i , we will have an observation on each regressed variable and let x ij denote the ith<br />

level of variable x j . We assume that the error term € in the model has E(€) =0 and V(€) = σ 2 and that the {€ i}<br />

are uncorrelated random variables. We may write the model equation<br />

y i = β 0 + β 1 x i1 + β 2 x i2 + …+ β k x ik + € I ……………..(5)<br />

y<br />

i<br />

<br />

o<br />

k<br />

<br />

j1<br />

x<br />

j<br />

ij<br />

<br />

i<br />

…………………………..(6)<br />

i = 1, 2… n<br />

The method of least squares chooses the β‟s in equation (6) so that the sum of the squares of errors, € I , is<br />

minimized. The least squares function is<br />

L <br />

n<br />

<br />

i1<br />

<br />

2<br />

i<br />

<br />

n<br />

k<br />

( yi<br />

<br />

o<br />

<br />

<br />

j<br />

xij<br />

i1<br />

j1<br />

www.<strong>ijcer</strong>online.com ||May ||2013|| Page 58<br />

)<br />

2<br />

……………..(7)<br />

The function L is to be minimized with respect to β 0 , β 1 … β k . The least square estimators, say β 0 , β 1 … β k , must<br />

satisfy<br />

n<br />

k<br />

L<br />

2<br />

( yi ˆ0 ˆ<br />

jxij)<br />

0 ……………..(8)<br />

<br />

i1 j1<br />

n<br />

k<br />

L<br />

2(<br />

yi ˆ0 ˆ<br />

jxij)<br />

xij 0 ……………..(9)<br />

<br />

i1 j1<br />

j = 1, 2, ------, k.<br />

Simplifying equation (9), we obtain<br />

n ˆ ˆ <br />

ˆ <br />

ˆ <br />

0<br />

n<br />

<br />

0<br />

i1<br />

n<br />

<br />

0<br />

i1<br />

i1<br />

n<br />

<br />

x<br />

ˆ <br />

<br />

1 i1<br />

2<br />

i1 i1<br />

x ˆ <br />

x<br />

ik<br />

ˆ <br />

n<br />

x<br />

n<br />

n<br />

2<br />

x ˆ<br />

1 i1<br />

2<br />

xi<br />

1<br />

i1 i1<br />

n<br />

<br />

x x ˆ <br />

<br />

1 ik i1<br />

2<br />

i1 i1<br />

n<br />

i2<br />

x<br />

x<br />

i2<br />

..... ˆ <br />

.....<br />

ˆ <br />

x<br />

ik i2<br />

n<br />

<br />

x<br />

<br />

n<br />

<br />

k ik<br />

i1 i1<br />

n<br />

<br />

.....<br />

ˆ <br />

x<br />

x<br />

<br />

<br />

k i1<br />

ik<br />

i1 i1<br />

n<br />

ik<br />

n<br />

n<br />

2<br />

k<br />

xik<br />

xik<br />

i1 i1<br />

y<br />

i<br />

i<br />

……………..(10)<br />

x y ……………..(11)<br />

y<br />

i<br />

……………..(12)<br />

These equations are called the least squares normal equations. Note that there are p = k + 1 normal equations,<br />

one for each of the unknown regression<br />

Defining Problem:<br />

The problem of load forecasting is approached by making the forecasts for one whole day at a time.<br />

The approach is static in the sense that the forecast is not updated during the day. The forecasting of the load on<br />

the daily basis with neural network techniques has been reported in many variations. Single feed forward<br />

artificial neural network architecture is used in forecasting the daily peak, valley, and average loads.<br />

These equations are called the least squares normal equations. Note that there are p = k + 1 normal<br />

equations, one for each of the unknown regression coefficients. The solution to the normal equations will be the<br />

least squares estimators of the regression coefficients ˆ , ˆ<br />

0<br />

1,.....,ˆ<br />

k.<br />

It is simpler to solve the normal equations if they are expressed in matrix notation. We now give a<br />

matrix development of the normal equations that parallels the development of equation (10, 11, 12). The model<br />

in terms of the observations, of the above equation may be written in matrix notation as Y = X β + €<br />

Where,

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!