28.10.2014 Views

Fault Detection and Diagnostics for Rooftop Air Conditioners

Fault Detection and Diagnostics for Rooftop Air Conditioners

Fault Detection and Diagnostics for Rooftop Air Conditioners

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

3.2 Comparisons<br />

Four black-box modeling approaches: polynomials, GRNN, RBF <strong>and</strong> BP neural networks<br />

were investigated using the laboratory data. Seven characteristic parameters (evaporating<br />

temperature , condensing temperature T , compressor discharge temperature<br />

, suction line superheat T , liquid line subcooling T , condenser air temperature<br />

difference<br />

T evap<br />

∆ , evaporator air temperature difference ∆ ) <strong>for</strong> FDD were modeled.<br />

T ca<br />

cond<br />

Tdis<br />

sh sc<br />

Since the gathered data are very limited (94 points <strong>for</strong> large set data <strong>and</strong> 40 points <strong>for</strong> small<br />

set data), when testing interpolation, the large data set were used to train models <strong>and</strong> both<br />

the small <strong>and</strong> large data set were used to test models. When testing extraplation, the core of<br />

the total data (T from 73 to 79 F,T from 70 to 90 F, T from 58 to 64 F) were used<br />

to train models <strong>and</strong> the remaining data were used to test models.<br />

To get a visual feeling of the modeling per<strong>for</strong>mance, figure 3.3 <strong>and</strong> figure 3.4 show the<br />

training per<strong>for</strong>mance <strong>and</strong> testing per<strong>for</strong>mance <strong>for</strong> evaporating temperature. Table 3.2 <strong>and</strong><br />

figure 3.5 show the RMS error <strong>for</strong> the polynomial models. Polynomial models have good<br />

interpolating ability when the order is high enough (e.g. third order) <strong>and</strong> the interpolating<br />

per<strong>for</strong>mance increases as the order increases. However, low-order polynomial models have<br />

good extrapolating per<strong>for</strong>mance, while the extrapolating per<strong>for</strong>mance will be very poor<br />

when the polynomial order is too high (e.g. the third order). So there exists a conflict<br />

between interpolating <strong>and</strong> extrapolating per<strong>for</strong>mance.<br />

Table 3.2 <strong>and</strong> figure 3.6 show that the GRNN models have very good interpolating ability<br />

but poor extrapolating per<strong>for</strong>mance. The spread has a significant influence on the<br />

interpolating per<strong>for</strong>mance but little influence on extrapolating per<strong>for</strong>mance. The smaller<br />

the spread, the better the interpolating per<strong>for</strong>mance. Another advantage of GRNN is that<br />

training is very fast. The disadvantage of GRNN is that large memory is required to record<br />

the nodes when the number of nodes is large.<br />

From table 3.3, it can be seen that the BP neural network has very good interpolating<br />

ability when the number of neurons is appropriate, but extrapolating per<strong>for</strong>mance is poor.<br />

Also the per<strong>for</strong>mance is a little r<strong>and</strong>om, because the initial condition is r<strong>and</strong>om. The<br />

weakness of the BP neural network is that it takes a little long time to train.<br />

T ea<br />

ra amb wb<br />

21

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!