12.01.2015 Views

Download - Academy Publisher

Download - Academy Publisher

Download - Academy Publisher

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

ISBN 978-952-5726-09-1 (Print)<br />

Proceedings of the Second International Symposium on Networking and Network Security (ISNNS ’10)<br />

Jinggangshan, P. R. China, 2-4, April. 2010, pp. 186-189<br />

An SVM model for Water Quality Monitoring<br />

Using Remote Sensing Image<br />

Wei Huang 1 , Fengchen Huang 2 , and Jing Song 2<br />

1<br />

College of Hydrology and Water Resources, Hohai University, Nanjing, China<br />

Email: david_morika@hhu.edu.cn<br />

2 College of Computer and Information, Hohai University, Nanjing, China<br />

Email: hhice@126.com, songjing5996@yahoo.com.cn<br />

Abstract—The accuracy of traditional monitoring methods<br />

using remote sensing was lower, because of the limited<br />

number of monitoring points on the Tai lake. This paper<br />

proposed to use the Least Squares Support Vector Machine<br />

(LS-SVM) theory to improve the accuracy of water quality<br />

retrieval, which is suitable for the small-sample fitting. The<br />

LS-SVM model was used to monitor concentration of<br />

suspended matter. In this paper, the Radial Basic Function<br />

(RBF) was chosen as the kernel function of the retrieval<br />

model, and the grid searching and k-cross validation were<br />

used to choose and optimize the parameters. From the<br />

results of experiment, it showed that the proposed method<br />

had good performance and at the same time, the complexity<br />

is lower and the speed of the modeling was rapid.<br />

Index Terms—LS-SVM, water retrieval, grid searching,<br />

remote sensing<br />

I. INTRODUCTION<br />

The use of remote sensing for water quality monitoring<br />

is an uncertain problem [1]. How to establish an<br />

appropriate model for the retrieval of the water quality<br />

parameters is a difficult problem, when it referred to<br />

either different water regions or different water<br />

characteristic.<br />

Because of using linear regression to estimate the<br />

water quality parameters could not get the accurate<br />

retrieval results, so the nonlinear algorithms, for example,<br />

BP neural network and SVM were used in the filed of the<br />

water monitoring. Unfortunately, BP algorithm is easy to<br />

fall into the local minimum; the hidden number is hard to<br />

determine and the training speed is slower. So the BP<br />

neural network method affected the accuracy of retrieved<br />

result. The drawback of the SVM is that the optimum<br />

kernel transfer function and its corresponding parameters<br />

for respective data sets are difficult to set [2]. It reduces<br />

the generalization ability, and the complexity of<br />

algorithm is increasing.<br />

LS-SVM is a reformulation of the standard SVM [3].<br />

LS-SVM uses the least-squares linear system instead of<br />

the quadratic programming method to estimating the<br />

function. So it has good generalization ability. According<br />

to the limited sample points of water monitoring, and the<br />

requirement of good generalization ability, the LS-SVM<br />

algorithm and theory was proposed to retrieve the water<br />

quality of Tai Lake.<br />

II. THE MODEL OF LS-SVM<br />

The LS-SVM uses the square of deviance cost function<br />

instead of ε -insensitive loss function, so the restriction<br />

of the inequality is transformed into the restriction of the<br />

equation. Therefore, compared with the SVM, LS-SVM<br />

algorithm greatly reduces the complexity of the algorithm,<br />

making the training speed very fast, so the retrieval<br />

precision is increased.<br />

For the training sample sets D = ( x , y ) ,<br />

n<br />

nh<br />

xk<br />

∈ R , yk<br />

∈ R , k = 1, 2,... N .Where, xk<br />

is the input<br />

data, yk<br />

is the output data. According to the theorem<br />

Mercer, the kernel function K(, ⋅⋅)<br />

and the mapping<br />

function ϕ()<br />

⋅ allow:<br />

T<br />

K( x , y ) = ϕ( x ) ϕ( x )<br />

(1)<br />

k k i j<br />

LS-SVM is solved in the original space, so the<br />

following optimization problem is obtained:<br />

N<br />

1 T 1 2<br />

min J( ω, b, ξ)<br />

= ω ω+<br />

γ∑ξk<br />

2 2 k=<br />

1<br />

T<br />

st . . y = ωϕ( x) + b+ ξ( k=<br />

1,2... N)<br />

k k k<br />

Where γ is an adjustable regularization parameter, γ is<br />

the compromise between the training error and model<br />

complexity, so that the desired function has good<br />

generalization ability. The value of γ is greater, the<br />

regression error of the model is smaller [4].The variable<br />

ω reflects the complexity of the function, which is a<br />

linear combination of non-linear mapping function ϕ()<br />

⋅ .<br />

LS-SVM defines a different loss function compared<br />

with the standard SVM, and it transfers the restriction of<br />

the inequality into the restriction of the equation. By<br />

adding Lagrange function:<br />

N<br />

k=<br />

1<br />

k<br />

k<br />

(2)<br />

T<br />

L( ω, b, ξ, a) = J( ω, b, ξ) − ∑ ak[ ωϕ( xk) + b+ ξk −yk]<br />

(3)<br />

Lagrange multiplies ak<br />

∈ R , the optimal abcan , be<br />

obtained through the KKT conditions:<br />

Manuscript received January 30, 2010; revised March 12, 2010;<br />

Copyright credit, project number, corresponding author, etc.<br />

© 2010 ACADEMY PUBLISHER<br />

AP-PROC-CS-10CN006<br />

186<br />

∂<br />

L<br />

∂ =<br />

L<br />

∂<br />

0 , =<br />

L<br />

∂<br />

0 , =<br />

L<br />

0 , = 0<br />

∂ ∂ ∂ ∂<br />

ω<br />

b<br />

ξ<br />

a<br />

(4)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!