13.07.2015 Views

()£1 0 2 3$ 41 !©¡©#5 ¥6 8 79 @8!

()£1 0 2 3$ 41 !©¡©#5 ¥6 8 79 @8!

()£1 0 2 3$ 41 !©¡©#5 ¥6 8 79 @8!

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

idkjledfegjideji2of experiments is presented in Section 4. Conclusions are finally exposed inSection 5.$%ULHI'HVFULSWLRQRI(QWURS\This section presents a brief overview of some concepts of information theory[5], [6], that will be useful to describe the method proposed in this paper.7KHGLVFUHWHHQWURS\HYDOXDWLRQComputation of the entropy of a discrete random variable ; requires thecomputation of the amount of information , revealed after occurrence of theevent ; = [b , where , is associated to the [c occurrence rarity. It means thatobservation of an expected event brings low information level. On the otherhand, a rare event is surrounded by a very specific circumstance, which bringsnew information.Considering Sc as being the probability of occurrence of event ;= [b , then theamount of information is defined as:,[⎛ 1 ⎞log⎜⎟S⎝ ⎠−logSNote that the inverse relationship with the probability denotes the notion ofrarity. Since the scale is logarithmic, if Sc = 1 then ,([c ) = 0. Therefore, eventsthat are 100% predictable do not contain any new information.Considering 1 possible values of [c that ; can assume, the entropy iscomputed as:+;([,[] S ⋅e Swhere ([]⋅ is the expectation statistical operator.(1)∑ − log (2)17KHGLVFUHWHFRQGLWLRQDOHQWURS\The conditional entropy of ; assuming the event < \h is given by:+; _ \∑ − log ( 3[ _ \ ) ⋅ 3[ _ \ (3)1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!