12.07.2015 Views

1 Introduction

1 Introduction

1 Introduction

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1.2. Probability Theory 25Figure 1.13Plot of the univariate Gaussianshowing the mean µ and thestandard deviation σ.N (x|µ, σ 2 )2σµxExercise 1.8∫ ∞−∞N ( x|µ, σ 2) dx =1. (1.48)Thus (1.46) satisfies the two requirements for a valid probability density.We can readily find expectations of functions of x under the Gaussian distribution.In particular, the average value of x is given byE[x] =∫ ∞−∞N ( x|µ, σ 2) x dx = µ. (1.49)Because the parameter µ represents the average value of x under the distribution, itis referred to as the mean. Similarly, for the second order moment∫ ∞E[x 2 ]= N ( x|µ, σ 2) x 2 dx = µ 2 + σ 2 . (1.50)−∞From (1.49) and (1.50), it follows that the variance of x is given byvar[x] =E[x 2 ] − E[x] 2 = σ 2 (1.51)Exercise 1.9and hence σ 2 is referred to as the variance parameter. The maximum of a distributionis known as its mode. For a Gaussian, the mode coincides with the mean.We are also interested in the Gaussian distribution defined over a D-dimensionalvector x of continuous variables, which is given by{1 1N (x|µ, Σ) =(2π) D/2 |Σ| exp − 1 }1/2 2 (x − µ)T Σ −1 (x − µ) (1.52)where the D-dimensional vector µ is called the mean, the D × D matrix Σ is calledthe covariance, and |Σ| denotes the determinant of Σ. We shall make use of themultivariate Gaussian distribution briefly in this chapter, although its properties willbe studied in detail in Section 2.3.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!