15.01.2015 Views

Lectures on Elementary Probability

Lectures on Elementary Probability

Lectures on Elementary Probability

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

3.3. JOINT PROBABILITY MASS FUNCTIONS 19<br />

3.3 Joint probability mass functi<strong>on</strong>s<br />

C<strong>on</strong>sider discrete random variables X, Y . Their joint probabilities are the<br />

numbers P [X = x, Y = y]. Here we are using the comma to indicate and or<br />

intersecti<strong>on</strong>. Thus the event indicated by X = x, Y = y is the event that X = x<br />

and Y = y. The resulting functi<strong>on</strong> of two variables x, y is called the joint<br />

probability mass functi<strong>on</strong> of X, Y . In a similar way, we can talk about the joint<br />

probability mass functi<strong>on</strong> of three or more random variables.<br />

Given the joint probability mass functi<strong>on</strong> of X, Y , it is easy to compute the<br />

probability mass functi<strong>on</strong> of X, and it is equally easy to compute the probability<br />

mass functi<strong>on</strong> of Y . The first formula is<br />

P [X = x] = ∑ y<br />

P [X = x, Y = y], (3.12)<br />

where the sum is over all possible values of the discrete random variable Y .<br />

Similarly<br />

P [Y = y] = ∑ P [X = x, Y = y], (3.13)<br />

x<br />

where the sum is over all possible values of the discrete random variable X.<br />

Theorem 3.5 The joint probability mass functi<strong>on</strong> of a random variable Z =<br />

g(X, Y ) that is a functi<strong>on</strong> of random variables X, Y may be computed in terms<br />

of the joint probability mass functi<strong>on</strong> of X, Y by<br />

∑<br />

P [g(X, Y ) = z] = P [X = x, Y = y]. (3.14)<br />

x,y:g(x,y)=z<br />

Theorem 3.6 The expectati<strong>on</strong> of a random variable Z = g(X, Y ) that is a<br />

functi<strong>on</strong> of random variables X, Y may be computed in terms of the joint probability<br />

mass functi<strong>on</strong> of X, Y by<br />

E[g(X, Y )] = ∑ ∑<br />

g(x, y)P [X = x, Y = y]. (3.15)<br />

x y<br />

Proof:<br />

E[g(X, Y )] = ∑ z<br />

zP [g(X, Y ) = z] = ∑ z<br />

∑<br />

z P [X = x, Y = y] (3.16)<br />

x,y:g(x,y)=z<br />

However this is equal to<br />

∑ ∑<br />

z<br />

x,y:g(x,y)=z<br />

g(x, y)P [X = x, Y = y] = ∑ x,y<br />

g(x, y)P [X = x, Y = y]. (3.17)<br />

This theorem has a very important c<strong>on</strong>sequence for expectati<strong>on</strong>s. This is<br />

the property called additivity. It says that the expectati<strong>on</strong> of a sum is the sum<br />

of the expectati<strong>on</strong>s. We state it for two random variables, but again the idea<br />

extends to the sum of three or more random variables.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!