18.11.2012 Views

Pr obability Distributions

Pr obability Distributions

Pr obability Distributions

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Random Variables and<br />

<strong>Pr</strong> <strong>obability</strong> <strong>Distributions</strong><br />

J."/ Concept af a Random Variable<br />

The field of statistids is concerned with making inferences about populations<br />

and pop-ulation- qhar.actffistics. Experiments are'ioiiducied with results that<br />

'are<br />

subject to chance. The testing of a number of electronic components is an<br />

example of a statistical experiment, a term that is used to describe any process<br />

by which several chance observations are generated. It is often very important<br />

*) tg allocata".a nuqerical description !9 the outcome. For example, the sample<br />

space giving a detailed description of each possible outcome when three elect1o-9jc_.qppponents<br />

are tested may be written<br />

5: {NNN, NND, NDN, DNN, NDD, DND' DDN,,DDD\, '<br />

- . I,r<br />

where N denotes "nondefective" and D denotes "defecti#.n'One is naturally<br />

concerned with the number of defectives that occur. Thus each point in the<br />

sample space will be assigned a numericsl value of 0, I, 2, or 3. These values<br />

are, of course, random quantities determined by the outcome of iei*priiilinf.<br />

fn*y *"y be viewed as values assumed by the random uariable' X, the number<br />

gi-Ce,{ectivg items when three electronic components are tested.<br />

4 j


44<br />

Definition 2.1<br />

Example 2.1<br />

,n/ 't..t<br />

- t,).,a'v i *^<br />

\*j .,:.r :,-!,..r .J1.1<br />

Example 2.2<br />

Ch. 2 ' Rsndom Variables and <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

.4 random variable a func tion tAq- qs ssCt:q! e,s q-r gql luy1b-91-Vtt!-e rch:JSmep in<br />

the.s*mple space.<br />

We shall use a capital letter, say X, to denote a random variable and its<br />

corresponding small letter, x in this case, for one of its values. In the electronic<br />

component testing illustration above, we notice that the random variable X<br />

assumes the value 2 for all elements in the subset<br />

5: {DDN, DND, NDD}<br />

of the sample space S. That is, each possible value of X_represents an eveg-t<br />

that is a subset of the sample space for the given experiment.<br />

Two balls are drawn in succession without replacement from an urn containing<br />

4 red balls and 3 black balls. Th_e pA-s-lible outcomes and the valueg-;r of<br />

the random variable<br />

", *$j:I it ll1".::ryEt-q!9{ b{ls' are<br />

Lol"", t*r*qh<br />

Sample Space v"<br />

RR<br />

RB<br />

BR<br />

BB<br />

A stockroom clerk returns three safety helmets at random to three steel mill<br />

employees, who had previously checked them. If Smith, Jones, and Brown, in<br />

thai oider, receive one of the three hats, list the sample points for the possible<br />

orders of returning the helmets and find the values m 9'-f lhe random variable<br />

M that represents the number of lneclSatches'<br />

Salution. If S, J, and B stancl for Smith's, Jones', and Brown's helmets, respectively,<br />

then the possible arrangements in which the helmets may be returned<br />

and the number of correct matches are<br />

2<br />

1<br />

1<br />

0<br />

Sample Space fl<br />

SJB<br />

SBJ<br />

JSB<br />

JBS<br />

BSJ<br />

BJS<br />

3<br />

1<br />

1<br />

0<br />

0<br />

1


iec. 2.1 ' Concept of a Rantlom Vaiable<br />

In each of the two preceding examples the sample space containp a finite<br />

number of elements. On tne other hand,-yhg! a d-i9 i1lhJown unti! a 5 occurs,<br />

..*a'afmna-sempii slace m-rmannnena_ing sequence of elements,<br />

s: {F, NF, NNF, NNNF,...},<br />

where F and N represent, respectively, the occurrence and nonoccurrence Of a<br />

5. But even in this experiment, the number of elements can be equated to the<br />

number of whole numbers so that tllere is a first element, a second element, a<br />

ihird "taro"nt, and so on, and in this sense can be counted'<br />

tleffnitiOn 2.2 If a sample space contoins a fnite number of possibilities or an unending sequence<br />

with as m-any elements as there are whole numbers, it is called a discrete sample<br />

space.<br />

Theoutcomesofsonrestatisticalexperimentsmaybeneitherfinitenor<br />

countable. Such is the case, for example,-when one conducts an investigation<br />

measuring the distances that a certain -aiii oi automobile will travel over a<br />

test course on 5 liters of gasoline. Assuming distance to be a vari-<br />

Of*;i;i<br />

able measured to any degree of accuracy, then clearly we have an infinite<br />

number of possible distanies in the sample space that cannot be equated to<br />

the num-beF-o$*whsle*numher$, Also, if one were to record the length of time<br />

i;;;il;cal reaction to take place, once again the possible time intervals<br />

making up our sample ,puc" utt infinite in number and uncountable' We see<br />

now that all sample spaces need not be discrete'<br />

illdrution 2.3 If a sample space contains on infnite numbey of possibilities equal to the number of<br />

pointson.alinesegment,itiscalledacontinuoussamplespace.<br />

@<br />

A random variable is called a discrete ranilom variable if its set of possible<br />

outcomes is countable. Since the possible values of Y in Example 2'1 are 0, 1,<br />

uii z, and the possible values of M in Example 2.2 are 0, 1, and 3, it follows<br />

that y and M are discrete random variables. When a random variable can<br />

take on values on a-c-qn-tiDuqEs- scgle, it is called a continuous random vsriable'<br />

oil"o ttr" possible nutu"r of i continuous random variable are precisely the<br />

same values that are contained in the continuous sample space._ Such is the<br />

case when the random variable represents the measured distance that a certain<br />

*utd oiuotomobile will travel over a test course on 5 liters of gasoline'<br />

.^.. 2 | i; most practical problems, 99!-4_Luo-qs- random- YaqAllgS -Lep!9!9-nt-"msa-<br />

--'zF<br />

I<br />

*r,"--- -*,' k / w-94-ngj9,cua3se!!9uoUt"-frtigltlt*trehtt, telqpqrallules, distaqce$, or life<br />

ffi:- ,--t'' ,frf I ffi-' wt*"* airdt" ."tta"- such as lhe<br />

""ri"ut"-*reBL?,seatJpunL-da-ta,<br />

{, ;ffi; sf dged="_"t a_sample of k itp$s qltlhe luslbpr of highway-,iel.a-lities<br />

drn ",*' *<br />

v<br />

14-<br />

-<br />

rdi - f-ryrt;-iry4-gvel 59i*1har-the.-rand-om<br />

-stata<br />

:tr*<br />

;;;U"t of corrgct hat rnatches' ,<br />

45<br />

variables Y an{.M of E11m--<br />

pd;.i;;a fi-ioth repreienicount data, f the number of red balls and r14


46<br />

'r"*, @<br />

t jr' ++r<br />

,r' !' I J '<br />

6. pt( '1t<br />

lTr<br />

t_plf H<br />

Ll T KY ,?- 1 rrl-l<br />

"'. I'li"li.{ r. TT Ff<br />

"4<br />

Ch. 2 ' Random Variables and pr<strong>obability</strong> <strong>Distributions</strong><br />

2.2 Discrete <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

i t\ ! .r i<br />

-.ir')' i't/:{i<br />

!1ri":!r. :r,ii<br />

t:<br />

..,:.t . ,.-.,.i.. i<br />

i:<br />

,\ -'s 1 ,'<br />

,{ dis.c-r91e randgm variable ?s-suTes each of its values with a cer1ain-probabil"-<br />

, ity"Jn the case of tossing a coiiTfi-ree-iiffi&;'tr6-14;it6l" x,-iepresenting the<br />

number of heads, assumes the value 2 with pr<strong>obability</strong> 3/g, since 3 of the g<br />

equally likely sample points result in two heads and one tail. If one assumes<br />

equal weights for the simple events in Example 2.2, the pr<strong>obability</strong> that no<br />

employee gets back his right helmet, that is, the pr<strong>obability</strong> that M assumes<br />

the value zero, is l/3. The possible values m of u and theii probabilities are<br />

given by<br />

P(M : m) 111<br />

326<br />

L-l<br />

le(tL<br />

Note that the values of m exhaust all possible cases and hence the probabilities<br />

add to l.<br />

Frequently, it is convenient to represent all the probabilities of a random<br />

variable X by a formula. such a formula would necessarily be a function of the<br />

numerical values x that we shall denote by f (x),g(x), r(x), and so forth. Therefore,<br />

we write /(x) : P(X : x); that is, /(3) : p(X: 3). The set of ordered<br />

pairs (x,/(x)) is called the pr<strong>obability</strong> function or pr<strong>obability</strong> distribution of rhe<br />

discrete random variable X.<br />

Definition 2.4 The set of ordered pairs (x,f (x));s a p1qlghiljlx.fg4plior-r, pr<strong>obability</strong> mass funcrion,<br />

or ppla-bllly ,Qqt1ilrrlion of the discrete random uaiiable X rf,for each possible<br />

outcome x'<br />

lr'r<br />

t>' o<br />

l./(x)>0. 2it"':"t {"<br />

2.lf1x1: t.<br />

Z. PtX:;r) :/(x).<br />

Example 23 A shipment of 8 similar microcomputers to a retail outlet contains 3 that are<br />

defective. If a school makes a random purchase of 2 of these computers, find<br />

the pr<strong>obability</strong> distribution for the number of defectives.<br />

sslntion. Let X be a random variable whose values x are the possible<br />

numbers of defective computers purchasec by the school. Then x can be any of


Sec. 2.2 Discrete <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

. /:\<br />

" r;10 ,0<br />

' - t /to1:P(X:0):7f:X'<br />

the numbers 0, 1, and 2. Now'<br />

: t."<br />

. '.: ... :,."<br />

e\ ty'<br />

j - ,4ri*rl.:,*{<br />

. -1<br />

.]'<br />

'*l<br />

" i i. ,.<br />

:- ': !''|,r"r /i t li { ' (<br />

f (2) : P{X :2) :<br />

'ii", ;il;."i"uii',v Ji',ribuiion ol X is<br />

Example2.4If50%oftheautomobilessoldbyanagencyforacertainforeigncarare<br />

equrppedwitlldiesele4g!ne$n.a"r"'-"r"foithepr<strong>obability</strong>distributionof<br />

_ r, tt. n,i,.,'U", dl ales66"olJilnons "'^'"-' -\:..--<br />

' j<br />

---<br />

''/<br />

\,1<br />

0<br />

the next 4 cars sold bv this agency'<br />

,.*. 'i- ^Solulion,$iace..the pr<strong>obability</strong> of selling a diesel model or a gasoline model is<br />

0.5, ;;;';i: tO'Uoints in the sample space are equally iikely to occur' There-<br />

,or","iil"'o*;t'";;;; ;11 prouuuititi"t, and.also for our function' will be<br />

i6..fg-qbt-aa$9-lu-loberofw'ays.ofselling3diesel..models"weneedtocon.<br />

*


48<br />

Ch. 2 ' Random Variables and <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

Definition2.5,,'oW)ofadiscreterandomuariableXwithprobabiIity<br />

f iu 1-<br />

* il} '''<br />

fL?;:'<br />

-iti= /"<br />

, i,,)= 'lr<br />

r'(x): P(X


l<br />

Discr ete <strong>Pr</strong> obabilit y <strong>Distributions</strong><br />

6l t6<br />

s lt6<br />

4l16<br />

3l t6<br />

2lt6<br />

Figure 2.1 Bar chart.<br />

I lt6<br />

It is often helpful to look at a pr<strong>obability</strong> distribution in graphic form. one<br />

might plot the points (x,,f(x)) of Example 2.4 to obtain Figure 2.l.By joining<br />

the points to the x axis either with a dashed or solid line, we obtain what is<br />

commonly called a bar chgrt. Figure 2.1 makes it very easy to see what values<br />

of X are most likely to occur, and it also indicates a perfectly symmetric situation<br />

in this case.<br />

Instead of plotting the points (x,"f(x)), we more frequently construct rectangles,<br />

as in Figure 2.2.Herc the rectangles are bonstructed so that their bases of<br />

equal width are centered at each valus x and their heights are equal to the<br />

corresponding probabilities given by"f(t).The bases are constructed so as to<br />

leave no space between the rectangles. Figure 2.2 is called a pr<strong>obability</strong> histogram.<br />

I<br />

since each base in Figure 2.2 has'.nit width, the P(x : x) is equal to the<br />

area of the rectangle centered at x. Even if the bases were not of unit width' we<br />

could adjust the heights of the rectangles to give areas that would still equal<br />

the probabilities of X assuming any of its values x. This concept of using areas<br />

to represent probabilities is necessary for our consideration of the pr<strong>obability</strong><br />

distribution of a continuous random variable'<br />

The graph of the cumulative distribution of Example 2.5, which appears as a<br />

step function in Figure 2.3, is obtained by plotting the points (x, F(x))'<br />

6l 16<br />

s lt6<br />

4lt6<br />

3116<br />

z.l t6<br />

I lr6<br />

0l<br />

Flgure 2.2 Fr<strong>obability</strong> histogram.<br />

49


50 Ch. 2 ' Random Variables and <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

I<br />

314<br />

t12<br />

t14<br />

I<br />

?------------J<br />

I<br />

Figure 2.3 Discrete cumulative distribution.<br />

I<br />

r-r___________l<br />

I<br />

r------J<br />

I<br />

Certain pr<strong>obability</strong> distributions are applicable to more than one physical<br />

situation. The pr<strong>obability</strong> distribution of Example 2.4, for example, also<br />

applies to the random variable Y, where Y is the number of heads when a coin<br />

is tossed 4 times, or to the random variable W, where W is the number of red<br />

cards that occur when 4 cards are drawn at random from a deck in succession<br />

with each card replaced and the deck shuflled before the next drawing. Special<br />

discrete distributions that can be applied to many different experimental situations<br />

will be considered in Chapter 4.<br />

2.3 Continuous <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

>1<br />

'.(' ,J (.\, A continuous random variable has a pr<strong>obability</strong> of zero of assuming exactlv<br />

.._ - .',,,,:--'-':-- .-<br />

- ----=-<br />

m €-it-61-m-vamECTonsequentlylitFpr<strong>obability</strong> distribution cannot be-givet. in<br />

--<br />

-''''t -=:---<br />

,l -..:^' ,'ti\.! q ffid, ibfrlnt first this may seem startling, but it becomes more plausible<br />

.\ L .'; -f' E r- .r<br />

.... 1i {i.. -, r,, ! - when we consider a particular example. Let us discuss a random variable<br />

.*t.o ,,u }f,.. .ll, whose values are the heights of all people over 2l years of age. Between any<br />

-" \" {.<br />

t<br />

lr. . two values, say 163.5 and 164.5 centimeters, or even 163.99 and 164.01 centit<br />

, \^.',' meters- there are an infinite number of heishts. one of which is 164 centimeters.<br />

i' -ndL+r" - q The prdba-.fiIi1! of selecting a person at random w-ho is qx4ctly 164 centim_eters<br />

, A '' ., / tall and not one of the infinitely large set of heights so close to 16a centimeieii<br />

^<br />

, s-"Fr*<br />

"ntr''<br />

,nN, ] that you cannot humanly measure the difference ltg:fr,ole, and thus we assign<br />

M p I"-t [a pr<strong>obability</strong> of zero to the event. This is not the case, however, if we,f-alk<br />

@ \t ,^5 I-1;d'<br />

W \u .,id}" not more than 165 centimeters tall. Now we are dealing with an '$''<br />

interval rather<br />

about the pr<strong>obability</strong> of selecting a person who is at least 163 centimeters'6u1-<br />

tttun u poini uutu" of our *nao* variable.<br />

We shall concern ourselves with computing probabilities for various intervals<br />

of continuous random variables such as P(a < X < b), P(W > c), and so<br />

forth. Note that when X is continuous<br />

P(a < X < b) : P{a < X < b) + P(X : b<br />

: P(s< X


Scc. 2-3<br />

I<br />

C ontinuous <strong>Pr</strong> ob abilit y <strong>Distributions</strong><br />

(c)<br />

Figure 2.4 Typical density functions.<br />

9 ), .That is.. it does not matter whether we include an end point of the interval or<br />

e | - not. This is not true. though. when X is discrete.<br />

Although the pr<strong>obability</strong> distribution of a continuous random variable<br />

cannot be presented in tabular form, it can have a formula. Such a formula<br />

would necessarily be a function of the numerical values of the continuous<br />

- vaiiable X and as such will be represented by the functional notation/(x). In<br />

, ^,\. / dealing with continuous variables,/(x) is usually called the pr<strong>obability</strong> density<br />

[* function, or simply the density function of X. Since X is defined over a continuous<br />

sample space, -.:,',t5 it is possible for/(x) to have a finite number of discontin-<br />

.;.. f uities. However, most density functions that have practical applications in the<br />

.. ,, analysis of statistical data are continuous and their graphs may take any of<br />

' ,, , several forms, some of which are shown in Figure 2.4. Because areas will be<br />

.' ,-, used to represent probabilities and probabilities are positive numerical values,<br />

the density _fu,Irgli_on must lie entirely above the x axis.<br />

A probabilitv itv densitv density function lunction is constructed so that the area under its curve<br />

bounded bv the x axi ualtolw qyer,ltlqlg-lgr el X_!gI<br />

x) is defined. Should this range of X be a finite interval, it is always<br />

possible to exten interval to include the entire set of real numbers by<br />

defining/(x) to be zero at all points in the extended portions of the interval. In<br />

Figure 2.5, the pr<strong>obability</strong> that X assumes a value between a and b is equal to<br />

the shaded area under the density function between the ordinatss at x : o and<br />

x : b, and from integral calculus is given by<br />

tl, 1<br />

I P@. x


52 Ch. 2 ' Random Yariables and pr<strong>obability</strong> <strong>Distributions</strong><br />

Figure25 P(a


I<br />

Sec. 2.3 . Continuoas <strong>Pr</strong><strong>obability</strong> Distributians<br />

Definition 2.7 The cumulative rlistribution F(x) of a continuous ranilom Dariable X with density<br />

function[(x\ k giaen by<br />

for-m


Exercises<br />

Ch. 2 ' Random Variables and pr<strong>obability</strong> Distibutions<br />

The cumulative distribution F(x) is expressed graphically in Figure 2.6. Now,<br />

P(0 < X < 1) : r(1I_ F(0) : 4 _ t : *,<br />

which agrees with the result obtained by using the density function in Example<br />

2.6.<br />

" l. Classify the following random variables as discrete 06.<br />

or continuous-<br />

the number of automobile accidents per year<br />

in Virginia.<br />

Y: the length of time to play 18 holes of golf.<br />

M: the amount of milk produced yearly by a-./7.<br />

particular cow.<br />

N: the number of eggs laid each month by a<br />

hen.<br />

P: the number of building permits issued each<br />

month in a certain city.<br />

Q: the weight of grain produced per acre. '/8'<br />

,2. An overseas shipment of 5 foreign automobiles<br />

contains 2 that have slight paint blemishes. If an<br />

agency receives 3 of these automobiles at random,<br />

list the elements of the sample space S using the '/9.<br />

letters B and N for ..blemished - and<br />

" nonblemished," respectively; then to each sample<br />

point assign a vdlue x of the random variable X<br />

representing the number of automobiles purchased<br />

by the agency with paint blemishes. ./10t<br />

3. Let W be a random variable giving the number of<br />

heads minus the number of tails in three tosses of<br />

a coin. List the elements of the sample space S for /ll.<br />

the three tosses of the coin and to each sample --'<br />

point assign a value w ofW.<br />

/4. A cain is flipped until 3 heads in succession occur.<br />

List only those elements of the sample space that<br />

require 6 or less tosses. Is this a discrete sample<br />

space? Explain.<br />

v'5. Determine the value c so that each of the follow- lL<br />

ing functions can serve as a pr<strong>obability</strong> distribu-<br />

tion of the discrete random variable X:<br />

t(a) f(x) : c(x2 + 4) for x : 0, 1,2,3;<br />

(b) f(xt: "(:)(,I.) for x:0, r,2<br />

13'<br />

From a box containing 4 dimes and 2 nickels, 3<br />

coins are selected at random without replacement.<br />

Find the pr<strong>obability</strong> distribution for the total T of<br />

the 3 coins. Express the pr<strong>obability</strong> distribution<br />

graphically as a pr<strong>obability</strong> histogram.<br />

From a box containing 4 black balls and 2 green<br />

balls, 3 balls are drawn in succession, each ball<br />

being replaced in the box before the next draw is<br />

made. Find the pr<strong>obability</strong> distribution for the<br />

number of green balls.<br />

Find the pr<strong>obability</strong> distribution of the random<br />

variable 17 in Exercise 3, assuming that the coin is<br />

biased so that a head is twice as likely to occur as<br />

a tail.<br />

Find the pr<strong>obability</strong> distribution for the number<br />

of jarz records when 4 records are selected at<br />

random from a collection consisting of 5 jezz<br />

records, 2 classical records, and 3 polka records.<br />

Express your results by means of a formula.<br />

Find a formula for the pr<strong>obability</strong> distribution of<br />

the random variable X representing the outcome<br />

when a single die is rolled once.<br />

A shipment of 7 television sets contains 2 defective<br />

sets. A hotel makes a random purchase of 3 of the<br />

sets. If X is the number o[ defective sets purchased<br />

by the hotel, find the pr<strong>obability</strong> distribution of X.<br />

Express the results graphically as a pr<strong>obability</strong><br />

histogram.<br />

Three cards are drawn in succession from a deck<br />

without replacement. Find the pr<strong>obability</strong> distribution<br />

for the number of spades.<br />

Find the cumulative distribution ol the random<br />

variable l/ in Exercise 8. Using F(w), find<br />

(a) P(W > o);<br />

(b) P(-t


;ec. 2.5 Joint <strong>Pr</strong><strong>obability</strong> Distibutions<br />

decimal point, each repeated five times such<br />

that the double-digit leaves 00 through 19 are<br />

associated with stems coded by the letter a;<br />

leaves 20 through 39 are associated with stems<br />

1.5 Joint <strong>Pr</strong><strong>obability</strong> Distributigr'!<br />

63<br />

coded by the letter b; and so forth' Thus a<br />

number such as 1.29 has a stem value of lb and<br />

a leaf equal to 29'<br />

(b) Set up i relative frequency distribution'<br />

(e)ourstudyofrandomvariablesandtheirpr<strong>obability</strong>distributionsinthepre.<br />

ced i n g sec ti o ns * ;';.;;;;t'd tggl,':q iT'l'i " " il' l++"-:p:"="-:: i: :l:: H<br />

f*.";0 ".<br />

;I;i,"li3:.!<br />

we w€ might trrrBrrl measure rrrv4Jurv the r'v 'ving -:-^ .^ ^ +rr,^<br />

rise to a two-dimensional .ri-ancinnal<br />

from a controlled chemical experiment gr<br />

sample space consistd^;ilil;;tcomes (p' :l' "-t "t"::lT be-interested in<br />

the hardness H and ,"i'if" T of cold-drawn copper resulting in the<br />

'tt""gth<br />

ourcomes (h, 4. In ;;; io J"i".-ine the likelihood of success in college'<br />

"<br />

based on high school J*", on" might use a three-dimensional sample space<br />

andrecordforeachindividualhisorheraptitudetestscore,highschoolrank<br />

in class, and grade-poi;; ;;;;;g" at the endof the freshman year in college'<br />

If X and y ur" t*o-dlr"."tJ."rroom variables, the pr<strong>obability</strong> distribution<br />

lor their simultaneous occurrence can be represented by a function with values<br />

puit or I;;;;";;"v<br />

*ru"t 1t' v) within the range of the random variables X<br />

and y, It is customary ,o-i"r"t ,. this function as the ioint pr<strong>obability</strong> distribution<br />

of X and Y' Hence, in the discrete case'<br />

f{x,Y}:P(x:x'Y:Y);<br />

that is, the values /(x, y) give the pr<strong>obability</strong> that outcomes x and y occur at<br />

the same time' For if aielevision set is to be serviced and X rep-<br />

"*L'u:ptin,<br />

resents the age to ttre iea're'i y"u' of the set and Y represents the number of<br />

defective tubes in trr" r"t, it"n i6,z)is the pr<strong>obability</strong> that the television set is<br />

5 years old and needs 3 new tubes'<br />

2"8 The function f (x, y)is a ioint pr<strong>obability</strong> distribution or pr<strong>obability</strong> mass function o/<br />

the discrete 'lon''lo^ uariables X and Y if<br />

7. f (x, y) > 0 for all (x, Y)'<br />

, \\.f(x, y): 1.<br />

3. P(X:x,Y:Y):f(x,Y)'<br />

For any region A in the xy plane' Pl(X' Y) e Af : I I 16' il'


64 Ch. 2 - Random Yariables and <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

Example 2.8<br />

i.,. j<br />

Two refills for a ballpoint pen are selected at random from a box that contains<br />

3 blue refills, 2 red refills, and 3 green refills. If X is the numbcr of blue refills<br />

and Y is the number of red refills selected, find (a) the joint pr<strong>obability</strong> func-<br />

{oni(x, d, ana 0)"1(X,Yf e Af,whdre.4 is the region {(t, y)lx + y < 1}.<br />

Solation<br />

(a) The possible pairs of values (x, y) are (0, 0), (0, 1), (1, 0), (1, 1), (0, 2), and<br />

(2,0). Now,/(0, 1), for example, represents the pr<strong>obability</strong> that a red and a<br />

green refill are selected. The total number of equally likely ways of selec-<br />

ting any 2 refills from the t - 0 : 28. The number of ways of selecting I<br />

red from 3 red refills and I green from 3 green refills is (?Xi) : 6. Hence<br />

,f(0, 1) :6128 -- 3114. Similar calculations yield the probabilities for the<br />

other cases, which are presented in Table 2.6. Note that the probabilities<br />

sum to 1. In Chapter 3 it will become clear that the joint pr<strong>obability</strong> distribution<br />

of Table 2.6 can be represented by the formula , !<br />

' -o;<br />

for x : A, \2; | : 0, 1,2;0 < x + Y < 2.<br />

(b) P[(X, Y] e A]: P(X + Y < 1)<br />

:"f(0, 0) +/(0' 1) +/(l' 0)<br />

:rr3+*+*<br />

:&<br />

., \-00(,-1-,)<br />

Jtx,y):T__,,<br />

\r/<br />

Table 2.6 Joint <strong>Pr</strong><strong>obability</strong><br />

Distribution for Example 2.8<br />

f(x, v)<br />

0<br />

yl 2<br />

Column<br />

Totals<br />

x<br />

o ii i^.\ 2<br />

'_tle<br />

l28l28i<br />

i -r- I -r-<br />

,1411+<br />

rli<br />

i 2R:i<br />

-r ,r-r i<br />

-L<br />

28<br />

La. 2g 2a<br />

i<br />

l-.r'<br />

t,<br />

Row<br />

Totals<br />

ll'<br />

2A<br />

3<br />

7<br />

-L<br />

2A<br />

I


I<br />

Sec. 2.5 Joint <strong>Pr</strong> ob abiltt y <strong>Distributions</strong><br />

65<br />

When X and Y are continuous random variables, the joint density function<br />

f(x,y) is a@, and P[(X, Y\ e Af, where,4 is any<br />

region in the xy olane- is eoual to tht uolg-e of th" !!ght t<br />

the base .4 and the surfiG]-<br />

Definition 2.9 ,**;1t?;f(*,y\ isc jointdensityfunction of thecontinuousrsndomoariablesX<br />

l.f(x,v)><br />

O for all (x, y).<br />

fa fo<br />

r. f(x, y) dx dy : 1.<br />

.J__ .l__<br />

3. Pt(x, Y) e Af: iitO, fl dx dy<br />

for any region A in the xy plane.<br />

Erample Z.g ) A candy company distributes boxes of chocolates with a mixture of creams,<br />

' toffees, and nuts coated in both light and dark chocolate. For a randomly<br />

selected box, let X and Y, respectively, be the proportions of the light and<br />

dark chocolates that are creams and suppose that the joint density function is<br />

given by<br />

(a)<br />

(b)<br />

f(x, v): {::t<br />

Verify condition 2 of Definition 2.9. *)<br />

Find P[(X, Y) e A], where.4 is the region<br />

+3y), 0


66 Ch. 2 ' Random Variebles anil <strong>Pr</strong><strong>obability</strong> Distributians<br />

(b) P[(x, Y) e Af<br />

:P(o


t<br />

J oint <strong>Pr</strong>obsbilitY <strong>Distributions</strong><br />

Lxample 2.11<br />

2<br />

P(X:1):9(1):lf0,t)<br />

v=0<br />

:/(1,0) +/(1, 1) +f(r,2)<br />

--&+rnr+o:4;'<br />

2<br />

a P(X : 2) : g(2) : L f(2, v) :f(2,0) + f(2' r) + f(2'2)<br />

)=0<br />

:*+0+0:*,<br />

which are just the column totals of Table 2'6' In a similar manner we could<br />

show that the values "i-lttyl "* given by ll9 to't" totals' In tabular fornr' these<br />

margittut distributions may be written as follows:<br />

Find g(x) and h(y) for the joint density function of Example 2'9'<br />

Solution. BY definition'<br />

ra f r')<br />

s$): v)<br />

.J__r,*, dY : (" + 3Y) dY<br />

.|. ;<br />

4xv 6v'lt= 1 4x * 3<br />

: s *lol,=o 5<br />

for 0 < x ( 1, and g(x) : 0 elsewhere' Similarly'<br />

ro lr )<br />

h(y): .|_to, r) dx: J. i,t.<br />

-4r t_3v\<br />

)<br />

for 0 < Y S 1, and h(Y) :0 elsewhere'<br />

+ 3) dx<br />

The fact that the marginal distributions g(x) and hlyYt: inlef the prob*<br />

abitity distributions ,iiir-iiai"idual variiL. X "nA Y alone can easily be<br />

verified by showing ;1";;;""ditions of Definition 2.4 or Definition 2'6 are<br />

..iitn"a. io, examplc, in the continuous case<br />

[* n(r) ar: l* i' ,,', v\ dv dx: I<br />

j-- J-o J-o<br />

lu:<br />

67


68 Ch. 2 ' Random Yariables anil <strong>Pr</strong>obabtlity <strong>Distributions</strong><br />

and<br />

P(a < X < b) : P(a < X 1b, -co < Y < co)<br />

lb l*<br />

:l I f@,v)dvdx<br />

JoJfb<br />

: I g$\ dx'<br />

Jo<br />

In Section 2.1 we stated that the value x of the random variable X represents<br />

an event that is a subset of the sample space. If we use the definition of<br />

conditional pr<strong>obability</strong> as given in Chapter 1,<br />

P(Bl A) :<br />

P(A o B)<br />

P(A)<br />

P(.4) > 0,<br />

where / and B are now the events defined by X : x and Y: y, respectively,<br />

then<br />

P(Y : ylX: x): P(X--x,Y:l)<br />

P(X : x\<br />

: IJ+!, s(x) > o,<br />

s@)<br />

when *Itlr X and Y are disglgte'raadam-raua-bles*."<br />

""i Aiffilt t; show that the function f(x,y\ls!), which is strictly a<br />

function of y with x fixed, satisfies all the conditions of a pr<strong>obability</strong> distribution.<br />

This is also true when/(x, y) and g(x) arc the joint density and marginal<br />

distribution of continuous random variables. Expressing such a pr<strong>obability</strong><br />

distribution by the symbol/(y lx), we have the following definition.<br />

Definition 2.11 Let X and Y be two random variables, discrete or continuous. Ihe*cgndjtional<br />

distribution of the random uariable Y, giun thot X : ,, it gir.4<br />

f ulx) : W, s(x) > o.<br />

-<<br />

Shnilarly, the conditional distribution of the random variable X, giuen that<br />

Y : y, is giuen by<br />

f(xli:W, h(v)>0.<br />

If one wished to find the pr<strong>obability</strong> that the discrete random variable X<br />

y, wE<br />

falls between a and b when it is known that the discrete variable f -<br />

evaluate<br />

P(a < X < blY : y) : I.f(rly),


J oint Pt obabilitY <strong>Distributions</strong><br />

where the summation extends over all values of X between a and b' When X<br />

and Y are continuous, we evaluate<br />

lb<br />

P(a < x < blY : !) : J"ft*li a*'<br />

Solurton.<br />

Now<br />

Therefore,<br />

w" needl find/(xly), where y : 1' First we find that<br />

h(l) = i .f(x, 1): * + * +o :+'<br />

x=O<br />

f(x, l) t .(r, ,), x : o, l, Z.<br />

/(xl1):7f=3r<br />

/(0t1) : +f(a,l):<br />

(3x*): *<br />

/(11 1) : !f(r,l) : (3x*) : I<br />

/(21 1) : zf {2,1) : (3x0) : 0<br />

and the conditional distribution of X' given that Y : l' is<br />

f(x I l)<br />

FinallY, O r€t Y: 1):.r(ol1): *'<br />

bnra<br />

G)Therefore, if it is known that 1 of-the 2 pen refills selected is red' we have a<br />

-;;;;;;uiiv<br />

"qutt to-r/z tt'ut the other refill is not blue'<br />

/-' ^ r-^^+r^n Y nf male runners and the fraction Y of female<br />

*fu "qlp1s-z,reJw.*Jn'ff JfiT'*,f,*?:::,':#TFilil#;ii":"*'densitv<br />

function<br />

o!\(t{r<br />

r(x, y): til' I,i"J,i;.<br />

o < / < 1<br />

Find g(x), h(v),/(y I x)' and determine the pr<strong>obability</strong> that f:Y-": '1"'n<br />

69<br />

1/8 of the<br />

women entered i' u pJrti"oru, marathon d;lly i"tshed if it is known that<br />

;xafi; ip;i;" male runners completed the race'


70 Ch. 2 . Random variables snd pr<strong>obability</strong> Dis*ibutions<br />

Example 2.14<br />

Solution.<br />

and<br />

Now,<br />

By definition,<br />

fo lx<br />

s@): I f6, y) dy : | 8xy dy<br />

J-- Jo<br />

lY=t<br />

:4xy2l :4xt, o


J oint <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong> 7I<br />

Therefore,<br />

and<br />

:n"ru mi,stic al Independence<br />

f (xlt):<br />

t<br />

(t<br />

Pl -<br />

\4<br />

f (x, y) _ x(l + 3y2)14<br />

h(y) - (1 +3y\2 -/". -tl<br />

< x


7 2 Ch. 2 - RandoTn Variables and <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

The continuous random variables of Example 2.L4 are statistically independent,<br />

since the product of the two marginal distributions gives the joint density<br />

function. This is obviously not the case, however, for the continuous variables<br />

of Example 2.13. Checking for statistical independence of discrete random<br />

variables requires a more thorough investigation, since it is possible to have<br />

the product of the marginal distributions equal to the joint pr<strong>obability</strong> distribution<br />

for some but not all combinations of (x, y). If you can find any point<br />

(x, y) for which/(x, y) is defined such that/(x, y) + S6)h(y), the discrete variables<br />

X and Y are not statistically independent.<br />

Example 2.15 Show that the random variables of Example 2.8 are not statistically independent.<br />

Solution.'Let us consider the point (0, 1). From Table 2.6 we find the three<br />

probabilities,f(0, 1), g(0), and i(1) to be<br />

Clearly,<br />

/(0' l): t )<br />

s(o): I f$, v): * +<br />

'=O<br />

2<br />

h(1): I ,f(x, 1): * +<br />

x=O<br />

/(0, l) # g9)h(t),<br />

3 r I<br />

14r 28-t4 -l-<br />

*i + o:;.<br />

and therefore X and Y are not statistically independent-<br />

All the preceding definitions concerning two random variables can be generalized<br />

to the case of n random variables. Let f(xr, x2, ..., x,) be the joint<br />

pr<strong>obability</strong> function of the random variables Xy Xr, ..., X,. The marginal<br />

distribution of Xr, for example' is given by<br />

s(x):I I f(xyxz,...,xJ<br />

for the discrete case and by<br />

s(xr): I: f<br />

* .rtrr, xz, ..., xi;,) dx, dx3 '' ' dxn<br />

for the continuous case. We can now obtain ioint marginal distributions such<br />

as @(x1, xr), where<br />

It L f@r,<br />

xn<br />

l-t<br />

6(xt, xr) :<br />

xz, -.', Xo)<br />

ll- "J-'-tt'"x2r"'r<br />

(discrete case)<br />

x^) dx3 dxa "' dx, (continuous case).


t<br />

Sec, 2.5 Joint <strong>Pr</strong><strong>obability</strong> <strong>Distributions</strong><br />

one could consider numerous conditional distributions. For example, the joint<br />

conditional distribution of Xr, X.r, and X., given that Xnl rn, X, : x5, ...,<br />

X n : x,, is written<br />

f (xt, xz, xtlx+, xs, ..., xn) : f lxt xz' ' ", x)<br />

"' g(x+, x5, .. ., xJ'<br />

where g(xa, xs,..., x,) is the joint marginal distribution of the random variables<br />

Xn, X s, ..., X n.<br />

A generalization of Definition 2.12 leads to the following definition for the<br />

mutually statistical independence of the variables X ,, X r, . .-. , Xn.<br />

mk'finition 2.13 Let Xr,Xr,.", xnbenrandomuariobles,discrete or continuous,withjointpr<strong>obability</strong><br />

distribution f (x,., xr, . . ., xo) and marginal distributions fr1xr1,, yrlxrj, ...,<br />

f,(x,), respectiuely. The random uariqbles X ,, X r, ..., Xn or, ,iti'ti b:i*iiually<br />

statistically independent if and only if<br />

for all (xr, x,..., Xn) within their range.<br />

f (x,., xr, .. ., x,) : fr$)fz(xz) . . . f ,(x,)<br />

frample 2.16 Suppose that the shelf life, in years, of a certain perishable food product packaged<br />

in cardboard containers is a random variable whose proUaUitity density<br />

function is given by<br />

tvt:{'^"<br />

[0,<br />

x>o<br />

elsewhere.<br />

Let X r, Xr, and X, represent the shelf lives for three of these containers<br />

selected independently and find p(X, < 2, 1 < X, 1 3, X, > 2).<br />

Solution. Since the containers were selected independently, we can assume<br />

that the random variables X r X z, and X. are statistically independent, having<br />

the joint pr<strong>obability</strong> density<br />

f (xy xz, xr) -,f(xr) f $rlf(xr\<br />

: g- x\ e-<br />

xze- xl<br />

: e-tt -J2-r3<br />

for x, > 0, xz 2 0, xr ) 0, aqd/(xr, xz, xs): 0elsewhere. Hence<br />

P(J' < 2, 1 < x2 < 3, xt > 2) :l-,[t<br />

lr', r-',-xz-x1<br />

: (1 - "-2)'1r-r - e-3)e-2<br />

:0.0376.<br />

d,x1 d.x2 dx,<br />

73

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!