06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

6 1 Probability <strong>Theory</strong><br />

A1, A2, . . ., which we may write as {An}, we say the sequence is a sequence <strong>of</strong><br />

independent events. We also may abuse the terminology slightly and say that<br />

“the sequence is independent”. Similarly, we speak <strong>of</strong> independent sequences<br />

<strong>of</strong> collections <strong>of</strong> events or <strong>of</strong> Borel functions.<br />

Notice that each pair <strong>of</strong> events within a collection <strong>of</strong> events may be independent,<br />

but the collection itself is not independent.<br />

Example 1.1 pairwise independence<br />

Consider an experiment <strong>of</strong> tossing a coin twice. Let<br />

A be “heads on the first toss”<br />

B be “heads on the second toss”<br />

C be “exactly one head and one tail on the two tosses”<br />

We see immediately that any pair is independent, but that the three events<br />

are not independent; in fact, the intersection is ∅.<br />

We refer to this situation as “pairwise independent”. The phrase “mutually<br />

independent”, is ambiguous, and hence, is best avoided. Sometimes people use<br />

the phrase “mutually independent” to try to emphasize that we are referring to<br />

independence <strong>of</strong> all events, but the phrase can also be interpreted as “pairwise<br />

independent”.<br />

Notice that an event is independent <strong>of</strong> itself if its probability is 0 or 1.<br />

If collections <strong>of</strong> sets that are independent are closed wrt intersection, then<br />

the σ-fields generated by those collections are independent, as the following<br />

theorem asserts.<br />

Theorem 1.1<br />

Let (Ω, F, P) be a probability space and suppose Ci ⊆ F for i ∈ I are independent<br />

collections <strong>of</strong> events. If ∀i ∈ I, A, B ∈ Ci ⇒ A ∩ B ∈ Ci, then σ(Ci)<br />

for i ∈ I are independent.<br />

Pro<strong>of</strong>. Exercise.<br />

Independence also applies to the complement <strong>of</strong> a set, as we see next.<br />

Theorem 1.2<br />

Let (Ω, F, P) be a probability space. Suppose A, B ∈ F are independent. Then<br />

A and B c are independent.<br />

Pro<strong>of</strong>. We have<br />

hence,<br />

P(A) = P(A ∩ B) + P(A ∩ B c ),<br />

P(A ∩ B c ) = P(A)(1 − P(B))<br />

= P(A)P(B c ),<br />

and so A and B c are independent.<br />

In the interesting cases in which the events have equal probability, a concept<br />

closely related to independence is exchangeability. We define exchangeability<br />

in a probability space in three steps, similar to those in the definition<br />

<strong>of</strong> independence.<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!