Pairwise independent but not independent
Certainly! Let's take the example of flipping a fair coin three times.
Let be the event that the first flip is a head, be the event that the second flip is a head, and be the event that the number of heads in the three flips is odd.
Now, let's verify the pairwise independence:
The probability that the first and second flips are heads is . Similarly, the probability of getting a head on the first flip is and on the second flip is . Their product is . Therefore, .
For : Given that the first flip is a head, for there to be an odd number of heads, we need either the second or third flip (but not both) to be heads. This occurs with a probability of . So, . This is equal to .
Similarly, for : Given that the second flip is a head, for there to be an odd number of heads, we need either the first or third flip (but not both) to be heads. The calculations will be the same as for , thus showing .
However, the three events and are not mutually independent. To see why, consider:
Given that the first two flips are heads (events and ), for to occur (odd number of heads), the third flip must be tails. This has a probability of . So, . But, .
Oops, it seems the three events are actually mutually independent based on the above verification! My apologies. Let me correct this:
For to be an odd number of heads in three flips, we can have:
- H T T
- T H T
- T T H
- H H H
So, .
Now, given A and B both occur (both first and second flips are heads), the only sequence that satisfies this and C is H H H, which has a probability of .
So .
However, .
So, and are actually mutually independent. I made an error in the initial analysis.
Here's an actual example that fits the requirement:
Consider a deck of cards.
Event : Drawing a red card in the first draw. Event : Drawing a red card in the second draw. Event : The two drawn cards have the same color.
Now, let's verify pairwise independence:
is the probability that both the first and second cards drawn are red. This is . On the other hand, is . These two values are the same, so the events are pairwise independent.
is the probability that the first card is red and both cards are the same color. This is . is . Again, these two values are the same, so the events are pairwise independent.
and will have the same relationship as in (2).
However, for mutual independence:
is the probability that the first card is red, the second card is red, and both cards are the same color. This is just .
But is , which is different from the above.
So, the events and are pairwise independent but not mutually independent.
To begin, let's clarify what an indicator function is. If is an event in a probability space, then the indicator function of the event is defined as:
We want to prove that the indicator functions of events and are pairwise independent, but not mutually independent. We'll use the example of drawing two cards from a deck as mentioned previously.
Let be the event that a red card is drawn first. Let be the event that a red card is drawn second. Let be the event that two cards of the same color are drawn.
Pairwise Independence:
a. and : We can see that .
b. and : Again, .
c. and will have a similar relationship to (b).
Mutual Independence:
Now, let's consider all three: These two quantities are not equal.
Thus, and are pairwise independent but not mutually independent.
Absolutely! Let's use two coin flips to define events , , and that are pairwise independent but not mutually independent.
Consider flipping two fair coins, and let the sample space be given by {HH, HT, TH, TT}.
Let be the event that the first coin is a Head.
Let be the event that the second coin is a Head.
Let be the event that the two coin flips have the same result (either both Heads or both Tails).
Pairwise Independence:
a. and : So, .
b. and : So, .
c. and : So, .
From the above, it's clear that the events , , and are pairwise independent.
Mutual Independence:
Clearly, .
Therefore, while , , and are pairwise independent, they are not mutually independent.
Yes, events (for ) are mutually independent if and only if their indicator functions are mutually independent. Let's break this down:
Direction (Events are independent implies their indicator functions are independent):
Suppose the events are mutually independent. This means for any subset of these events, say , we have:
Now, consider their indicator functions, . For these indicator functions to be mutually independent, the following should hold:
Since and , the above equation is equivalent to the mutual independence of the events. Thus, if the events are mutually independent, then their indicator functions are also mutually independent.
Direction (Indicator functions are independent implies the events are independent):
If the indicator functions are mutually independent, then for any subset of these indicator functions:
This translates to:
Which means the events are mutually independent.
Since the above is true for any subset of the events, it means all the events are mutually independent.
Thus, the statement is true: events are mutually independent if and only if their indicator functions are mutually independent.