# Coin Toss Probability Setup

Let's break this down step by step.

**1. Probability Space:**

A probability space consists of three main components:

- A sample space $\Omega$,
- A sigma-algebra $\mathcal{F}$,
- A probability measure $P$.

For a biased coin, the sample space consists of the possible outcomes of a single coin toss. So, $\Omega = \{H, T\}$ where $H$ is heads and $T$ is tails.

Given that we're tossing the coin $n$ times, the sample space for $n$ tosses would be the set of all $n$-length sequences of $H$ and $T$. So, the sample space for $n$ tosses is: $\Omega^n = \{ (x_1, x_2, \ldots, x_n) \mid x_i \in \{H, T\} \ \forall \ i \in \{1, 2, \ldots, n\} \}$

The sigma-algebra $\mathcal{F}$ on $\Omega$ is the power set of $\Omega$, i.e., the set of all possible subsets of $\Omega$. Since $\Omega$ is finite (with only 2 elements), this is trivial. However, for $n$ tosses, $\mathcal{F}$ would be the power set of $\Omega^n$.

Lastly, $P$ is the probability measure. Since the coin is biased, let $p$ be the probability that the coin shows heads and $1-p$ be the probability that it shows tails. Therefore, for a single toss: $P(H) = p$ $P(T) = 1 - p$ For $n$ tosses, the probability of a specific sequence is the product of the probabilities of each toss outcome. For instance, the probability of the sequence $(H, T, H)$ in 3 tosses would be $p(1-p)p$.

**2. Random Variable for the $i$-th toss:**

A random variable is a function that assigns a real number to each outcome in a sample space.

Let's define $X_i: \Omega^n \to \{H, T\}$ to be the random variable that gives the result of the $i$-th toss. For any sequence $\omega = (x_1, x_2, \ldots, x_n)$ in $\Omega^n$, the value of the random variable $X_i$ at $\omega$ is $x_i$.

Formally, for $\omega = (x_1, x_2, \ldots, x_n)$ in $\Omega^n$, $X_i(\omega) = x_i$ for $i \in \{1, 2, \ldots, n\}$.

It's worth noting that if you'd like the random variable to map to real numbers (which is often the convention), you could define $X_i$ such that:

Where 1 represents heads and 0 represents tails.

Certainly!

Given a biased coin with the probability of heads being $p$ and tails being $1-p$, let's define the probability measure $P$ on the sample space $\Omega^n$ for $n$ coin tosses.

For any specific sequence $\omega = (x_1, x_2, \ldots, x_n)$ in $\Omega^n$, where each $x_i$ can be either $H$ (heads) or $T$ (tails), the probability measure $P$ is given by:

$P(\omega) = p^{\text{number of H in } \omega} \times (1-p)^{\text{number of T in } \omega}$

Or, in a more concise mathematical notation:

$P(\omega) = p^{k} \times (1-p)^{n-k}$

where $k$ is the number of occurrences of $H$ in the sequence $\omega$.

Given that $X_i$ is the random variable representing the outcome of the $i$-th toss, where:

Then, the sum $S = X_1 + X_2 + \ldots + X_n$ represents the total number of heads in $n$ coin tosses. This sum will follow a binomial distribution.

The binomial distribution gives the probability of having $k$ successes (in this context, $k$ heads) in $n$ independent Bernoulli trials (coin tosses), each with the same probability $p$ of success.

Thus, the distribution of $S$ is $\text{Binomial}(n, p)$, and the probability mass function (PMF) of $S$ is given by:

$P(S=k) = \binom{n}{k} p^k (1-p)^{n-k}$

Where:

- $\binom{n}{k}$ is the binomial coefficient, representing the number of ways to choose $k$ successes from $n$ trials.
- $p^k$ is the probability of getting $k$ heads.
- $(1-p)^{n-k}$ is the probability of getting $n-k$ tails.

Here, $k$ can range from 0 to $n$, inclusive.

Thank you! If you have any more questions or need further clarification on any topic, feel free to ask. Have a great day!