# Computing PGF from PMF

Certainly! The probability generating function (PGF) $G(z)$ of a discrete random variable $X$ with probability mass function (pmf) $p(k)$ is given by: $G(z) = E(z^X) = \sum_{k=0}^{\infty} p(k)z^k$

Let's compute the PGFs for a couple of well-known discrete probability distributions:

**Bernoulli Distribution**: If $X$ is a Bernoulli random variable with success probability $p$, then its pmf is:

The PGF is: $G(z) = p(0)z^0 + p(1)z^1 = (1-p) + pz$

**Binomial Distribution**: If $X$ follows a binomial distribution with parameters $n$ (number of trials) and $p$ (success probability), then its pmf is: $p(k) = \binom{n}{k} p^k (1-p)^{n-k}$ For $k = 0, 1, 2, ... , n$. The PGF is: $G(z) = \sum_{k=0}^{n} \binom{n}{k} p^k (1-p)^{n-k} z^k$ But we know from the binomial theorem that: $(p + (1-p)z)^n = \sum_{k=0}^{n} \binom{n}{k} p^k (1-p)^{n-k} z^k$ So, the PGF for a binomial random variable is: $G(z) = (p + (1-p)z)^n$**Geometric Distribution**: If $X$ follows a geometric distribution with success probability $p$, its pmf is: $p(k) = (1-p)^{k-1} p$ For $k = 1, 2, ...$. The PGF is: $G(z) = \sum_{k=1}^{\infty} (1-p)^{k-1} p z^k$ To find this sum, recognize it as a geometric series: $G(z) = pz \sum_{k=0}^{\infty} ((1-p)z)^k$ Using the sum of an infinite geometric series formula: $G(z) = \frac{pz}{1-(1-p)z}$ Given that the absolute value of $(1-p)z$ is less than 1.

These are just a few examples. Many other discrete probability distributions have their corresponding PGFs derived from their pmfs.

Certainly! Let's consider the Negative Binomial distribution.

The Negative Binomial distribution describes the number of failures before the $r^{th}$ success in a series of Bernoulli trials. The pmf is given by: $p(k) = \binom{k+r-1}{k} p^r (1-p)^k$ Where $k = 0, 1, 2, ...$ represents the number of failures, $r$ is the number of successes, and $p$ is the probability of success.

The probability generating function (PGF) for the Negative Binomial distribution is: $G(z) = \sum_{k=0}^{\infty} \binom{k+r-1}{k} p^r (1-p)^k z^k$

To derive this PGF, we'll use a useful trick. We know that the geometric distribution's PGF with success probability $p$ is: $G_1(z) = \frac{p}{1-(1-p)z}$

Now, the Negative Binomial can be thought of as the sum of $r$ independent geometrically distributed random variables (each describing the number of failures before a success). So, the PGF for the Negative Binomial distribution is just the product of $r$ such PGFs: $G(z) = \left(\frac{p}{1-(1-p)z}\right)^r$

That's the PGF for the Negative Binomial distribution!

To obtain the pmf from a given PGF, you'll differentiate the PGF and then evaluate it at $z = 1$.

Given the probability generating function $G(z)$ of a discrete random variable $X$, the pmf $p(k)$ is given by: $p(k) = \frac{1}{k!}\frac{d^k}{dz^k} G(z) \Bigg|_{z=1}$

This is because the PGF is defined as: $G(z) = \sum_{k=0}^{\infty} p(k)z^k$

Let's look at an example to understand this:

### Example: Geometric Distribution

Recall the PGF for a geometric distribution (with success probability $p$) is: $G(z) = \frac{p}{1-(1-p)z}$

To find the pmf $p(k)$, we need the $k^{th}$ derivative evaluated at $z = 1$.

The first derivative is: $G'(z) = \frac{p(1-p)}{(1-(1-p)z)^2}$

The second derivative is: $G''(z) = \frac{2p(1-p)^2}{(1-(1-p)z)^3}$

In general, after some algebra and recognizing patterns, we can express the $k^{th}$ derivative as: $\frac{d^k}{dz^k} G(z) = \frac{k! p (1-p)^k}{(1-(1-p)z)^{k+1}}$

Evaluating the $k^{th}$ derivative at $z = 1$ gives: $p(k) = \frac{1}{k!}\frac{k! p (1-p)^k}{(1-p)^{k+1}} = p(1-p)^k$

Which is indeed the pmf for the geometric distribution!

### Notes:

- Differentiating to obtain the pmf from the PGF is especially useful when the PGF is available in a neat closed form.
- This method can be used for other distributions, but keep in mind that sometimes differentiating explicitly can become cumbersome. Some distributions might require using the general formula or recognizing the pattern of the derivatives as we did with the geometric distribution.

I apologize for the oversight in my earlier explanation. You're correct. The coefficient extraction for the pmf from the PGF is done by evaluating the $k^{th}$ derivative of the PGF at $z=0$, not $z=1$.

Given a PGF $G(z)$, the pmf $p(k)$ is: $p(k) = \frac{1}{k!} \frac{d^k}{dz^k} G(z) \Bigg|_{z=0}$

Let's correct our earlier example with the geometric distribution:

The PGF for a geometric distribution (with success probability $p$) is: $G(z) = \frac{p}{1-(1-p)z}$

To find $p(k)$, we need the $k^{th}$ derivative evaluated at $z = 0$.

The first derivative is: $G'(z) = \frac{p(1-p)}{(1-(1-p)z)^2}$

Evaluating $G'(0)$ gives $p(1-p)$, which is $p(1)$ for the geometric distribution.

Similarly, the second derivative evaluated at 0 will give $p(2)$, and so on.

Thank you for pointing out the error, and I apologize for any confusion caused.

Certainly, here's a basic structure for LaTeX Beamer slides using the `Hannover`

theme:

latex`\documentclass{beamer} \usetheme{Hannover} \usecolortheme{orchid} \usefonttheme{serif} \title{From PGF to PMF} \author{ChatGPT} \date{\today} \begin{document} \begin{frame} \titlepage \end{frame} \begin{frame}{Outline} \tableofcontents \end{frame} \section{Introduction} \begin{frame}{Introduction to PGF and PMF} \begin{itemize} \item PGF (Probability Generating Function) represents the probabilities of a discrete random variable in a generating series. \item PMF (Probability Mass Function) gives the probability that a discrete random variable is equal to a particular value. \item We can move from PGF to PMF through differentiation. \end{itemize} \end{frame} \section{Geometric Distribution} \begin{frame}{PGF of Geometric Distribution} For a geometric distribution with success probability \( p \): \[ G(z) = \frac{p}{1-(1-p)z} \] \end{frame} \begin{frame}{Deriving the PMF} To find \( p(k) \) from the PGF: \[ p(k) = \frac{1}{k!} \frac{d^k}{dz^k} G(z) \Bigg|_{z=0} \] \end{frame} \begin{frame}{PMF of Geometric Distribution} For the geometric distribution: \begin{align*} G'(z) &= \frac{p(1-p)}{(1-(1-p)z)^2} \\ p(1) &= G'(0) = p(1-p) \end{align*} Higher derivatives give \( p(2) \), \( p(3) \), and so on. \end{frame} \section{Conclusion} \begin{frame}{Conclusion} \begin{itemize} \item By using differentiation, we can extract the PMF from the PGF. \item This technique provides a powerful tool for understanding the behavior of discrete random variables. \end{itemize} \end{frame} \end{document}`

You can copy this code into a LaTeX editor that supports Beamer (e.g., Overleaf) and compile to produce the slides. Adjustments, including adding more content or tweaking the styling, can be made as per your needs.