#analysis/probability-statistics
Consider a [[binomial experiment]]. Define the [[random variable]] $X$ to be the number of successes of this [[experiment]] in $n$ trials.
## Probability of exactly $k$ successful [[experiment]]s
**Which** $k$ experiements? $n \choose k$ ways to decide. Then by [[independent|independence]] $P(x==k) = {n \choose k}p^k(1-p)^{n-k}$
## [[Expectation]] of a [[binomial random variable]]
Decompose $X$ into [[indicator random variable]]s ([[bernoulli random variable|bernoulli random variables]]) $X = X_1 + \dots + X_n$, where $X_i = \begin{cases} 1& \text{Experiment $i$ succeeds} \\ 0&\text{otherwise} \\
\end{cases}.$
Then by [[linear map|linearity]] of [[expectation]] $E(X) = E(\sum_{i=1}^n P(X_i==1)) = \sum_{i=1}^n p = np.$
# Example with Generating Functions
### Consider the binomial probability distribution
$
p_k = \binom{n}{k} p^k (1-p)^{n-k}
$
#### a) Show that the probability generating function for this distribution is
$
g(z) = (pz + 1 - p)^n
$
Let $X$ be the number of successful experiments in $n$ trials, then $X=\sum_{i}X_{i}$ where $X_{i}$ are [[bernoulli random variable]]s. The [[probability generating function]] of a sum of independent [[random variable]]s is the product of generating functions, hence it suffices for us to compute the PGF of some $X_{i}$. This is $\sum_{k}\mathbb{P}(X_{i}==k)z^{k}=\mathbb{P}(X_{i}==0)z^{0} + \mathbb{P}(X_{i}==1)z^{1}=1-p+pz.$
Then taking product of the generating functions of the $X_{i}$ yields the required result.
#### b) Find the first and second moments of the distribution from Eqs. (12.7) and (12.89) and hence show that the variance of the distribution is
$
\sigma^2 = np(1-p)
$
The first moment is given by $g'(1)=[np(pz+1-p)^{n-1}]|_{z=1}=np$.
The second moment and hence [[variance|variance]] is calculated as ![[CleanShot 2023-11-07 at 10.13.57.jpg|400]]
#### c) Show that the sum of two numbers drawn independently from the same binomial distribution is distributed according to
$
\binom{2n}{k} p^k (1-p)^{2n-k}.
$
We can find the [[probability generating function]] and then take [[derivative]]s with relevant manipulations as necessary to obtain $\mathbb{P}(X=k)$, where $X=X_{1}+X_{2}$. Because the draws are [[independent random variables|independent]] and identically [[probability distribution|distributed]], we have that $g_{X}(z)=g_{X_{1}+X_{2}}(z)=[g_{X_{1}}(z)]^{2} = (pz+1-p)^{2n}$generates our desired [[distribution]]. Now to get $\mathbb{P}(X=k)$ we take $k$ [[derivative]]s then divide by $k!$ thus: $\frac{g_{X}^{(k)}(z)}{k!}=\frac{p^{k}2n \cdot(2n-1)\cdot \ \dots \ \cdot (2n-k+1)(pz+1-p)^{2n-k}}{k!}$
and then evaluate at $z=0$ to obtain $\frac{g_{X}^{(k)}(0)}{k!}=\frac{2n!}{k!(2n-k)!}p^{k}(1-p)^{2n-k}$
as requested.
#notFormatted