---- > [!definition] Definition. ([[independent random variables]]) > Suppose $(\Omega, \mathcal{F}, \mathbb{P})$ is a [[probability|probability space]]. > We call a family $(X_{i})_{i \in I}$ of random variables **independent** if their [[σ-algebra generated by a set collection|generated]] [[σ-algebra|σ-algebras]] $\big( \sigma(X_{i}) \big)_{i \in I}$ [[independent|are independent]]. > [!equivalence] Equivalence. > [[σ-algebra generated by a set collection#^properties|Recalling the identity]] $\sigma(X_{i})= X_{i} ^{-1} (\mathcal{B})$, it is clear that $(X_{i})_{i \in I}$ are independent if and only if for any finite subset $J \subset I$ one has $\mathbb{P}(X_{i} \in B_{i} \text{ for all }i \in J)=\prod_{i \in J} \mathbb{P} (X _{i} \in B_{i})$ > for [[Borel set|any]] $B_{i} \in \mathcal{B}$. > > [[σ-algebra generated by a set collection#^properties|Recalling the further identity]] $\sigma(X_{i})=\sigma\big(X_{i} ^{-1}(\mathscr{A})\big)$ for $\mathscr{A}$ [[σ-algebra generated by a set collection|generating]] $\mathcal{B}$, it is further clear that we can equivalently consider $A_{i} \in \mathscr{A}$ in place of $B_{i} \in \mathcal{B}$. For example, > - Two [[random variable|random variables]] $X,Y$ are independent iff $\{ X \in U \}$ and $\{ Y \in V \}$ are [[independent]] [[probability|events]] for all open sets $U, V \in \mathbb{R}$. > - Two [[random variable|random variables]] $X,Y$ are independent iff $\{ X \leq t\}$ and $\{ Y \leq s \}$ are [[independent]] [[probability|events]] for all $t,s \in \mathbb{R}$. > > > [!equivalence] Product equivalence. > $X_{1},\dots,X_{n}$ are independent if and only if $\mu_{(X_{1},\dots,X_{n})}=\mu_{X_{1}} \times \dots \times \mu_{X_{n}},$ where the LHS is the [[joint probability distribution]] $\mu_{\boldsymbol X}$ of $X_{1},\dots,X_{n}$ and the RHS is the [[product measure|product]] of their [[random vector|marginal]] [[probability distribution|distributions]] $\mu_{X_{i}}$. > In other words, $X_{1},\dots ,X_{n}$ are independent iff products and [[pushforward measure|pushforwards]] commute for them. ^equivalence > [!definition] Definition. ([[independent random variables]]) > Suppose $(\Omega, \mathcal{F}, \mathbb{P})$ is a [[probability|probability space]]. > > - Two [[random variable|random variables]] $X,Y$ are called **independent** if $\{ X \in U \}$ and $\{ Y \in V \}$ are [[independent]] [[probability|events]] for all [[Borel set|Borel sets]] $U,V$ in $\mathbb{R}$. > - More generally, a family of [[random variable|random variables]] $\{ X_{k} \}_{k \in \Gamma}$ is called **independent** if $\{ X_{k} \in U_{k} \}_{k \in \Gamma}$ is [[independent]] for all families of [[Borel set|Borel sets]] $\{ U_{k} \}_{k \in \Gamma}$ in $\mathbb{R}$. > [!basicproperties] > - *(Functions of independent random variables are independent)* If $X,Y$ are independent random variables and $f,g:\mathbb{R} \to \mathbb{R}$ are [[measurable function|Borel measurable]] functions, then $f \circ X$ and $g \circ Y$ are again independent random variables. > > > [!proof]- Proof. > > > > Suppose $U,V$ are [[Borel set|Borel subsets]] of $\mathbb{R}$. Then $\begin{align} > > \mathbb{P}(\{ f \circ X \in U \} \cap \{ g \circ Y \in V \}) &= \mathbb{P}(\{ X \in f ^{-1} (U)\} \cap \{Y \in g ^{-1} (V)\}) \\ > > &= \mathbb{P}(\{ X \in f ^{-1} (U) \}) \mathbb{P}( \{ Y \in g ^{-1} (V) \}) \\ > > &= \mathbb{P}(\{ f \circ X \in U \}) \mathbb{P}(g \circ Y \in V), > > \end{align}$ > > where the second equality holds because $X,Y$ are independent random variables. > > [!intuition] > $X_i$ take their values "independently", in that knowing something about the values one is taking tells nothing about the values another might be taking. ^intuition > [!basicexample] > Let $A, B \in \mathcal{F}$. The [[indicator random variable|indicator random variables]] $1_{A}$ and $1_{B}$ are independent if and only if $A$ and $B$ are [[independent|independent events]]. > > > > > [!proof]- Proof. > > > > $\implies.$ If $1_{A}$ and $1_{B}$ are independent, take $U=V=\{ 1 \}$. Then $\mathbb{P}(\{ 1_{A} \in U \})=A$ and $\mathbb{P}(\{ 1_{A} \in V \})=B$ are independent events as claimed. > > > > $\impliedby.$ Suppose $A$ and $B$ are independent events, $\mathbb{P}(A \cap B)=\mathbb{P}(A)\mathbb{P}(B)$. [[independent|Then also]] each of $\{ A^{c}, B^{c} \}$, $\{ A^{c}, B \}$, $\{ A, B^{c}\}$ is independent. Let $U,V$ be [[Borel set|Borel sets]] in $\mathbb{R}$. There are 16 (easy) cases. > > > > **1. $0 \in U, 1 \in U, 0 \in V, 1 \in V$.** In this case $\{ 1_{A} \in U \}=\Omega=\{ 1_{A} \in V \}$ and from $\mathbb{P}(\Omega)=1$ the result is clear. > > > > **2-5. $\{ 0,1 \}$ is contained in one set and exactly one of $0,1$ is in the other.** For example, consider $0 \in U, 1 \in U, 0 \in V, 1 \not \in V$. In this case $\{ 1_{A} \in U \}=\Omega$ and $\{ 1_{B} \in V \}=B^{c}$, so $\mathbb{P}(\underbrace{ \{ 1_{A} \in U \} }_{ \Omega } \cap \underbrace{ \{ 1_{B} \in V \} }_{ B^{c} })=\mathbb{P}(B^{c})=\mathbb{P}(\Omega) \mathbb{P}(B^{c})$ > > as required. Symmetric reasoning goes through for the other three cases of this kind. > > > > **6-12. One (or both) of the sets contains neither $0$ nor $1$.** Suppose it is $A$. Then $\{ 1_{A} \in U \}=\emptyset$ and $\mathbb{P}(\{ 1_{A} \in U \})=0$ so both sides of the desired equality are zero. > > > > > > **13-16 Each set contains exactly one of $0,1$.** In this case the result follows from the fact that each of $\{ A^{c}, B^{c} \}$, $\{ A^{c}, B \}$, $\{ A, B^{c}\}$ is independent. > > ---- #### ---- #### References > [!backlink] > ```dataview > TABLE rows.file.link as "Further Reading" > FROM [[]] > FLATTEN file.tags as Tag > WHERE Tag = "#definition" OR Tag = "#theorem" OR Tag = "#MOC" OR Tag = "#proposition" OR Tag = "#axiom" > GROUP BY Tag > ``` > [!frontlink] > ```dataview > TABLE rows.file.link as "Further Reading" > FROM outgoing([[]]) > FLATTEN file.tags as Tag > WHERE Tag = "#definition" OR Tag = "#theorem" OR Tag = "#MOC" OR Tag = "#proposition" OR Tag = "#axiom" > GROUP BY Tag > ``` #analysis/probability-statistics # Definition Define a familly of [[random variable]]s on the same [[probability space]] $\Omega$, $X_i: \Omega \to \mathbb{R}, \ \ i \in I.$ We say that $X_i: i \in I$ are **independent random variables** if for any set of [[real numbers]] $a_i$, $i \in I$ the [[event]] consisting of the outcomes where the $i^{th}$ [[random variable]] does not exceed $a_i$ are [[independent]], i.e., the [[event]]s $A_i = \{\omega \in \Omega : x(\omega) \leq a_i\}$ are [[independent]]. # Intuition $X_i$ take their values "independently", in that knowing something about the values one is taking tells nothing about the values another might be taking. # Property If each $X_i$ takes on [[countably infinite|countably]] many values, then we have $X_i \text{ are ind.} \iff \text{events } A_i=\{w\in \Omega: x_i(\omega)=a_i\} \text{ are ind. for any } a_i \in \mathbb{R}.$ **This is not true if the $X_i$ take on [[uncountably infinite|uncountably]] many values!** # Example Roll two fair dice. Let $X_1$ be the number shown by die 1, $X_2$ the number shown by die 2. We have $\Omega = \{(i,j) : 1\leq i \leq 6, 1 \leq j \leq 6\}.$ And $X_1(i,j) = i \text{ while } X_2(i,j) = j,$ meaning that each $X_k$ depends on only its own (the $k^{th}$) coordinate. #notFormatted