Examples:: *[[Examples]]* Nonexamples:: *[[Nonexamples]]* Constructions:: *[[Constructions|Used in the construction of...]]* Specializations:: *[[Specializations]]* Generalizations:: [[Euclidean diffeomorphism]] Justifications and Intuition:: *[[Justifications and Intuition]]* Properties:: [[derivatives are unique]], [[existence of derivative guarantees existence of all directional derivatives]], [[derivative of vector-valued function is vector of component derivatives]], [[total derivative equals Jacobian]] Sufficiencies:: *[[Sufficiencies]]* Equivalences:: *[[Equivalences]]* - Let $\Omega$ be an [[open set]] in $\mathbb{R}^{m}$; let $f:\mathbb{R} ^{m} \to \mathbb{R}^{n}$. > [!definition] Definition. (Derivative of $f:\Omega \subset \mathbb{R}^{m} \to \mathbb{R}^{n}$) > Suppose $\vec f: \Omega \subset \mathbb{R}^m \to \mathbb{R}^n$ with $\Omega$ an [[open set]] in $\mathbb{R}^m$. Let $\vec p \in \Omega$. We say **"$\vec f$ is differentiable at $\vec pquot;** if there exists $B \in \mathbb{R}^{n \times m}$ such that $\lim_{ \vec h \to \vec 0 } \frac{ \big\vert \vec f(\vec p+ \vec h) - \vec f(\vec p) - B\vec h \big\vert}{\big\vert \vec h \big\vert}=0.$ If such a $B$ exists, $B$ is called the [[derivative]] of $\vec f$ at $\vec p$, written $B=D \vec f (\vec p)$. > > [!intuition] Motivation. > # Single-variable definition from calculus Let $\phi : \Omega \subset \mathbb{R} \to \mathbb{R}$ with $\Omega$ open in $\mathbb{R}$, let $p \in \Omega$. > To say "$\phi$ is **differentiable** at $pquot; is to assert that the following [[function limit]] exists: $\lim_{ t \to 0 } \frac{\phi(p+t) - \phi(p)}{(p+t) - p}, $ and we call that limit the [[derivative]] of $\phi$ at $p$. Unfortunately, this definition *breaks down* if we attempt to naively generalize it to functions of several variables. > > # Towards a [[vector]]-valued definition (must have read above section) *so what do we do?* First, rewrite the single-variable definition from calculus: $\lim_{ t \to 0 } \frac{\phi(p+t) - \phi(p)}{t}$ which exists if and only if *there exists $b \in \mathbb{R}$ such that*$\lim_{ t \to 0 } \big\vert \frac{\phi(p+t) - \phi(p)}{t} - b \big\vert = 0,$ which is an equivalent expression to saying that (make common denominator) *there exists $b \in \mathbb{R}$ such that* $\lim_{ t \to 0 } \big\vert \frac{\phi(p+t) - \phi(p) - bt}{t} \big\vert = 0,$ i.e., *there exists $b \in \mathbb{R}$ such that* $\lim_{ t \to 0 } \frac{ \big\vert \phi(p+t) - \phi(p) - bt \big\vert}{\big\vert t \big\vert} = 0.$ Now **this** expression looks like one we can better generalize, replacing [[absolute value]]s with [[metric]]s and the like... ># Derivative of a [[vector]]-valued function > **Note. Now, $\vert \cdot \vert$ denotes the [[sup norm]], NOT [[absolute value]]!** > #### Remark. > We can actually remove the [[sup norm]] in the numerator above and keep an equivalent statement, that is, the above holds **iff** $\lim_{ \vec h \to \vec 0 } \frac{ \vec f(\vec p+ \vec h) - \vec f(\vec p) - B\vec h }{\big\vert \vec h \big\vert}=\vec 0.$ ### Intuition Don't forget that, in the single-variable case, the $b$ from whence $B$ came in the above 'derivation' was originally the value which, when subtracted from $\lim_{ t \to 0 } \frac{\phi(p+t) - \phi(p)}{t}$, yielded 0, i.e., $\lim_{ t \to 0 } \vert \frac{\textcolor{Skyblue}{\phi(p+t) - \phi(p)} - \textcolor{Thistle}{bt}}{t} \vert=0$. We called $b$ the derivative of $\phi$ at $p$, since it is the value that makes equality to 0 hold. But here we're looking at the notion of a *best linear approximation* $\textcolor{Thistle}{\text{err(t)} = bt}$ to the 'incrementer function' $\textcolor{Skyblue}{\phi(p+t) - \phi(p)}$, also called the *best first-order approximation*. But a *best linear function* is an idea that comes straight from linear algebra — [[linear map]]s and [[matrix|matrices]]! That's why we generalize $b$ to $B$. Another way to look at things would be that it happens to be the case that $b$ is also the [[function limit]] of the difference quotient. Therefore, it should make sense that we require $B$ to yield equality to 0 in a similar way; it may not be the limit of anything *per se* (at least i don't think?), but it behaves the same way as the single-variable derivative in a way that we've decided is more useful to care about! # How do we find $B$? - [[total derivative equals Jacobian]] > [!backlink] > ```dataview > TABLE rows.file.link as "Further Reading" > FROM [[]] > FLATTEN file.tags as Tag > WHERE Tag = "#definition" OR Tag = "#theorem" OR Tag = "#MOC" OR Tag = "#proposition" OR Tag = "#axiom" > GROUP BY Tag > ``` > [!frontlink] > ```dataview > TABLE rows.file.link as "Further Reading" > FROM outgoing([[]]) > FLATTEN file.tags as Tag > WHERE Tag = "#definition" OR Tag = "#theorem" OR Tag = "#MOC" OR Tag = "#proposition" OR Tag = "#axiom" > GROUP BY Tag > ```