----- > [!proposition] Proposition. ([[derivative of the dot product]]) > We have $D(u(x) \cdot v(x)) = u(x) \cdot Dv(x) + v(x) \cdot Du(x)$ > for $u$ and $v$ vectors depending smoothly on the variable $x$. > [!proposition] Corollary. > Letting $u=v$, we obtain $D(\|v(x)\|_{2}^{2})=D(v(x) \cdot v(x))=2Dv(x)\cdot v(x).$ > [!proof]- Proof. ([[derivative of the dot product]]) > Noting that any term in the [[dot product]] of two functions is single-variable function for which the single-variable [[single-variable product rule]] may be applied, we have $\begin{align}D\big(u(x) \cdot v(x)\big) = & D\left( \sum_{j=1}^{n} u_{j}(x)v_{j}(x) \right) \\= & \sum_{j=1}^{ n} D\big(u_{j}(x) v_{j}(x)\big) \\ = & \sum_{j=1}^{ n} u_{j}(x) Dv_{j}(x) + v_{j}(x)Du_{j}(x) \\ = & \sum_{j=1}^{n} u_{j}(x)Dv_{j}(x) + \sum_{j=1}^{n} v_{j}(x)Du_{j}(x) \\ = & u(x) \cdot Dv(x) + v(x) \cdot D{u}(x). \end{align}$ ----- #### ---- #### References > [!backlink] > ```dataview > TABLE rows.file.link as "Further Reading" > FROM [[]] > FLATTEN file.tags as Tag > WHERE Tag = "#definition" OR Tag = "#theorem" OR Tag = "#MOC" OR Tag = "#proposition" OR Tag = "#axiom" > GROUP BY Tag > ``` > [!frontlink] > ```dataview > TABLE rows.file.link as "Further Reading" > FROM outgoing([[]]) > FLATTEN file.tags as Tag > WHERE Tag = "#definition" OR Tag = "#theorem" OR Tag = "#MOC" OR Tag = "#proposition" OR Tag = "#axiom" > GROUP BY Tag > ``` $2 \lambda I + \sum_{i=1}^{n} \frac{y_{i}^{2}\tilde{x}_{i}\tilde{x}_{i}^{\top}}{(1+e^{y_{i}\theta^{\top}\tilde{x}_{i}})^{2}} \ \ (*)$ Let $i \in [n]$ and $\theta$ be arbitrary. By [[the self-outer product is positive semidefinite]], $\tilde{x}_{i} \tilde{x}_{i}^{\top}$ is a [[positive semidefinite matrix]], [[characterization of positive semidefinite operators|hence has nonnegative eigenvalues]]. Scaling a [[matrix]] scales its [[eigenvalue]]s by the same amount, so we conclude that, since $\frac{y_{i}^{2}}{(1+e^{y_{i} \theta^{\top} \tilde{x}_{i}})^{2}}$ is nonnegative, the [[matrix]] $\frac{y_{i}^{2}\tilde{x}_{i}\tilde{x}_{i}^{\top}}{(1+e^{y_{i}\theta^{\top}\tilde{x}_{i}})^{2}}$ is [[positive semidefinite matrix|PSD]] because [[characterization of positive semidefinite operators|because it has nonnegative eigenvalues]]. By [[sum of positive semidefinite matrices is positive semidefinite]] and that $i, \theta$ was arbitrary we have that $\sum_{i=1}^{n} \frac{y_{i}^{2}\tilde{x}_{i}\tilde{x}_{i}^{\top}}{(1+e^{y_{i}\theta^{\top}\tilde{x}_{i}})^{2}}$ is [[positive semidefinite matrix|PSD]] for all $\theta$. If $\lambda \geq 0$, then the [[eigenvalue]]s of $2 \lambda I$ are just $2 \lambda$ and hence nonnegative, so $2 \lambda I$ is [[positive semidefinite matrix|PSD]]. Then by [[sum of positive semidefinite matrices is positive semidefinite]] $(*)$ is [[positive semidefinite matrix|PSD]] and so by [[C2 function is convex iff its Hessian is everywhere PSD]] we're done. If $\lambda >0$, then the [[eigenvalue]]s of $2 \lambda I$ are $2 \lambda$ and hence positive, so $2 \lambda I$ is a[[positive definite matrix]]. Then by [[sum of positive definite and positive semidefinite matrix is positive definite]] $(*)$ is [[positive definite matrix|PD]] and so by [[C2 function is strictly convex if its hessian is everywhere positive definite]] we're done.