Linear transformations A mapping from a vector space into a vector space is said to be a linear transformation if

$$ \begin{equation} \begin{aligned} L(\alpha \textbf{v}_{1}+\beta \textbf{v}_{2})= \alpha L(\textbf{v}_{1})+\beta L(\textbf{v}_{2}) \end{aligned}\end{equation} $$

for all and for all scalars and .

  In fact, for every linear transformation there exists a matrix corresponding to it.

  Consider a linear transformation from into , where and are vector spaces of dimension and , respectively. Let be an ordered basis for and be an ordered basis for . Let be a linear transformation mapping into . If is any vector in , then we can express in terms of the basis :

$$ \begin{equation} \begin{aligned} \textbf{v}=\sum_{j=1}^{n}x_{j} \textbf{v}_{j}. \end{aligned}\end{equation} $$

  For , let be the coordinate vector of with respect to ; i.e.,

$$ \begin{equation} \begin{aligned} L(\textbf{v}_{j})=\sum_{k=1}^{m}a_{kj} \textbf{w}_{k}. \end{aligned}\end{equation} $$

Then,

$$ \begin{equation} \begin{aligned} L(\textbf{v})&=L(\sum_{j=1}^{n}x_{j} \textbf{v}_{j}) \\ &=\sum_{j=1}^{n}x_{j}L(\textbf{v}_{j}) \\ &=\sum_{j=1}^{n}x_{j}(\sum_{k=1}^{m}a_{kj} \textbf{w}_{k}) \\ &=\sum_{k=1}^{m}(\sum_{j=1}^{n}x_{j}a_{kj}) \textbf{w}_{k}. \end{aligned}\end{equation} $$

  If we let and , then

$$ \begin{equation} \begin{aligned} \textbf{y}_{i}=(y_{1},y_{2},\dots,y_{m})^{T}=A\textbf{x} \end{aligned}\end{equation} $$

is the coordinate vector of with respect to . We therefore find a corresponding matrix to .

Derivatives

  If is a real function with domain and if , then is usually defined to be the real number

$$ \begin{equation} \begin{aligned} \lim_{h\rightarrow0} \frac{f(x+h)-f(x)}{h}, \end{aligned}\end{equation} $$

provided, of course, that the limit exists. Thus

$$ \begin{equation} \begin{aligned} f(x+h)-f(x)=f'(x)h+r(h) \end{aligned}\end{equation} $$

where the “remainder” is small, in the sense that

$$ \begin{equation} \begin{aligned} \lim_{h\rightarrow0} \frac{r(h)}{h}=0. \end{aligned}\end{equation} $$

  Note that the equality above might indicate that the difference could be approximated by a linear function . The point is, we could also regard the derivative of , this linear function, as a linear operator on which takes to .

  Then we can naturally extend the concept of derivative to a higher dimensional space. In this case the derivative of function should be a linear transformation, or, equivalently, a matrix.

9.11 Definition Suppose is an open set in , maps into , and . If there exists a linear transformation of into such that

$$ \begin{equation} \begin{aligned} \lim_{\textbf{h}\rightarrow0} \frac{\vert \textbf{f}(\textbf{x}+\textbf{h})- \textbf{f}(\textbf{x})- A\textbf{h}\vert} {\vert\textbf{h}\vert}=\textbf{0}, \end{aligned}\end{equation} $$

then we say that is differentiable at , and we write

$$ \begin{equation} \begin{aligned} \textbf{f}'(\textbf{x})=A. \end{aligned}\end{equation} $$