Consider the function $f(x,y) = x\cos y + y\sin x$. Find the linear approximation of $f$ around $(0,0)$. Find a bounding factor for the error in the region $[-1,1]\times[-1,1]$.
I think by linear it means only the first order taylor approximaiton, which is
$$f(x+p) = f(x) + p^T\nabla f(x+tp)$$
for some $t\in (0,1)$.
Or should I use the second order? There's no clear explanation.
I see that $\nabla f(x) = (\cos y + y \cos x, -x\sin y + \sin x) $. We're supposed to evaluate at $(0,0)$ so
$$f(0+p) = f(0) + (p_1,p_2)\cdot (\cos p_2t + p_2t \cos p_1t, -p_1t\sin p_2t + \sin p_1t) = p_1(\cos p_2t + p_2t \cos p_1t) + p_2 (-p_1t\sin p_2t + \sin p_1t)$$
for some $t\in (0,1)$.
First of all, this doesn't look linear. For example, $p_1\cos p_2t$ for some fixed $t$: if we move $p$ and therefore $p_1$ by a little, it does not in principle move linearly (unless we consider small $p$, but I don't know if it's the case). Also $-p_1p_2t\sin t$ is not linear at all.
I think that for the error I can measure the length of the gradient and imagine all possible $t$. But what is the square $[-1,1]\times[-1,1]$? I'm trying to imagine this expansion inside it but I can't.
You should not use $x$ as a coordinate variable and as vector variable at the same time. You may, e.g., write ${\bf z}=(x,y)$. The linear Taylor approximation to $f$ at ${\bf 0}$ is a clear cut polynomial of degree $1$ in the coordinate variables, and is given by $$j^1\!f({\bf z})=f({\bf 0})+\nabla f({\bf 0})\cdot(x,y)=0+(1,0)\cdot(x,y)=x\ ,$$ period. There is no obscure $t$ here. Note that at the origin the increment variable, $p$ in the question, coincides with the basic variable ${\bf z}$. (There is no universally accepted notation for the $n^{\rm th}$ order Taylor polynomial. Many authors use $j^n\!f$ for "$n$-jet".)
In order to obtain an bound for the error $$ R({\bf z}):=f({\bf z})-j^1\!f({\bf z})$$ we have to use Taylor's theorem with some remainder. One possible form is $$f({\bf z})=j^1\!f({\bf z})+{1\over2}\sum_{i,k}f_{.ik}(\tau{\bf z})z_i z_k\ ,\tag{1}$$ meaning that there is a point $\tau$ at an unknown location in $(0,1)$ such that $(1)$ holds. Hereby I have written $(z_1,z_2)$ for $(x,y)$, and $f_{.ik}({\bf z}):={\partial^2 f\over\partial z_i\partial z_k}\biggr|_{\bf z}$. The second partials of $f$ compute to $$f_{xx}=-y\sin x,\quad f_{xy}=-\sin y+\cos x,\quad f_{yy}=-x\cos y\ .$$ From $(1)$ we therefore obtain the following bound for the error $$R({\bf z})={1\over2}\sum_{i,k}f_{.ik}(t{\bf z})z_i z_k\ ,$$ valid for ${\bf z}\in Q:=[{-1},1]^2$: $$|R({\bf z})|\leq{1\over2}\sup_{{\bf z}\in Q}\bigl(|f_{xx}|+2|f_{xy}|+|f_{yy}|\bigr)\leq {3\over2}(1+\sin 1)\ .$$ This is a very bad bound, because $Q$ is quite large. The linear approximation to $f$ at ${\bf 0}$ cannot capture all features of $f$ present in $Q$. At the bottom vertices of $Q$ the true error of the linear Taylor approximation is $1.3$.