I am self-studying vector calculus and trying to differentiate the following function $f$
$$ f(\textbf{t}) = sin(log(\textbf{t}^T\textbf{t})) $$
with respect to $\textbf{t}$ where $\textbf{t} \in \mathbb{R}^D$.
Using the chain rule twice, I computed:
$$ \frac{df}{d\textbf{t}}= cos(log(\textbf{t}^T\textbf{t})) \cdot \frac{1}{\textbf{t}^T\textbf{t}} \cdot ... $$
$cos$ is the derivative of the outermost function ($sin$). To derive the inner function ($log(\textbf{t}^T\textbf{t})$), I applied the chain rule again and receive $\frac{1}{\textbf{t}^T\textbf{t}}$ as the derivative for the outer function ($log$) and now I need to multiply this with the derivative of the inner function ($\textbf{t}^T\textbf{t}$). But how exactly can I compute this derivative?
My first idea was that this derivative has to be scalar-valued since $\textbf{t}^T\textbf{t}$ is just a scalar, but one (unofficial) solution I found online was this:
$$ \frac{df}{d\textbf{t}} = cos(log(\textbf{t}^T\textbf{t})) \cdot \frac{1}{\textbf{t}^T\textbf{t}} \cdot [2t_1,2t_2,...,2t_n] = cos(log(\textbf{t}^T\textbf{t})) \cdot \frac{2t^T}{\textbf{t}^T\textbf{t}} $$
Here, the derivative of $\textbf{t}^T\textbf{t}$ is a vector. I don't have a solution for this exercise available and right now I am struggling a bit because I don't fully understand how I can differentiate this last component $\textbf{t}^T\textbf{t}$. I am new to vector calculus and would truly appreciate any help or insight.
Notice that you are taking the derivative of a function $f:\mathbb{R}^D\to \mathbb{R}$ with respect to a vector $\mathbf{t}\in \mathbb{R}^D$. Finding the derivative of a scalar field with respect to a vector is what we mean by "computing the gradient." This means that differentiating correctly will yield a "gradient vector." Everything you have up to
$$\frac{df}{d\textbf{t}}= \cos(\log(\mathbf{t^T}\mathbf{t})) \cdot \frac{1}{\textbf{t}^T\textbf{t}} \cdot \frac{\partial (\mathbf{t^Tt})}{\partial \mathbf{t}}$$
is exactly the right idea. Evaluating $\frac{\partial (\mathbf{t^Tt})}{\partial \mathbf{t}}$ can be done in a few different ways. Appealing to the intuition you provided above, $\mathbf{t^Tt}$ is a scalar thus the gradient of $\mathbf{t^Tt}$ is given by
$$\frac{\partial (\mathbf{t^Tt})}{\partial \mathbf{t}} = \nabla(t_1^2 + t_2^2 + \dots + t_D^2) = [2t_1, 2t_2, \dots, 2t_D] \text{ .}$$
If you interpret $\mathbf{t}$ as a column and the gradient as a row, then certainly
$$\frac{\partial (\mathbf{t^Tt})}{\partial \mathbf{t}} = 2\mathbf{t^T} .$$