Consider the region $A = \{(s,t) \in \mathbb{R}^2; s \geq 0, t \geq 0, s^2+t^2 \leq 1 \}$. Be $X = (X_1, X_2)$ a random vector that is uniformly distributed on $A$.
Now, how can I calculate the marginal density $f_{X_1}$?
My first idea regarding the joint pdf is that since the vector has a uniform distribution on $A$, the density function is constant for all $s,t$. Thus, $f_{X_1,X_2}(s,t) = \frac{1}{A}$.
$A$ is a set, not a number, so $\frac{1}{A}$ does not make sense. You are almost correct that the density function is constant, but it is only constant for all $(s,t)$ that lie in $A$. Outside $A$, the density is zero. Can you find the correct joint density now?
Then, to compute the marginal density of $X_1$, you do the usual thing: integrate the joint density with respect to its second argument. $$f_{X_1}(s) = \int_{-\infty}^\infty f_{X_1, X_2}(s,t) \, dt.$$
The tricky part is to figure out which values of $t$ make the joint density nonzero, and which values of $t$ make the joint density zero.