Let $X_1,X_2$ be independent random variables.
Moreover $X_1,X_2$ are discrete uniform distributed({$1,...,N$})
We define:
$A:= X_1+X_2$
$B:= \min(X_1,X_2)$
Find joint-distribution of $A$ and $B$ and calculate covariance of $A$ and $B$.
I don't really have a clue how to find the joint-distribution of $A$ and $B$. So for this task I really need some help. Maybe you can give me a hint and I try to solve it then. I would edit this question with my attempt until I find the joint-distribution.
Edit: Let us start with the joint distribution. $P(A=a,B=b) =P(B=b|A=a)\cdot P(A=a)$.
In the answers below I saw that: $ P(A=a)=\frac{1}{N^2}\left\{ \begin{array}{lr}a-1& 2\leq a \leq N+1 \\ 2N+1-a & N+2\leq a \leq 2N \end{array} \right. $
I understand why this formula holds for $P(A=a)$ but only with the example. I don't know how we can show it. Moreover We have to find $P(B=b|A=a)$. I think a) is clear now. Will try to edit my attempt in a few days.
Edit for the second part: I will write $Cov$ for covariance. So we have to calculate $Cov(A,B)$.
We already know that: $Cov(A,B) = E(AB) - E(A)E(B) $ (expected value)
All we have to compute is the expected value for $AB,A,B$. Thanks to the user "mathemagical". I already know the value of $E(A), E(B)$. I even understand the rest of the answer except how we can calculate $E(Z)$.
I am assuming that you mean the discrete uniform distribution that gives probability $\frac{1}{N}$ to each point. As the joint distribution is a bit gnarly, it seems easier to compute the covariance by computing the various pieces of $E(AB)-E(A)E(B)$.
Hints/Outline: (For easy typing, I have changed $X_1$ to $X$ and $X_2$ to $Y$).
For the covariance of $A$ and $B$ First find the distribution and expectation for A and B. Then do the same for $AB$.
From independence, the joint distribution of $(X,Y)$ is easy, with the probability associated with any $(x,y)$ pair given by $\frac{1}{N^2}$
Compute $E(A)$ and $E(B)$
Note that $A=X+Y$ takes values from 2 to 2N with probabilities as follows $$P(A=k)=\frac{1}{N^2}\left\{ \begin{array}{lr}k-1& 2\leq k \leq N+1 \\ 2N+1-k & N+2\leq k \leq 2N \end{array} \right.$$ (do you see why? The values taken by A for each X and Y when N=6 are shown in the table below, where X and Y are shown in the margins) $$\begin{array}{c|cccccc} 6&7&8&9&10&11&12\\ 5&6&7&8&9&10&11\\ 4& 5&6&7&8&9&10\\ 3& 4& 5&6&7&8&9\\ 2&3& 4& 5&6&7&8\\ 1&2&3&4& 5&6&7\\ \hline A&1&2&3&4& 5&6 \end{array}$$
This gives (derive ) $$E(A)=N+1$$ Note that $B=X$ when $X \le Y$ and $B=Y$ when $X\ge Y$ as in the example table below for $N=6$. $$\begin{array}{c|cccccc} 6&1&2&3&4&5&6\\ 5&1&2&3&4&5&5\\ 4&1&2&3&4&4&4\\ 3&1&2&3&3&3&3\\ 2&1&2&2&2&2&2\\\ 1&1&1&1&1&1&1\\ \hline B&1&2&3&4& 5&6 \end{array}$$ So (derive) $$P(B=k)=\frac{2N-2k+1}{N^2}$$ and $$E(B)=\frac{N(N+1)(2N+1)}{6N^2}$$
Compute $E(AB)$
Note that $$AB=\left\{ \begin{array}{lr} XY + Y^2& Y\le X\\XY+X^2&Y>X\end{array}\right.$$
$$AB=XY +\min(X^2,Y^2)=XY+Z$$
Now note that $E(XY)=E(X)E(Y)=\left(\frac{N+1}{2}\right)^2$ by independence. You can now finish the computation of $Cov(A,B)=E(XY)+E(Z)-E(A)E(B)$ by finding $E(Z)$ (you can recycle the distribution of $B$ for this: note that $P(Z=k^2)=P(B=k)$)
That then leads you to the covariance of A and B since you already have $E(A), E(B)$.
For the joint cumulative distribution of $A$ and $B$, use the geometry of the tables for $A$ and $B$ to note that there are 3 cases (and some meticulous computation for the probabilities in each of the 3 cases).
$$F(a,b)\equiv P(A\le a, B\le b)=\frac{1}{N^2} \left \{\begin{array}{lr} 2Nb-b^2& a\ge N+b\\ N^2-\frac{(2N-a)(2N-a+1)}{2}&a\le 2b\\ N^2-\frac{(2N-a)(2N-a+1)}{2} - \frac{(a-2b)(a-2b+1)}{2}& \mbox{otherwise} \end{array}\right.$$
If you want to find the probability mass function, then $$P(A=a,B=b)=F(a,b)-F(a-1,b)-F(a,b-1)+F(a,b)$$