Let $X,Y$ Bernoulli random variables with parameter $p$. My question is: Is it possible to cumpute the variance of $X+Y$ in the case when the variables are not independent?. My try was \begin{equation} \begin{split} Var(X+Y)&=E((X+Y)^2)-E(X+Y)^2\\ &=p+2E(XY)+p-4p^2\\ &=2p-4p^2+2E(XY) \end{split} \end{equation} But I don't know how to compute $E(XY)$ without a joint probability function, because $E(XY)=P(X=1,Y=1)$ and I'm not very sure that it is $p^2$ in general.
Any advice is welcome. Thank you!.
You have to specify how $X$ and $Y$ are dependent in order to answer the question. Just saying they are "not independent" does not furnish enough information to compute the variance of their sum.
For instance, let $0 \le k \le 1$ and construct the general joint PMF
$$\begin{array}{|c|c|c|c|} \hline & \Pr[X = 1] & \Pr[X = 0] & \\ \hline \Pr[Y = 1] & kp & (1-k)p & p \\ \hline \Pr[Y = 0] & (1-k)p & 1 + (k-2)p & 1-p \\ \hline & p & 1-p & 1 \\ \hline \end{array}$$
You can verify that the marginal probabilities meet the conditions of your problem, but the constant $k$ defines the strength of dependence between $X$ and $Y$. For instance, if $k = p$, then $X$ and $Y$ are independent, but if $k = 1$, then $X = Y$ almost surely, and if $k = 0$, then $X = 1 - Y$ almost surely.
Indeed, when $X = Y$ (case $k = 1$), then $\operatorname{Var}[X+Y] = 4 \operatorname{Var}[X] = 4p(1-p)$; but when $X = 1 - Y$ (case $k = 0$), then $X + Y = 1$ and $\operatorname{Var}[X+Y] = 0$.
I leave it as an exercise for you to compute $\operatorname{Var}[X+Y]$ as a function of this "dependence parameter" $k$.