Let $A\in \mathbb{R}^{m\times n}$ be a matrix and let us denote by $A_S$ the submatrix of $A$ with the columns restricted to a set $S\subset [n]:=\{1,2,\cdots, \ n\}$. Then one says that the matrix $A$ has restricted isometry property of order $K$ with constant $\delta_K$ if the matrix $A$ has the property $$\delta_K=\max_{S\subset[n]:|S|\le K}\|A_S^TA_S-I\|_{2\to 2}$$
There is a very nice Cauchy-Scwartz like inequality concerning this constant which can be stated as below
If $x,y\in \mathbb{R}^n$ with $supp(x)\subset S_1,\ supp(y)\subset S_2,\ S_1\cap S_2=\emptyset$, then, $$|\langle Ax,Ay\rangle |< \delta_{|S_1|+|S_2|} \|x\|_2\|y\|_2$$
Emmanuel Candes proved this inequality here. The proof is rather simple.
I am trying to find a similar inequality, for the quantity in the LHS, but now a lower bound and also with the constraint $S_1\cap S_2\ne \emptyset$. I do not know if it is trivially $0$, but I hope it is not. I have still not succeeded to find it. I have searched for such an inequality in the literature but failed to get one.
So it will be really helpful if anyone can direct me either towards some relevant literature or giving some helpful tips to come up with a non-trivial lower bound. Thanks in advance.
I don't know if this is sufficiently interesting, but it may be the kind of thing that you are driving at when you say that the supports should overlap.
For $x, y$ write $\Delta(x, y) := \delta_{\#(\text{supp} x \cup \text{supp} y)} \|x\|_2 \|y\|_2$.
Take $x, y$. Let $\bar{x}, \bar{y}$ denote their restrictions to the common support $S_c$, set $\hat{x} := x - \bar{x}$ and $\hat{y} := y - \bar{y}$.
Then $$ \langle A x, A y \rangle = \langle A \bar{x}, A \bar{y} \rangle + \langle A \hat{x}, A \hat{y} \rangle + \langle A \bar{x}, A \hat{y} \rangle + \langle A \hat{x}, A \bar{y} \rangle \geq \langle A \bar{x}, A \bar{y} \rangle - \Delta_3, $$ where $\Delta_3(x, y) := \Delta(\hat{x}, \hat{y}) + \Delta(\bar{x}, \hat{y}) + \Delta(\hat{x}, \bar{y})$; note that those 3 pairs have disjoint support.
Using $\|A_{S_c}^T A_{S_c} - I\| \leq \delta_{|S_c|}$ then $$ \langle A x, A y \rangle \geq \langle \bar{x}, \bar{y} \rangle - \Delta_4(x,y), $$ where $\Delta_4(x,y) := \Delta_3(x,y) + \Delta(\bar{x}, \bar{y})$.
It is clear that you need something with $\langle \bar{x}, \bar{y} \rangle$, because for the 2x2 identity matrix and $x = (1, -1)$, $y = (1, 1)$, the quantity you are interested in is zero, despite nontrivial common support.
Addendum:
If you start with $\text{supp} x = \text{supp} y$ then the following can be said (this, however, has nothing to do with compressed sensing). Let us suppose $S_c = [n]$ for ease of notation, otherwise everything has to be restricted to $S_c$. Since $A^T A$ is a symmetric positive (possibly semi-)definite matrix, there is an orthonormal basis of its eigenvectors for $\mathbb{R}^n$, say $v_1, \ldots, v_n$ with corresponding eigenvalues $\lambda_i \geq 0$. Expand $x = \sum_i x_i v_i$ and $y = \sum_j y_j v_j$. Then $$ \langle A x, A y \rangle = x^T A^T A y = \sum_{i,j} x_i y_j v_i^T A^T A v_j = \sum_{i,j} x_i y_j \lambda_j v_i^T v_j = \sum_{i,j} x_i y_j \lambda_j v_i^T v_j \stackrel{{!}}{\color{red}\geq} (\min_k \lambda_k) \sum_{i,j} x_i y_j v_i^T v_j = m \langle x, y \rangle $$ with $m := \min_k \lambda_k$. The marked inequality is not valid in general, as the signs of the summands can be different (OP pointed this out). But on the other hand such a bound cannot hold in general: take $A = \text{diag}(1.2, 1)$ and $x = (1 / \sqrt{1.2}, 1)$, $y = (-1 / \sqrt{1.2}, 1)$. Then $\langle A x, A y \rangle = 0$ while $\langle x, y \rangle > 0$.
From the inequality $\|A_{S_c}^T A_{S_c} - I\| \leq \delta_{|S_c|}$ we get $|\lambda_k - 1| \leq \delta_{|S_c|}$ for each $k$, as can be checked by assuming the contrary. Therefore $m \geq 1 - \delta_{|S_c|}$. This is positive if $\delta_{|S_c|} < 1$, which usually the case in the context of the RIP.