In enter link description here, @Ben Grossmann mentioned that if one of the matrices has rank 1 and no non-zero entries, $rank(A\circ B)= rank(A)rank(B)$.
My progress is as follows, where the superscript $(\cdot)^H$ indicates conjugate transposition.
Let $r_A=1$ and $r_B$ are ranks of $A$ and $B$. Using, e.g., the SVD, $A$ and $B$ can be written as $$ A=p_1q_1^H, \quad B=\sum_{j=1}^{r_B}s_jt_j^H, $$ for some vectors $p_1$, $q_1$, $s_j$, $t_j$, $j=1,\ldots,r_B$. Further, we have $$ A\circ B=\sum_{j=1}^{r_B}(p_1\circ s_j)(q_1\circ t_j)^H, $$ so $A\circ B$ is a sum of $r_B$ rank-one matrices and thus its rank cannot be higher than $r_B$.
The next step is to show $$ rank\left[\sum_{j=1}^{r_B}(p_1\circ s_j)(q_1\circ t_j)^H\right]=r_B, $$ which seems not trivial.
Here's a quick proof. Suppose (without loss of generality) that $A$ is rank $1$ with all non-zero entries. Let $u,v$ be column-vectors such that $A = uv^T$; notably, $u,v$ also have only non-zero entries. It follows that $D_u = \operatorname{diag}(u)$ and $D_v = \operatorname{diag}(v)$ are invertible matrices. Thus, we have $$ \operatorname{rank}(A \circ B) = \operatorname{rank}(D_u BD_v) = \operatorname{rank}(B) = \operatorname{rank}(A) \cdot \operatorname{rank}(B), $$ which was what we wanted.