Let $\rho$ and $\phi$ be any two different density matrices on $H_A \otimes H_B$ such that $Tr_B (\rho) = Tr_B (\phi)$.
Does there always exist a unitary $U$ on $H_A \otimes H_B$, $U\rho U^{\dagger} = \rho^{\prime}$ and $U\phi U^{\dagger} = \phi^{\prime}$ such that $Tr_B (\rho^{\prime}) \neq Tr_B (\phi^{\prime})$?
Here's a detailed look at the case where $\mathcal H_A = \mathcal H_B = \Bbb C^2$. A similar line of reasoning is applicable whenever both $\mathcal H_A,\mathcal H_B$ have dimension greater than $1$.
Let $\sigma = \rho - \phi \neq 0$. For a proof by contrapositive, we will show that if $\operatorname{tr}_B(U\sigma U^\dagger) = 0$ for all $U$, then $\sigma = 0$ (and hence $\rho = \varphi$). Note that $\sigma$ is Hermitian, and hence unitarily diagonalizable. That is, there exists a $U$ such that $$ U\sigma U^\dagger = \pmatrix{\lambda_1 \\ & \lambda_2 \\ & &\lambda_3 \\ &&& \lambda_4}. $$ The partial trace computation yields $\lambda_1 + \lambda_2 = \lambda_3 + \lambda_4 = 0$. However, a permutation matrix $P$ allows us to obtain diagonal matrices $(PU)\sigma(PU)^\dagger$ with the eigenvalues permuted arbitrarily. Thus, we have $\lambda_i + \lambda_j = 0$ for all $i \neq j$.
To conclude that the eigenvalues are all zero, we could apply logic like the following: $$ \lambda_1 = -\lambda_2 = \lambda_3 = -\lambda_1 \implies \lambda_1 = 0. $$ Thus, $\sigma$ is a Hermitian matrix with eigenvalues equal to zero. Thus, $\sigma = 0$, which was what we wanted.
So, in the case that $\mathcal H_A = \mathcal H_B = \Bbb C^2$, we indeed find that such a $U$ must exist whenever $\rho \neq \phi$.
This argument can easily be extended to allow us to conclude that such a $U$ must exist in the case where $\dim(\mathcal H_B) = 2$ and $\dim(\mathcal H_A) \geq 2$ is arbitrary.
In fact, the above argument can be extended to handle the general case where $\dim(\mathcal H_A) > 1$. Let $m,n$ denote the dimensions of $\mathcal H_A, \mathcal H_B$ respectively. The matrix $\sigma = \rho - \phi$ has eigenvalues $\lambda_1,\lambda_2,\dots,\lambda_{mn}$. Using suitable matrices $U$, we can conclude that for any distinct indices $j_1,\dots,j_n$, we have $$ \sum_{k=1}^n \lambda_{j_k} = 0. $$ With that, we have $$ \lambda_1 + \sum_{k=3}^{n+1} \lambda_k = \lambda_2 + \sum_{k=3}^{n+1} \lambda_k = 0 \implies\\ \left(\lambda_1 + \sum_{k=3}^{n+1} \lambda_k\right) - \left(\lambda_2 + \sum_{k=3}^{n+1} \lambda_k\right) = 0 \implies\\ \lambda_1 - \lambda_2 = 0. $$ That is, we have $\lambda_1 = \lambda_2$. By a similar argument, $\lambda_j = \lambda_k$ for all $1 \leq j,k \leq mn$. On the other hand, the fact that $\operatorname{tr}(\sigma) = 0$ implies that $$ \lambda_1 + \cdots + \lambda_{mn} = mn \lambda_1 = 0 \implies \lambda_1 = 0, $$ so that all eigenvalues of $\sigma$ are equal to zero. Since $\sigma$ is a Hermitian matrix whose eigenvalues are equal to zero, we conclude that $\sigma = 0$, as desired.