So I am trying to solve for a value of $t$ that maximizes the following functions. I am not trying to solve for an actual value of $t$, but rather, I want to know if one is larger than the other. My conjecture is that the maximizing $t$ will be larger for the second equation. $$ \max_{t>0}e^{-rt}B(t) + \int_{0}^{t}e^{-rt}A(x)dx$$
$$\max_{t>0}e^{-r't}B(t) + \int_{0}^{t}e^{-r't}A(x)dx$$ with the following conditions: $B(t)>0$, $t>0$, $A(x)>0$, $A(t)>0$, $r'>r$ (note that $r'$ is simply a notation for a larger than $r$, not a derivative.
I tried to solve using supermodularity properties (using Topkis) but have not made much progress. For example, if the $\int$ terms were not in the equation, I could use a $log$ transformation to make this problem doable. Unless I am missing something, it does not seem possible to solve for $t$ as a function of $r$ directly. I am not interested in the value of the maximum, but rather the $\arg\max$
If I understand your statement correctly, the conjecture is true but in the opposite direction (?)
I assume here that in the integral the exponential factor is $e^{-rx}$ and not $e^{-rt}$. Otherwise the problem may be reduced to that without the last term. We assume everything differentiable and we want to to study monotonicity of the arg max of:
$$ f_r(t) = e^{-rt} B(t) + \int_0^t e^{-rx} A(x) dx = B_r(t) + \int_0^t A_r(x)dx$$ where we have introduced $B_r(t) = e^{-rt} B(t)$ and $A_r(x)=e^{-rx}A(x)$.
Consider what happens when replacing $r$ by $r+h$. We have: $$ f_{r+h}(t) = e^{-ht} B_r(t) + \int_0^t e^{-hx} A_r(x) dx$$ which has as derivative: $$ f_{r+h}'(t) = e^{-ht} (B_r'(t) - h B_r(t) + A_r(t))= e^{-ht} (f_r'(t)-h B_r(t))$$
Suppose for a given $r>0$ that $f_r$ has a nice global (concave) maximum at $t=t^*_r$, so that in particular $f_r'(t^*_r) =0$ and $f_r''(t^*_r) < 0$.
Setting $t=t_r^*+u$ we get by Taylor expansion in $t$ of $f'_{r+h}(t)$ (noting that $f'_r(t^*_r)=0$): $$ f_{r+h}'(t_r^* + u) = e^{-h(t_r^*+u)} ( f_r''(t^*_r) u - h B_r(t^*_r) + o(h,u)).$$ Here $f''<0$ and $B_r>0$ so we find that there is an extremum when $u = \lambda h + o(h)$ where: $ \lambda = B_r(t^*_r)/f''_r(t^*_r) < 0 $ so under the above assumptions, $t^*_r$ is decreasing with $r$ and we have the formula for its derivative: $$ \frac{d t^*_r}{dr} = \frac{B_r(t^*_r)}{f''_r(t^*_r)}<0$$ The above gives the behaviour of a local max under the concavity assumption. To get a global result note that the derivative w.r.t. $r$ is given by $$ \frac{\partial f_r}{dr}(t) = - t B_r(t) - \int_0^t x A_r(x) dx .$$ Thus at two values $0\leq t_1 < t_2<+\infty$ we have (writing e.g. $B_1=B_r(t_1)$,..): $$ \frac{\partial f_r}{dr}(t_1) - \frac{\partial f_r}{dr}(t_2) = t_2 B_2 - t_1 B_1 + \int_{t_1}^{t_2} x A_r(x)dx = t_2 B_2 - t_1 B_1 + \xi \int_{t_1}^{t_2} A_r(x) dx$$ for some $\xi \in (t_1,t_2)$ (by the MVT). Now, since $f_r(t)=B_r(t)+ \int_0^t A_r(x)dx$ we get: $$ \frac{\partial f_r}{dr}(t_1) - \frac{\partial f_r}{dr}(t_2) = (t_2-\xi)B_2 + (\xi-t_1)B_1 + \xi (f_r(t_2)-f_r(t_1)) $$ In particular as $(t_2-\xi)B_2 + (\xi-t_1)B_1 >0$ we may conclude that if $f_r(t_2)\leq f_r(t_1)$ then for every $r'>r$ we must have: $f_{r'}(t_2) < f_{r'}(t_1)$. This implies that the arg max (whatever choice you make among possible max values) is always a decreasing function of $r$.