In one of my classes, our teacher introduced us to a software. Using this software, we had to model a lake with fish in it. The growth of the fish population would be governed according to two things: the natural logistic growth rate and fishing. The task was then to maximize the amount fished after a set period of time. This was to learn how to use the software, but just for fun, I wanted to find the absolute maximum.
Let $r$ be the logistic growth rate, $k$ be the carrying capacity, $T$ be the total time, and $P_0$ be the starting fish population with $P_0 < k$. All four of these values are fixed. Let $P(t)$ be the fish population at time $t$ and $f(t)$ be the fishing rate. Then I got that the differential equation for $P$ would be $$\frac{dP}{dt} = \underbrace{rP\left(1 - \frac{P}{k}\right)}_{\text{logistic growth}} - f \tag 1$$
while the amount fished (which is the value to be maximized) is $$\int_0^T f(t)dt \tag 2$$
The main condition is that if $P$ drops to $0$, then $f$ would also have to drop to $0$. You can't get fish if there's no fish left! Otherwise, you could make $f$ arbitrarily large. Also, $f$ must be nonnegative: no adding fish to the lake.
Trying to solve $(1)$ for an explicit equation for $P$ seems impossible, but I'll show what I tried. Setting it up as $Mdt + NdP = 0$ makes it $$\left( rP \left( 1-\frac{P}{k} \right) - f \right) dt + (-1) dP = 0$$
Multiplying by an integrating factor $u$, I get that $$r \left( 1- \frac{2P}{k} \right)u = -\frac{\partial u}{\partial t} - \left( rP\left(1-\frac{P}{k}\right)-f \right) \frac{\partial u}{\partial P}$$
I'm pretty new to differential equations, and I don't know how to solve this (or if it even is solvable).
Rearranging $(1)$, I get that $f = rP\left(1-\frac{P}{k}\right) - \frac{dP}{dt}$, which means that $(2)$ is $$\int_0^T \left( rP\left(1-\frac{P}{k}\right) - \frac{dP}{dt} \right) dt = r\int_0^T P\left(1-\frac{P}{k}\right)dt - P(T)+P_0$$
Intuitively, it is true that $P(T) = 0$. If it was greater than $0$, then more fish could be taken. I'm not sure what to do from here though. One of the challenges with this problem is that it's impossible to work from the end. It might pay off to let the fish grow more at that instant so that you can fish more later.
My questions:
$1. $ What is the $f(t)$ that maximizes $(2)$ given that $P$ changes according to $(1)$, $f(t) \ge 0$, and if $P = 0$, then $f = 0$?
$2. $ Can the problem be modified in some way to get rid of $r, k, T,$ or $P_0$ (i.e. can it be transformed so that one of those can be fixed at $1$ [or any other value])? I think a linear transformation might be useful here, but I don't know.
$3. $ I'm doubtful of an exact closed-form solution, so are there approximations of $f(t)$?
We can rescale the function $P$. Let $P(t)=k \tilde P(rt)$ so, in the new variable $w=rt$ we get the equation $$\frac{d}{dw}\tilde P(w) = \tilde P(w)(1-\tilde P(w)) - \tilde f(w) \quad\text{in}\quad [0, rT],\quad \text{where}\quad \tilde f(w)= \frac{f(w/r)}{rk}.$$ In this way we get $\tilde P(0)=k^{-1}P_0 \in (0,1)$. We can make the change of variable $$\int_0^T f(t) \,dt \overset{t=w/r}{=} rk \int_0^{rT} \tilde f(w) \,dw \overset{w=srT}{=}r^2 k T \int_0^1 \tilde f(srT) \,ds$$ Hence we can always reduce to the case $P'=P(1-P)-f(t)$ with $P_0 \in (0,1)$ and $T=1$.
We can think the problem as a maximization problem for $P$. Indeend $f$ can be obtained from $P$ by the equation, and vice versa.
We need to maximize $$\int_0^1 f_P(t) dt = \int_0^1 P(t)-P^2(t) \,dt - P(1)+P_0$$ with $P(t)\in S=\{C^1[0,1]: P(0)=P_0\}$. We write $h(t)=P(t)+\delta g(t)$ with $$ g(t)\in S:=\{ C^1[0,1] : g(0)=0\} $$ and we check if at least there are critical points. We have $$I[\delta]:=\int_0^1 (P(t)+\delta g(t)-(P(t)+ \delta g(t))^2 \, dt +(P(0)+\delta g(0))- (P(1)+\delta g(1)$$ and we compute $$0\overset{\text{E.L.}}{=}I'[0]=\int_0^1 g(t)-2P(t) g(t)\,dt-g(1) =\int_0^1 g(t)(1-2P(t))\,dt-g(1).$$
This can be written in distributional sense as $\int_0^1 g(t) [1-2P(t)-\delta_1(t)]\,dt=0$ (here $\delta_1(t)$ is the Dirac Delta with centre 1) and the Fundamental Lemma of Calculus of Variation gives $1-2P(t)-\delta_1(t)=0$, that is $P(t)=\frac{1+\delta_1(t)}{2}$. This is $1/2$ in all $(0,1)$ with $P(1)=1$ and $P(0)=P_0$. This proves that there are no $C^1$ maximizers.
If we impose both the boundary values $P_0=1/2$ and $P(1)=0$ we restrict the admissible functions by requiring $g(1)=0$ and we still get non-existence.
Only in case $P_0=1/2$ we get a critical point, which is the stationary solution $P(t)\equiv 1/2$ and $f(t)=1/4=\int_0^1 f(t) \,dt$. This is indeed a maximizer (by computing the second variation).
We could approximate the stationary solution $P(t)\equiv 1/2$ (and hence approximating $f(t)=1/4$) taking $P_n(t)=\frac{1}{2}\chi_[1/n,1-1/n]+g_n(t)$ where $g_n$ makes $P_n(t)$ a $C^1$ continuation in $[0,1/n]$ and $[1-1/n,1]$. In this way $f(t)$ will be very well approximated in most of the interval but badly close to the boundary (since $|P_n'(t)|$ grows as $n \to \infty$).