On the ground, I have a winch driven by a synchronous motor to which I can command a torque input $u$. Parameters of this winch, such as friction $d$ and moment of inertia $J$ are well understood and identified.
The motor drives a drum which can spool a tether on or off. The tether is connected to a flying surf-kite. The dynamics of the kite are not well understood at all, but let's assume they are changing relatively slowly and smoothly.
I want to control the force on the tether which goes from the drum to the kite.
The problem is that using a PID controller that takes $(F_{setpoint} - F_{measured})$ as an input calculate $u$ sometimes results in the line going slack, suddenly rendering $F_{measured} = 0$. Of course, the proportional part of the PID controller then kicks in and wants to drive the force back to $F_{setpoint}$, resulting in a huge overshoot.
Is there a body of theory that treats such "unilateral" control action, i.e. that tracking an increasing $F_{setpoint}$ is much easier than tracking a decreasing one, where there is a danger of the process variable suddenly becoming zero?