Given an algorithmic problem, theoretical computer science has powerful tools in order to give lower bounds on the number of required computation steps, based on the strong time hypothesis (SETH). For instance, we require at least $\Omega(n^2)$ computation steps in order to compute the Frechet distance of two curves (each on $n$ points) in the plane.
Are there similar conjectures that can help me to give space lower bounds? For instance, stating that we need at least $\Omega(\sqrt{n})$ space to compute the Frechet distance if we are restricted to $O(n^2)$ time.
Lower Bounds based on Satisfiability (SAT) seem not appropriate, as we only need linear space to solve SAT. Even quantified boolean feasibility (QBF) can be solved with polynomial space. So, we might want to start with a problem that requires exponential space.
I am aware of so-called cell-probe lower bounds. But those only apply to data structure problems. I have also heard of communication complexity lower bounds.
OK, here I have a partial answer. If we want to show space lower bounds then it seems to be handy to have so-called transducer reductions. Specifically, if we want to show sublinear space lower bounds. The idea is as follows. For concreteness say that we have the hypothesis that Problem A requires at least $\sqrt{n}$ working memory and we want to conclude that Problem B requires at least the same amount. Furthermore, let's assume we have a linear algorithm T that transforms instance $I$ of Problem A to an instance $J$ of Problem B. Additionally, T requires only, say $\log n$ space and T does not need to output the entire instance $J$, but only some required bit of it.
Then we can argue that any Algorithm X for Problem B also requires at least $\sqrt{n}$ bits of memory. Otherwise, we can use Algorithm X and transducer T to solve Problem A with $o(\sqrt{n})$ bits.
Some work on transducers in the context of complexity classes can be found here: https://arxiv.org/abs/2201.13119
Note that problems in NP are not good candidates as a starting problem. For instance, we can solve in linear space, and we cannot solve it with less memory as the running time is already at least exponential. Whenever the running time of an algorithm is $t$, we need at least $\Omega(t)$ memory. Otherwise, the Turing machine was twice in the same state, which cannot happen. (Because then it would loop forever returning to the same state.)
As a starting problem, Jesper Nederlof suggested the directed reachability problem. Given a directed graph G and two vertices s,t, is there a directed path from s to t? If we restrict ourselves to polynomial-time algorithms then the best-known space complexity is linear. (Savitch's theorem has better space complexity, but requires quasipolynomial running times.)