A (possible) generic spectral property in one dimensional dynamics

201 Views Asked by At

This question was previously posted on MathOverflow.

Context and Definitions

Consider the interval $I=[0,1]$. We say that $T:I\to I$ satisfies the axiom A (I am following [MvS]) if:

  1. $T$ has a finite number of hyperbolic periodic attractors; and
  2. defining $B(T)$ as the union of the basins of the hyperbolic attractors of $T$, the set $\Lambda = I \setminus B(T)$ is a hyperbolic set, i.e. there exists $C >0$ and $\lambda>1$ such that $$ |(T^n)'(x)| > C \lambda^n\ \text{for every }x\in \Lambda.$$

It is well known that the set $\mathcal A = \{T: I\to I\in\mathcal C^1; \ T\ \text{satisfies the axiom A}\}$ is open and dense in the $\mathcal C^1$-topology (see [Chapter IV,1]).


Defining $R = \overline{\{p\in\Lambda;\ p\ \text{is a periodic orbit of }T\}}$ we know from the dynamical decomposition [Theorem 11.2.15, OV] that there exists a (finite) partition of $R$ in non-empty compact sets $R^{1},R^2,\ldots, R^k$ such that:

  1. $R^{i}$ is a $T$-invariant set for every $i$;
  2. $T:R^i\to R^i$ is uniformly hyperbolic and topologically transitive.

Also, from the thermodynamic formalism for expanding maps, we obtain that the operators \begin{align*}\mathcal L_i : \mathcal C^0(R^i) &\to \mathcal C^0(R^i)\\ f&\mapsto \left(x\mapsto\sum_{T(y)=x}\frac{f(y)}{|T'(y)|}\right), \end{align*} satisfy $\mathrm{dim}\,\mathrm{ker}(\mathcal L_i - r(\mathcal L_i)) = 1,$ where $r(\mathcal L_i)$ is the spectral radius of $\mathcal L_i$.


My Question:

Let us now consider the operator \begin{align*}\mathcal L : \mathcal C^0(R) &\to \mathcal C^0(R)\\ f&\mapsto \left(x\mapsto\sum_{T(y)=x}\frac{f(y)}{|T'(y)|}\right), \end{align*} we have that $\mathcal L(f) = \sum_{i=1}^k \mathcal L_i (\mathbf 1_{R^i} f)$ and therefore $r(\mathcal L) = \max\{r(\mathcal L_1),\dots, r(\mathcal L_k)\}.$

It seems to me that (generically) we have that $\mathrm{dim}\, \mathrm{ker}(\mathcal L - r(\mathcal L)) = 1.$ Equivalently, the set $\{r(\mathcal L_1), r(\mathcal L_2), \ldots, r(\mathcal L_k))\}\subset \mathbb R$ (generically) has a unique maximal element.

In this way, I would like to prove the following theorem:

Possible Theorem: The set $$\mathcal B = \{T:I\to I \in \mathcal C^1;\ T \text{ satisfies the axiom A} \ \text{and }\mathrm{dim}\,\mathrm{ker}(\mathcal L - r(\mathcal L)) =1 \},$$ is open and dense in open and dense (or at least residual) in the $\mathcal C^1$ topology.

I feel that this result is known in the literature however I am neither able to find nor prove it. Can anyone please help me?


[MvS] One-dimensional Dynamics - W. de Melo and S. van Strien (https://link.springer.com/book/10.1007/978-3-642-78043-1)

[OV] Foundations of ergodic theory - M. Viana and K. Oliveira (https://www.cambridge.org/core/books/foundations-of-ergodic-theory/F5AE11B50B16D9FF32909EA4BBA1E7CE)