Lattices in $\ell^\infty(X)$

73 Views Asked by At

Let $X$ be a set and let $A$ be a subspace of $\ell^\infty(X)$.

  1. Prove that for any $f,g \in \ell^\infty(X)$, $f+g = f \vee g + f \wedge g$ and $|f-g| = f \vee g - f \wedge g$.
  2. Show that $A$ is a sublattice of $\ell^\infty(X)$ if and only if $|f| \in A$ for all $A$.

My attempt:

For the first question, I know that $\ell^\infty(X)$ is a Banach lattice, $(f \vee g)(x) =$ max$\{f(x), g(x)\}$, and $(f \wedge g)(x) =$ min$\{f(x),g(x)\}$.

So my idea is:

$f \vee g + f \wedge g =$ max$\{f, g\} + $ min$\{f,g\} = f+g$

$|f-g| = $ max$\{f, g\} - $ min$\{f,g\} =f \vee g - f \wedge g$

But I am not exactly sure how to approach the second question. Any feedback for my first solution is appreciated as are any tips for the second.

1

There are 1 best solutions below

0
On BEST ANSWER

If A is a sublattice then $|f|=f \vee 0 - f \wedge 0 \in A$ since $0 \in A$ and A is a subspace. Conversely, if $|f| \in A$ for all $f \in A$ then $f \vee g =\frac {f+g+|f-g|} 2 \in A$ and $f \wedge g = \frac {f+g-|f-g|} 2 \in A$. The two equations I have used are obtained respectively by just adding the two equation in 1) and subtracting one from the other in 1).