Let $f \in L^2(\mathbb{R})$. Then $$f=g+h$$ where $g\in L^1(\mathbb{R})$ and $h\in L^{\infty}{(\mathbb{R})}$.
My approach :
Since $L^1(\mathbb{R}) \cap L^2(\mathbb{R})$ is dense in $L^2(\mathbb{R})$, there exists a function $g\in L^1(\mathbb{R})$ such that $$||f-g||_2 < M.$$ This would imply that $|f-g| < T$ a.e for some $T$. Now $h$ can be chosen accordingly and $||h||_{\infty} < T$.
I think I messed up in thinking the bounded part but please correct me n complete my argument.
Hint: $f= f\cdot \chi_{\{|f|\le 1\}}+ f\cdot \chi_{\{|f|>1\}}.$