I want to show that the function
$$ L^2(\Omega,\mathcal{O})\longrightarrow L^2(\widetilde{\Omega},\mathcal{O}) \colon f \longmapsto f|_{\widetilde{\Omega}}$$
is a isomorphism, where $L^2(\mathbb{C},\mathcal{O})$ is the hilbert space of square-integrable holomorphic functions on $\mathbb{C}$ and $\Omega \subset \mathbb{C}$ is a bounded domain and for $E\subset \Omega$ finite we define $\widetilde{\Omega}:=\Omega \setminus E$.
Since $E$ has Lebesgue measure $0$, the restriction $\rho \colon L^2(\Omega,\mathcal{O}) \to L^2(\widetilde{\Omega},\mathcal{O})$ is an isometry. So it remains to see that it is surjective, hence that for every $f\in L^2(\widetilde{\Omega},\mathcal{O})$, every point $a\in E$ is a removable singularity.
Since $E$ is finite, there is an $R > 0$ such that $D_{2R}(0) \subset \Omega$ and $\lvert a-b\rvert > R$ for all $b \in E\setminus\{a\}$. To simplify the notation, we may assume that $a = 0$.
For every $0 < r < R$, then consider the annulus $A_r = \{ z : r < \lvert z\rvert < R\}$ and the restriction $\rho_r \colon L^2(\widetilde{\Omega},\mathcal{O}) \to L^2(A_r,\mathcal{O})$. As a restriction, $\rho_r$ is norm-decreasing, i.e.
$$\lVert \rho_r(f)\rVert_{L^2(A_r)} \leqslant \lVert f\rVert_{L^2(\widetilde{\Omega})}$$
for all $f\in L^2(\widetilde{\Omega},\mathcal{O})$. On the annulus $A_r \subset \widetilde{\Omega}$, every $f\in L^2(\widetilde{\Omega},\mathcal{O}) \subset \mathcal{O}(\widetilde{\Omega})$ has a Laurent expansion
$$f(z) = \sum_{n=-\infty}^\infty c_n z^n.$$
By the choice of $R$, the Laurent series converges in a punctured disk $\{ 0 < \lvert z\rvert < R+\varepsilon\}$, hence it converges uniformly on $A_r$. (Note: locally uniform convergence would suffice, but uniform convergence simplifies a few steps.)
Now one needs to check that the monomials $(z^k)_{k\in\mathbb{Z}}$ are mutually orthogonal in $L^2(A_r,\mathcal{O})$.
Then, we see that
$$\lVert \rho_r(f)\rVert_{L^2(A_r)} = \int_{A_r} \lvert f(z)\rvert^2 \,d\lambda = \sum_{n=-\infty}^\infty \lvert c_n\rvert^2 \int_{A_r} \lvert z\rvert^{2n}\,d\lambda \leqslant \int_{\widetilde{\Omega}} \lvert f(z)\rvert^2\,d\lambda.$$
Letting $r \to 0$, the only way for the left hand side to remain finite is that $c_n = 0$ for all $n < 0$, which means that the singularity of $f$ in $0$ is removable.
That holds for every singularity $e\in E$, hence $\rho$ is surjective.