We consider the problem of representing a time function, or signal, $x(t)$ on a $T$-s interval $(t_0, t_0+T)$, as an expansion. Thus we consider a set of time functions $\phi_1 (t), \phi_2(t), ..., \phi_N(t)$, which are specified independently of $x(t)$, and seek a series expansion of the form $$x_a(t)=\sum_{n=1}^N X_n \phi_n(t), t_0 \le t \leq t_0+T\tag{1}$$ in which the $N$ coefficients $X_n$ are independent of time and the subscript $a$ indicates that (1) is considered an approximation. We assume that the $\phi_n(t)$s in (1) are linearly independent and orthonormal. The error in the approximation of $x(t)$ by the series of (1) will be measured in the integral-squared sense: $$\mbox{Error}=\epsilon_N=\int_T |x(t)-x_a(t)|^2 dt \tag{2}$$ where $\int_T()dt$ denotes integration over $t$ from $t_0$ to $t_0+T$.
My question is: How to show that the integral-squared error (ISE) in (2) is an applicable measure of error only when $x(t)$ is an energy signal or a power signal?
Note: For an arbitrary signal $x(t)$, which may, in general, be complex, we define total (normalized) energy as $$E\triangleq \int_{-\infty}^\infty |x(t)|^2 dt \tag{3}$$ and (normalized) power as $$P\triangleq \lim_{T\rightarrow\infty}\frac{1}{2T}\int_{-T}^T |x(t)|^2 dt. \tag{4}$$
We say $x(t)$ is an energy signal if and only if $0<E<\infty$, so that $P=0$.
We classify $x(t)$ as a power signal if and only if $0<P<\infty$, so that $E=\infty$.
This error definition is of course applicable in many situations, not just for problems dealing with “energy” or “power.” However, I see your clarification at the end that, by “energy signal” or “power signal,” you just mean that square integrals over a finite interval are finite. In that sense, the following function is not included: $x(t) = 1/t$ for $t \in (0,1)$. This function fails because $\int_0^1 (1/t^2)dt = \infty$.
One way to answer your question is to say that if $x(t)$ does not have a finite square integral over the $T$-duration interval in question (that is, if $\int_T x(t)^2dt = \infty$), and if this definition of error is used, then $Error = \infty$, regardless of the coefficients $X_n$ (assuming these coefficients are real numbers).
To prove this, you assume:
(i) $X_n$ coefficients are real numbers (in particular, they are finite).
(ii) $\int_T \phi_n(t)\phi_m(t)dt = 0$ for all $n \neq m$.
(iii) $\int_T \phi_n(t)^2 dt = 1$ for all $n \in \{1, \ldots, N\}$.
(iv) $\int_T x(t)^2 dt = \infty$
Using the above 4 assumptions, you can prove that $Error = \infty$. As a hint, you can first prove the following fact:
$$ (x(t)-x_a(t))^2 \geq \frac{x(t)^2}{4} - x_a(t)^2 $$
Then you can use this fact.