OFDM IDFT implementation....

319 Views Asked by At

i have some doubts with the implementation of the IDFT in OFDM systems. The question concerns the expression of the IDFT of the OFDM signal. During the symbols period $T_s$ we have the following base-band OFDM signal: $$x(t)=\sum_{m=0}^{N-1}X_m e^{j2 \pi \frac{m}{T_s} t},\ \ \ 0\leq t \leq T_s$$ If we sample the signal at time instants $t = k\,Ts/N$, we have: $$x_k=x \left(k \frac{T_s}{N}\right) =\sum_{m=0}^{N-1}X_m e^{j\frac{2 \pi }{N}m\,k}\ \ \ \ \text{with }k=0,1,\dotso,N-1$$ Now, except for a multiplying constant ($1/N$), the above formula is the equation of an N-point inverse discrete Fourier transform (IDFT).

In the OFDM Transmitter and Receiver we implement the respectively the IDFT and DFT to convert the symbol $\{X_0,X_1,\dotso,X_{N-1}\}$ to the time domain symbol sequence $\{x_0,x_1,\dotso,x_{N-1}\}$ and viceversa.

What I was wondering is if the $1/N$ factor that is absent in the previous expression of the IDFT that I have reported is due only to a convention of normalization of the power of the transmitted signal. Furthermore, is the absence of this factor an error?