Improper integral of a cosine

1.3k Views Asked by At

I'm trying to follow some equations in an electrical engineering paper that I'm reading. I'll spare you the details, but at one point I come across:

$$\lim_{ T \rightarrow \infty }\int_{-T/2}^{T/2} \cos (\omega_r(t+\tau)) dt$$

For the reasoning in the paper to work this integral should equal T. I can't prove this mathematically, nor find some intuitive reasoning for it. Intuitively I would have said the answer is $0$...

I guess it could also be rewritten as two integrals:

$$\lim_{ T \rightarrow \infty }\int_{-T/2}^{-\tau} \cos (\omega_r(t+\tau)) dt + \lim_{ T \rightarrow \infty }\int_{-\tau}^{T/2} \cos (\omega_r(t+\tau)) dt$$

but it didn't get me anywhere.

I know that $\displaystyle\lim_{ T \rightarrow \infty }\int_{0}^{T} \cos(x) dx$ is undefined as sinusoidal functions never converge, but I would expect the symmetry of $\displaystyle\lim_{ T \rightarrow \infty }\int_{-T}^{T} \cos(x) dx$ to make the integral equal to $0$.

I'd appreciate if anyone could point me in the right direction.


Edit: providing some context

It's one of the terms in a signal correlation. The full problem can be stated as follows:

Let

$$s(t) = a\cos (\omega_rt-\phi)+b$$ $$g(t) = \cos (\omega_rt)$$

$\omega_r$ is constant (pulsation) and $s(t)$ is a phase delayed version of $g(t)$ with a change of amplitude and a DC offset, b.

They define the correlation of the signals as:

$$h(\tau)=(s\otimes g)(t)=\frac{1}{T}\lim_{T\rightarrow \infty }\int_{-T/2}^{T/2}s(t)\cdot g\left (t+\tau\right ) dt$$

And they state that the result of this integral is

$$h(\tau)=\frac{a}{2}\cos(\omega_rt+\phi)+b$$

Without provide further details.

I naively did:

$$h(\tau)= \frac{1}{T}\lim_{T\rightarrow \infty }\int_{-T/2}^{T/2}\left [a\cos (\omega_rt-\phi)+b \right ]\cdot \cos (\omega_r(t+\tau)) dt$$

$$= \underbrace{\frac{a}{T}\lim_{T\rightarrow \infty }\int_{-T/2}^{T/2}\left [\cos (\omega_rt-\phi)\cos (\omega_r(t+\tau)) \right ]dt}_{\text{A}} + \underbrace{\frac{b}{T}\lim_{T\rightarrow \infty }\int_{-T/2}^{T/2} \cos (\omega_r(t+\tau)) dt}_{\text{B}}$$

And trying to figure out the right integral, which must be equal to T if the term is to be equal to $\frac{bT}{T}=b$.

Maybe I'm doing something obviously wrong :)

2

There are 2 best solutions below

5
On BEST ANSWER

Since $\cos$ is even, $\int_{-T}^{T} \cos(x)\; dx = 2 \int_{0}^{T} \cos(x)\; dx$, not $0$. But surely you know $\int_0^T \cos(x)\; dx = \sin(T)$?

More generally, $$\int_{-T/2}^{T/2} \cos(\omega (t + \tau))\; dt = 2\,{\frac {\sin \left( \omega\,T/2 \right) \cos \left( \omega\,\tau \right) }{\omega}} $$

This has no limit as $T \to \infty$ unless $\cos(\omega \tau)$ happens to be $0$. Either the paper is wrong, or you're misreading it.

0
On

It turns out it's a notation issue in the paper because another domain-specific assumption is that the time over which the signals are correlated (T) is always much larger than the wavelength of the signals.

Therefore it's implied that the limit include the $1/T$ term:

$$\lim_{ T \rightarrow \infty } \frac{1}{T}\int_{T/2}^{-T/2}...$$

The B term in the original question is thus $0$.

As for the $b$, it's assumed to be a different DC offset than the one in the original signal $s(t)$ but they use the same variable.

I'm accepting Robert's answer as the original question is simply ill-posed. Thanks