I was studying physics for an upcoming exam and found this in a practice sheet :
A source S is oscillating between A and C with its position varying with time as $\displaystyle y (\text{in metres})=900\cos\frac{\pi}{6} t$. An observer situated at a distance $12$ m along the perpendicular bisector of. Frequency omitted by source is $300$ Hz. Approximate frequency observed by observer when sources at O is ____________. (Speed of sound in air=300 m/s)
$A. 320 Hz \qquad B. 345 Hz \qquad C. 300 Hz \qquad D. 350 Hz$
So by the time S reaches O, the wave released by S at some point, say B, between C and O, reaches the observer. Now, I want to find the distance $OB=x$. Thus I will have the equation, equating the time taken by the sound (to reach observer) and S (to reach O):$$ \dfrac{\sqrt{x^2+12^2}}{300}=\dfrac{\arcsin\left(\dfrac{x}{900}\right)}{2\pi}(\text{Time period})= \dfrac{\arcsin\left(\dfrac{x}{900}\right)}{2\pi}\dfrac{2\pi}{\frac{\pi}{6}}$$
If I find x, I’ll have no problem in solving the rest of the question.
How do I solve this? Or have I done it incorrectly?
