Converting a wave equation from three to two dimensions.

43 Views Asked by At

enter image description here

I've recently been set the following problem in a takeaway paper and I have one small problem with my result.

Using the Leibniz integral rule I found:

$$u_t=v(x,t,t)+\int^t_0v_t(x,t;s)ds, u_x=\int^t_0v_x(x,t;s)ds$$ then following from that I got: $$u_{tt}=2v_t(x,t,t)+\int^t_0v_{tt}(x,t;s)ds, u_{xx}=\int^t_0v_{xx}(x,t;s)ds$$ This is of course incorrect as this leaves the correct boundary conditions for u however u$_{tt}$-u$_{xx}$=2f(x,t) not f(x,t)? Where in my explanation is wrong? I feel like what I found was correct however I don't know how I could have got one of these v$_t$s out of my equation for u$_{tt}$.