I am trying to calculate the Jacobian determinate of the Fourier transform which I stumbled upon when studying the Path Integral in Quantum Field Theory. I know the answer should be $1$ but I don't know how to show it. The transform is \begin{equation} x _n = \sum _k \frac{1}{ \sqrt{ N} }e ^{ - i 2 \pi k n/N } \tilde{x} _k \end{equation} I know that the Jacobian matrix is given by \begin{equation} J _{ n,k} = \frac{ dx _n }{ d\tilde{x} _k } = \frac{1}{ \sqrt{ N}} e ^{ - i 2 \pi k n/N} \end{equation} and the determinate is then \begin{equation} \det J = \frac{1}{ \sqrt{ N}}\epsilon _{ k _1 k _2 ... } e ^{ - i 2 \pi k _1 1/N } e ^{ - i 2 \pi k _1 2 /N} ... \end{equation} but I'm not sure how to show this is equal to one. I found a link online which says to calulate the determinate you should perform the Fourier Transform twice but I wasn't able to figure out the steps.
2026-04-06 07:37:19.1775461039
Jacobian of Fourier Transformation
3.8k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
There are 1 best solutions below
Related Questions in FOURIER-SERIES
- order of zero of modular form from it's expansion at infinity
- Fourier series expansion of $\frac{\pi^4}{96}$ and $\frac{\pi^4}{90}$
- How is $\int_{-T_0/2}^{+T_0/2} \delta(t) \cos(n\omega_0 t)dt=1$ and $\int_{-T_0/2}^{+T_0/2} \delta(t) \sin(n\omega_0 t)=0$?
- Fourier series. Find the sum $\sum_{n=1}^\infty \frac{(-1)^{n+1}}{2n+1}$
- How get a good approximation of integrals involving the gamma function, exponentials and the fractional part?
- The convolution theorem for fourier series.:$ \widehat{f*g}(x) =2π\hat{g}(x)\cdot\hat{f}(x) $
- Ergodicity of a skew product
- Fourier Series on $L^1\left(\left[0,1\right)\right)\cap C\left(\left[0,1\right)\right)$
- Parseval's Identity Proof Monotone/Dominated Convergence Theorem
- How can I interchange the sum signs
Related Questions in TRANSFORMATION
- $\int \ x\sqrt{1-x^2}\,dx$, by the substitution $x= \cos t$
- Functions on $\mathbb{R}^n$ commuting with orthogonal transformations
- How do you prove that an image preserving barycentric coordinates w.r.t two triangles is an affine transformation?
- Non-logarithmic bijective function from $\mathbb{R}^+$ into $\mathbb{R}$
- Where does this "magical" transformatiom come from?
- Calculate the convolution: $\frac{\sin(4t)}{\pi t}*( \cos(t)+\cos(6t) )$ using Fourier transform
- Find all $x \in\mathbb R^4$ that are mapped into the zero vector by the transformation $x \mapsto Ax$
- Linear transformation $f (ax+by)=$?
- Is a conformal transformation also a general coordinate transformation?
- Infinite dimensional analysis
Related Questions in COORDINATE-SYSTEMS
- How to change a rectangle's area based on it's 4 coordinates?
- How to find 2 points in line?
- Am I right or wrong in this absolute value?
- Properties of a eclipse on a rotated plane to see a perfect circle from the original plane view?
- inhomogeneous coordinates to homogeneous coordinates
- Find the distance of the point $(7,1)$ from the line $3x+4y=4$ measured parallel to the line $3x-5y+2=0.$
- A Problem Based on Ellipse
- Convert a vector in Lambert Conformal Conical Projection to Cartesian
- Archimedean spiral in cartesian coordinates
- How to find the area of the square $|ABCD|$?
Trending Questions
- Induction on the number of equations
- How to convince a math teacher of this simple and obvious fact?
- Find $E[XY|Y+Z=1 ]$
- Refuting the Anti-Cantor Cranks
- What are imaginary numbers?
- Determine the adjoint of $\tilde Q(x)$ for $\tilde Q(x)u:=(Qu)(x)$ where $Q:U→L^2(Ω,ℝ^d$ is a Hilbert-Schmidt operator and $U$ is a Hilbert space
- Why does this innovative method of subtraction from a third grader always work?
- How do we know that the number $1$ is not equal to the number $-1$?
- What are the Implications of having VΩ as a model for a theory?
- Defining a Galois Field based on primitive element versus polynomial?
- Can't find the relationship between two columns of numbers. Please Help
- Is computer science a branch of mathematics?
- Is there a bijection of $\mathbb{R}^n$ with itself such that the forward map is connected but the inverse is not?
- Identification of a quadrilateral as a trapezoid, rectangle, or square
- Generator of inertia group in function field extension
Popular # Hahtags
second-order-logic
numerical-methods
puzzle
logic
probability
number-theory
winding-number
real-analysis
integration
calculus
complex-analysis
sequences-and-series
proof-writing
set-theory
functions
homotopy-theory
elementary-number-theory
ordinary-differential-equations
circles
derivatives
game-theory
definite-integrals
elementary-set-theory
limits
multivariable-calculus
geometry
algebraic-number-theory
proof-verification
partial-derivative
algebra-precalculus
Popular Questions
- What is the integral of 1/x?
- How many squares actually ARE in this picture? Is this a trick question with no right answer?
- Is a matrix multiplied with its transpose something special?
- What is the difference between independent and mutually exclusive events?
- Visually stunning math concepts which are easy to explain
- taylor series of $\ln(1+x)$?
- How to tell if a set of vectors spans a space?
- Calculus question taking derivative to find horizontal tangent line
- How to determine if a function is one-to-one?
- Determine if vectors are linearly independent
- What does it mean to have a determinant equal to zero?
- Is this Batman equation for real?
- How to find perpendicular vector to another vector?
- How to find mean and median from histogram
- How many sides does a circle have?
Let $Q$ be an orthonormal matrix. By definition, orthonormal matrices are the matrices which satisfy the following conditions:
($i$) $Q^{-1}=Q^T$. From here it follows that $QQ^T=QQ^{-1}=I$ where $I$ is the identity matrix, a square matrix with $1$s at the diagonal and else $0$.
($ii$) All rows and columns of an orthonormal matrix satisfy the inner product rule $<q_i,q_j>=0$ and $<q_i,q_i>=1$.
A matrix is called orthogonal (but not necessarily orthonomal) when $<q_i,q_j>=0$ holds. This means the angle beween two pairs, $i$ and $j$ of vectors from $Q$ with $i\neq j$ is $90$ degrees. This follows simply from $$\cos(\theta)=\frac{<u_i,u_j>}{||u_i||||u_j||}$$ here $||u_i||=<u_i,u_i>$ is the inner product in Euclidean space ($L^2$ norm).
Absolute value of the determinant of every orthonormal matrix is always $1$. This can be proven at least in two different ways:
$1$ st way: $$1=\det(I)=\det(QQ^T)=\det(Q)\det(Q^T)=(\det(Q))^2$$ The third equality is a result of the determinant of two square matrices here the 4th property.
$2$ nd way:
($i$) For every orthonormal matrix, $Q$, all singular values $\sigma_i$ of this matrix are equal to $1$
($ii$) The determinant of any real matrix is given by $|\det(A)|=\prod_i \sigma_i^2$
from ($i$) and ($ii$) we conclude that $|\det(A)|=(\prod_i 1)^2\Longrightarrow |\det(A)|=1$.
The proof of ($i$):
For every real matrix $Q$ we have the singular value decomposition given by $Q=U\Sigma V^T$ where $U$ and $V$ are orthonormal matrices. See here. One can select $U=Q$ and $V=I$. From here we get $\Sigma=I$. Since $\Sigma$ is uniquely determined for singular value decomposition, the proof is complete. Note that $U$ and $V$ are not unique.
The proof of ($ii$):
For this proof one can see either this, proposition C.3.7 or this question and the following answers.
To show that $J_{n,k}$ forms an orthonormal basis is easy to justify by the given properties of orthonormal matrices. Therefore, we can eventually conclude that $|\det J|=1$.