As far as I can tell, the following infinite integral holds, $$ I(m,n) = \int_0^{\infty} \frac{1}{(1+x^n)^m} dx = \frac{\Gamma(1+1/n)\Gamma(m-1/n)}{\Gamma(m)}. $$ Here, $m \geq1, n \geq 2$ are integers. $\Gamma(x)$ is the well-known Gamma function.
When $m = 1$, it is shown in this link. Some other cases can be found here.
My Questions are that: (1) How can we proof the above integral identity nicely? (2) Is there a way to calculate the above integral by Residue theorem?
Firts of all, using the Gaussian hypergeometric function, the antiderivative is $$I(m,n) = \int \frac{dx}{(1+x^n)^m} = x\, \, _2F_1\left(m,\frac{1}{n};1+\frac{1}{n};-x^n\right)$$ $$J(m,n) = \int_0^t \frac{dx}{(1+x^n)^m} =t \,\, _2F_1\left(m,\frac{1}{n};1+\frac{1}{n};-t^n\right)$$ $$K(m,n) = \int_0^\infty \frac{dx}{(1+x^n)^m} =\frac{\Gamma \left(1+\frac{1}{n}\right) \Gamma \left(m-\frac{1}{n}\right)}{\Gamma (m)}$$
If you want to use residues, think about Barnes integrals.