So apparently:
xΓ(x) = Γ(x+1)
But when I plug them both into Mac's Grapher, Γ(x+1) is defined at 0 (it equals 1), whereas xΓ(x) is not. Can I define 0 * Γ(0) to be 1 via analytic continuation?
The specific function that I am looking at is this one:
f(x) = Γ(x+2) / xΓ(x)
which I want to evaluate to 1 for x = 0. As I understand it, the expansion looks like this:
(x+1)(x)(x-1)(x-2)... / (x)(x-1)(x-2)(x-3)... = (x+1)
Is it fair to say that xΓ(x) = 1 when x = 0 in this case?
Thanks you guys!