I cannot see what happens to $P_{ij}(s)=p_{ij}^0+\sum^\infty_{n=1}\sum^{n-1}_{k=0}s^{n-k}f_{ij}^{n-k}s^kp_{jj}^k$ to get result

382 Views Asked by At

This is self learning and it is stats.

$P_{ij}(s)=\sum^\infty_{n=0}p_{ij}^ns^n$ (which you'll probably recognise is a generating function)

and

$F_{ij}(s)=\sum^\infty_{n=1}f_{ij}^ns^n$ (note n=1 here)

I wish to show (I know this result, but I'd like to do it as if I did not however)

$P_{ij}(s)=p_{ij}^0+F_{ij}(S)P_{jj}(s)$

$P_{ij}(s)=p_{ij}^0+\sum^\infty_{n=1}p_{ij}^ns^n$

Now using a previous result: $p_{ij}^n=\sum^n_{k=1}f_{ij}^kp_{jj}^{n-k}$

We get:

$P_{ij}(s)=p_{ij}^0+\sum^\infty_{n=1}s^n\sum^n_{k=1}f_{ij}^kp_{jj}^{n-k}$

One can easily flip that around:

$P_{ij}(s)=p_{ij}^0+\sum^\infty_{n=1}s^n\sum^{n-1}_{k=0}f_{ij}^{n-k}p_{jj}^k$

and get close:

$P_{ij}(s)=p_{ij}^0+\sum^\infty_{n=1}\sum^{n-1}_{k=0}s^{n-k}f_{ij}^{n-k}s^kp_{jj}^k$

Then suddenly! Magic!

$=p_{ij}^0+\sum^\infty_{k=0}\sum^\infty_{n=k+1}s^{n-k}f_{ij}^{n-k}s^kp_{jj}^k$

The rest seems like something I can do.

So I looked at a simpler sum, $\sum^L_{i=1}\sum^i_{j=1}f(j)$ and noticed this is:

$\ \ \ f(1)$

$+f(1)+f(2)$

$+f(1)+f(2)+f(3)$

$...$

Which seems important, but I daren't make the jump to infinity.

The question adds at the end "for |s|$<$1" now isn't this a bit pointless, generating functions considered for their parameter being 1 is when they are actually useful!

Definitions (if they are needed / help)

$f_{ij}^n$ denotes $\mathbb{P}[X_n=j,X_k\ne j \forall 1\le j\le n-1|X_0=i]$

or in English: the probability that starting from state i we end up at state j taking a path that doesn't visit state j in exactly n steps.

$p_{ij}^n$ denotes $\mathbb{P}[X_n=j|X_0=i]$

It is the "probability that starting from i you end up at (via any path) at j in exactly n steps".

Note that why I have used the notation of powers, $p_{ij}^k\ne (p_{ij})^k$

(That's not true, but it'd be a special case where they are equal, like the identity matrix of state changes.... it doesn't denote power basically!)

1

There are 1 best solutions below

0
On

I see no question in your post, except this consideration:

The question adds at the end "for |s|<1" now isn't this a bit pointless, generating functions considered for their parameter being 1 is when they are actually useful!

...which happens to be squarely wrong. No, the generating function $g:s\mapsto E[s^X]$ is not used only at $1$. Two basic examples:

  • The expectation $E[X]=g'(1)$ involves the behaviour of $g$ in an interval $(1-\varepsilon,1]$.
  • Each weight $P[X=n]=(n!)^{-1}g^{(n)}(0)$ involves the behaviour of $g$ at $0$ (if $n=0$) or in an interval $[0,\varepsilon)$ (if $n\geqslant1$).

Actually, this is rather $g(1)$ which is not useful, since $g(1)=1$ always.