Let $ \mathbf{P}$ denote the "infinite matrix"
$$ \left[ \begin{array}{ccccc} 1 & 0 & 0 & 0 & \dots \\ 1 & 1 & 0 & 0 & \dots \\ 1 & 2 & 1 & 0 & \dots \\ 1 & 3 & 3 & 1 & \dots \\ \vdots & \vdots & \vdots & \vdots & \ddots \\ \end{array} \right]$$
with entries $ \mathbf{P}_{ij} = \dbinom{i-1}{j-1}$ and let $ \mathbf{I}$ denote the "infinite identity matrix." Compute the inverse of $ \mathbf{P} + \mathbf{I}$.
This was not the initial attempt. I couldn't think of anything at first. But after some nudges, I tried to compute the $n\times n$ matrix $\mathbf{P}_n$. Now my first observation was $\det \mathbf{P}_n=1$. Now if I could show that the invertibility of $\mathbf{P}_n$ would be efficient. So we expand $\mathbf{P}_{n+1}$ by the last row, then it is obvious that $\det \mathbf{P}_{n+1}=\det \mathbf{P}_n=1$. So invertibility is meaningful. But when I inverted for small values, I couldn't find any pattern. I can't think of a method to cook up the solution. Can someone help me? I see my method of thinking should have been presented and I apologise. I will add them later on.
How to compute an inverse of an infinite matrix? And even if I can, what to do with it? Thanks for any help.
For now, let's focus on inverting $\mathbf{P}$. (The method below should work for $\mathbf{I}+\mathbf{P}$ as well but I didn't want to start there). First, note that the elements of $\mathbf{P}^{-1}$ satisfy $$(\mathbf{P}\cdot\mathbf{P}^{-1})_{ik}=\sum_{j} \mathbf{P}_{ij}(\mathbf{P}^{-1})_{jk} =\sum_j \binom{i-1}{j-1}(\mathbf{P}^{-1})_{jk}=\delta_{ik}.$$ Note that this is still a linear algebra problem, albeit with an infinite number of variables and constraints.
I'll attack it with a generating function approach: Multiplying the LHS by $x^i$ and summing over all integers produces
\begin{align} \sum_{ij} \binom{i-1}{j-1}(\mathbf{P}^{-1})_{jk}x^i=\sum_j (\mathbf{P}^{-1})_{jk}\sum_i\binom{i-1}{j-1}x^i = \sum_{j=1}^\infty (\mathbf{P}^{-1})_{jk}\left(\frac{x}{1-x}\right)^j. \end{align} To justify the last equality, shift the index of summation of $i\mapsto i+j$: $$\sum_i\binom{i+j-1}{j-1}x^{i+j}=x^j\cdot \sum_i\binom{i+j-1}{i}x^{i}=\dfrac{x^j}{(1-x)^j}$$ since the last summation is a (negative) binomial series. If we repeat this on the RHS we simply get $x^k$ since the Kronecker delta kills the rest of the terms. We then let $y=\dfrac{x}{1-x}$ and equate the RHS and LHS to obtain
$$ \sum_{j=1}^\infty (\mathbf{P}^{-1})_{jk}y^k =\left(\frac{y}{1+y}\right)^k=(-1)^k\left[\frac{(-y)}{1-(-y)}\right]^k=(-1)^k\cdot \sum_j\binom{j-1}{k-1}(-y)^j$$ with the last equality following from the prior equation. Identifying coefficients on both sides then finally gives $\boxed{(\mathbf{P}^{-1})_{jk}=(-1)^{j+k} \binom{j-1}{k-1}}$. Comparing with our original equation, this implies the summation $\sum_j (-1)^{j+k} \binom{i-1}{j-1}\binom{j-1}{k-1}=\delta_{ik}$; this almost certainly admits a counting proof via inclusion-exclusion.
So we may take $\mathbf{P}^{-1}$ as known, and can now focus on $(\mathbf{I}+\mathbf{P})^{-1}$. Something like the binomial inverse theorem should come in handy; I'll see if I can find a simple route.