Given two binary gamma processes, $X = \Gamma(t; \gamma_1, \lambda_1)$ and $Y = \Gamma(t; \gamma_2, \lambda_2)$, what is their maximum covariance? Applying this answer, it would seem that it is the product of their standard deviations, but I'm not exactly sure how to derive the variance of the gamma process, as opposed to that of the gamma distribution whence it is derived.
Since in my research I am generating (binary) gamma processes from gamma distributions, I would like an answer I could apply to any two gamma distributions (determined by either their shape/scale or shape/rate parameters) and get maximum covariance.
Edit: Example: Using Matlab's gamrnd function, let the shape/scale parameters for one gamma distribution be 4, 1 and for the other be 2, 4. Taking the cdf of a randomly generated sequence and discretizing, I can get two corresponding processes like this:
0001000101001000100000010001100001010000100100110010001001010000010001001...
0000000000000000010001000100000010001000000010000000100000000000011000100...
But the two processes could also be like this:
0001000101001000100000010001100001010000100100110010001001010000010001001...
0001000000000000000001000000100010000000100000000000010001000000011001000...
The first process is unchanged; the second has had the intervals permuted so that it correlates better with the first. Crucially such a change does not affect the parameters of the underlying gamma distribution. However, there is a limit on how similar the second process can be to the first, given these parameters. That limit is what I'm looking for.