I have an algorithm with running time depending on a parameter $ x \in (0, 1) $.
How is the asymptotic complexity of such algorithms typically analized?
In my particular case the running time approaches ∞ as x -> 1 but I don't know what I can say beyond that.
EDIT
More details about my particular algorithm:
The algorithm terminates when a given score S is reached (S is a parameter). On each iteration of the algorithm, S can either increase by 1 or be set to 0 (with probability x). I am looking to better understand the expected number of iterations as a function of x.