I have a signal that looks like this:

To obtain an average, I apply this formula
y[i] = α * x[i] + (1-α) * y[i-1]
With a relatively small value of α the output is “stable“ enough for my application, the problem is that the output responds very slow to significative changes in the input

So, I sort of solved the problem with a variable α that depends on the normalised input like this
normalised_input[i] = abs(x[i] - y[i-1]) / y[i-1]
α = α_fast * normalized_input + α_slow * (1 - normalized_input)
α is the linear interpolation between a α_fast that responds quickly to significative changes in the input and a α_slow that gives the average I want.
I suspect there is a better way to do this, so I wonder what is the most common approach to this kind of problem.