I was wondering if this logic makes sense.
I have a time series where the slope is -0.19 over 21 years, so the rate would be -0.19 per year. Now if I want a rate per decade, would it logically make sense to multiple the yearly rate by 10 to achieve a rate per decade?
For this example the rate would become -1.9 per decade. Does this logic make sense or do I need to make additional calculations to perform this conversion?
Yes, you can show this explicitly by using "dimensional analysis" with a conversion factor between years and decades. $$\require{cancel} -0.19 \frac{\text{units}}{\text{year}} = -0.19 \frac{\text{units}}{\cancel{\text{year}}} \times 10 \frac{\cancel{\text{years}}}{\text{decade}} = -1.9 \frac{\text{units}}{\text{decade}}$$