I have a dataset with the cumulative ice mass change of the Greenland Ice Sheet with units Gt (Gigatonnes) and its uncertainty (error estimate).
yr=[2003.038356 2003.117808 2003.208219 2003.287671 2003.358904 2003.539726 2003.619178 2003.709589 2003.789041 2003.879452 2003.958904 2004.019126 2004.128415 2004.20765 2004.289617 2004.379781 2004.459016 2004.538251 2004.628415 2004.70765 2004.789617 2004.879781 2004.959016];
mc=[94.91467285 -106.5794067 -39.28155518 -0.194824219 -4.950927734 -193.9120483 -397.7717896 -389.0446777 -302.5618896 -271.7980347 -341.026062 -325.5195313 -319.352417 -287.8536377 -210.249939 -255.0377808 -234.324707 -405.8927002 -533.7174072 -574.6280518 -641.2060547 -519.1424561 -509.010376];
mc_error=[281.3519897 178.404007 159.776001 135.2619934 186.845993 115.685997 127.3939972 122.6439972 101.6019974 111.5830002 110.0019989 243.348999 123.1790009 93.52189636 107.526001 162.6710052 194.0769958 233.6569977 876.4349976 418.4100037 698.5339966 162.121994 129.7559967];
I found a paper which used the slope of a linear regression to derive mean annual mass changes from the same type of data (but from a different source, so they may differ) with units in Gt yr-1:
When I apply a linear regression to my data, I get a slope (or mean mass change) of -276.11 Gt yr-1. But I am wondering how to determine (1) the acceleration (in Gt yr-2) and (2) the uncertainty/error from these data as done in this paper (-276.11 +/- what?). Can someone help me?
Thanks!
