I have the following array of numbers:
Data = [1.0093723027146584, 1.0046505670796182, 1.0161141384219396, 1.0134308125226419, 0.9968327836577084,
0.9991179639395691, 1.004309884935752, 1.0045739950900365, 1.0034488516079065, 1.0123691848397993]
After calculating the weighted average of these numbers based on the weights:
Weights = [0.10402577, 0.07763709, 0.09376677, 0.06940965, 0.05807695,
0.11757057, 0.13287357, 0.13127618, 0.12285175, 0.0925117 ]
I can find the average is $1.00623$.
How to find the error of this weighted average? Should that be the standard deviation of Data? How the weights contribute to the error bar?
Thanks!