How to use linear regression to find the most sensitive independent variable?

148 Views Asked by At

I have a curve. I took a linear part of it between maxima and minima and implemented linear regression. Now I’m told that I should use regression to find the most sensitive $x$ value. How to do it? Slope is the same on the regression line! Here are exact words:

Based on the graph $U=f(a)$ determine the range of the fastest changes of the voltage $U$ as a function of angle $a$ (the range at which the graph is almost linear). In this range, calculate the maximum angle sensitivity of the Hall probe using linear regression.

1

There are 1 best solutions below

0
On BEST ANSWER

You are misunderstanding the task and appear to be doing it backwards. Let me try to rephrase it:

  • The angle sensitivity is the slope of your graph ($f'$).
  • There is a region of your graph where it has a maximum slope. Your first task is to find this region.
  • When you identified this region, you shall calculate the slope within this region via linear regression, which is the maximum angle sensitivity (per definition).

What is arguably confusing is the following phrasing:

In this range, calculate the maximum angle sensitivity of the Hall probe using linear regression.

I would have written one of the following:

From this range, calculate the maximum angle sensitivity of the Hall probe using linear regression.

In this range, calculate the maximum angle sensitivity of the Hall probe using linear regression.