Formula for Contractor's Hourly Rate

46 Views Asked by At

Assume the number of work hours per year is 12 (months) x 4 (weeks per month -- for the purposes of this problem, assume 4 weeks per month) x 40 (hours per week) = 1920 hours per year.

Furthermore, assume a goal of incentivizing companies to hire a contractor for as many possible hours per week for as many possible months.

If a contractor is hired for 12 months at 40 hours a week, a multiplier of 1 is applied to a set hourly rate (e.g. 100 per hour x 1 = $100).

The minimum number of months a contractor can be hired is 1 month at 5 hours a week. The multiplier for that is 2 (e.g. 100 per hour x 2 = $200).

What formula can be used to compute this multiplier? What should the multiplier be if the contractor is hired for 6 months at 20 hours a week? Or for 3 months at 15 hours a week?

Trying to find a formula that would apply to 1 (month) and 5 (hours a week) to equal 2 (multiplier) and use that same formula that would apply to 12 (months) and 40 (hours a week) to equal 1 (multiplier).

UPDATE: I think 25 hours (halfway between 10 and 40) at 7 months (halfway between 1 and 12) would probably yield a 1.5 multiplier.

1

There are 1 best solutions below

6
On BEST ANSWER

Assuming you want a linear relationship between hours worked and multiplier, you have two data points: $(20,2)$ and $(480,1)$. The line that goes thru these two points is defined by this equation: $m=-h/460+2\frac{1}{23}$ where $m$ is multiplier and $h$ is the number of hours.