Suppose I took an exam in a class and I know the raw percentage I scored. The average of the class was low, so the professor decides to curve the scores. Now, the professor provides a list of letter grades (US grading system) and the lowest raw percentage necessary to score into that letter. Example:
- A+: 95%
- A: 89%
- A-: 84%
- B+: 79%
- etc...
Each of the letters corresponds to a percentage, so A+ means 97%, A means 93%, A- means 90%, etc. With the information I have, I know which letter grade "bucket" I fall into (i.e. which percentage range my curved score falls into).
The problem I'm interested in solving is this: given only the above information, what would be the best way to approximate the percentage score I have with the curve? I know which range of percentages I have, but how would I make a good estimate of my actual percentage? I'm not so much interested in this question for practical purposes, but mainly out of curiosity.
Additionally, I may or may not know the mean, median, and standard deviation of the entire class's raw scores. However, since I don't know the type of curve the professor used, I'm not sure if that's relevant for solving this problem.