Likelihood of knowing the answer given y correct answers on multiple choice test

149 Views Asked by At

I've been trying to solve the following problem but am thrown off by the "guessing in case she doesn't know the answer". The problem is as follows:

A student takes a test with n questions of equal difficulty. For each question, the student has a probability $\theta$ of knowing the answer and in that case she solves the question perfectly. If she does not know the answer, she guesses and then she has a probability of 0.5 of solving it correctly. The responses to all n questions can be considered independent.

Based on this information, the following is asked:

Derive analytically the likelihood function for $\theta$ given y correctly solved items.

The "likelihood given y correctly solved items" confuses me. I've seen many examples of solving $P(\theta|Y=1)$ using Bayes' theorem (e.g. see here). But how can this be generalized to a likelihood for $\theta$ given y correctly solved items? Do we even need Bayes' theorem or can this just be answered with a Binomial distribution corrected for the guessing given she doesn't know the answer?

Furthermore, a follow-up question is asked:

Assuming a beta prior with constants a and b, derive an expression for the marginal probability $p(y)$. Simplify the expression, but leave the integral if there is no closed-form solution.

This makes me believe Bayes' theorem would indeed be used for the first question. Any help on any part of this problem is greatly appreciated.

1

There are 1 best solutions below

0
On BEST ANSWER

The likelihood of $\theta$ is $\mathbb P(Y=y | \theta)$ (the probability of the data given the parameter). But $Y$ is just a binomial random variable given $\theta$, where the success probability is given by the probability of getting a question correct. So all we need to do is calculate the probability of getting any one question correct.

$$p_\theta:=\mathbb P (\text{correct}) = \mathbb P(\text{correct} | \text{known})\mathbb P(\text{known}) + \mathbb P(\text{correct} | \text{unknown})\mathbb P(\text{unknown}) = \theta + (1-\theta)\cdot0.5$$

Then the likelihood is: $$\sum_i^n \binom{n}{i}{p_\theta}^i(1-p_\theta)^{n-i}$$