Okay, so I am trying to find unbiased and consistent estimators of parameter $a$ from sequence of RVs that represent unfair dice rolls: it rolls 1 with probability of $1+a$, 6 with probability of $1-a$ and all the other with $\frac{1}{6}$.
Here's where I'm confused: I know that one of the ways is to find a Maximum Likelikhood estimator, which requires me to find probability mass functions for each RV, then we multiply it, log it, yada, yada yada, apply calculus and we get it. Except, how to put this RV into one nice equation, easy to multiply? All the examples I find on-line show conveniently Bernoulli's distribution which has fine and dandy PMF. How to even tackle thing like that?
I assume you meant $(1 + \alpha)/6$ and $(1 - \alpha)/6$, where $0 < \alpha < 1,$ for for the respective probabilities of faces 1 and 6. Also that we are rolling the die $n$ times.
Intuitively, and immediately obvious from the likelihood, one should ignore counts for outcomes other than 1 and 6 as irrelevant. Let $X_1$ and $X_6$ be the respective counts of 1 and 6 in $n$ rolls.
Upon finding the derivative of the log-likelihood function, etc., it seems the estimator is $(X_1 - X_2)/(X_1 + X_2).$
I'll leave the details of that, and the discussion of unbiasedness and consistency, to you.
I tried simulations with 10 million rolls, for $\alpha = .1$ and $.3$, and got three place accuracy.