Let $Y_1, Y_2, \ldots, Y_n$ iid random variables with density $f(y)=\theta\cdot y^{\theta-1}$, $0<y<1$, $\theta >0$.
I need to show that the maximum likelihood estimator of $\theta$ is $-n\bigm/\sum\ln Y_i$.
I know the first step is to get the likelihood of $\theta$
$$L(\theta) = \theta\cdot 1^{\theta-1} \cdot \theta\cdot 2^{\theta-1} \cdots \theta\cdot n^{\theta-1} $$ But I am at a loss of how to simplify this; I keep getting scratches of dead-end work... (It is something wrong with my algebra or partial differentiation).
Your likelihood is not correct; it should be $L(\theta) = (\theta y^{\theta-1}_1) \cdot (\theta y^{\theta-1}_2) \cdot \ldots \cdot (\theta y^{\theta-1}_n)$, which is $\theta^n (\prod^n_{i=1} y_i)^{\theta-1}$. Then $\ln(L(\theta)) = n \ln \theta + (\theta-1) \sum^n_{i=1} \ln y_i$. Differentiating this gives $n/\theta + \sum y_i$. Equating this to zero gives $\hat{\theta} = -n / \sum \ln y_i$.