Let's say $X_i ∼ Bernoulli(p)$ for $p \in (0,1)$. We can determine the value of $p$ by using some number $t$ of i.i.d. samples $X_1, ... , X_t ∼ Bernoulli(p)$. Consequently, we have $\bar X=\tfrac {\sum_{i=1}^tX_t}t$.
Now I want to find accuracy threshold, $\alpha$, to find $t$ in terms of $p$ and $\alpha$ such that $$P[(1-\alpha)p \leq X \leq (1 + \alpha)p] > \frac{9}{10}$$ and $\alpha \in (0,1)$.
I have no idea how to start with this. My understanding of what this means is: How many times do we need to sample (which is the parameter $t$) such that we get an accuracy of at least 90% when estimating $p$ using $X$ given that $X$ can be inaccurate by a threshold of $\alpha$.
Is this the correct way to think about this problem? If so, my approach would be, let's say $t=1$, then what is my accuracy? Solving that problem, how does accuracy change with increasing $t$?
So when $t=1$, we get $\bar X = \frac{X_1}{1}$. This is where I get stuck. How can I incorporate $\alpha$ and calculate $P(X)$?
You can use Hoeffdings inequality which states that for any $\epsilon > 0$, $$ P\left( \left |\frac{1}{n} \sum_{i=1}^n X_i - p \right| > \epsilon \right) \le 2 \exp (-2n\epsilon^2). $$ This implies that $$ P\left( \left |\frac{1}{n} \sum_{i=1}^n X_i - p \right| \le \epsilon \right) \ge 1- 2 \exp (-2n\epsilon^2) =: 1-\delta. $$ Put in a slightly different way, we have with probability $1-\delta$ that $$ p-\epsilon \le \frac{1}{n} \sum_{i=1}^n X_i \le p+\epsilon. $$ Now note that $$ \delta = 2 \exp (-2n\epsilon^2) \implies n = \frac{1}{2 \epsilon^2} \log (\frac{2}{\delta}), $$ actually note that this number might not be an integer, so we need to take $n=\lceil \frac{1}{2 \epsilon^2} \log (\frac{2}{\delta}) \rceil $ so in your case, if we take $\delta =0.1$, and specify a tolerance of $\epsilon =0.05$, then $$ n=\lceil \frac{1}{2\times 0.05^2} \log (\frac{2}{0.1}) \rceil = \lceil 599.15\rceil = 600 $$ The sample size obviously increases as your tolerance $\epsilon$ decreases (more samples are needed to be sure that the parameter lies in a smaller interval.
Note that there are sharper bounds that Hoeffding, but this should be a good start for your analysis