In my physics course, we're covering physical pendulums, and we are to essentially analyze the range of angles within the interval $\left[0, \frac{\pi}{6}\right]$ to show that $\sin\theta \approx \theta$. (I completed my analysis using Desmos.)
After creating our analyses, we are to estimate an angle, in radians, which has an error margin of approximately one percent. The error estimation function I have is $$ E_1(x) = \frac{x - \sin x}{\sin x} \cdot 100, $$ where values of $x$ are in radians. I then rewrote the RHS as $$ E_1(x) = (x\csc x - 1) \cdot 100. $$ So if the error threshold is one percent, I let $E_1(x) = 1$. And so I have been trying to figure out how to solve $$ x \csc x - 1.01 = 0. $$ The best I could do was to graph the function on Desmos to find the roots.
However, I was hoping to get some pointers in the right direction as to how to solve this equation algebraically. While searching the Math Stack Exchange, I happened upon a related question, but the best I could gather is that the approach depends on the type of equation you have.
Is there a purely algebraic approach to solve this equation? Any advice and/or pointers to further reading would be appreciated.
Likely, your instructor wants you to find the approximate value where the error is 1%. Your equation is transcendental and won't be solved algebraically.
You could use a numerical method such as Newton's Method to approximate the roots to desired accuracy. That presupposes your familiarity with calculus (or at least with differentiation).
Another alternative - which is what you did - is to solve it graphically, plotting $y_1=1$ and $y_2=\frac{x-\sin x}{\sin x}$ and assessing where they intersect visually. From your visual judgment, you can test (i.e. plug in) nearby values and, via trial and error, estimate where the error converges to 1%.