A particular cryptographic hash is represented as a $57$ byte string, encoded as base $64$.
RWSvUZXnw9gUb70PdeSNnpSmodCyIPJEGN1wWr+6Time1eP7KiWJ5eAM
I want to convert that string to a human-readable string, composed of dictionary words, of a similar size.
For example this string could be rendered as something like
football arrow never water
Assuming there are $10,000$ English language words, to figure out how many words it would take to represent this string, I need to solve for $x$:
$$\large (57\text{ bytes}) ^ {\text{base }64} = (x\text{ bytes}) ^ {10,000 \text{ dictionary words}}$$
My first thought on how to do this is to raise both sides to $1/64$ to get rid of the $64$ exponential
$$\large (57^{64}) ^ {1/64} = (x^{10,000}) ^ {1/64}$$
However I'm not sure to do this. Wolfram Alpha returns $x^{625/4}$, which doesn't make much sense.
How can I solve this?
My answer takes your post literally and assumes you mean 57 bytes, expressed in base 64.
However, Mark Fischler's answer is correct if you actually meant 57 base 64 digits, which is not the same thing.
The base of the representation doesn't affect how much information the string has. In other words, it takes the same amount of information to write 57 bytes in binary as it does to write those very same 57 bytes in base 64. So the right computation to do is $$\begin{align*} 57\text{ bytes}=456\text{ bits} \quad &\leadsto\quad 2^{456}\text{ possibilities}\\ x\text{ English words}\quad &\leadsto\quad 10000^x\text{ possibilities} \end{align*}$$ and you want $$\begin{align*} 2^{456}&\leq 10000^x\\\\ 2^{456}&\leq 10^{4x}\\\\ 456&\leq 4x\log_2(10)\\\\ \frac{114}{\log_2(10)}&\leq x \end{align*}$$ which happens for $x\geq 35$. That is, you need a string of $35$ English words to encode $57$ bytes, assuming your English words are being chosen from a fixed $10000$ word dictionary.