Simply:
What is the intuition you get about the data if you are given standard deviation?
More detailed:
It is easy to imagine some information about the data if you are being told for example mean or median, etc. Also if you are told that some quantity is for example in range $5 \pm 0.001$, you again have idea about the value and also about its range. But can you make similar intuition about the data if you are given standard deviation (or maybe another more feasible quantity)?
I understand that this is perhaps dependent on probability distribution, but that is unfortunately rarely discussed in practice, data are usually measured, put into table and some basic statistics are mentioned. Let's say for example you are software developer and you need to measure latency of the system over time and present this to the management, they are usually not interested in probability distributions (although I feel that this is wrong and one should know or assume some probability distribution whenever working with the data...).
Note:
Please note that this is not a question about how to calculate standard deviation or why specific formula for standard deviation has been chosen, as some another questions here already address this. This question is strictly about having practical intuition when working with the data (providing or interpreting).
Feel free to provide any examples you like to demonstrate the answer, I did not want to limit this question by focusing on too specific situation.
For general distributions, Chebyshev's inequality is applicable https://en.wikipedia.org/wiki/Chebyshev%27s_inequality.
It says that $1-\frac{1}{k^2}$ of the data falls within $k$ standard deviations of the mean. (E.g. $\frac34$ of the data falls within $2$ standards deviations of the mean.)