In (American) primary school, they first teach you how to reckon with simple fractions in numerator-over-denominator "vulgar" form, often using the image of a pie as an illustration. They teach you how to reduce fractions. They teach you about fractions like $3/2$ where the numerator is higher than the denominator, but call these "improper" and require that they should be reduced to to a mixed number such as $1½$. They then forget about all of the above and teach you decimal fractions, which you then use more or less exclusively for the remainder of your primary and secondary education.
How often are vulgar rather than decimal fractions used in modern mathematical writing? Obviously they're often used to represent mathematical expressions involving variables—$\frac{x}{2}$ seems preferred to $0.5x$—but how often will you see simple rational constants like one-half represented in the form of a numerator over a denominator? Are mixed numbers ever used?
As far as I can tell, there are three ways to render a vulgar fraction: $\frac{1}{2}$, $½$, and $1/2$. I don't know what these are properly called, so I'm going to call them "vertical form", "slanted form", and "solidus form". Is there any dominant preference between these forms, and if so, how strong of a preference?
I personally like vulgar fractions because I feel they more strongly suggest precise ratios where a decimal fraction hints that a quantity may have been rounded or truncated. I don't mind the solidus form, but to me it suggests integer division, possibly because that's what it often means in various programming languages. Are these impressions at all in line with actual mathematical usage?
If I submitted an article to a mathematical journal that used vulgar fractions (or, for that matter, mixed numbers), would it be rejected on those grounds? Would the form in which they were typeset matter? Would it be accepted, but edited to better match convention?
In brief: