Why is 0.5 rounded up? It is exactly halfway between 0 and 1, so it is not closer to 1...nor is it closer to 0. It is exactly halfway between them.
The same principle applies to other things like 11.5 or 23.5. The value is exactly halfway between integers. I don't see any objective round-direction in these cases.
You can say the same thing for something like 6.45 if rounding to the nearest 10th. 0.45 is no closer to 0.50 than it is to 0.40. It's exactly halfway between them.
Why are they rounded up? Is it just because we need an arbitrary rule for all cases?
Edit: I read everything on the duplicate question, but still feel left hanging. If the answer is, it's arbitrary, it just seems so unsatisfying. Is there a way to prove if something in math is arbitrary or not?
As far as I know, it's pretty much just an arbitrary convention. If you think about it, it really shouldn't matter too much one way or the other because, for example, while 3.5 rounds up to 4, 2.5 rounds up to 3 and so on. So, really it evens out more or less.