I have programmed a simple pythonic version of Gauss addition. It came up when watching a YouTube video of a story from when he was a child. The formula $1 + 2 + 3 + 4 + \ldots$ that I came across seemed overly complex. I like to think I can recognize patterns so I spent a minute thinking about it and came up with this:
user = int(input('Number: '))
num = user
mult = num * num
result = (mult + num) / 2
print(result)
Output:
Number: 100
5050.0
Number: 14
105.0
Number: 123456789
7620789436823655.0
After testing this I find the results are the same and it computes easier than $$\frac{n(n + 1)}{2}.$$ So my question is why is it equated as $n(n + 1)$ when it's easier to calculate $$\frac{(n \times n) + n}{2} ?$$
It's the same amount of computation: one add, one multiply, one divide by two.
When you are working with expressions like this, if you can write it in factored form (i.e., n(n+1)), it can often be combined with other expressions that occur in the computation more readily.
However, you have to be alert for the need to use the distributive law in various ways to combine or split expressions.