Algorithm pseudocode:
1 def find max(data):
2 biggest = data[0] # The initial value to beat
3 for val in data: # For each value:
4    if val > biggest # if it is greater than the best so far,
5       biggest = val # we have found a new best (so far)
6 return biggest # When loop ends, biggest is the max
Now, we know that step 1, 2, 4, 6 are O(1) in terms of complexity and that step 3 will contribute O(n) many steps as the length of data grows (n), but since, this is the worst case analysis, in a an array, which is already sorted with growing values, we expect to enter step 4 (n-1) times. Since, this step is nested within step 3, shouldn't the worst case asymptotic complexity for this be O(n^2), instead of O(n) ?
There is just one loop over the data. You examine each data point, and if it's greater than the best so far, you change the best so far, which is one step (we are presumably using a model of computation where each data value fits in one "word" and can be compared with another or copied in one step). So your algorithm takes $O(n)$ steps. I don't understand how you could think it was $O(n^2)$.