Computing the sum of the result of the cross-minima of a given vector and a sequence of values

21 Views Asked by At

I don't know if the title is clear, but the dumb way of doing what I want is this: I have a vector with size $n$, and another vector with size $m$ (in my case it is a linear space of all values from 0 until the maximum value of the first vector). Then I make a matrix of size $n \times m$ with the minimum value between the two, then I compute the sum for each value.

You can think of it as a simulation saying, I had this vector that is a realization of a random variable, but if I had truncated this random variable in many points, I would have gotten this sum as a result (for each of these points).

Basically it is this:

$$ g(n) = \sum_x min(f(x), n) $$

Example: if I have a vector with values [1, 3, 4], I want an output with values [3, 5, 7, 8].

Is there a more clever way to achieve this result without computing the minima of all combinations? I though about doing a cumulative sum or something.

1

There are 1 best solutions below

0
On

I have absolutely no proofs about this, but I came up with something. If I do a count vector of all the values I have in my original vector I have this for the [1, 3, 4] vector:

[1, 0, 1, 1]

Then I make a backwards cumulative sum:

[3, 2, 2, 1]

And then a forward cumulative sum:

[3, 5, 7, 8]

Funny huh