Imagine a function that has the following inputs:
Input 1: An array of 10 real numbers between 50 and 150.Input 2: number of seconds that have passed since 1st January, 1970.
It runs some mystery algorithm, that we don't actually know, on these inputs, and outputs a single real number between 0 and 100.
Here is an illustration to explain this function.
I need to figure out a way to get an output of 90 or above. Since we don't know the underlying algorithm, we can use a random number generator, to produce 10 random real numbers, within the limits, for Input 1 and test. We keep doing this, with different numbers each time, until we get the output we want.
Seems simple. But here's the catch. Input 2 always has to be the number of seconds that have passed since 1st January, 1970. This is an input that can never decrease. It cannot stay the same for more than a second. It keeps increasing. Because of this, for any specific Input 1, we don't get the same output each time.
However, it has been observed, that the output usually doesn't vary too much. For example, if for a specific Input 1, the output is 55.667. Than 10 seconds later, it might be 55.234 or 55.88905. Two hours later, it might be 53.8 or 56.339453. You get the point. But this is not a guarantee. Sometimes, the output might vary much more than this. I need to train a machine, that always (or as often as possible) produces 10 numbers for Input 1, so that the output is 90 or above, whenever I run the machine.
What machine learning or prediction algorithm do I need to implement to achieve this?