Sequences of integers might be ordered, totally ordered,... For all such attributes we find definitions in clear mathematical terms.
(1) But what is in clear mathematical terms the established and commonly accepted definition for an arbitrary sequence of integers, when it is called random (or non-random)? [Please kindly pay attention, that I am really not looking for any intuitions or thoughts or ideas (we can find lots of them on internet), rather a clear mathematical definition which I can apply in a deductive way to test an arbitrary sequence of integers to whether it is random or not.]
(2) Particularly taking into account that when we observe for the first time ever an arbitrary sequence of integers, we may not identify immediately a possible complex background of any relations or patterns behind the numbers, such relations could be however discovered at a later point of time, when sufficiently investigated. It would be paradox if such a sequence could be regarded as random for some time and non-random later. Hence my second question: Is our knowledge about a sequence a crtieria to be considered in the definition of the attribute random?
Thank you in advance!
I totally agree with the others who have commented, in that no arbitrary sequence of numbers is inherently random. For example, the sequence of primes (as you've pointed out) have been called random, yet we know that they can be represented by a (admittedly complex) system of Diophantine equations. So they form a rather well-ordered group, but just incredibly complicated.
Probability and randomness are ways of working with uncertainty...epistemological or ontological. If you see a sequence of numbers that you can't make sense of, then viewing them as if they were generated by a random process may help you make progress (as with the primes, for example). However, no matter how successful your probability model is, unless you know how the numbers were produced, you cannot conclude that they were, in principle, not predictable with 100% certainty. A truly random process will limit the in-principle predictability to something less than 100%. Whether or not these truly exist in some philosophical sense is somewhat debated, but practically, there are many processes that are accurately modeled as inherently random processes.
As a strong counterexample to your attempt for a rigorous definition of random: even pseudo-random number generators, which drive the vast majority of simulation studies and empirical probabilistic studies, are demonstrably NOT random. Yet, pretending that they are and applying probability theory leads to generally excellent results! So, true randomness is not even needed for probability theory...so what is the utility of such a definition?