I've heard that there are several different ways to define randomness. If so, what is the strictest definition of randomness that are widely accepted as defining a type or sort of randomness. By strictest, I mean the one with the most 'strict' requirements. I don't want to define what I mean by 'strict', because I don't know how many definitions there are, and if defining it would make the question hard to answer.
I am also interested in what result or distribution the definition would yield for a large number A.
There are many definitions of randomness, based on equal probability of each value and the lack of correlations among values (at every scale), and such.
Perhaps the "strictest" definition is due to Kolmogorov: a string (e.g., of characters or bits) is random if and only if it is shorter than any computer program that can produce that string.
There are some methodological challenges to making Kolmogorov's definition practical in every case, and it presumes some very deep properties about the universality of computation, but conceptually it has much to recommend it.