Quanta Magazine's April 13, 2023 A New Approach to Computation Reimagines Artificial Intelligence starts with:
By imbuing enormous vectors with semantic meaning, we can get machines to reason more abstractly — and efficiently — than before.
Later on, during the explanation are the paragraphs:
The vectors must be distinct. This distinctness can be quantified by a property called orthogonality, which means to be at right angles. In 3D space, there are three vectors that are orthogonal to each other: One in the x direction, another in the y and a third in the z. In 10,000-dimensional space, there are 10,000 such mutually orthogonal vectors.
But if we allow vectors to be nearly orthogonal, the number of such distinct vectors in a high-dimensional space explodes. In a 10,000-dimensional space there are millions of nearly orthogonal vectors.
I remember reading previous questions here with high dimensions and dot products are discussed and seeing comments about how easy it is to get very small or even zero dot products in high dimensions, but I've never worked outside of one, two and three dimensional problems.
Question: What definition of "nearly orthogonal" would result in "In a 10,000-dimensional space there are millions of nearly orthogonal vectors"? Would it be for example dot product1 smaller than some number like 0.1?
1of the presumably normalized vectors
Based on discussion and references in the comments (principally from user L.F.) it appears that, as I guessed, two vectors are considered “nearly orthogonal” if their dot product is “small”: that is, $a$ and $b$ are nearly orthogonal if, in the context of a particular value $\epsilon$, we have $$-\epsilon \le \frac{a\cdot b}{\lvert a\rvert \lvert b \rvert} \le \epsilon.\tag{$\star$}$$ The smaller the value of $\epsilon$, the stricter the requirement imposed by “nearly orthogonal”. As $\epsilon$ goes to zero, the meaning of “nearly orthogonal” approaches the actually orthogonal.
This Math Overflow post asks how, for given $n$ and $\epsilon$, one can find a large family of vectors from $\Bbb R^n$ that are nearly orthogonal in this sense. The top answer there, by Bill Johnson, cites the so-called Johnson-Lindenstrauss lemma and claims that you can find a family of $k$ nearly-orthogonal vectors if
$$n\ge C \epsilon^{-2} \log k$$
where $C$ is a fixed constant no larger than $8$. Bill Johnson is (or at least purports to be) one of the namesakes of the Johnson-Lindenstrauss lemma, so the answer is likely to be reliable.
Turning this around, we have that, given $n$ and $\epsilon$, one can find at least $$e^{n\epsilon^2/8}$$ nearly-orthogonal vectors. Note that the appearance of $e$ here is rather arbitrary, as its value can be absorbed into the $C$. For the specific case of $k \approx 10^6, n=10000$ that you asked about, we find that $\epsilon = 0.12$ is sufficient to find many millions of nearly-orthogonal vectors, but $\epsilon = 0.1$ may not be.
(Beware; some of the answers seem to consider the less strict constraint that the dot product lie in $[-1, \epsilon]$ rather than in $[-\epsilon, \epsilon]$, and I have not thought carefully about how this will affect the results. For large $n$, not too much, I think.)
A reply by Timothy Gowers explains why this result is plausible: The vectors in $\Bbb R^n$ lie on the the unit $n-1$-sphere, and each vector can be thought of as excluding a portion of this sphere that is proportional to $(1-\epsilon)^{n-1}$.
Separate answers by Ryan O'Donnell and by ‘kodlu’ provide a method for locating $2^{O(\epsilon^2n)}$ nearly-orthogonal unit vectors: simply select random vectors whose components are $\pm \frac1{\sqrt n}$; by a probabilistic argument these will usually be nearly-orthogonal.
Disclaimer: I tried to summarize the MO discussion, but I did not think about any of it carefully, so I may have gotten the details wrong. Jelani Nelson suggests consulting Problems and Results in Extremal Combinatorics Part I, by Noga Alon, for details. This is currently available online from Professor Alon's web site at Tel Aviv University. The pertinent part seems to be section 9.