Accourding to theorem 5 of Dr Edgard's paper can be estimated with the function $O(ρ^2)$
Theorem 5. The class of functions computed by multilayer neural networks with binary as well as linear activations and ρ weights has VC dimension $O(ρ^2)$.
My question is how can we quantify this function, what is the function O? and how can we evaluate it?
For example if we have a neural network with 10 weights can we just say that it's VC dimension is approximated by $10 ^ 2 = 100$ ?
It's sound's that I was missing some dump details : the function $O$ refers to the big $O$ notation