Today, this crazy question came into my mind.
Let's consider the following optimization via Genetic Algorithm:
$$f(x_1,x_2)=(x_1-0.5)^2+(x_2-0.5)^2$$
where
$$0\le x_1\le1 ~,~~~~~0\le x_2\le1$$
If I use GA with the following crossover and mutatoins,
crossover: $X_1=\{x_1^1,x_2^1\}, X_2=\{x_1^2,x_2^2\} \rightarrow X_{child}=rX_1+(1-r)X_2$
where $r$ are is a random number. (Here all random numbers are between 0 to 1 and distributed uniformly.)
mutation: $X_1=\{x_1^1,x_2^1\} \rightarrow X_{new}=\{x_1^1,r\}~~~ or~~~ X_{new}=\{r,x_2^1\}$ with 50%-50% chance
Crossover fraction is 0.7
Mutation fraction is 0.4
Elite count 4
Population 100
Selection: $1/\sqrt{n}$
Stop criteria: $f_{\min}^n-f_{\min}^{n-1}<10^{-6}$ for 5 generations
Then, the GA converges stops after N generations.
What is the expectation of N?
Is it possible to solve this problem analytically without statistics?
Another question is if I run GA for 50 generations, what will be the expected error (cost)?