Machine Learning. Can Neural Network software model continious phenomena?

108 Views Asked by At

Computer (calculator with extra devices) performs

  1. Control flow (compare and jump, goto next instruction)

  2. Read write bytes to devices which store them

  3. Calculates large boolean functions

q1. How such hardware can simulate the human brain, which is continuous in nature? Or Is neural network, which will work on pc can approximate any continious not linear phenomena?

I just wanted to mention, that mathematical experience shows that:


  1. This things continious and discrete is fundamentally different things. In some cases we can create continious phenomena, for example recreate sound which we can here from samples via Sampling Theorem.

But during proov of this Kolmogorov (or) Shannon (or) Nyquist formula we assume couple things

p1. Signal is bandlimited (we can not here some frequencies with hear)

p2. Signal is also timelimited (we assume it by implictly ignore samples outside time interval)

I don't want to go deeper, but even here it can be prooved that p1 contradict to p2. Even here it is an approximation when you apply this correct theorem "a bit incorrectly".


  1. We are as people - not able to integrate something, we do it via finite approximation and usually we do it lineary. Examples:

ex1. It is impossible to solve this y'=x^3+y^3 analitically.

ex2. It is impossible to find antiderivative of sinc analitically.

I think we are really far away from create some ai.

q2: So I think that it is another point of view to finite boolean functions from Discrete Mathematics. Not more over. Am I wrong?

1

There are 1 best solutions below

7
On

First of all, it is not clear that the behavior of the brain, as far as thinking is concerned, is continuous in nature. I am no expert on the brain, but my understanding of some of the research is that things such as memories are actually constructed from the brain "interpolating" from finite sets of data.

Second, all the continuous mathematics that has been developed was done by finite constructions on a finite set of symbols. Even without invoking pure formalism, you should understand that while we deal with concepts of smooth and continuous things, what we can prove about them is fundamentally finite. Artificial Intelligence is no more limited than Real Intelligence in this sense.