As far as I remember a neural network cannot forget anything. Does this mean that no matters how evolved the network is, it's always going to throw me back the right output if I feed it an input?
And when I say "right output" I mean "precise output" without getting a bit wrong
neural networks only have a finite capacity to "remember" (called capacity). Assuming that you don't keep trying to update it with new information, then it would theoretically 'last forever'. But if you keep adjusting the weights to learn new and different things, eventually it will forget old patterns.