I really like this law. Anyone know how true it is?
A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.
It seems to say something like: 'You can't solve an optimization problem directly, you have to iterate'.
Although I can think of immediate counterexamples when it's stated like that, it brings to mind Galois' discovery that you can't directly solve a quintic equation by arithmetic plus extraction of roots. And so I wonder if that itself might be an example of a much more general truth along the lines of 'general solutions are always more complicated than the problems they solve, and so the direct solution to any sufficiently complicated problem is usually uncomputable except by approximation methods'.
So, can anyone think of any complex systems designed from scratch that worked?
Or complex problems that have easily computed solutions?
But actually Gall's Law says more than that. It says that if you've got a starting point of great complexity, your iteration process will work better if you give up and start again.
That again seems likely in the context of numerical solution of polynomials. Pick a random number as the start of your search and unless you're unreasonably lucky then the search will take longer than if you seeded the search with zero.
So, can anyone think of any complex systems designed from scratch which, although they didn't work to start with, quickly converged to working systems without going back through previous known working designs?