one idea is, the balance between order and chaos in giving rise to life.
another idea is ensemble methods performing really well - like the netflix competition winner. (is this related to monte carlo methods too? lots of weak solutions thrown together?)
in this talk hinton talks about why it might be good to have sexual reproduction in evolution. sex keeps reassorting chunks of the genome, so you never have a "single learner" in that evolutionary sense, but rather an ever-shifting ensemble of individually maybe not all that amazing learners, but always thrown back together into the same system.
he shows an example of how you can do the same thing within a deep neural network. you basically constantly turn on and off bits of the network as it's learning, so subsets of it are trying to learn some of the structure in the data, but the whole thing is never getting too committed, it's finding bits of *relatively* shallow solutions. and all these bits are learning in the context of each other (so they regularize each other, in that larger generalized space).
for one thing, it would be cool if the brain works this way, essentially living on the edge of chaos so the dynamics are always getting thrown around of which subcircuit is online during some experience (even at a microscale of time, like individual theta cycles for example). so the brain is keeping online a sort of chaotically scattered but somewhat balanced subset of stuff, always flitting between different hypothesis spaces and driving updates in those, within the global weight-space of all the learning it's done so far.
this might partly explain why individual trials of brain activity look so random, and doug garrett's data where dynamical *variability* in brain signals is related to behavioral performance.
and fundamental balance between chaos and order- the emergence of beauty and life is this being tilted with your weight ahead of your feet, the unresolving energy of the sampling dynamics kind of bootstrapping itself.