While walking with Bilyana this morning, we took to discussing complex dynamic systems, and the capability of present-day science to address them. Such systems are distinguished by the existence of complex interactions and interdependencies within them. You can’t look at the behaviour of a few neurons and understand the functioning of a brain; likewise, you can’t look at a few ocean currents or a few cubic miles of atmosphere and understand the climatic system. The resistance of these systems to being understood through being broken down and studied piece by piece is why they pose such a challenge to a scientific method that is generally based on doing exactly that.
Murray Gell-Mann, the physicist who discovered quarks while working at the Stanford Linear Accelerator Center, extensively discusses complex dynamic systems in his excellent book: The Quark and the Jaguar. Among the most interesting aspects of that book is the discussion of the difficulty of categorizing things as simple or complex. That is to say, establishing the conditions of complexity. Some kinds of problems, for instance, are extremely complex for human beings – taking the sixth root of some large number, for instance – but facile for computers. That said, computers have a terrible time trying to perform some tasks that people perform without difficulty. The comparison of human and machine capability is appropriate because of the difficulties involved in trying to understand something like the climatic system and determine the effects that anthropogenic climate change will have upon it. Increasingly, our approach to studying such things is based on computer modelling.
Whether studying an economy, the cognitive processes of a cricket, or the dynamics of a thunderstorm, modelling is an essential tool for understanding complex systems. At the same time, a level of abstraction is introduced that complicates the status of such understanding. First of all, it is likely to be highly probabilistic: we can work out about how many bolts of lightning a storm with certain characteristics might produce, but cannot predict with exactitude the behaviour of a certain storm. Secondly, we might not understand the reasons for which behaviour we predict is taking place. Some modern aircraft use neural networks and evolutionary algorithms to dampen turbulence along their wings, through the use of arrays of actuators. Because the behaviour is learned rather than programmed, it doesn’t reflect understanding of the fluid dynamics involved in the classical sense of the word ‘understanding.’
I predict that the most significant scientific advancements in the next hundred years or so will relate to complex dynamic systems. They exist in such importance places, like all the chemical reactions surrounding DNA and protein synthesis, and they are so imperfectly understood at present. It will be interesting to watch.
Some areas of science are already based quite a lot on probability and modelling. Look at drug trials. They don’t necessarily know how or why a molecule works, but they use double-blind trials to either accept to reject the hypothesis that it does nothing medically useful, up to a certain level of certainty.
They don’t do anything nearly so grand as to try and understand the biochemistry of the body as a whole and how the drug fiddles with it. They just develop a body of data sufficient to serve their purpose.
Not that you could do something comparable for climate change, really. We don’t have a few thousand other Earths to test out the effects of industrial and transport technologies on, over the course of decades or centuries…
Researchers from San Diego are using supercomputers to accurately predict the shape of the Sun’s corona, based on magnetic field data from the photosphere. (link)
Via /.
HYDROGEN: A light, odorless gas, which, given enough time, turns into people.
— John P. Wiley Jr.
Bunimovich Stadium demonstrates how small differences in initial conditions can produce hugely different outcomes.