A physics talk for non-physicists by Michael Baranger
“The twenty-first century is starting with a huge bang. For the person in the street, the bang is about a technical revolution that may eventually dwarf the industrial revolution of the 18th and 19th centuries, having already produced a drastic change in the rules of economics. For the scientifically minded, one aspect of this bang is the complexity revolution, which is changing the focus of research in all scientific disciplines, for instance human biology and medicine. What role does physics, the oldest and simplest science, have to play in this? Being a theoretical physicist to the core, I want to focus on theoretical physics. Is it going to change also?
Twentieth-century theoretical physics came out of the relativistic revolution and the quantum mechanical revolution. It was all about simplicity and continuity (in spite of quantum jumps). Its principal tool was calculus. Its final expression was field theory.
Twenty-first-century theoretical physics is coming out of the chaos revolution. It will be about complexity and its principal tool will be the computer. Its final expression remains to be found. Thermodynamics, as a vital part of theoretical physics, will partake in the transformation.”
The author describes calculus as being limited to smooth functions that can be approximated by a series of straight lines. “For at least 200 years, theoretical science fed on this calculus idea. The mathematicians invented concepts like continuity and analyticity to describe smoothness more precisely. And the discovery of Calculus led to an explosion of further discoveries. The branch of mathematics so constituted, known as Analysis, is not only the richest of all the branches, but also by far the most useful for applications to quantitative science, from physics to engineering, from astronomy to hydrodynamics, from materials science to oceanography. Theoretical scientists became applied mathematicians, and applied mathematicians are people for whom analysis is second nature. Integrals, differential equations, series expansions, integral representations of special functions, etc . . . . , these are the tools that calculus has provided and that are capable of solving an amazing variety of problems in all areas of quantitative knowledge.”
Scientists and engineers accepted this assumption so long that they forgot completely about it. “Yes, the enormous success of calculus is in large part responsible for the decidedly reductionist attitude of most twentieth century science, the belief in absolute control arising from detailed knowledge. Yes, the mathematicians were telling us all along that smooth curves were the exception, not the rule: we did not listen!”
Chaos is the exception that finally broke through. “Chaos is the rediscovery that calculus does not have infinite power. In its widest possible meaning, chaos is the collection of those mathematical truths that have nothing to do with calculus. And this is why it is distasteful to twentieth century physicists.”
Chaos can exist in both time and space. Chaos is space is called a fractal. “There are many possible definitions of the word fractal. A very loose and general definition is this: a fractal is a geometric figure that does not become simpler when you analyze it into smaller and smaller parts. Which implies, of course, that it is not smooth.”
Fractals exist in mathematics, geometry and almost everywhere in nature.
Chaos in time is the result of dynamical systems, a system that is capable of changing its configuration over time. “The signature of time-chaos is something called “sensitivity to initial conditions”.”
“Sensitivity to initial conditions is the death of reductionism. It says that any small uncertainty that may exist in the initial conditions will grow exponentially with time, and eventually (very soon, in most cases) it will become so large that we will lose all useful knowledge of the state of the system. Even if we know the state of the system very precisely now, we cannot predict the future trajectory forever. We can do it for a little while, but the error grows exponentially and we have to give up at some point.”
Time and space chaos are closely related. “Every chaotic dynamical system is a fractal-manufacturing machine. Conversely, every fractal can be seen as the possible result of the prolonged action of time-chaos.”
Chaos can exist for very simple systems and is always nonlinear.
“At the present time, the notion of complex system is not precisely delineated yet. This is normal. As people work on complex systems more and more, they will gain better understanding of their defining properties. Now, however, the idea is somewhat fuzzy and it differs from author to author. But there is fairly complete agreement that the “ideal” complex systems, those which we would like most to understand, are the biological ones, and especially the systems having to do with people: our bodies, our groupings, our society, our culture. Lacking a precise definition, we can try to convey the meaning of complexity by enumerating what seem to be the most typical properties. Some of these properties are shared by many non-biological systems as well.”
- Complex systems contain many constituents interacting nonlinearly.
- The constituents of a complex system are interdependent.
- A complex system possesses a structure spanning several scales. At every scale we find a structure.
- A complex system is capable of emerging behavior. “Emergence happens when you switch the focus of attention from one scale to the coarser scale above it. A certain behavior, observed at a certain scale, is said to be emergent if it cannot be understood when you study, separately and one by one, every constituent of this scale, each of which may also be a complex system made up of finer scales. Thus the emerging behavior is a new phenomenon special to the scale considered, and it results from global interactions between the scale’s constituents.”
- A complex system is capable of self organization. “The combination of structure and emergence leads to self-organization, which is what happens when an emerging behavior has the effect of changing the structure or creating a new structure.”
- A complex adaptive system can exhibit self reproduction. “There is a special category of complex systems which was created especially to accommodate living beings. They are the complex adaptive systems. As their name indicates, they are capable of changing themselves to adapt to a changing environment. They can also change the environment to suit themselves. Among these, an even narrower category are self-reproducing: they know birth, growth, and death.”
- Complexity involves an interplay between chaos and non-chaos.
- Complexity involves an interplay between cooperation and competition. “Once again this is an interplay between scales. The usual situation is that competition on scale n is nourished by cooperation on the finer scale below it (scale n+1).”
I find the author’s discussion on entropy unsatisfactory and unconvincing. After several pages of proof and discussion, he writes, “The conclusion is that our dimensionless entropy, which measures our lack of knowledge, is a purely subjective quantity. It has nothing to do with the fundamental laws of particles and their interactions. It has to do with the fact that chaos messes up things; that situations that were initially simple and easy to know in detail, will become eventually so complicated, thanks to chaos, that we are forced to give up trying to know them.”
Having learned about entropy from thermodynamics and being able to derive the concept of entropy from simple considerations of the Carnot Cycle, it’s a real property to me. One implication of thermodynamic entropy is the impossibility of perpetual motion. Energy gets dissipated in the form of heat. Perhaps I’m not knowledgeable enough to see the difference between thermodynamic entropy and information entropy. Perhaps I still have more to learn, or unlearn…
Chaos, Complexity and Entropy: A physics talk for non-physicists, Michael Baranger, MIT and NECSI, MIT-CTP-3112