Friday, April 6, 2012

The Elegant Simplicity of Chaos

     I've recently become cognizant of a buzzing sort of hum. Today, I determined that it's the upright freezer which sits between the kitchen counter and the stainless steel worktable, atop which our microwave oven is perched. Because this freezer was kept outside or downstairs in our previous dwellings, I'd never noticed that it runs almost continuously. It's a quiet, mechanical vibration, almost like a prolonged exhalation, a breathing out of thermal energy. Within its compressor, refrigerant is exposed to high air pressure, converting it from a gas to a liquid. As the liquid refrigerant flows through an expansion valve, the air pressure drops, evaporation stops, and heat is released, producing cool air as the refrigerant evaporates in this low pressure system. A thermostat senses the internal temperature of the freezer, shutting off the motor and the flow of refrigerant when it reaches a certain temperature. It's a somewhat inefficient process.

     According to the second law of thermodynamics, heat energy won't spontaneously flow from a colder body to a warmer body; this process requires a certain amount of work. The ratio of useful work to total work done, expressed as a percentage, is what determines the efficiency which with a heat engine transfers heat. Heat can be defined as disordered energy. Entropy is a measure of the amount of energy unavailable to do work within a system, its tendency to progress from a state of order to disorder, a reflection of the system's multiplicity. The more disorder there is in the system, the higher its state of entropy. In other words, entropy is a measure of what we don't know about a system's behavior, all the cognitive possibilities regarding its state at a given point in time. For some of us, the idea of entropy conjures up images of chaos, the end of the world as we know it, a system spinning hopelessly out of control, careening into oblivion. Thinking of it from this perspective, a consideration of chaos is almost TMI (too much information). The truth of the matter is that, as sacrilegious as it sounds (especially if you recall anything you learned in high school physics class), without chaos and its paradoxical influence on entropy, the second law of thermodynamics couldn't possibly exist.

     The first law of thermodynamics, which seems pretty intuitive, states that the total energy of an isolated system, both its ordered and disordered energy, is conserved over time. It's an observation that neither matter nor energy can be created or destroyed by anything in the physical universe: they merely change place and form. The quantity of matter and energy remain constant. Consider what happens in a fireplace. A lighted match applied to a log results in combustion, producing fire which releases the wood's potential energy in the form of light and heat, transforming the log into ashes and smoke. Obviously, this law begs the question, "If nothing natural can create or destroy matter or energy, what or who created them? Where exactly does all this matter and energy come from, and how did it get here?" That's a topic for another day. Needless to say, even young children have relatively little difficulty grasping this concept.

     The second law of thermodynamics, the law of increased entropy, states that while the quantity of matter and energy of an isolated system remain constant over time, its quality will gradually deteriorate, resulting in energy that is unusable, increasing randomness and disorder within the system. It is a comparison of order with disorder. This law states that, for any cyclic process, the entropy of an isolated system (like the universe) will increase. Time's "arrow" predicts that an isolated system will progress irreversibly from one that is orderly to one that is more disorderly, with equilibrium being equivalent to maximum entropy. Accordingly, entropy cannot decrease with time. If I clean and organize my house today, I know that by the end of the weekend, various objects, such as shoes, clothes, and the television remote, will have found new and often interesting places to reside, very different from that earlier point in time. It's a simple matter of statistical mechanics. One of the problems with this second thermodynamic law, especially given the fact that it is rooted in mechanics, is that it directly violates the mechanical principle of reversibility which states that for every action, there is an equal and opposite reaction!

     Maybe the problem lies in our definition of disorder. In assessing a situation, we tend to compare what's considered ideal with what's actually occurring, what we think should happen with what is happening. We regard things and events that don't seem to be in their proper place as disorderly because they weren't necessarily what we'd predicted or hoped for. I don't know how or why the television remote ended up on top of the refrigerator. Now that I've found it, it doesn't seem important enough to investigate--I'll just chalk it up to someone else's forgetful behavior and spend the afternoon, feeling mildly annoyed. It's a mismatching of rigid expectations with the inevitability of everything that's possible.

     Within a system, phase space is the set of all its possible states, all the possible values of its variables. If we know the state of the system at a given time, this can be represented by a point in phase space. Since a system is typically comprised of many components and variables, it cannot be fully known. We assign probabilities and likelihoods, based on what we do know. This incomplete knowledge can be represented by a probability distribution of points in phase space. With time, each point in phase space will move. For example, at a given point in time, I might bake a blueberry cake, using a known quantity of blueberries and batter, poured into a Bundt pan. This cake represents a certain known portion of phase space. Assuming that the blueberries are dispersed uniformly throughout the batter within the pan, I can generate assumptions about where the blueberries are positioned within the batter. This is probability. Although I know that the blueberries are somewhere inside that cake, but not exactly where, their precise locations represent an infinite number of possibilities. Some blueberries could have sunk to the bottom or floated to the top, some could be clumped together, and some could be evenly distributed, but only on one side of the pan. The lack of information regarding possibility within this system is its entropy, which is related logarithmically to the volume of the batter. The other big problem with the second law of thermodynamics lies in its assertion that entropy increases with time. This doesn't seem to hold true for the baking cake. If I dove inside it right now, I'd see that the blueberries were changing position, rising from the bottom atop air bubbles, contracting and expanding, moving away from their neighbors, making new blueberry neighbors. Despite all this activity, the system's overall phase space, the set of all its possible states is conserved, and its volume and entropy remain constant over time, posing yet another paradox.

     Believe it or not, this quietly baking cake is in a state of mild chaos. Although the volume of the batter hasn't  changed, its shape certainly has. It's gone from being a gooey mass of wet dough to a firm, crumbly cake. As it rises in the pan, the batter stretches and folds upon itself as a result of heat, moisture, air bubbles, and gluten strands, pushing and pulling, each point inside the volume diverging from neighboring points in exponential fashion, its structure becoming more complex with each passing moment. The cake is behaving like a fractal, the result of time evolution occurring within the phase space of the Bundt pan. Fractals are objects that are chaotic in space, rough-edged, geometric figures which may or may not display self-similarity, the common denominator being that they cannot be simplified by analyzing them into successively smaller parts. Our bodies are prime examples of fractals: we can't be "reduced" under a microscope, no matter how powerful the lens. It is this "controlled" chaos which complicates (and defies) a cogent explanation of the universe in terms of elementary particles, for every particle is made up of smaller particles, which conceivably are composed of even smaller particles. 

     Another cool illustration of chaos is the "butterfly effect." By flapping his wings somewhere across the world, a butterfly can affect the weather on another continent at some point in the future. This is the same time-related chaos seen in the cake. Blueberries which were kissing  in the batter each follow their own paths; with time, they move away from one another, distancing themselves from one another, while forming new alliances. Each blueberry's motion affects that of another. The uncertainty contained within these blueberries, no matter how small initially, grows exponentially with time, and eventually, "it will become so large that we will lose all useful knowledge of the state of the system. Even if we know the state of the system very precisely now, we cannot predict the future trajectory forever...we will have to give up at some point."(Baranger)

     It is our natural tendency as humans to despair over our lack of knowledge, especially when there is a very strong possibility that we may never fully understand some things. We tend to be reductionistic in our thinking. We become overwhelmed by the complexity of situations, where even simple questions seem to have ridiculously complicated answers. The state of not knowing leaves us feeling powerless. But, our logic is full of holes. Instead of marveling at the elegant simplicity of chaos, its infinitesimal patterns within patterns, its infinite possibilities, we become frustrated and defensive, and resort to attempting to smooth out the rough edges. We confuse chaos with complexity: we make chaos complex. Although the two are related by a requirement for nonlinearity, what's chaotic isn't necessarily complex. To frame it another way, the fact that we're complex adaptive systems capable of changing to cope with our environment, as well as altering our environment to suit ourselves, implies that we're chaotic part of the time, but not vice-versa.

     Aside from being nonlinear, what makes systems complex? Just as we are greater than the sum of our parts, the constituents of complex systems are interdependent. Allowing the helium to leak out of a balloon doesn't change the properties of the remaining gas, and though you end up with a balloon that no longer floats, the system isn't profoundly impacted. Letting the air leak out of our lungs through a bullet hole in the chest, however, would lead to life-threatening, potentially irreversible complications.  In this case, altering just one variable can result in demise of the entire system. We are interconnected, inseparable from what we're made of, from what surrounds us. The structure within complex systems spans several different scales. Our heads, trunks, arms, and legs are composed of organ systems which arise from the formation of organs out of various tissues comprised of cells which contain organelles that manufacture chromosomes out of DNA crafted from a string of nucleic acids joined by molecular bonds created by atoms bumping into each other, their subatomic particles whizzing and whirring about, concealing entire universes within their infinite substance. Because of this particular feature, complex systems are capable of emerging behavior, an interplay between constituents which can't be explained by the function of the more organized constituents themselves. We don't walk just because we have legs inside our jeans; we do it because they're made of our genes, which also helped make all our other "parts."

     The ability to "self-organize" arises from these qualities of structure and emergence. Self-organization is the opposite of chaos. Although our DNA contains the same basic nucleic acid building blocks as other animals, our cells are programmed to differentiate into those of human beings. Herein lies the major difference between chaos and complexity. While complex systems always contain several scales, those that are more diverse, such as subatomic particles may tend toward chaotic behavior, while those that are a notch above, such as cells, might demonstrate self-organization. This property of non-linear systems, the dividing line between chaos and non-chaos, is the edge of chaos. It's the point at which external controls change to permit modification and adaptation, making self-organization more likely to take place on a broader scale, influenced by the degree of cooperation and competition between scales. We're only as strong as our weakest link.

     In our haste to smooth out the rough edges, we're actually increasing a system's entropy. It's like drawing a line around a sea sponge; in doing so, we effectively increase its volume by ignoring all its holes--we make the sponge bigger! The fractal nature of the sponge, which was created by chaos, freaks us out because we can't keep track of its details. Instead of grooving on this natural chaos, our go-to remedy is to smooth the sponge's volume to the point where we hadn't yet lost sight of its branchpoints of differentiation, the trade off being "a loss of knowledge, [an increase in] the effective volume of distribution, [and] hence the entropy."(Baranger) In similar fashion, we make our problems in life bigger. Chaos makes things messy. Because we become frustrated and overwhelmed, we give up trying to "know" the situation at hand. The good news is that because of chaos, our dimensionless lack of knowledge (entropy) is completely subjective: it has nothing to do with our understanding of fundamental laws of the behavior of particles. Although some would argue that entropy can be quantified and expressed as absolute certainty in the presence of large enough numbers, this certainty is still based in probability. It's really not objective. In life, we can't be absolutely certain of anything. All we can do is take a moment to enjoy the sweet smell of chaos.
Baranger, M. Chaos, Complexity, and Entropy: A Physics Talk For Non-physicists. 

8 comments:

  1. "A physics talk for non-physicists"??
    I think that most scientists would understand this article, but would most non-scientists? I found that I had to pay very close attention to be able to keep up, and I not only studied thermodynamics at university; I've also made my own attempts to explain chaos. Nevertheless, I found this a fascinating discussion.

    ReplyDelete
    Replies
    1. You're right, but I've still got to give Michel Baranger kudos for trying! Chaos, complexity, and entropy...they are all such fascinating topics in and of themselves, and when you consider their inter-relatedness, WOW!

      Delete
    2. I do believe you are right in saying it's our natural tendency to despair over our lack of knowledge, especially when there is a very strong possibility that we may never fully understand some things.

      Sometimes Chaos is so wonderful!

      Delete
  2. I've often considered how woefully inadequate the Big Bang theory is at explaining the birth of the universe. Like you say, the first law of thermodynamics states that the amount of energy in a system must remain constant and cannot be created or destroyed. So the logical follow on from that is that all the energy that is in the universe now, always was there.

    The big bang says that before that period of rapid inflation, the universe was incredibly dense and hot - so....it's already there? That doesn't explain the start!

    As for entropy, it's really interesting to ponder how complex systems like the human body or planetary systems come to be. I suppose it ultimately comes down to a struggle between gravity and entropy.

    Another way to look at it is that humans are complex systems that are actually agents of increasing disorder - we know how to make a mess!

    I like how you point out that order is something that's generally observed at larger scales. The more microscopic you go the weirder things get. On the sub-atomic level quantum mechanics starts having an effect and for me, such unpredictable dynamics echo with chaos theory very well.

    Interesting how nobody can disprove chaos theory :-P. Great post, I'm glad I have a little bit of background in this area, seems like you really know your stuff, as said above, possibly a bit technical for laymans.

    ReplyDelete
    Replies
    1. Kev, Thanks for your comments! I especially love your observations about humans and the messes we make...isn't that First Law of TD a trip? The further I go in life, the more receptive I become to "I don't know." Kris

      Delete
  3. This comment has been removed by the author.

    ReplyDelete
  4. Thanks for your comment and tips Krysiu:) I think I should eat pounds of almonds to get it right, for some reason even magnesium pills work really slow for me. :)

    ReplyDelete
  5. Don't know how I missed this before, Kris. Absolutely brilliant. I have no background in science whatever, but I feel intuitively the truth of your observations. What we can't control, we call chaos. The sense of chaos is in direct proportion to the need to impose order.

    ReplyDelete