About Rationally Speaking


Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.

Thursday, October 25, 2012

Essays on emergence, part IV


ewinsidetv.files.wordpress.com
by Massimo Pigliucci

The previous three installments of this series have covered Robert Batterman’s idea that the concept of emergence can be made more precise by the fact that emergent phenomena such as phase transitions can be described by models that include mathematical singularities; Elena Castellani’s analysis of the relationship between effective field theories in physics and emergence; and Paul Humphreys’ contention that a robust anti-reductionism needs a well articulated concept of emergence, not just the weaker one of supervenience.

For this last essay we are going to take a look at Margaret Morrison’s “Emergence, Reduction, and Theoretical Principles: Rethinking Fundamentalism,” published in 2006 in Philosophy of Science. The “fundamentalism” in Morrison’s title has nothing to do with the nasty religious variety, but refers instead to the reductionist program of searching for the most “fundamental” theory in science. The author, however, wishes to recast the idea of fundamentalism in this sense to mean that foundational phenomena like localization and symmetry breaking will turn out to be crucial to understand emergent phenomena and — more interestingly — to justify the rejection of radical reductionism on the ground that emergent behavior is immune to changes at the microphysical level (i.e., the “fundamental” details are irrelevant to the description and understanding of the behaviors instantiated by complex systems).

Morrison begins with an analysis of the type of “Grand Reductionism” proposed by physicists like Steven Weinberg, where a few (ideally, one) fundamental laws will provide — in principle — all the information one needs to understand the universe [1]. Morrison brings up the by now familiar objection raised in the ‘70s by physicist Philip Anderson, who argued that the “constructionist” project (i.e., the idea that one can begin with the basic laws and derive all complex phenomena) is hopelessly misguided. Morrison brings this particular discussion into focus with a detailed analysis of a specific example, which I will quote extensively:

“The nonrelativistic Schrodinger equation presents a nice picture of the kind of reduction Weinberg might classify as ‘fundamental.’ It describes in fairly accurate terms the everyday world and can be completely specified by a small number of known quantities: the charge and mass of the electron, the charges and masses of the atomic nuclei, and Planck’s constant. Although there are things not described by this equation, such as nuclear fission and planetary motion, what is missing is not significantly relevant to the large scale phenomena that we encounter daily. Moreover, the equation can be solved accurately for small numbers of particles (isolated atoms and small molecules) and agrees in minute detail with experiment. However, it can’t be solved accurately when the number of particles exceeds around ten. But this is not due to a lack of calculational power, rather it is a catastrophe of dimension ... the schemes for approximating are not first principles deductions but instead require experimental input and local details. Hence, we have a breakdown not only of the reductionist picture but also of what Anderson calls the ‘constructionist’ scenario.”

Morrison then turns to something that has now become familiar in our discussions on emergence: localization and symmetry breaking as originators of emergent phenomena, where emergence specifically means “independence from lower level processes and entities.” The two examples she dwells on in some detail are crystallization: “the electrons and nuclei that make up a crystal lattice do not have rigidity, regularity, elasticity — all characteristic properties of the solid. These are only manifest when we get ‘enough’ particles together and cool them to a low ‘enough’ temperature”; and superconductivity: “The notion of emergence relates to superconductivity in the following way: In the N to infinity limit of large systems (the macroscopic scale) matter will undergo mathematically sharp, singular phase transitions to states where the microscopic symmetries and equations of motion are in a sense violated. ... [as Anderson put it] The whole becomes ‘not only more than but very different from the sum of its parts.’”

Morrison concludes the central part of her paper by clearly stating that we ought to take seriously the limits of reductionism “and refrain from excusing its failures with promissory notes about future knowledge and ideal theories.” Amen to that, sister.

The rest of the paper deals with some more specifically philosophical issues raised by the reductionism-emergence debate, one of which is the “wholes-parts” problem, referring to how — metaphysically — we should think about parts and the wholes they make up. But Morrison points out that emergence does not entail a change in the ontological status of parts (the parts don’t cease to exist when they form wholes). Rather, the problem is that emergent properties disappear if a system crosses a lower threshold of complexity. An example is superfluidity, which manifests itself as a collective effect of large ensembles of particles at low energy. Superfluidity cannot be rigorously deduced by the laws of motion that describe the behavior of the individual particles, and the phenomenon itself simply disappears when the system is taken apart. As Morrison sums up: “These states or quantum ‘protectorates’ and their accompanying emergent behavior demonstrate that the underlying microscopic theory can easily have no measurable consequences at low energies.”

Another concept tackled by Morrison and that we have already encountered is the use of renormalization theory as a way to describe emergent phenomena. She makes it clear that she doesn’t think of renormalization as just a mathematical trick, and certainly not as a friend of reductionism: “renormalizability, which is usually thought of as a constraint on ‘fundamental’ quantum field theories can be reconceived as an emergent property of matter both at quantum critical points and in stable quantum phases. ... [Indeed] what started off as a mathematical technique has become reinterpreted, to some extent, as evidence for the multiplicity of levels required for understanding physical phenomena.”

We have arrived at the end of my little excursion into the physics and philosophy of emergence. What have we gained from this admittedly very partial tour of the available literature? I think a few points should be clear by now:

* The concept of emergence has nothing inherently mystical or mysterious about it, it is simply a way to think about certain undeniable properties of the world that we can observe empirically.

* There are conceptually (Humphreys) and mathematically (Batterman, Castellani and Morrison) ways of operationalizing the idea of emergence.

* “Fundamental” physics itself provides several convincing examples of emergent phenomena, without having to go all the way up to biological systems, ecosystems, or mind-body problems (though all of those do, of course, exist and are both scientifically and philosophically interesting!).

* The reductionist program seems to be based on much talk that includes words like “potential,” “in principle,” and so on, that amount to little more than promissory notes based on individual scientists’ aesthetic preferences for simple explanations.

* While the reductionist-antireductionist debate is far from being settled (and it may never be), it is naive to invoke straightforward physics as if that field had in fact resolved all issues, particularly the philosophical ones.

* There doesn’t seem to be any “in principle” reason why certain laws of nature (especially if one thinks of “laws” as empirically adequate generalizations) may not have specific temporal and/or spatial domains of application, coming into effect (existence?) at particular, non-arbitrary scales of size, complexity, or energy.

So, there’s much to think about, as usual. And now I’m off to the informal workshop on naturalism organized by Sean Carroll, featuring the likes of Jerry Coyne, Richard Dawkins, Dan Dennett, Rebecca Goldstein, Alex Rosenberg, Don Ross, Steven Weinberg, and several others, including yours truly. Should be fun, stay tuned for updates...
____

[1] This reminds me of the following hilarious exchange between Penny and Sheldon on The Big Bang Theory show. The context is that Sheldon — the quintessential scientistic reductionist — volunteered to help Penny start a new business, a rather practical thing for a theoretical physicist.

Penny: “And you know about that [business] stuff?”
Sheldon: “Penny, I’m a physicist. I have a working knowledge of the entire universe and everything it contains.”
Penny: “Who’s Radiohead?”
Sheldon: “I have a working knowledge of the important things in the universe.”

5 comments:

  1. Sheldon reminds me of this
    When researchers meet to start a company

    http://researchinprogress.tumblr.com/post/32929390264/meeting-of-researchers-about-starting-a-company

    ReplyDelete
  2. Superfluidity, intact to some extent, but not a whole. Like anything, it is caused from lower & other components. Even if one of the causes is human invention, it's history shows it is not a whole because it has a context. Build from below after retrospective analysis, even if it requires the building of human psychology from below. An awfully big research program unassisted by definitional conundrums.

    ReplyDelete
  3. "There doesn’t seem to be any “in principle” reason why certain laws of nature (especially if one thinks of “laws” as empirically adequate generalizations) may not have specific temporal and/or spatial domains of application, coming into effect (existence?) at particular, non-arbitrary scales of size, complexity, or energy."

    Are there any reductionists who dispute that there are empirically adequate generalizations? I don't see why such laws would contradict reductionism. I presume the debate is about whether the laws are "merely" adequate generalizations or in fact part of the "instruction set" of the universe.

    ReplyDelete
    Replies
    1. Fred,

      yes, the view that laws of nature are simply empirically adequate generalizations is indeed disputed. In itself it isn't a contradiction of reductionism, but if some of those laws "emerge" only when complex systems of certain types are at play, then that would be incompatible with reductionism.

      Delete
    2. Correct Fred. The constituents making up every supposedly system or 'whole' are the same. They aggregate from protons, electrons & neutrons (primarily) with spatial temporal etc properties at the Big Bang. The fact that they become complex shows the potential of those constituents driven by their complex aggragtions. It has nothing to do with emergent teleology.

      The logical problem for these emergence theorists is in understanding how the complexities in aggregation change constituents radically at the human level in particular. They seem to think Darwinism, for example, is teleological. Actually it is just the enveloping of organism built from constituents within an environment build from constituents - all driven by constituents.

      Delete

Note: Only a member of this blog may post a comment.