Jump to ratings and reviews
Rate this book

Every Thing Must Go: Metaphysics Naturalized

Rate this book
Every Thing Must Go argues that the only kind of metaphysics that can contribute to objective knowledge is one based specifically on contemporary science as it really is, and not on philosophers' a priori intuitions, common sense, or simplifications of science. In addition to showing how recent metaphysics has drifted away from connection with all other serious scholarly inquiry as a result of not heeding this restriction, they demonstrate how to build a metaphysics compatible with current fundamental phsyics ("ontic structural realism"), which, when combined with their metaphysics of the special sciences ("rainforest realism"), can be used to unify physics with the other sciences without reducing these sciences to physics intself. Taking science metaphysically seriously, Ladyman and Ross argue, means that metaphysicians must abandon the picture of the world as composed of self-subsistent individual objects, and the paradigm of causation as the collision of such objects.
Every Thing Must Go also assesses the role of information theory and complex systems theory in attempts to explain the relationship between the special sciences and physics, treading a middle road between the grand synthesis of thermodynamics and information, and eliminativism about information. The consequences of the author's metaphysical theory for central issues in the philosophy of science are explored, including the implications for the realism vs. empiricism debate, the role of causation in scientific explanations, the nature of causation and laws, the status of abstract and virtual objects, and the objective reality of natural kinds

360 pages, Hardcover

First published July 5, 2007

46 people are currently reading
1394 people want to read

About the author

James Ladyman

8 books35 followers

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
81 (46%)
4 stars
50 (28%)
3 stars
26 (14%)
2 stars
10 (5%)
1 star
7 (4%)
Displaying 1 - 6 of 6 reviews
Profile Image for WarpDrive.
274 reviews505 followers
May 22, 2018
This book presents a variants of a philosophical stance commonly named “Ontic Structuralism Realism” (referred to as “OSR” from this point onward).
The authors' version of OSR promotes an approach based on the following criteria:

- It is “realist”: in agreement with classical "scientific realism", it commits to the view that our best scientific theories are not merely empirically adequate, but represent “approximately true” descriptions of the world, and that their theoretical terms successfully “refer". Also, it is “scientistic”: it confers epistemological primacy to science over philosophical speculation, and it is defined as a “radically naturalistic metaphysics”, by which the authors mean a metaphysics strongly anchored on the sciences, and motivated mainly by attempts to unify hypotheses and theories that are taken seriously by contemporary science. I strongly share the authors' views about the uselessness of “esoteric debates based on prioritizing armchair intuitions about the nature of the universe over scientific discoveries.” A Priori inquiry has too many times regarded as impossible states of affairs that science has come to entertain. Just as an example, armchair intuition by the philosophers would have hardly anticipated the results of Bell's theorem, which enforces the fact that any empirically adequate successor to quantum mechanics will have to violate local realism. As the authors state: "the institutions of modern science are more reliable epistemic filters than are any criteria that could be identified by philosophical analysis and written down - institutional processes of science have inductively established peculiar epistemic reliability.”.
In its “realism”, OSR is also in opposition to constructive empiricism (which states that science aims only to give us theories which are empirically adequate).
I feel that the “realist” and “scientistic” position, properly qualified, correspond better to reality, if we take into account the history of continuous scientific success of the last few centuries, and the impressive corresponding technological progress. Science works. It is manifest, I think, that a progressive convergence towards a knowledge of the Universe's underlying structure and regularities is being achieved. Of course, knowledge acquisition is a process mediated by the mental and sensorial (and technological) apparatus of the "knower", so such knowledge acquisition will always be filtered and limited by such context: this does not prevent however genuine progress from being achieved, in theoretical as well as applicative terms. Regardless of what some armchair philosophers might proclaim, nature DOES have regularities, and science has been able to capture, at least to an extent sufficient for remarkable technological progress, such regularities. How asymptotically close our current mature physical theories are to the real underlying patterns of Nature is, of course, open to intense debate – but I think it is essentially a question of degree, not of overall tendency. When empirical adequacy reaches the extraordinary levels it has historically achieved, it seems unsustainable to seriously doubt that the underlying theories have captured some important critical aspects of the underlying nomological structure.

- OSR is also a “structural” form of realism: in this, it diverges from the classical approach of scientific realism, and in doing so it purports to overcome its problems. The history of ontological discontinuity across theory change makes standard scientific realism problematic: the “objects” posited by successive scientific theories, even if deemed empirically successful when originally formulated, have changed significantly; therefore (claim the critics), simply by induction we should expect that the entities posited by current theories might have to be revised too.
This “theory change” problem, claim the authors, is significantly reduced by OSR: structural realism does not commit to the existence of the individual “objects”, but it states that the truth-value of a theory is essentially in its nomological/structural contents. Science is ultimately all about discovering patterns, structures and regularities in nature. Science describes objective modal relations among the phenomena, not just what actually happens and what entities/objects objective reality is “made of”. The import of successful scientific theories, according to OSR, consists in their giving correct descriptions of the structure of the world, not of its furniture. In opposition to standard realism and to the large majority of metaphysical stances, OSR posits “structure” (or patterns) as fundamental: according to this perspective, objects are not individually autonomous, but are fully dependent on relational properties (their nomological location within a structure) for their existence. According to the structural realist, the reason why our best scientific theories are successful is that they (more or less) accurately reflect the structure of the world. The structure itself can be conceptualized in terms of the relevant laws and symmetry principles associated with the given theory. Any ontological commitments that extend beyond structure must be essentially regarded as surplus content, or at least of derivative import.
With regards to objects, the type of structural realism envisaged by the authors of this book is not "eliminativist": objects still play an important pragmatic role in the creation of physical theories (used as heuristic devices or stepping stones to obtain the structure). But they are ontologically secondary; they are re-conceptualized in structural terms (whereby their identity is determined contextually, via the relations they enter into).
By referring to patterns/structures rather than objects, claim the authors, the problem of ontological discontinuity is minimised: when it comes to the structural nature of scientific theories it is possible to see a progression, a significant retaining of important structural aspects from one theory to its successor.
A few examples of such supposed continuity are provided in this book: the structural commonalities between Ptolemaic and Copernican astronomy, the correspondence between Special Relativity and classical mechanics (when v becomes very small compared to c, or when c tends to infinity, so that v/c tends to zero, the mathematical structure of the Lorentz transformation increasingly approximates that of the Galilean one: the two theories are different in their ontological import—for example, SR does discards absolute simultaneity; however there is partial continuity of mathematical structure), the structural continuity in descriptions of the electron, the relationship between Maxwellian and Quantum Electrodynamics, and in general the structural continuities between classical and quantum physics (Bohr consciously applied the methodological principle that quantum mechanical models ought to reduce to classical models in the limit of large numbers of particles, or the limit of Planck’s constant becoming arbitrarily small ( the ‘correspondence principle’). There are also numerous cases in quantum mechanics where the Hamiltonian functions that represent the total energy of systems imitate those of classical mechanics, but with variables like position and momentum replaced by Hermitian operators).
The authors also claim that there are in fact many such examples of continuity in the “mathematical” structure of successive scientific theories, in particular where equations are preserved, but reinterpreted or recovered as limiting cases. Theories, like Newtonian mechanics, can be “literally false”, but they still capture important modal structure and relations.
As expected, the “indistinguishibility” of particles of the same type is also used as an argument in favor of structuralism, in opposition to an ontology based on the individuality of objects: for example, there is no fact of the matter which one of the two quantum objects prepared in the singlet state at the source of an EPR-Bohm experiment is later measured in the “left” or “right” wing of the experiment. In the case of entangled systems, the state of the system can only be defined in relation of both particles as one single “entity” – each particle has no state of its own with respect to that observable but rather enters into a product state - in other words, we have non-supervenient relations. In such context, old metaphysical concepts such as “haecceity” appear obsolete. In more general terms and beyond the example of entangled states, permutation of indistinguishable particles in some state is not observable, and states which differ only with respect to the permutation of particles of the same kind are treated as the same state simply labelled differently.
When it comes to fundamental physics, objects are very often identified via group theoretic structure. Objects can be identified in terms of which symmetry transformations leave them unchanged or invariant. Weyl famously asserted that: ‘All quantum numbers, with the exception of the so-called principal quantum number, are indices characterizing representations of groups’. The Standard Model itself, after all, is nothing but a gauge quantum field theory containing the internal symmetries of the a symmetry group (SU(3) × SU(2) × U(1)). The objects are, in this context, nothing more than members of equivalence classes under symmetry transformations, and no further individuation of objects is possible. Using concepts from group theories, the distinction between fermions and bosons, for example, can be viewed as resulting from the action of the permutation group, the properties of mass and spin can be understood in terms of the relevant irreducible representation of the Poincaré group, and so on. In general, the symmetries represent the invariants in terms of which the ‘nodes’ in the structure can be described, and it is the latter that acquires ontological primacy. The authors also deal with General Relativity, in which case, they appear to support the view that the metrical properties are relational instead of intrinsic properties, so that general relativity theory is committed to a structure of metrical relations.

- OSR is an “ontological” form of structural realism, which refers to the belief that patterns/structures adopted in scientific theories are “real”: they are not merely abstract constructs used as book-keeping devices, but correspond to features of reality (Epistemological versions, on the other hand, state that scientific theories describe the structure of the world, but do not ontologically commit to the actual content of the world). This is possibly the features of OSR most open to criticism, where some might level the charge of Neo-Platonism or of some form of disguised Idealism. When talking about structures and patterns as ontologically foundational, OSR opens itself to the criticism that it potentially confuses the mathematical with the physical, and it begs the questions about how the structural/mathematical instantiates into the physical.
The authors develop a theory of “real patterns” that try to answer to this criticism: firstly, it appears that they do not explicitly conflate mathematical structures with the physical structures: “The ‘world-structure’ just is and exists independently of us and we represent it mathematico-physically via our theories.... the fact that we only know the entities of physics in mathematical terms need not mean that they are actually mathematical entities”.
Moreover, they distinguish the concept of generic “pattern” from the concept of real pattern, whereby the latter is a scientifically determined physical regularity in Nature subject to the standard vetting of the experimental procedure of mainstream science, and characterized by predictive and explanatory power. In this, the authors' position is clearly at variance with the forms of mathematical monism or Neo-platonism proposed by the likes of Tegmark. It is also true however that the authors also appear to contradict themselves in their unfortunate quick digression into philosophy of mathematics when they say: “possibility is that...the traditional gulf between Platonistic realism about mathematics and naturalistic realism about physics will shrink or even vanish” - this is a statement with which I disagree, in the sense that, in my opinion, mathematical structures and the the patterns in the physical world are two closely related but still different conceptual levels, and conflating the two generates unnecessary confusion, requiring a deep and separate discussion. To be fair, in order to put the whole question in its right context, the millennia-old metaphysical “problem of the universals” (and the related debate between nominalism and Platonic realism (universalia ante res)) have dramatically shifted (and been rendered quite moot) by the developments in physics, where it is becoming not at all that obvious whether a theoretical term refers to a concrete entity or a “mathematical” entity. Think about Quantum field Theory (QFT), where quantum fields are very different from classical fields (quantum fields are operator-valued: they entail an assignment of operators, not of specific values of some physical quantity, to spacetime. Naive forms of physicalism would be completely confused by QFT). Moreover, “particles” in QFT can be considered as “excitations” of such fields, thus the question "why are all electrons identical?" arises from mistakenly regarding individual electrons as fundamental objects, when in fact it is only the electron field that is fundamental.
Another very interesting aspect of the definition of “real patterns” is its embedding into an information-theoretical framework. I was very pleased to find that the authors take onboard the increasing importance that information-theoretical aspects are assuming in the pursuit of contemporary physical science. A pattern is real, according to this version of OSR, if it is a description of the data/phenomena that is more efficient than the simple “bit map encoding” (a straight bit by bit copy of the underlying data); in other words, if it can be viewed as an “algorithm” that reproduces the data/describes the phenomena using a smaller number of bits than the data itself (when there is such an algorithm, we say that the data is algorithmically compressible). The existence of regularities in Nature may be expressed by saying that the world is algorithmically compressible. This information compression is, after all, what physical laws achieve: they identify the regularities of the world and its modal structure, and capture the seemingly complex multiform variety of the underlying phenomena, in a concise set of informational-efficient “laws”.

- another interesting aspect of the authors' version of OSR, is the primacy given to Physics (everything else, including the other sciences called “special sciences” by the authors, is subject to the constraints of the results determined by the Physics), which does NOT imply reductionism.
The authors propose a non-reductionist, scale-relative ontology, named “Rainforest Realism”: still using the paradigm of information theory, the authors note that all other sciences are possible because the world is algorithmically compressible at many levels. At certain levels of description it is possible to use much less information to predict the behaviour of systems described in an approximate (and even probabilistic way), than would be needed to describe their microstates according to the underlying physical laws. In particular, the special sciences often rely upon reduction in the degrees of freedom of the system. Take the example of the ideal gas laws in thermodynamics: the ideal gas laws use only three degrees of freedom to give a reasonably accurate description of the behaviour of systems that have of the order of 10^23 degrees of freedom. Real patters can therefore be identified at many levels of granularity and in relation to different sciences, so they are not necessarily limited to fundamental physics; in the case of the ideal gas law, the reduction in the number of parameters needed to describe systems is what the theory of real patterns, in conjunction with the “Rainforest Realism” aims to capture. In more general terms, there are real patterns in the world that are only visible at the right scales of resolution and degrees of approximation, but that are not any less “real” than patterns defined at the basic “microphysical” level. Real patterns are all those that indispensably figure in “projectible generalisations” allowing us to predict and explain the behaviour of the system being analyzed.
This is a most commonsensical view, which takes proper notice of the fact that reality is ordered and presents regularities at many levels, and that it can not be assumed (as some reductionists do) that all regularities and the whole modal structure of the world can be “reduced” to a few physical laws defined at subatomic level. Science is in the business of describing such patterns/structures. At one level, the patterns of nature can best be captured by referring to, say, the electron field; at another level they can be captured by objects like planets, tables, organisms, cells. The latter are not illusions; rather, they are simply the most appropriate way (informationally efficient, and liable to generate predictions) to describe a certain stable pattern at a particular level of granularity

This book also debates other interesting themes, such as causality, time, the role of information in the natural world. Unfortunately, due to space constraints, I cannot address such themes in any more detail.
However, while I find Structural Realism quite compelling, and its “Ontological” variety interesting, there are a few areas that the authors have not, in my opinion, developed to a satisfactory level of detail:
- The actual extent to which “structures” are preserved, and what we actually mean by preservation of structure. In general, I think, structural continuity cannot just refer to the shared logic-syntactic form of equations, for we must also consider the subject matter (the underlying "semantics", if you wish). Structurally similar equations are used in different cases for modelling different phenomena. The authors appear not to be fully aware of this aspect, I feel
- Quantum Field Theory: I remember coming across the issue that QFT is not unequivocal when it comes to the structure of the world, as there as there is the the problem of “unitarily inequivalent representations” (see https://arxiv.org/pdf/1312.3239.pdf). There are alternative structures, seemingly incompatible, related to the same physical aspects of the world.
- The problem of initial conditions and of instantiation: can only "structures" be responsible for the actual evolution and current status of this universe, and what is the relationship between mathematical and physical structures. How aspects of symmetry breaking, probabilistic behaviour, the supposed initial low entropy state of the universe, and in general aspects of contingency, all relate to the overall view promoted by OSR. I am not saying that these necessarily present insurmountable issues in OSR, but I would have liked a more direct and detailed treatment of such points.

Apart from the points above, this is an interesting book, highly informative even if not always accessible, with insightful arguments and recommended to anybody who, having some minimum background in philosophy and the physical and mathematical sciences, is seriously interested in gaining a better understanding of Ontological Structural Realism. 4 stars.
Profile Image for Shaun.
7 reviews8 followers
June 4, 2016
For a great summary, by Massimo Pigliucci, of the realism/anti-realism debate, go here. Then find Part II of that post here for Pigliucci's full treatment of "Every Thing Must Go: Metaphysics Naturalized."

The only reason this isn't 4 stars is because it's so overtly polemical. Here are some examples of their positions. 1) Metaphysics should be constrained by science. 2) Most modern metaphysics is repackaged scholasticism. To be fair, the authors come right out and apologize beforehand, but it doesn't make it go down any easier.

"Every Thing Must Go" is worth reading for two reasons. First, for it's presentation of "Ontological Structural Realism" as a solution to the realism/anti-realism debate (I refer again to Pigliucci's blog for a fuller explanation.) There are two concerns in that debate, both related to the unobservable entities of scientific theories (think quarks and smaller.) First, there's something philosophers call the "pessimistic meta-induction". This asks "why should we take the unobservable objects to which current scientific theories refer to be true when those have changed so drastically from older 'successful' theories to new 'successful' theories?" From the other direction, the "no miracles" argument asks, "how could our theories be so successful if they aren't actually about the world?" Structural realism answers in the following way. The mathematical structures of both rejected and current theories are isomorphic with one another. This weakens the claim of the meta-induction because older theories are, in one sense, consistent with newer theories. SR claims that they are consistent just because they represent structure that is found in the world. Therefore, in spite of drastic theory change, we are justified in believing our theories are structurally true. Getting that right is what makes our theories instrumentally successful.

Second, Ladyman makes an interesting case for the explanatory usefulness of OSR to broader metaphysics. Ladyman's OSR is unique in its strong claim that objects depend on structural relations for their existence. By analyzing structure as the most fundamental bit of nature, causal relationships are eliminated at the deepest level of physics. However, in the special sciences (psychology, geology, etc.) the concept is retained because of the "temporal asymmetry" that exists in the phenomena they are explaining. As a result, those disciplines cannot be reduced to the physical sciences. In the blog I link above, Pigliucci expresses his excitement about this. Because, as Ladyman claims, OSR is supported by the best and latest theories in physics, then the block to reductionism comes straight from physics. As a proponent of non-reductivist naturalism, he thinks this is a great thing. For that alone, the book is worth reading.

Onto the negatives. Ladyman argues that "no hypothesis that the approximately consensual current scientific picture declares to be beyond our capacity to investigate should be taken seriously." In other words, claims to know things beyond the scope of what science can tell us should be rejected. Epistemology should guide metaphysics, therefore metaphysics should be only concerned with things science can talk about, i.e. physical things.

I'm strongly inclined to reject this view. Space here is limited but, to put it simply, they beg the question against non-naturalistic metaphysics by limiting what kind of questions are "answerable" prior to investigation. In other words, the issue up for grabs is whether or not there are non-physical facts about reality that metaphysics is suited to talk about. Ladyman argues that there are not, and supports his claim with a verificationist argument. This will be an unfair statement because I can't fully defend it in this space, but I don't believe that they adequately justify their verificationism, ignoring obvious and legitimate objections from philosophers like George Bealer (2000).

Unfortunately, this is the starting point of the rest of the text, and their strongest argument on behalf of OSR over other forms of realism. OSR overturns standard relational ontological categories and definitions. In standard ontology, structure is a relational property that is dependent on relata (objects/entities). Ladyman asks us to make the counterintuitive shift toward analyzing structure as basic. Entities/objects are either nonexistent or are dependent on relational properties for their existence. The warrant for this shift is that OSR is the best realist explanation of modern physical theories. For example, they argue that quantum field theory, where we analyze the bits that make up the field by first analyzing the field, is more consistent with their ontology than standard categories. They take this explanatory consistency to justify their verificationist constraint on metaphysics.

Even were we to grant that QFT makes better sense on OSR, and that claim is disputed [see Morganti (2004), Morganti (2011), and Cao (2003)], their position on the relationship between physics and metaphysics is flawed. Science cannot justify the trustworthiness of its own claims. The scientific method can't answer whether or not we should use the scientific method when trying to understand the natural world. Philosophy (or "R"eason) provides the warrant to accept conclusions delivered by the scientific method. The realism/anti-realism debate, in which we ask whether theories are actually about the world or only useful fictions, is just the kind of question science doesn't have the tools to answer. It's a philosophical question. Even if we grant that science *could* answer it, there is another difficulty raised for their argument. Whenever philosophy and science make claims about the same things, philosophical claims will be more authoritative. We should either ignore whatever perceived evidential support science provides on behalf of OSR, or at best we should consider it only after we have considered it philosophically.

There are good philosophical reasons to hold onto traditional ontological categories, and scientific theories will not weigh against those. Therefore, we should be skeptical about OSR until other reasons are given to reject traditional ontology.
Profile Image for Yumeko (blushes).
263 reviews43 followers
Read
July 27, 2022
God, I hate not having the pre reqs to understand a book I thought I had the pre reqs to understand.
Profile Image for Matt.
231 reviews34 followers
December 6, 2012

I want to say two things before I get into the review. Firstly, this was one of the most technical philosophy books I've ever attempted to tackle head-on, and consequently a whole lot of it was either beyond me or right at my limits. The philosophy I can mostly follow, albeit with effort, but the nature of this particular project calls for a lot of heavy science. Some of that I can follow, but a lot of the technicalities are simply out of my scope.


As a consequence, I wound up skimming a whole lot of the book and focusing mostly on the sections that held particular interest for me. My review is going to reflect that, rather than any in-depth examination or critique of the arguments herein. If you'd like that style of review by someone far more qualified to get in-depth (and remain readable), I'd suggest Massimo Pigliucci's two-part review which you can find here and here.


All that said, there was a lot here which I found interesting. As per the subtitle, "Metaphysics Naturalized", Ladyman and Ross are putting forth an argument for a naturalistic metaphysics. They believe that modern analytic metaphysics has headed off into the clouds, totally divorced from scientific realities, and they want to change that by grounding metaphysical speculation in the best science we have at the moment.


The entire first chapter is devoted to a foundation for a rehabilitated form of 'scientism'. Normally used as a not-so-nice word, Ladyman and Ross intend their scientism to encompass a particular synthesis of empiricism and materialism. Now anyone familiar with philosophy will sense a trap, given the history with all three of these terms, but I thought they did a convincing job of de-fanging the above of their most insidious consequences (of which I'm thinking the Vienna Circle's logical positivism and verificationism).


By re-framing verificationism as a less-noxious epistemic criterion (rather than its former role as arbiter of meaning) and adopting what they call a "dialectical empiricist materialist stance", which is more like a set of good ideas than a hard-line commitment to either position, they hope to center science -- physics, to be precise -- at the center of an naturalistic metaphysics. This is a necessarily weak metaphysics, one in which or statements about metaphysics only make sense if they can be taken seriously by physics. This rehabilitated scientism sets the tone for the rest of the book.



The most general claims of this book can be summarized as follows. Taking naturalism seriously in metaphysics is equivalent to adopting a verificationist attitude towards both science and metaphysics. On the basis of this we arrived at the scientistic stance, where this is our dialectical combination of realism and empiricism. This in turn, when applied to the current near-consensus in science as a body of input beliefs, yields the details of our information-theoretical structural realism as a body of output (metaphysical) beliefs.



I don't have anything like the background needed to level any severe criticisms towards their arguments, but even with my meager training I do find these ideas at least agreeable. Centering our metaphysical playground on our best-possible empirical knowledge of the natural world (which we'd define as that world accessible by scientific inquiry) strikes me as a good idea, and even though the shortfalls are innumerable and the critics will find many a loophole, this strikes me as the least-bad of our choices if we want to presume science isn't a total fiction. Ladyman and Ross are well aware of this from the outset, conceding that those not convinced by science will find little to change their minds here (although a general disagreement with science should bear itself out in a discussion of reasons, for which their later discussion of realism and anti-realism ).


Most of the remainder of the book is a detailed discussion of the book's two central prongs: ontic structural realism (OSR) and rainforest realism (RR). Frankly I'm not even sure how much of this I understood myself, but I'll do my best to briefly summarize.


The title, "Every Thing Must Go", is no arbitrary choice. OSR is an ambitious idea from philosophy of science which suggests that, "at the bottom", there is nothing -- literally, no thing. The entire metaphysical categories of "objects" and "causation" simply do not exist. Instead, we're left with a network of mathematical relations; these are what our physical theories actually refer to when we talk about say general relativity or quantum mechanics.


Now this might strike you as typical metaphysical nonsense, telling you that your table and your dinner, which you see right in front of you, can touch, taste, and so on, isn't actually there. Fortunately OSR is not going to make any such claim. Since OSR is centered on our best understanding of physics, which includes quantum strangeness, we can't quite say that substance and causation are fundamentally real, but OSR is also predicated on a division between physics and special sciences (i.e., everything except physics).


Since the relations (or patterns) posited by OSR are the basis of reality, and those patterns are also scale-relative, we find that our patterns can still make claims to "realness" despite having no fundamental substance. The objects and events talked about by special sciences have their own existence provided we restrict their being to the appropriate scale. It wouldn't make much sense to talk about Mt. Everest on the scale of atoms, or a living cell by reference to galactic masses.


In the special sciences, the properties and attributes under study, which includes "thing-ness" and cause-and-effect, are patterns operating at a specific scale. There is "nothing" at the bottom, but as a persistent pattern at the human scale, your dinner table is as real as anything. By extension, time symmetry (i.e., movement from past to future), and the realms of biology and perhaps even mind, may all have their own properties which are bounded by, but nevertheless distinct from, the deepest real patterns of the universe. This even provides the possibility of an interesting reply to issues of personal identity; by framing our "I" as a pattern persistent over time, we find that many concerns about transporters and ships of Theseus may have more definitive answers than a "metaphysics of stuff" would allow.


This diversity of scale-relative patterns is the basis of Rainforest Realism, a name which has an interesting origin. Ladyman and Ross find Quine's 'desert' of materialism to be an impoverished world in which nothing else exists besides particles and fields of force. RR instead postulates a lush ecosystem of metaphysical entities, in the form of these real, persistent, scale-relative patterns of being.



Our relationship with common-sense realism is not straightforward. On the one hand, our ontology makes room for everyday objects and treats them on a par with the objects of the special sciences. On the other hand, we attribute no epistemic status to intuitions about ontology derived from common sense, and in particular we deny that scientific ontology is answerable to common sense, while insisting that common-sense ontology is answerable to science. We take it to be an empirical question for any particular common-sense object whether it is a genuine real pattern, and so eliminativism about, for example, tables or mental states, cannot be ruled out a priori. If cognitive science concludes that mental concepts do not track any real patterns then the theory of mind will have to go.



Translated into lay-speak, this would say, roughly, that everyday objects (e.g. tables, dinner) are real and what we investigate when we do science. But our everyday intuitions about what is (or is not) real have no bearing on science's discoveries, and those discoveries are not obligated to agree with what we might believe about the world. Whether or not an object we believe to be "real" actually is real, as a persistent pattern, is a matter for science to discover, and thus we have no guarantee that the things we currently believe to be real will remain so -- which includes our own mental states.


As to what these patterns might be, well, the book devotes a decent bit of space to Shannon's information theory and as well as thermodynamic entropy. OSR and RR, together, become Information-Theoretic Structural Realism (ITSR), and I got the impression that Ladyman and Ross were connecting information to real patterns on a deep level, per their project to assign primacy to physics and its hypotheses, but I have to confess that this is one part of the book that became technical and which I skimmed in healthy portions. I'm content with a layman's understanding that "it's information", in a strict, mathematical sense of the word.


I particularly enjoyed how Ladyman and Ross concluded with a brief discussion of how their project relates to Kant's theory of knowledge. I've maintained that, despite the obvious and not-so-obvious shortcomings, Kant had more insight into the nature of reality and knowledge than might be immediately apparent. It was interesting then to see how closely The Critique of Pure Reason parallels a cutting-edge science-driven metaphysical position, right down to the active role of humankind in constructing knowledge and the division between phenomenal and noumenal realms.


There are important differences, of course. It is the collective set of institutional activities that we call "science" which are charged with determining what propositions should be taken seriously, rather than appearances synthesized the transcendental subject. Perhaps more importantly and contra Kant, Ladyman & Ross argue that science can discover the fundamental structures of reality, independent of our mental constructions, and our theories about these structures may well change as science improves (as contrasted with Kant's a priori universals and unknowable noumenal world). Nevertheless, we are left with an interesting set of similarities and all their implications.


My overall impression is of a surprisingly rich, elegant, and dare I even say exciting conception of metaphysics, one which is starkly at odds with popular materialism of the desert-dwellers, and yet grounded in our best current science. That is a formidable case, and one that leaves ample room for a diverse and human -- rather than sterile, mechanical, algorithmic -- world.


I will reiterate that this is not casual reading, and it is not for those without some background in both philosophy and science. While Ladyman and Ross do a good job of summarizing the contemporary debates and put forth (what I read as) a well-reasoned argument, those without some foundations in metaphysics and philosophy of science will find this rough going.


In addition, I only touched on a bare fraction of what's actually here, and I feel like I could spend months in close-reading to tease out even more tidbits. I might suggest you read Massimo Piglucci's discussion of the book to find the important take-aways, if you're interested in the subject but aren't quite ready to sink your teeth in here.

Profile Image for Sharad Pandian.
435 reviews167 followers
April 16, 2018
Phew, what a book. It's dense, and to properly follow it you'll need to understand advanced physics, information theory, and some background philosophy of science about the realism debate. I really am not proficient in any of these as much as I would like, so this "review" is more about the general points made rather than close engagement (but in my defense, this seems to be the case for most other reviews of the book, so I'm not going to sweat it)

-

The main thrust of the book is that a lot of metaphysics that happens in analytic philosophy is completely misplaced, because the philosophers are using "neo-scholastic" and "A-level chemistry" notions instead of paying close attention to what cutting-edge science says. While they think philosophers are completely alright looking at human life and the manifest image, it is simply untenable to pretend to care about objective reality while proceeding this way:

"People who wish to explore the ways in which the habitual or intuitive anthropological conceptual space is structured are invited to explore social phenomenology. We can say ‘go in peace’ to Heideggerians, noting that it was entirely appropriate that Heidegger did not attempt to base any elements of his philosophy on science, and focused on hammers—things that are constituted as objects by situated, practical activity—rather than atoms—things that are supposed by realists to have their status as objects independently of our purposes—when he reflected on objects. We, however, are interested in objective truth rather than philosophical anthropology. Our quarrel will be with philosophers who claim to share this interest, but then fail properly to pay attention to our basic source of information about objective reality."

They reject various usual ways of thinking about different objects and sciences as fitting neatly within each other through reduction relations. Instead, they offer their own view of "Ontic Structural Realism" that they claim sits comfortably between naive realism and Bas Van Fraassen's constructive empiricism. It aims at consilience of the various sciences, a way of unifying the various scientific domains of inquiry, in opposition to disunity theorists like Nancy Cartwright and John Dupre. At its heart are two principles:

1. Principle of Naturalistic Closure (PNC):
"Any new metaphysical claim that is to be taken seriously at time t should be motivated by, and only by, the service it would perform, if true, in showing how two or more specific scientific hypotheses, at least one of which is drawn from fundamental physics, jointly explain more than the sum of what is explained by the two hypotheses taken separately."

2. Primacy of Physics Constraint (PPC):
"Special science hypotheses that conflict with fundamental physics, or such consensus as there is in fundamental physics, should be rejected for that reason alone. Fundamental physical hypotheses are not symmetrically hostage to the conclusions of the special sciences."

These might sound serpentine, but the first functionally seems to be "pay attention to fundamental physics only for our first-level ontology" and the second seems to be "nothing that violates fundamental physics can be allowed into our ontology".

While these might seem to be a blatant case of Physics Chauvinism, Ladyman and Ross are not hostile or dismissive of special sciences at all. They subscribe to the scale relativity of ontology, and so think the current ways of engaging in the special sciences is completely legitimate, only they shouldn't be considered ontologically fundamental, in the particular sense that the generalizations of Physics are held to hold more generally.

-

By looking at various cutting edge theories and arguments, they argue that objects simply don't form a part of the modern ontology for fundamental physics and so should be treated at best as bookkeeping. Instead, what is actually fundamental are real patterns, which are projectile patterns in the world. They cash this out using Shannon's information theory, such that the patters are identified using locators, where "A locator is to be understood as an act of ‘tagging’ against an established address system". Since there are multiple possible address systems, the scale relativity of ontology follows naturally.

This "information-theoretic" approach does a lot of the heavy-lifting, and is a metaphysical set-up, not merely an epistemic one. This is what justifies the claim that this theory is just Van Fraassen's constructive empiricism with modality, namely the real patterns. So there are all sorts of actual locators which we don't and might never have access to:

"When the classical and logical empiricists thought about an observational relation between some x and some P, they thought of x as a cognitive agent and of P as an object of predication. When we think about a relation of informational connectedness between some x and some P, we are thinking about both x and P as points (nodes) or regions (interconnected sets of nodes) in networks. Some relevant x’s are indeed cognitive agents, but many are not. Many perspectives are unoccupied, or occupied only by very stupid agents. (Increasingly many perspectives are occupied by information processors that are more powerful than humans or networks of humans; large computers open new informational channels) Which perspectives an agent occupies—which x she instantiates, if you like—is partly a function of the inferences available to her, which is in turn a function of both her computational capacities and her position in the network of information flow."

-

One source of support for the existence of real patterns across theory change is how many equations of past theories can be shown to be limiting cases of newer ones (paradigmatically, Einstein's GR and Newton's gravitation). This way there really is a continuation of pattern detection:

"The structural realist is only claiming that theories represent the relations among, or structure of, the phenomena and in most scientific revolutions the empirical content of the old theory is recovered as a limiting case of the new theory."

This view deflates what laws and their necessity consist of:

"Thus, for example, if we say that Heisenberg’s Uncertainty Principle is a law of nature, we do not mean that it is true in every logically or semantically possible world; despite its modal inflections, science does not aim to describe such worlds. We mean only that the Uncertainty Principle happens to constrain possible measurements everywhere in this world, the one world in which we can take measurements."

This goes back to PPC, since the regularities associated with lawhood does support counterfactuals (being projectible), but the only necessity is that they are more general restrictions. So other scientific fields take care of not violating physical laws, and that's the full extent of what we can say about the relationship between sciences.

-

I find this book fascinating because I can't think of anything wrong with it, although I do have an amateur's unease about:

a. how its information-theoretic approach seems to paper over some of the tricky questions of theory change and theory-dependence of observation.

b. whether the notion of fundamental physics having more universal applicability can actually be sustained, especially given the scale relativity of ontology. Can't/Doesn't the biologist mean "all humans need calcium" in a way that is meant to be general, ie., without even implicitly meaning "on earth, all humans..."? On the other hand, I guess it would be strange in terms of actual scientific practice to think of biology cosmologically. But Biologists still don't take care to not contradict any Physics in any serious sustained way (and how could they, they usually don't tknow what cutting edge physics says). So a disunity account might not be quite ruled out, but I need to read and think more about this point.

Additionally, I find the actual job of finding out "real patterns" across theory changes (eg: George Smith's "Closing the Loop") far more interesting than simply gesturing at a general enough metaphysics. But I guess this is important as theoretical background even there, particularly since there are all sorts of other views floating around.
119 reviews2 followers
September 11, 2023
This book was too technical for me to really follow the arguments, but I think I got the outlines and I really like the driving ideas.

The starting point is the question of why we should think about metaphysics. Metaphysics needs to be justified because it ostensibly consists of trying to learn the way the world is; but we already have a discipline that does this, it's called science, and it has a much better track record (by just about any measure of success). So the point of metaphysics is taken to be the unification of science.
One of the important things we want from science is a relatively unified picture of the world. We do not assert this as a primitive norm. Rather, we claim, with Friedman (1974) and Kitcher (1981), that it is exemplified in the actual history of science... We refer to the articulation of a unified world-view derived from the details of scientific research. We call this (weak) metaphysics because it is not an activity that has a specialized science of its own. In case someone wants to declare our usage here eccentric or presumptuous, we remind them that we share it with Aristotle.
Scientific practice seeks a unified view, but it does not dedicate specialists to perform the unification (you might think physicists are such specialists- more on that later). Ladyman et al. propose that philosophers do the job. In order to be of service to scientists in this way, philosophers need to listen to the best science, instead of speculating from intuition.
A lot of metaphysics has been performed from the armchair even though we know this is a bad way to try to understand the world: The criteria of adequacy for metaphysical systems have clearly come apart from anything to do with the truth. Rather they are internal and peculiar to philosophy, they are semi-aesthetic, and they have more in common with the virtues of story-writing than with science
This is really polemical! But it's also very, very true. As the authors will show, a lot of modern "neo-scholastic" metaphysics starts from premises that are totally deprecated by modern physics. (Aside: it might still be worthwhile to pursue neo-scholastic metaphysics! It would just be for the purpose of thinking out "hard magic systems" rather than telling scientists what to do)

In particular, people get stuck on thinking there have to exist individuated objects (the "things" in Every Thing Must Go) as ultimate building blocks to which everything else must be reducible, an asymmetrical flow of time, the Principle of Sufficient Reason, and so forth. The arguments against these things actually make use of pretty standard twentieth-century physics. Quantum mechanics, and in particular Bell's theorem, casts doubt on whether it makes sense to talk about quantum states/particles/properties having any kind of definite existence independent of the structures and relations that define them. Relativity, and in particular the relativity of simultaneity, casts doubt on whether it makes sense to even talk about a single "present" that divides the past from the future. Yet metaphysicians have continued to argue for things like atomic reductionism and presentism.
Precisely what physics has taught us is that matter in the sense of extended stuff is an emergent phenomenon that has no counterpart in fundamental ontology. Both the atoms in the void and the plenum conceptions of the world are attempts to engage in metaphysical theorizing on the basis of extending the manifest image.
More withering is this point:
If it really doesn’t matter that classical physics is false then we might as well do our metaphysical theorizing on the basis of Aristotelian or Cartesian physics.
At this point, I was totally on board that metaphysicians should listen more to scientists, but it remained to be seen that the authors could demonstrate that their alternative proposal (for what metaphysicians could do) would be in fact possible or genuinely useful. So I'll turn to their positive program.

The first thing you need to do is distinguish science from non-science. I think that their proposal here is practical, given the difficulty inherent in the demarcation problem.
Science is, according to us, demarcated from non-science solely by institutional norms: requirements for rigorous peer review before claims may be deposited in ‘serious’ registers of scientific belief, requirements governing representational rigour with respect to both theoretical claims and accounts of observations and experiments, and so on... Since science just is our set of institutional error filters for the job of discovering the objective character of the world—that and no more but also that and no less— science respects no domain restrictions and will admit no epistemological rivals (such as natural theology or purely speculative metaphysics). With respect to anything that is a putative fact about the world, scientific institutional processes are absolutely and exclusively authoritative... To reiterate: we assume
that the institutions of modern science are more reliable epistemic filters than are
any criteria that could be identified by philosophical analysis and written down
. Note that we do not derive this belief from any wider belief about the reliability of evolved human institutions in general. Most of those—governments, political parties, churches, firms, NGOs, ethnic associations, families, etc.—are hardly epistemically reliable at all. Our grounding assumption is that the specific
institutional processes of science have inductively established peculiar epistemic reliability.
As good naturalists, their justification for the taking this institutional point of view is the very empirical success of the institutions. No other point of view will do given the premise of naturalism. This motivates their more precise statement about what metaphysics is all about, the Principle of Naturalistic Closure (PNC):
Any metaphysical hypothesis that is to be taken seriously should have some identifiable bearing on the relationship between at least two relatively specific hypotheses that are either regarded as confirmed by institutionally bona fide current science or are regarded as motivated and in principle confirmable by such science.
There are a few other constraints on what would constitute a proper metaphysics.

One is the Primacy of Physics Constraint (PPC). Metaphysics has to explain why physics has an asymmetrical relationship with the special sciences, but it also has to vindicate the special sciences as doing actual science even though it's not "fundamental". Note that defining why physics is "fundamental" in this way can't be a form of reductionism anymore, because that violates the PNC as discussed before.

Another constraint is that it has to explain why scientific progress is possible given the Pessimistic Meta-Induction. On a naive realist view, science tends to disagree with its past self about what sorts of things exist. So we would need to find a warrant for believing in whatever we say current science says exists.

Opposite to this constraint is the "No Miracles" argument. We have to explain why science is so successful if the naive realist view is false. If science isn't about things that actually exist, then why are we capable of making predictions that come true?

Ladyman et al. put forth "ontic structural realism" (OSR) as a basis for a sound metaphysics given these constraints. What is conserved across scientific revolutions is the abstract (read: mathematical) structures of theories, not any objects that theories are said to be "about". As theories are revised, there is a sort of homomorphism between the old and the new. Think, special relativity maps onto Newtonian gravity via the limit v->c; nothing is said about the existence specific objects obeying either of these structures. For example, the issue of whether photons exist as particles in Newtonian gravity (and therefore whether such Newtonian photons would be relegated to non-existence in the updated theory) is dodged. So, this is how the Pessimistic Meta-Induction is addressed. Structures accrete, objects are subject to agnosticism. As far as the "No Miracles" argument is concerned, the "ontic" in "ontic structural realism" denotes a commitment to objective modality. That is, science needs to make accurate predictions about things that haven't happened or might not happen; it needs to support counterfactuals about our world. So there really must be such things as "the way the world would be had we done X". And we really are referring to this world, not "possible worlds": "possible measurements everywhere in this world, the one world in which we can take measurements". So the structures we are assigning "reality" to must include such modality.
Objective modalities in the material mode are represented by logical and mathematical modalities in the formal mode. All legitimate metaphysical hypotheses are, according to us, claims of this kind. A metaphysical hypothesis is to be motivated in every case by empirical hypotheses that one or more particular empirical substructures are embedded in (homomorphic to) particular theoretical structures in the formal mode that represent particular intensional/modal relations among measurements of real patterns.
At this point, I was wondering how these structures would hook into the empirical world. So far, there hasn't been a discussion of what separates an actually existing structure from a nonexistent one. I was anticipating that the argument might crumble as it was forced to tie itself back to putative "objects". As far as I can tell, the empiricist or verificationist commitment is addressed by the authors by using the idea of a "locator". "Locators" are pragmatically defined (so, their meaning is their use) and purport to direct measurements to some part of the world. So, they are objectively existing words, coordinates, etc. If the abstract data that real scientists obtain from a locator has a certain structure, then that structure is a candidate for a "real pattern" (i.e. an actually existing structure).

What distinguishes "real patterns" from patterns in data is their counterfactual robustness, their non-redundancy, and their informational content. I found a helpful example here in Conway's Game of Life. Now, Life does have a fundamental ontology of "cells" to which everything is reducible, but that's not the aspect that interests us here. What's interesting is the patterns that researchers have named, like "blinkers" and "gliders". Gliders are objectively "patterns" because even though they admit a low-level description in terms of the rules of Life, they can be described in a more compressed way if their macroscopic pattern is described directly. It is part of the definition of a "real pattern" that its structure can be described using fewer bits of information than it would take to describe its substrate (and the substrate would be another real pattern- it's "real patterns all the way down", say Ladyman et al., since science can't gather evidence for any other kind of thing). The informational characterization of "real patterns" has the benefit that it's PNC-compliant; scientists have proposed information as part of the basic structure of the world.

One of the benefits of this metaphysics is that it accounts for "scale relativity of ontology". For example, atoms, humans, and economies would (modulo any updates to current science) qualify as "real patterns". Different ontologies reign at different parameter values (e.g. time and length scales). This evokes what "disunity theorists" like Nancy Cartwright maintain (that there is no "ultimate ontology" at all), but disagrees in the sense that it holds out hope for the existence of a true physics. Fundamental physics would be
a set of mathematically specified structures without self-individuating objects, where any measurement taken anywhere in the universe is in part
measurement of these structures.
That is, any locator at all should point to real patterns studied by fundamental physics. So instead of "disunity", we simply have different genuine science at different scales, in a way that satisfies the PPC.

There are lots of interesting applications of these ideas. One is to discuss to eliminate data inaccesible to science from metaphysics.
if the universe is limned by a singularity, as physics suggests it is, then the explanation of the fact of the universe’s existence cannot be speculated upon in a PNC-compatible way; speculation here is empty.
Another is a speculation on how to save the idea of causality within the special sciences even though it may lack global physical justification. Why is it that in our mesoscale lives, things appear to have causes?

For a non-reductionist, the failure of causal concepts to appear in generalizations of fundamental physics doesn’t imply that these concepts simply denote fictions. As discussed in 4.5, all our evidence tells us that we live in a region of the universe—which might or might not be coincident with the whole universe—in which the degrees of freedom of every system we approximately isolate for measurement and projection is restricted by various asymmetries. This region is highly isotropic, and supports robust (that is, projectible across all counterfactual spaces within the boundaries of
the region) distinctions between sources and recipients of information. That is sufficient to establish the scientific utility of directional flow [in our region]. This in turn implies that many of what we refer to as causal processes are real patterns, even if this does not mean that they exemplify any extra-representational real pattern of general ‘causation’.
Overall, it is deeply satisfying to see lucid discussion of the way our world might be that actually takes into account what real, up-to-date science has to say about it.
Displaying 1 - 6 of 6 reviews

Can't find what you're looking for?

Get help and learn more about the design.