Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Generalized measures were introduced by Gleason [2] in the context of his characterization of the measures definable on the closed linear subspaces of Hilbert space. The analysis of the three-dimensional case proved to be fundamental. For this case, a generalized measure is a map f from the closed linear subspaces of H3 to the closed unit interval satisfying the conditions

$$ fa + fb\; \le \;{1} $$

for ab, and

$$ fa + fb + fc = {1} $$

for any three rays a, b, c which are mutually orthogonal. A generalized two-valued measure takes values in {0, 1}.Footnote 1 The interpretation of such measures as probability measures arises when the rays are taken to represent propositions; and a generalized two-valued probability measure is a generalized truth-value assignment when 0 and 1 are interpreted as Truth and Falsity. It is evident that generalized two-valued measures and generalized truth-value assignments are formally interchangeable with one another, whatever the conceptual differences between probability measures and truth-value assignments.

The focus of this paper is a particular feature of the statistical behavior of elementary particles, simple composite systems of them and the quantum probability theory to which their behavior gives rise. This feature was given its canonical formulation by Kochen and Specker [4] in the course of an investigation of the problem of hidden variables. It is captured by their principal theorem (Kochen and Specker [4], Theorem 1) and the discussion of it in Section IV of their paper; and it consists in the fact that there exist simple systems of particles and finite combinations of propositions “belonging to them” for which no generalized two-valued measures are possible, where a proposition belongs to a particle if its constituent dynamical property is a possible property of the particle. The assumption of such propositions expresses the idea that the systems which they describe are characterizable by systems of properties which are uncovered when the systems are probed. Hence the notion of a proposition belonging to a particle supports the idea that measurements reveal a particle’s dynamical properties. I will argue that the significance of the existence of systems of the sort that underlie the Kochen-Specker construction is to show that the generalized probability measures that arise in quantum mechanics are not naturally interpretable as the probabilities of propositions belonging to particles. (The notion of a proposition belonging a particle is developed further below in conjunction with the explanation of the notion of a natural interpretation.) The idea I will develop is that quantum probabilities are probabilities of “effects,” probabilities of the traces of particle-interactions with objects and processes that are epistemically accessible to us in a sense which I will explain. I hope to make it clear that such a view is not committed to anti-realism about the micro-world and that it illuminates at least one otherwise paradoxical feature of quantum mechanics.

I should emphasize that the focus of this paper is not the notion of an effect as it pertains to the study of effect algebras, but the question whether the probabilities that arise in quantum mechanics should be understood to apply to propositions which express the properties of particles or whether they should be understood to apply to propositions which express the effects of particles. The issue I intend to address is not whether the probabilities of quantum mechanics concern propositions in any sense, but whether they concern propositions belonging to particles. I will argue that they do not, and I will explain the kind of proposition with which they are concerned. The terminology of effects offers a simple mnemonic device with which to mark this different kind of proposition.

It is true that classical systems are themselves composed of particles whose quantum probability measures have the peculiarities just noted. And it is also true that quantum mechanics is essential to the correct theoretical description of classical systems. However, I wish to defer the questions ‘How should classically described systems be subsumed under quantum mechanics?’ and ‘How do classically described systems enter into the measurement process?’ A burden of the discussion to follow is to clarify the view that there is a basic conceptual difference between classical states and quantum states, between what is represented by a point in phase space and a vector in Hilbert space. The premature consideration of the measurement problem and the quantum theory of classically described systems has a tendency to mask this conceptual difference. For these and other reasons, the conceptual issues raised by these questions are best taken up after the impossibility of two-valued measures has been considered. Although I will not argue directly for this thesis here, the discussion which follows is intended to support the view that the ψ–function represents a state of belief about a system rather than its physical state.Footnote 2

The discussion of measurement and the issues it raises can be deferred, as can the discussion of the relationship between classical and quantum states, since even if quantum mechanics is a theory of the fundamental constituents of matter, the evidence for the theory can perfectly well come from our experience with things for which we do not possess a quantum-theoretical account. And in fact the phenomena relevant to the present discussion were either known prior to the theory’s discovery and elaboration or are easily elicited with only a very modest contribution from developments that the theory initiated. Our problem is how to understand the real possibility of physical systems for which there are finitely many direction-dependent propositions,

$$ ({P_\alpha }^*){\hbox{ The square of the spin in the direction }}\alpha \; \ne \;0, $$

which are so related that there is no generalized two-valued measure definable on them. These direction-dependent propositions are arranged in families of Boolean algebras of which the largest families are generated by three atomic propositions, each associated with one of three mutually orthogonal directions x, y, z of ordinary physical space. (The propositions P α * are “co-atoms,” the Boolean complements of algebraically atomic propositions.) To each such family there corresponds an operational procedure which is interpretable as offering a means of detecting which of the propositions of the Boolean algebra are true and which are false. This is not merely a theoretical possibility, but one that comes close to being actually realizable in the laboratory. Kochen and Specker [4] show this for an atom of orthohelium whose total angular momentum is given by its spin. The atom is a spin-1 system whose spin components in three mutually orthogonal directions of space are not commeasurable, but whose square of the spin components in any three mutually orthogonal directions are commeasurable.

The canonical operational procedure for the measurement of a component of spin is a Stern-Gerlach magnet which splits a beam of spin-1 systems into three groups, each corresponding to one of the values −1, 0 or +1. The ideal operational procedure for the direct measurement of a square of the spin component is wholly different from that used for a measurement of a component of spin; it must employ an electric rather than a magnetic field, since it is only in the absence of a magnetic field that the Hamiltonian of the system preserves certain of its essential symmetry properties. Such a direct measurement of a square of the spin component distinguishes spin values of 0 from spin values of +1 or −1, but does not distinguish between the latter two possible values, which is what accounts for the way the propositions P α * are formulated.

The ideal measurement procedure for the square of the spin produces an electric field with the crystalline form of an octahedron. Such a field occurs naturally in a crystal of nickel Tutton salts consisting of an ion surrounded by an octahedron of water molecules. An ideal measurement procedure for the atom of orthohelium thus emulates the nickel ion’s environment in the salt crystal by subjecting the orthohelium atom to an external electric field of the same rhombic symmetry as the field inside the crystal. As noted earlier, the use of an electric rather than a magnetic field is important for preserving the spin-Hamiltonian. In the case of nickel Tutton salts it is standardly assumed “that, in the absence of a magnetic field, [the Hamiltonian of the crystal exhibits] rhombic symmetry [; i.e.,] it is possible to choose rectangular co-ordinates Ox, Oy and Oz such that the Hamiltonian is invariant under rotations through π about Ox and Oy” (Stevens [8], p. 238). An ideal test procedure for the square of the spin of the atom of orthohelium would allow one to probe experimentally the behavior of the orthohelium atom as the apparatus is rotated, and the external field turned off and on, by observing the shifts in the electromagnetic spectrum of the atom. One infers the directional properties associated with the dynamical magnitude, square of the spin in the direction α, for mutually orthogonal directions α, from the spectral shifts which result as the atom is subjected to the electric field by the measurement device.

By contrast with actual experiments with a nickel ion in a salt crystal, in this idealized experimental situation involving the atom of orthohelium, the field generated by the measurement device can be applied at orientations chosen at the discretion of the experimenter, with each orientation corresponding to a different orthogonal triple of axes of symmetry. The application of every such test procedure determines that exactly one of the propositions P α *, for α = x, y, z is false, and exactly two are true. But on the hypothesis that the families of propositions are related in the way specified, one discovers as one considers all the triples of directions appealed to in the proof of the Kochen-Specker theorem that it is logically impossible that there should be an assignment of truth-values to all these propositions that respects this observation.

Now it may be that a complete understanding of the conceptual innovation occasioned by the discovery of systems whose behavior shares this feature of the behavior of an atom of orthohelium will require invoking the quantum theory of the measuring instrument and other classically describable systems. But this does not affect the point that the interpretive problem raised by the statistical behavior of orthohelium is conceptually separable from any such account. The interpretive puzzle depends on the possibility of forcing an interaction with an electric field of specified character and then noting how the interaction gives rise to changes in the atom’s electromagnetic spectrum. All of this is characterizable at a pre-quantum-mechanical level of description. In light of these considerations, it seems reasonable to conclude that the problem posed by the Kochen-Specker theorem—the problem of understanding the significance of systems whose propositions do not admit generalized truth-value assignments—is not conceptually dependent on the provision of a quantum theory of measurement.

My plan in the balance of the paper is to argue in support of a conceptual framework that provides a solution to this interpretive problem, not by providing a hidden variable theory that is a counter-example to the theorem, but by providing a framework which yields a natural interpretation of the impossibility of such truth-value assignments, where, by a natural interpretation, I mean one that does not violate any of the following three desiderata:

  • Determinacy: Every proposition which attributes a possible dynamical property to a particle (i.e., every proposition which, in the special sense noted earlier, belongs to a particle) is determinately true or false. In particular, if P is a disjunction of propositions, each disjunct of which attributes a possible point value of a dynamical variable, and if the disjuncts exhaust all possible point values, then if P is true, exactly one of its disjuncts must be true. The intuition that supports determinacy is that while it makes perfect sense, when thinking of a fictional world, to treat some propositions belonging to its inhabitants as neither true nor false, this is precluded when we are concerned not with fiction, but with reality. This is because the failure of determinacy is one of the marks that separates our concept of a fictional world from the real one.

  • Objectivity: Dynamical properties are indicated by a variety of experimental conditions and operational criteria. The methodological basis for objectivity rests on two considerations: (a) every property requires a clear physical criterion for saying when it holds and when it fails to hold; (b) an objective property must have some degree of conceptual independence from the procedures for determining its presence or absence, since the same property must be accessible in different measurement contexts and by alternative measurement procedures. For an interpretation of the theory to be based on the attribution of dynamical properties to physical systems, it is necessary that there should be a conceptual gulf between properties and the procedures which probe their presence or absence. Accessibility in a variety of experimental contexts is not only largely constitutive of what we mean by the objectivity of properties, it is also presupposed by standard forms of counterfactual reasoning about them. For example it is presupposed when we ask whether a property would have obtained had a different operational procedure been applied, or when we ask whether the property would have obtained had the presence of another property been investigated. The possibility of such reasoning is an essential component of the objectivity we associate with physical properties. In the extreme case, where each property is tied to a single operational procedure, this aspect of our concept of objectivity is given up, since admitting only a single operational procedure is tantamount to abandoning the idea that the same property may be presented differently. The connection between objectivity and the existence of a variety of operational indicators suggests that there are degrees of objectivity, corresponding to the multiplicity of different operational procedures that are indicative of a property’s presence or absence. As we will see, interpretations may differ on the degree of objectivity which they accord the dynamical properties of physical systems.

  • Observer independence: The reality that attaches to particles is an observer-independent reality; this holds as well for their physically important properties. The observer independence of particles is so closely tied to the objectivity of their properties and the determinacy of propositions involving them that it is generally assumed to be undermined when these desiderata are violated.

I will assume without further argument that an interpretation of the impossibility of generalized truth-value assignments is successful to the extent that it is a natural interpretation in the sense just explained. I intend to show that if the generalized probability measures of quantum mechanics are understood to apply to propositions belonging to particles, it is not possible to frame a natural interpretation of the theory. I will argue that there is an alternative account of the domain over which such probabilities are defined that leads to a natural interpretation. This is the interpretation of probabilities as probabilities of effects. The burden of this paper is to show that such an alternative interpretation does not violate determinacy or objectivity, and that it also satisfies observer independence. The role of the notion of a natural interpretation in the following analysis is therefore a dialectical one: it is used to show that the framework of propositions belonging to particles cannot support an account of the absence of two-valued measures without compromising determinacy or objectivity. As a consequence, such a propositional framework undermines observer independence. By contrast, the framework of effects interprets the absence of two-valued measures as a simple failure of determinism without calling into question the determinacy of physical propositions, and without compromising our conception of the objectivity of physical properties. But the true measure of the success of an analysis in terms of effects turns on its account of observer independence. This issue is taken up at the end of the paper. Let me begin by considering more closely the desiderata of objectivity and determinacy.

Objectivity, in the sense considered here, is motivated by the idea of contextuality, which has figured especially prominently in discussions of the Kochen-Specker theorem. The issues surrounding contextuality are particularly clear in the case of directional properties and the direction-dependent propositions of which they are constituents. Consider two orthogonal triples of directions in E3, (x, y, z) and (θx, θy, θz), where θx = x but θy ≠ y and θz ≠ z. The direction x in E3 is evidently independent of the family of orthogonal triples—(x, y, z) or (x, θy, θz)—to which it belongs. But in the case of direction-dependent propositions, it is not clear a priori that the identity of a proposition is independent of the other direction-dependent propositions with which its truth is evaluated. The constituent directional properties of the propositions might be associated with distinct measurement procedures, and this might be sufficient to justify distinguishing direction-dependent propositions that are associated with the same direction in space.

For example, for directions α = x, , θz, consider the direction-dependent propositions,

$$ ({P_\alpha }){\hbox{ The square of the spin in the direction }}\alpha = 0, $$

where the P α are algebraic atoms and are the Boolean complements of the P α *, considered earlier. (P α = P α **, when * is understood as the operation of complementation.) Suppose that the families of directions (x, y, z) and (x, θy, θz) are associated with distinct ideal measurement procedures, one involving the triple of directions, x, y, z, the other involving the triple of directions, x, θy, θz. The first operational procedure decides the propositions P x , P y , P z , while the second decides the propositions, P x , P θy , P θz , but there is no measurement procedure that simultaneously decides all the P α . That is, we have that P x is comeasurable with P y and P z , and with P θy and P θz , but P y and P z are not comeasurable with P θy and P θz . Then contextuality concerns the bearing of measurement procedures on the identity of propositions: Is P x the same proposition when the operational procedure by which the presence or absence of its constituent property is decided is one that measures P x in conjunction with P y and P z as when the operational procedure is one that measures P x in conjunction with P θy and P θz ? It is certainly possible that the difference between these two measurement procedures is sufficient to show that P x splits into two propositions, each with its own constituent property, one decided by an operational procedure associated with (x, y, z), the other by one associated with (x, θy, θz).

Now it is simply a fact about quantum mechanics that its statistical states are such that the probability measures they generate are non-contextual. In the present case, this means that quantum states do not distinguish direction-dependent propositions any more finely than Euclidean geometry distinguishes directions of space. But a contextual hidden variable theory is characterized by the fact that it allows for the possibility that propositions are distinguished more finely by the “hidden” (i.e., two-valued) measures such theories introduce than they are by the probability measures of quantum mechanics. The issues raised by the possibility of such theories center on whether it is justifiable to require of the hidden measures they introduce that they too should be non-contextual. Since the hidden measures are mathematically interchangeable with truth-value assignments, this is equivalent to the question whether the constituent properties of the propositions to which truth-values are assigned vary with the measurement context.

Turning to composite systems, there is also an important sense in which the statistical states of quantum mechanics are local. This is not a tendentious remark since it does not contradict the fact that there are quantum states that are non-local in the sense that they violate the inequalities discovered by Bell [9]. The claim that quantum states are local is a simple consequence of the observation that locality is a special case of non-contextuality, the case that concerns the invariance of the probability measures of the theory when one leaves unchanged the local measurement context for one system while varying the test procedure for the system with which it is paired. Locality makes it difficult to invoke the modification of measurement procedures as a justification for distinguishing propositions, since one would have to distinguish propositions belonging to one system on the basis of what properties one chooses to detect by performing a measurement on the spatially separated system with which it is correlated.

For direction-dependent properties, conformity with what is allowed by the geometry of the associated rays of E3 is a natural measure of objectivity, and by this criterion, quantum mechanics accords the P α a maximum degree of objectivity since they are individuated exactly as finely as the directions of E3. Hence quantum mechanics permits a much more inclusive class of measurement procedures for a directional property than any of its contextualist rivals. I claim that it is a desirable property of an interpretation that it should preserve this feature of the theory. Although it is not a decisive objection against an interpretation that it requires a multiplicity of propositions P x on the basis of contextual considerations involving their constituent properties, it does show that within such a framework, the preservation of determinacy necessitates some sacrifice of objectivity. Relativity to the measurement context secures the determinacy of propositions belonging to particles only by compromising the objectivity of some of their constituent directional properties.

It might seem that one could confine the properties that are contextually individuated to a small subset of the directional properties we have been considering. However it is possible to show that one cannot fix in advance those properties which must be more finely individuated than the directions of space with which they are associated. To be sure, Kochen and Specker’s argument isolates a particular orthogonal triple of propositions and shows how the assumption of a truth-value assignment is inconsistent with the assumption that exactly one of the propositions of the triple is true and the others false. But the argument can also be run backwards in the sense that we can choose a different triple and proceed to construct a Kochen and Specker orthogonality graph so that the argument concludes by applying to the selected triple the observation that exactly one of the propositions P α is true, and the others false.Footnote 3 Since the choice of orthogonal triple which conflicts with this observation is completely arbitrary, in order to maintain determinacy any square of the spin property may have to be represented by a multiplicity of properties, one for each operational procedure corresponding to a relevant triple of directions. This consequence is a “paradox” of sorts when one considers the realist motivation for securing determinacy together with the fact that for realism the objectivity of physical properties is as fundamental a requirement as the determinacy of the propositions which contain them. That the same property may be presented as the property indicated by a number of different measurement procedures, and that counterfactual reasoning in association with a multiplicity of measurement procedures is legitimate, are no less indispensable to our concept of the objectivity of physical properties than determinacy is to our concept of reality.

To summarize our discussion thus far, in the context of generalized probability measures like those exhibited by quantum mechanics, the desiderata of determinacy and objectivity cannot be maximally satisfied within a framework of propositions belonging to particles; the satisfaction of determinacy involves some sacrifice in objectivity since a maximum degree of objectivity is incompatible with determinacy. As a result, the suggestion that one must give up the idea that particles have an observer independent reality has exercised a powerful appeal over both physicists and philosophers of physics.

By way of articulating an alternative to giving up determinacy, objectivity or observer independence, let me begin by separating “eternal” properties of particles from dynamical properties. Eternal properties are never lost: an electron is always a spin-1/2 particle, photons are always spin-1, etc. The possession of such properties is not brought into question by the interpretive problems that are raised by the Kochen-Specker theorem. Rather, it is the ascription of dynamical properties and the notion that they are the subject of the theory’s probability assignments that poses difficulties for a natural interpretation of the theory. The resolution of these difficulties that I will outline gives up the framework of dynamical properties of particles and the notion of propositions belonging to them and replaces it with the framework of effects. Effects constitute the domain of the algebraic structure over which probabilities regarding the behavior of particles are defined. Effects constitute the evidential basis for all our theoretical assertions about particles and simple combinations of them. They are to be thought of as the traces of particle interactions on systems for which we have “admissible” theoretical descriptions in terms of their dynamical properties. (I will return to the notion of admissibility in a moment.) Such systems are epistemically accessible to an extent that systems which are characterized only in terms of their eternal properties and their effects are not.

To see why treating probabilities as probabilities of effects allows for an interpretation of the Kochen-Specker Theorem that leaves intact the desiderata of determinacy and objectivity, recall that the problem of interpreting the theorem arose because effects were implicitly taken to be indicative of a particle’s dynamical properties. This meant that the algebraic structure of the theory was interpreted as an algebra of propositions belonging to particles, and the probabilities of the theory were understood to be the probabilities of such propositions. The objectivity of properties then demanded that had a different effect been elicited, it would have revealed that the particle had a different dynamical property. Proceeding through Kochen and Specker’s sub-algebra of possible propositions, we were led to a contradiction with the observation that for every orthogonal triple of directions x, y, z, exactly one of the propositions P α is true for α = x, y, z. By moving to the framework of effects we give up the idea that probabilities are assigned to propositions containing a particle’s dynamical properties, and focus instead on the effects to which particles give rise. Effects are determinate independently of the determinacy of propositions involving a particle’s dynamical properties, and their objectivity does not depend on counterfactual reasoning involving such properties. In a framework in which generalized probability measures are defined on effects, the problems posed by determinacy and objectivity are avoided since there is nothing in the concept of an effect to require that effects should obtain in the absence of the interactions in which they are found. The effects framework has no analogue of a state comprised of the totality of dynamical properties as there is in a classical picture of particles and their effects. In particular, there is no assumption of classical trajectories underlying the attribution of an observer independent reality to particles. The effects framework is agnostic about all such classical pictures.

Despite its agnosticism on questions of ontology, the effects framework offers a subtle account of the nature of the conceptual shift from classical to quantum mechanics. The transition to an effects framework consists in replacing the characterization of a particle by a list of its dynamical properties with one according to which a particle’s characterization has the logical form of a function: when presented with an experimental idealization of some naturally occurring situation, particles are characterized not by changes in their dynamical properties, but by the effects they produce. The fact that these effects cannot be anticipated with certainty is understood within the effects framework as the unsolvability of the following Problem of Determinism: Given a particle and a class of experimental procedures, to predict particle-effects with perfect knowledge, i.e., to predict with probability 0 or 1, uniformly and without foreknowledge of the experimental procedure to which a particle will be subjected, the answer to every question regarding the occurrence of a possible effect. The no hidden variable theorem of Kochen and Specker (together with the related theorems inspired by the work of Bell) shows that the quantum probabilities of such effects are not compatible with the existence of a two-valued measure which solves the Problem of Determinism. Applied to the example of an atom of orthohelium, this means that within the framework of effects the atom is represented not by a collection of dynamical properties but by a function which, when presented with an orthogonal triple of directions associated with the axes of symmetry of an electric field, produces an effect consisting of a shift in its spectrum. In the propositional framework, this shift is taken to be indicative of the truth of exactly two of the propositions,

$$ ({P_\alpha }^*){\hbox{ The square of the spin in the direction }}\alpha \; \ne \;0, $$

attributing dynamical properties to the atom. But this is precisely the interpretive step that is resisted by the effects framework: So far as probability assignments are concerned, there are only effects—in the present case, shifts in the spectrum of the atom—which the atom’s interaction with the field induces.

It is important to see how an interpretation in terms of effects bears on realism in view of the fact, already noted, that such an interpretation does not situate the theory within an “ontology.” But before turning to the question of observer independence, let me review an analogy which may clarify the status of realism in the present approach.

Imagine that we are concerned to construct a model of past events for which there is very little basis to assume that they resemble the events with which we are familiar. Let us also assume that the traces of these events are accessible to us only in fragments that can be examined one at a time, that the information contained in any one fragment is insufficient to determine a complete account of the events which produced the traces which comprise it, and that the traces are themselves continually changing. Assume further that the fragmentary traces do not combine to give a single consistent story regarding the events at this earlier time. Now suppose it is discovered that although the past is in this way “hidden” from us, our epistemic situation with respect to its traces is systematic and even susceptible of a relatively simple representation. Although systematic, the representation of available traces not only fails to facilitate the reconstruction of the past state of the world, but actually precludes the possibility of a consistent reconstruction on its basis. Under such circumstances we might cease looking for a representation of a past state in terms of the properties that hold of it because we will have come to recognize that there can be no convergence from present or future traces to such a representation. We might then dispense with the search for a theory of such states and focus instead on understanding the distribution of present traces, their relevance to one another, and the task of predicting their likely evolution. This would be a theory of past events of a sort, but not what we had originally imagined such a theory would be like. In particular, it would not aim to model the past, but to anticipate its present and future traces. To recover quantum mechanics from the analogy, replace traces with particle-effects, and past states of the world with lists of dynamical properties of particles. Then two things are worth noting: neither such a theory of traces of the past nor the quantum theory of effects contravenes the thesis of determinacy, and where the one theory accepts the reality of the past, the other accepts the reality of the micro-world. In each case, one has merely abandoned a familiar style of theorizing and the modeling associated with it.

Our discussion of the Kochen-Specker theorem, and our resolution of the conceptual difficulties it poses for determinacy and objectivity, is predicated on the idea that the probability assignments of the theory are not interpretable as probabilities of propositions belonging to particles; rather, there is a domain of effects which are epistemically accessible to us in a way in which dynamical properties of particles have been shown not to be, and it is these effects that are the proper subject of the theory’s probability assignments. This leaves a large residual issue that we must now address.

I have said that effects are marks or traces particles leave on certain physical systems, and these systems and the traces of their interactions constitute the epistemic basis for our evaluation of our quantum mechanical descriptions of the behavior of single particles and simple composite systems of them. But what is the status of the systems which record the effects of particles? Since they are merely complex systems of large numbers of particles, on the assumption that quantum mechanics is a truly universal and fundamental theory, shouldn’t they also fall within the purview of the theory?

The issue these questions raise is a familiar one: it is the issue the early founders of the theory addressed with the doctrine of the indispensability of classical concepts and the necessity of locating the “cut” between physical systems and observers. For them, both the question of the location of the cut and the indispensability of classical concepts arose because of epistemological considerations which had their source in the special status they supposed quantum mechanics assigns observers. They argued that the role of the observer in quantum mechanics is utterly unlike the situation in classical mechanics where it is possible to proceed on the assumption that all physical systems fall within the range of the theory, and where it is possible to treat observers as altogether absent from the application of the classical framework, except insofar as they may happen to occur among the physical systems the theory encompasses.Footnote 4 But if the very notion of an effect requires reference to observers and to a preferred, extra-quantum-theoretical, characterization of the physical systems accessible to them, an interpretation of the theory in terms of effects can hardly be advanced as one that restores observer independence to the theory’s interpretation. So although the notion of an effect may provide solutions to the problems of determinacy and objectivity, a more elaborate argument is needed to show that the notion has anything new to offer regarding the problem of observer independence.

Let me address this objection by considering first the use of classical concepts. There is a general observation which it is easy to lose sight of. It is that any description of the phenomena we wish to explain is admissible just in case reasoning in accordance with generally recognized methodological norms, we are able to reach agreement on the correctness of its application in any particular case. This is simply the non-operationalist core of the methodological framework of Einstein’s analysis of simultaneity. Provided our descriptions meet this admissibility condition, there need be nothing methodologically questionable about the continued, or even exclusive, use of classical concepts for the description of the phenomena we are interested in explaining. Our discussion of the atom of orthohelium showed that there is a family of descriptions of the relevant phenomena that are expressed in terms of classical concepts that satisfy this admissibility condition. But to concede that classical mechanics is a descriptive framework that supplies admissible descriptions of the phenomena we seek to explain does not preclude the possibility that we may uncover a framework that revises these descriptions and is in some sense more “fundamental” than the classical one. All that is required to justify the classical framework in this evidentiary role is that there should be consensus about the application of its descriptions. If this observation about admissible descriptions is accepted, the doctrine of the indispensability of classical concepts raises at least two questions: (i) In what sense, if any, is the quantum mechanical framework more fundamental than the classical one? (ii) What feature distinguishes the descriptive framework of classical mechanics and makes it not just well-suited but indispensable to the provision of admissible descriptions of at least some of the phenomena quantum mechanics is used to explain? The answer to this second question will direct us to an answer to the first, so let us begin with it.

The classical framework involves both dynamical properties and effects of the systems with which it deals. This ability to encompass a system’s properties as well as its effects is a consequence of the deterministic character of the classical framework. Here, as before, by the determinism of the classical framework, I mean that feature of it that admits the presence of dispersion-free pure states in the form of two-valued measures on the totality of propositions belonging to a classical mechanical system. The mathematical fact that two-valued probability measures are interchangeable with truth-value assignments entails that it is always possible to represent the state of a classical system in terms of the totality of its dynamical properties: these are the constituents of the propositions which a truth-value assignment (on the Boolean algebra of propositions belonging to the system) maps to Truth. In the case of quantum mechanical systems of particles and simple combinations of them, the absence of such measures led us to interpret the theory’s probability assignments as probabilities of their effects, rather than of propositions involving their dynamical properties. But to describe the effects of particles we need a framework whose systems are represented by their dynamical properties, since it is the properties of these systems that constitute particle-effects.

From the perspective of the effects framework, the conceptual dependence of quantum mechanics on classical mechanics is a result of the conceptual dependence of descriptions of the systems which record effects on descriptions involving the dynamical properties of these systems. This kind of conceptual dependence, does not preclude the application of quantum mechanics to systems that record effects. Although it is largely a matter of convenience which systems are, and which are not, taken to record effects, it is not wholly a matter of convenience. The development of the quantum theory shows that there are systems that are resistant to a satisfactory classical description—this is the content of Kochen-Specker—and as a result, such systems lack admissible descriptions of the kind we require for the description of the phenomena the quantum theory is invoked to explain.

As for the cut between classical and quantum systems, this also is mandated by the fact that quantum mechanics is a theory devised for the explanation of the behavior of systems that are inherently indeterministic in the sense that (i) they are represented in the theory by their non-dynamical properties and their effects, and (ii) their effects are such that they do not admit a solution to the Problem of Determinism. So far as the conceptual issues raised by the interpretation of the theory are concerned, the basis for the cut lies in the methodological demand for admissible descriptions of the appearances we hope to save.

Although the framework of effects assumes that there are two distinct kinds of system, this is compatible with the thesis that reality is unitary and quantum-mechanical. The reason for this compatibility is that the notion of a system is relative to a theoretical representation. There can be both classical and quantum mechanical systems because to assert that there are classical systems is to claim that theoretical representations expressed in the framework of classical mechanics yield what I earlier characterized as admissible descriptions. Nothing in this formulation precludes the possibility that a representation hitherto formulated within classical mechanics might be replaced by a quantum mechanical representation. Whether reality is “captured” by classical mechanics is a separate question, one whose answer may be ‘No’ compatibly with the descriptions of classical mechanics being admissible.

Since the effects framework locates the classical-quantum cut at the level of differences in theoretical representation, there is no incompatibility between the thesis that there are admissible classical mechanical descriptions and the thesis that reality is quantum mechanical. According to the framework of effects, the world presents us with appearances that we attempt to “save” and that we represent as classical systems. The framework leaves open the empirical question of why it is that the world appears to be amenable to descriptions that are expressible in classical mechanics. But whatever the character of its appearance, supposing quantum mechanics is true, reality itself has all the peculiarities quantum mechanics says it has. This does not mean that quantum mechanics presents us with a “picture of reality”—no theory does—only that quantum mechanics has identified salient aspects of reality, aspects that are missed by classical mechanics. Bohr is supposed to have said: “There is no quantum world. There is only an abstract quantum mechanical description.”Footnote 5 From the perspective of the framework of effects, the situation is rather that there is no classical world, only an abstract classical mechanical description.