""http://www.w3.org/TR/html4/loose.dtd" The Philosopher's Annual



Volume XXX
Introduction


The goal of The Philosopher’s Annual is to gather together the ten best philosophical articles of the last year. Articles of unusual merit are put forward by the Nominating Editors, and those articles are collected together for a second review by the Nominating Editors as a whole. Each of the Editors read the top-ranked thirty-or-so articles selected by that process. Over several days of extended argument, we meet to decide which ten articles stand out as the best of the bunch.

It would be hubris to suppose that our method is foolproof or that we have achieved our goal of ferreting out the ten best philosophical articles of 2010. A project this ambitious requires a bit more humility than that. This year’s selection calls for special comment in that regard, however. Something immediately obvious in the selection is that an unreasonably large number are in formal epistemology. Value theory, we are sorry to say, is entirely unrepresented.

As editors, we took our job to be the selection of the ten best articles from those most favored by the nominating editors, irrespective of area. This year, it so happened that a very large number of the articles from that batch were from areas outside of value theory, with formal epistemology very strongly represented. We stand by our estimation that the ten articles collected here are the cream of that crop. But we also recognize a topic bias in this year’s selection, and will attempt to guard against such a bias in future volumes.

Some Notes on This Year’s Selections

In “Attention and Mental Paint,” Ned Block argues that recent empirical findings on the effects of attention on the phenomenal character of perception raise problems for direct realism and representationism. These findings show that shifts of visual attention – even when one keeps one’s eyes fixed on a scene, with no relevant changes in sensory input – can change the phenomenology of perception. Block argues that these changes in perceptual phenomenology suggest that there is mental paint: qualities of perception that cannot be captured in terms of the objects and properties one is aware of, or one’s representations of them.

In “Decision-Theoretic Paradoxes as Voting Paradoxes,” Rachael Briggs attempts to build an important conceptual bridge between theories of decision and of judgment aggregation. She offers both a formal outline and an informal gloss on Evidential decision theory EDT (Jeffrey), Causal decision theory CDT (Gibbard and Harper, Lewis, Sobel, and Joyce), and a newer Benchmark theory BT (Wedgewood). Each of these, she argues, faces its own class of intuitive counter-examples. Her diagnosis of the problem is that each corresponds to a voting rule in which one makes a decision based on a poll of one’s future possible selves. Because which future self is actualized depends on the decision made, future selves serve both as voters and candidates for office. But just as no voting procedure can jointly satisfy certain intuitive desiderata, no decision procedure can jointly satisfy parallel constraints. Briggs outlines a Pareto constraint P: “if all of an agent’s future selves like A1 at least as well as A2, and some prefer A1 to A2, the agent should prefer A1 over A2.” She also outlines a ‘self-sovereignty’ condition S: a decision between A1 and A2 should depend solely on the what the agent will believe and desire after performing A1 and what the agent will believe and desire after performing A2. S and P both have intuitive force, and both play a role in motivating the counter-examples. But no completely general decision theory, Briggs argues, can satisfy both P and S. She ends on a wistfully optimistic note: a fully general decision theory may be as unnecessary as it is impossible. EDT, CDT and BT agree in all but a handful of troublesome cases.

Andrew Chignell’s paper “Real Repugnance and Belief about Things-in-Themselves: A Problem and Kant’s Three Solutions” addresses an apparent puzzle in Kant. On the one hand, Kant holds that our concepts are legitimately applied only to possible objects of experience. (This is because, even if there is no contradiction in the concept itself, the purported object of reference might have incompatible features, displaying what Chignell calls a kind of “real repugnance”. We only know that this is not so when the object shows up in possible experience.) For example, arguments for the existence of God and the soul fail to establish their conclusions because we cannot prove that the premises refer to objects that are really possible. Chignell adds two contemporary arguments to the list arguments Kant would, for this reason, be skeptical of: arguments from the possibility of “philosophical zombies” to the conclusion that physicalism is false; and arguments from our causal responsibility for certain actions to the conclusion that we have free will (in the incompatibilist sense). But on the other hand, Chignell tells us, this pessimism over the possibility of knowledge in these areas does not lead Kant to refrain from making assertions about them. This is because, while claims about God and the soul cannot be justified to the degree required for knowledge, Kant holds that other less demanding attitudes towards these claims might be (and, in some cases, are) legitimate. Chignell closes with a catalogue of three possible sources of this justification, each of which was proffered by Kant at some point. The first holds that there are “subjective grounds” arising out of the “needs of reason”; the second appeals to the notion of the possibility of an object from a “practical (as opposed to theoretical) point of view”, and the third holds that we acquire justification for claims about objects beyond possible experience via a process of “symbolization”, where we draw analogies between the supersensible and the sensible.

Two of this year’s selections raise problems for a view known as ‘Imprecise Bayesianism’. According to this view, when evidence for or against a proposition is sparse, epistemic rationality can demand that one fail to take on any precise degree of belief in that proposition. Rather, it is held, one should adopt a doxastic state which is compatible with multiple precise degrees of belief. This kind of ‘imprecise’ doxastic state is represented as a set of precise doxastic states, and a supervaluationist strategy allows us to ascribe to the agent those features which are invariant across members of the set. The attacks on Imprecise Bayesianism come from two sides: Adam Elga’s “Subjective Probabilities Should be Sharp” contends that no agent with an imprecise doxastic state will be capable of providing themselves with sufficient normative guidance to refrain from acting in practically irrational ways, while Roger White’s “Evidential Symmetry and Mushy Credences” argues against its epistemological claim that sparse evidence fails to warrant any precise degree of belief.

Elga’s paper presents a case in which an agent who fails to have a maximally specific degree of belief in some proposition p is offered a series of bets on p which are, in conjunction, guaranteed to pay out no matter what. He contends that, in this situation, passing up both of these bets is practically irrational. He then argues that any adequate decision theory for agents with imprecise degrees of belief will end up smiling on an agent who does just that. Elga concludes that imprecision in your doxastic state means that you will be incapable of providing yourself with sufficient normative guidance to avoid choosing a course of action which is guaranteed to do worse than another live alternative. Conclusion: you should never have imprecision in your doxastic state.

White’s paper begins with a rich and compelling defense of a classical principle for assigning degrees of belief in the absence of evidence – the so-called Principle of Indifference (‘POI’) – which has been largely dismissed as self-contradictory by contemporary philosophers. White contends that, in the standard trouble cases for the POI, a contradiction can be derived from the assumption of evidential symmetry alone, indicating that the blame for this contradiction has been too hastily laid at the feet of the POI. White then turns his attention to undermining a prominent alternative to the POI: Imprecise Bayesianism. He contends that this view suffers from the following defect: if you fail to have a determinate degree of belief that p, and you learn that p iff a fair coin landed heads, then you will respond to this information by failing to have a determinate degree of belief that the fair coin landed heads. White finds this result implausible. Given that you had no evidence about p, but you had lots of evidence that the coin is fair, why should learning that heads iff p give you reason to change the specificity of your beliefs about the coin?

Kit Fine’s “Towards a Theory of Part” is a discussion of the metaphysics of parthood that is simultaneously wide-ranging, creative, and rigorous. It begins by distinguishing monist theories of parthood (on which there is one basic notion of part) from pluralist approaches (which hold that there are more than one--for example, the parts of sets and the parts of mereological sums are both basic parts, but are parts in a different way). Fine argues for the latter view, and goes on to show how the many notions of part nonetheless admit of a unified and highly systematic characterization on the pluralist conception. We begin with the notion of a “composition operation”; that is, an operation which, when applied to a plurality of objects, generates a whole with exactly the objects in the plurality as parts. The various notions of part derive from differences in the formal features of the relevant composition operation. Mereological sums, for instance, are wholes formed by a composition operation that is invariant over repetition, order, and the embedding of additional composition operations among its arguments; this operation also fails to form a new object when applied to a single argument. The set-theoretic composition operation, by contrast, is only invariant over repetition and order of its arguments; applying it to a single object x yields a new argument (the singleton of x), and embedding additional composition operations yields a distinct object ({{Socrates},{Plato}} is distinct from {Socrates, Plato}). Fine outlines or proves many interesting technical features of the composition operation along the way, and concludes by developing the “general” notion of part (on which it can be said that Socrates’s kidney is a part of Socrates’s singleton, for the reason that there is some parthood relation between the kidney and Socrates, and some parthood relation between Socrates and the singleton), and connecting the present discussion to recent interest among metaphysicians in the notion of metaphysical priority.

The most common justification for updating your degrees of belief by Conditionalization – the so-called ‘Diachronic Dutch Book’ Argument – appeals to purely pragmatic considerations: failure to update by Conditionalization means that a clever bookie can sell you a series of bets guaranteed to empty your coffers. Many have thought, however, that the justification of an epistemic rule like Conditionalization should be alethic rather than pragmatic. In their “An Objective Justification of Bayesianism II: The Consequences of Minimizing Inacurracy”, Hannes Leitgeb & Richard Pettigrew attempt to provide just such a justification. They demonstrate that, if your goal is to change from a degree of belief function b to a new function b’ such that b’(p) =1 in such a way that the expected inaccuracy of b’ is minimized (from your current perspective), then you should update your beliefs by Conditionalization. Additionally, they show (quite unexpectedly) that the same kind of argument fails to vindicate a popular generalization of Conditionalization known as “Jeffrey Conditionalization” (JC). Rather, it vindicates a novel updating procedure which Leitgeb and Pettigrew dub “Alternative Jeffrey Conditionalization” (AJC). Where JC preserves ratios, AJC preserves differences. Additionally, unlike JC, AJC fails to have Conditionalization as an instance and can be used to raise degrees of belief from zero. Lietgeb and Pettigrew defend this consequence of their argument, contending that AJC, and not JC, is the proper way to revise one’s degrees of beliefs in response to certain kinds of evidence.

Non-classical logics of gaps or gluts have a strong appeal in rejecting the contradictions of the paradoxes. It is well known that such logics must also avoid Boolean negation and the material conditional like the plague; either would allow a derivation of triviality using the mechanisms of the Liar and Curry’s paradox. In a piece remarkable for both its brevity and its force, Greg Restall’s “On t and u, and what they can do” demonstrates that there is a further pair of items that the non-classical logician must avoid: the propositions t and u. t is a total-truth proposition, entailing any p if and only if p is true. u does something similar for untruth: any p entails u just in case p is untrue. In a possible world semantics in which propositions are sets of situations--widely accepted even among non-classical logicians-- t is the intersection of all true propositions, u the union of all untrue propositions. Using t and u together, Restall is able to construct a non-classical connective just enough like material conditional (though with a shade of contingency) to raise Curry-like problems again. He similarly constructs a connective enough like negation for excluded middle and the law of non-contradiction to hold, though contingently, resurrecting the trivializing Liar. The piece is short, sweet, and with an emphatic logical point. Metaphysical implications are left unsaid.

When Beauty falls asleep on Sunday night, a coin is flipped to determine whether (heads) she will be woken only on Monday morning, or (tails) on both Monday and Tuesday, the latter without a memory of the Monday awakening. When she is awoken with the self-locating information that it is either Monday or Tuesday, what should her credence be that the coin came up heads? Thirders argue that her credence should be one third. In “Sleeping Beauty, Countable Additivity, and Rational Dilemmas,” Jacob Ross outlines a Generalized Thirder Principle GTP: “In any Sleeping Beauty problem, defined by a partition S, upon first awakening, Beauty’s credence in any given hypothesis in S should be proportional to the product of its objective chance and the number of times Beauty awakens if this hypothesis is true.” He also outlines a principle of Countable Additivity CA: rationality requires that one’s credences in propositions in a countable set, any two of which are incompatible, must sum to one’s credence in their disjunction. Ross argues in detail that the considerations that lead to a thirder position entail the GTP as well, and that the GTP and CA are in conflict: there are situations in which both cannot be satisfied. Ross concludes with the suggestion that we therefore accept the reality of rational dilemmas, in which “reason is divided against itself.”

A central issue in philosophy of language and linguistics in the last several decades has concerned the projection problem for presuppositions: the problem of how to predict the presuppositions of complex expressions in a compositional manner from the presuppositions of their parts. In “Presuppositions and Local Contexts,” Philippe Schlenker offers a new account of ‘local contexts’ that yields a novel solution to the projection problem in the spirit of classic pragmatic approaches. Rather than understanding local contexts in terms of belief update, as is usual, Schlenker treats local contexts as shortcuts that facilitate sentence interpretation. The resulting general and formally precise pragmatic analysis of presupposition, Schlenker argues, suggests that projection phenomena do not require postulating a dynamic semantics.

In our judgment, each of these articles constitutes an exemplary contribution to Philosophy, worthy of being read and discussed by a wide range of philosophers. Were one to be able to read only ten articles from the philosophical literature of 2010, we think these would be a very good ten to read.

  Patrick Grim
Billy Dunaway
J. Dmitri Gallow
Alex Silk




CURRENT PAST VOLUMES