The Philosopher’s Annual’s goal is to gather together the ten best philosophical articles from the previous year. Articles of exceptional merit are put forward by the Nominating Editors, and those articles are then collected together for a second review by the Nominating Editors as a whole. We, the four Editors, read the top-ranked thirty-or-so articles, have an extended discussion about the relative merit of each paper, and ultimately decide on the ten articles we feel stand out as the best of the bunch.
We would be the first to acknowledge, however, that no selection method is without fault and because of this the use of slightly different methods could well have changed our list. We also understand that some excellent papers undoubtedly slipped through the nomination process and that there were far more than thirty papers of special note in the wide range of articles received from the Nominating Editors.
Despite the potential faults in the process, we are sure that each one of the ten selected papers is innovative, ambitious, carefully articulated, rigorously defended, and exemplary works of philosophical scholarship. We pass them on to you for consideration and enjoyment, confident that they are worthy of reflection and debate on your end as well.
Work on paradox often stops with the question of the semantic status of paradoxical sentences. In “Belief and Indeterminacy,” Michael Caie carries the discussion further into questions of rational attitudes. If Φ is a proposition that one ought to believe is indeterminate, what attitude should one have toward Φ? The orthodox answer appears to be rejection: if one ought to believe that Φ is indeterminate, one should reject both Φ and its negation. What Caie argues is instead that indeterminacy in the objects of belief will percolate upward with surprising consequences for the rationality of higher-order attitudes. If one ought to believe Φ is indeterminate, one’s mental states ought to be such that it is also indeterminate whether one believes Φ.
‘If E is evidence for H1, and H1 entails H2, then E is evidence for H2.’ This is what Matthew Kotzen calls ‘the Consequence Principle,’ apparent in classic Hempel but dealt with here in Bayesian form. In either form, the Consequence Principle faces a range of immediate counter-examples, some of which trace back to Goodman’s Fact Fiction and Forecast. Kotzen is out to defend the Consequence Principle, first rejecting a number of supplementary assumptions but finally conditioning it with a Dragging Condition: that the probability of H2 is less than the probability of H1 given E. Logical entailments of H1 that had a prior probability lower than H1 given E are ‘dragged up’ as entailments of H1. Although put forward merely as a sufficient condition for E to confirm H2, Kotzen shows the clear intuitive power of the Dragging Condition against the counter-examples outlined. Kotzen’s piece ends with a discussion of transmission failure and bootstrapping, offering a nice bridge from classical philosophy of science to contemporary formal epistemology.
Should we be one-boxers or two-boxers? The classic Newcomb problem has divided philosophers over this very question. In “Causation, Chance and the Rational Significance of Supernatural Evidence”, Huw Price shows how the debate between causal decision theorists (two-boxers) and evidential decision theorists (one-boxers) is more complicated then previously thought. Unperturbed by David Lewis’ suggestion that the Newcomb debate is “deadlocked”, Price offers a means by which progress can be made after all. The result is “evicausalism”, a view on which Newcomb-style problems force us to rethink the causal structure at play such that causal and evidential decision theorists must ultimately agree—agree to one-box, that is. Price’s paper is richer still for skillfully pointing out a tension between Lewis’ endorsement of causal decision theory and his views about “inadmissible” evidence, and using this insight to motivate his own argument.
If the earlier Price article can be said to offer a gentle criticism of Lewis in laying out his position, then Sarah Moss offers something of a defense of Lewis in laying out hers. In “On the Pragmatics of Counterfactuals”, Moss traces the 50-year dialectic guiding theories of counterfactuals: how the failure of counterfactuals to obey antecedent strengthening (as brought out by Sobel sequences) seems to rule out a semantics based upon strict-conditionals; how Lewis and Robert Stalnaker both circumvented this problem by appeal to similarity orderings over possible worlds, articulating what came to be considered the “standard” account of counterfactuals; finally, how recent work by Kai von Fintel and Thony Gillies uses “reverse Sobel sequences” to undermine the standard account in favor of a more sophisticated strict-conditional account. At this point, Moss offers a way to save the standard account by appealing to pragmatics to explain the otherwise thorny reverse Sobel sequences. In particular, Moss cogently shows how our judgments of infelicitous counterfactual assertion brought out by reverse Sobel sequences are due to general constraints on what we can say—in particular, epistemic responsibility constraints. Thus, Moss offers an account that, in her words “explains the same data [as von Fintel and Gillies] by appealing to general, independently plausible facts about conversation and reasoning.” (573)
There are two pieces that advance the history of the development of logic, addressing demanding aspects of Aristotle’s modal logic and Kant’s transcendental logic. In “A Method of Modal Proof in Aristotle” Rosen and Malink offer an undeniably systematic and nuanced treatment of Aristotle’s possibility principle, first introduced in the Prior Analytics. The possibility principle states that if B follows from A, then the possibility of B follows from the possibility of A. Rosen and Malink argue that this principle grounds a rule of inference—the possibility rule—that Aristotle often utilizes in combination with reductio proofs. The force of this interpretation comes not only from the careful formulation of Aristotle’s proof procedures, but also the compelling subsequent reconstructions of Aristotle’s arguments (spanning the Prior Analytics, De caelo, Physics, De generatione et corruptione, Metaphysics, and Posterior Analytics)—that at once reveal strengths and problematic features of many of Aristotle’s inferences.
In “The Generality of Kant’s Transcendental Logic” Clinton Tolley explains and motivates the defining features of Kant’s transcendental logic in contrast with traditional logic. While other interpreters identify the difference between the two logics as a function of subject matter—analytic vs. synthetic judgments or general vs. particular content, for example—Tolley argues that the subject matter of transcendental logic and traditional logic coincide; both are concerned with universal and necessary conditions for thinking. However, these two logics do differ according to which aspect of the understanding they are associated with—the former concerned with the content of pure concepts and the latter concerned with the form of judgments. Tolley offers an insightful account of Kant’s contrast between formal and non-formal logic to parse otherwise vexing passages of text and reconsider the character of transcendental logic in the Critique of Pure Reason.
Looking towards our intuitions to answer questions about whether “S knows that p” is a standard operating procedure within epistemology. However, recent work by “experimentalists” has called the legitimacy of the use and role of intuitions into question. What we find in Jennifer Nagel’s “Intuitions and Experiments: A Defense of the Case Method in Epistemology” is a thorough philosophical and psychological defence of epistemologist’s use of intuitions. Nagel looks to the research in psychology on intuitions to show that epistemic intuitions have the same status as our perceptual judgements; a domain whose legitimacy the experimentalists tend not call into question. Moreover, according to this research, the epistemic variety of intuition arises from a generally reliable natural capacity known as “folk psychology,” or “mindreading.” The experimentalists, for fear of falling into general scepticism, have not challenged this capacity and “at least one prominent experimentalist has explicitly identified mindreading as the kind of intuitive capacity that can be trusted.” (497) Thus, Nagel’s ambitious paper shows that the method of intuition in epistemology ought to be vindicated by the experimentalist’s own lights.
Imagine a case where a bus causes some harm and a 70% reliable eyewitness identifies the bus as run by Buscorp. Now, imagine a second case where there is no eyewitness but local statistics show 70% of the buses in the area belong to Buscorp. Why do courts base verdicts on evidence from the first case but not the second? Moreover what explains the intuition that courts are right to do so? Enoch, Spectre, and Fisher (ES&F) argue that the distinction here has an epistemological justification. In particular, ES&F draw an analogy with lottery paradox cases familiar from epistemology to show how, just like beliefs, verdicts based on a kind of merely statistical evidence are counterfactually insensitive to the truth. Despite this, ES&F argue that the law shouldn’t care about sensitivity per se if it conflicts with accuracy. Instead, they argue that if any considerations justify the distinction in how the kinds of evidence are treated, they are practical, having to do with avoiding perverse incentives to break the law. The upshot at the article’s end consists in the way ES&F show these practical considerations to be grounded in the very same counterfactuals as the epistemological ones.
The Evolutionary Challenge for moral realists is the challenge to explain why we should expect to have true moral beliefs when those beliefs are the product of evolutionary forces that would be indifferent to the moral truth. In “Morality and Mathematics: The Evolutionary Challenge,” Clarke-Doane provides a substantially clarified version of the Evolutionary Challenge that he argues is applicable to other forms realism beyond the moral variety. Clarke-Doane then applies this updated version of the challenge to mathematical realism, which is a significant worry for the philosophers that claim that this form of realism is immune from the challenge. The outcome is that we all should carefully think about a commitment to mathematical realism if we want to continue to use the Evolutionary Challenge.
In “Guide to Ground,” Kit Fine presents what may be the definitive form of his work on metaphysical determination: the constitutive ‘in virtue of’ relation that holds between for example (a) the fact that the ball is red and round and (b) the fact that it is red and that it is round. That central relation, Fine claims, “stands to philosophy as cause stands to science.” (40) Fine’s work presents an elegant logical treatment for metaphysical grounding, addresses counter-examples and some competing accounts, but is equally valuable for the fact that his formal work is set nicely in a general discussion of the philosophical importance of concepts of grounding in general.
In our judgment, all of the articles are worthy of being read and discussed by a wide range of philosophers as they due to their exemplary nature. If you were only able to read ten articles from the philosophical literature of 2012, we think you couldn’t do much better than the above ten.