Svoboda | Graniru | BBC Russia | Golosameriki | Facebook
Skip to main content
Classical authors were generally artistic realists. The predominant aesthetic theory was mimesis, which saw the truth of art as its successful representation of reality. High modernists rejected this aesthetic theory as lifeless, seeing... more
Classical authors were generally artistic realists. The predominant aesthetic theory was mimesis, which saw the truth of art as its successful representation of reality. High modernists rejected this aesthetic theory as lifeless, seeing the truth of art as its subjective expression. This impasse has serious consequences for both the Church and the public square. Moving forward requires both, first, an appreciation of the strengths and weaknesses of the high modernist critique of classical mimetic theory, and, second, a theory of truth which makes adequate reference to both subject and object. This paper argues that Lonergan offers just such an account of truth, and so cashes out the high modernist rejection of classical mimesis in Lonergan’s terms, thereby creating the opportunity for a synthesis of the two views.
The physical cosmology of material spheres proposed by Aristotle has been considered obsolete since “Tycho Brahe's observations of the super-nova of 1572.” Scholarly opinion shifted quickly from Galileo’s view that Aristotle was an... more
The physical cosmology of material spheres proposed by Aristotle  has been considered obsolete since “Tycho Brahe's observations of the super-nova of 1572.”  Scholarly opinion shifted quickly from Galileo’s view that Aristotle was an arch-enemy to be criticized at every turn to Descartes’ treatment of the Stagirite as a vanquished opponent rarely worthy of comment.  While neo-Aristotelianism is booming,  the vast majority of contemporary commentators do not focus on his cosmology,  and those who do defend its relevance in modern science often have a slightly desperate flair.  The difficulty of this project is unsurprising, since whereas Aristotelian cosmology is founded on a bifurcation of supra-lunary and sub-lunary realms,  the “remarkable discovery” grounding modern astronomy “is that the stars are made of atoms of the same kind as those on the earth.”  More productive studies have thus focused, not on finding contemporary references for Aristotle’s catalog of cosmological realities, but rather on hitherto-ignored parities of reasoning between Aristotelian and modern conceptions of science.  This study examines the relationship between Aristotle’s symmetry arguments in the De Caelo and contemporary applications of Noether’s First Theorem, that every differentiable symmetry of a physical system’s evolution implies a conservation law.  First I will examine the arguments from directed motion to its causes, second those running from symmetric motion to conservation, and third those concluding from conservation to claims about cosmology.
The debate about the ontological innocence of mereology has generally been framed as a debate about the plausibility of Universal Fusion. Ontologically loaded fusions must be more than the sum of their parts, and this seems to violate... more
The debate about the ontological innocence of mereology has generally been framed as a debate about the plausibility of Universal Fusion. Ontologically loaded fusions must be more than the sum of their parts, and this seems to violate parsimony if fusion is universal. Less attention has been paid to the question of what sort of emergence mereological fusions must exhibit if they are irreducible to their parts. The philosophy of science literature provides several models of such strong emergence. Examining those models suggests that the difficulty with emergent fusions has at least as much to do with extensionality as it does with Universal Fusion. Some accounts of emergence fail to ensure irreducibility when combined with extensional mereologies. The most promising model for the strong emergence of ontologically loaded fusions fails to validate Anti-Symmetry, which naturally leads to failures of extensionality. These results suggest that the focus on Universal Fusion may have been misplaced.
The difficulties often attributed to prime matter hold for all hylomorphic accounts of substantial change. If the substratum of substantial change actually persists through the change, then such change is merely another kind of... more
The difficulties often attributed to prime matter hold for all hylomorphic accounts of substantial change.  If the substratum of substantial change actually persists through the change, then such change is merely another kind of accidental change.  If the substratum does not persist, then substantial change is merely creation ex nihilo.  Either way matter is an empty concept, explaining nothing.  This conclusion follows from Aristotle’s homoeomerity principle, and attempts to evade this conclusion by relaxing the constraints Aristotle imposes on elementhood, generation, and substrata all fail, and even the minimal constraints imposed by the Problem of Material Constitution are enough to generate the dilemma.

Aristotle resolves this dilemma in Physics I.9 by postulating pure potentiality-for-substance as the substratum of substantial change.  Because the substratum persists, substantial change is not creation ex nihilo, but because it does not persist actually it is not a kind of accidental change.  Aristotle uses this approach to solve the Problem of the Mixt and the Problem of Material Constitution without weakening his constraints on elementhood, generation, or substrata.  This pure potentiality approach must be carefully distinguished from other ‘traditional’ or ‘prime matter’ views that posit some actuality for the substratum of substantial change, and it is best understood in light of the analogy found at Metaphysics Θ.6.  Pure potentiality-for-substance can do the work needed in a substratum for substantial change because Aristotle is able to ground the identity, existence, and characterization of the substratum in the corrupting and generating substances rather than the substratum itself.
Research Interests:
Descartes’ break with Baroque scholasticism is often framed as a move towards certitude in simple and mathematical methods instrumental to the founding of modern science. While Descartes’ development of analytic geometry and its... more
Descartes’ break with Baroque scholasticism is often framed as a move towards certitude in simple and mathematical methods instrumental to the founding of modern science.  While Descartes’ development of analytic geometry and its application for significant breakthroughs in optics bears this out, however, most of his scientific explanations are now viewed as dated rather than certain, and many of his claims contradict the more accurate understanding of Galileo and others.  I suggest that Descartes’ breakthrough is not in mathematical or error-free method but rather in a broadening of the notion of assumption to include multiple independent and contradictory models. Perhaps counter-intuitively, Descartes finds this broad notion of assumption in Ptolemaic astronomy, and first applies it in algebraic kinematics. Descartes’ use of models reaches its zenith, however, in his optics, where he disclaims any attachment to the truth of his contradictory physical models for the transmission, reflection, and refraction of light while insisting that each is necessary for explaining the observed phenomena. These multiple contradictory models used by Descartes go well beyond the idealization of Galileo, since the latter insisted that he only neglected minor phenomena rather than engaging in contradiction. This broad Cartesian view of modeling is now considered critical to predictive science in many disciplines and gives Descartes a genuine place in the foundation of modern science independent of his physical views.
The degree to which the sovereignty of authoritarian regimes precludes cosmopolitan intervention to protect human rights is an important question of global justice. Mistakes can be made in both directions: much of the international... more
The degree to which the sovereignty of authoritarian regimes precludes cosmopolitan intervention to protect human rights is an important question of global justice. Mistakes can be made in both directions: much of the international community regrets not intervening during the genocide in Rwanda (Heinze, 2007), but the invasion of Iraq and Western support for the “Arab Spring” have been credited with encouraging nuclear proliferation by regimes desperate to maintain sovereignty (Friedman et al., 2012). Behind this normative difficulty lies a metaphysical one: what explains the existence of political unity without recourse to fractious and perhaps circular normative claims? When is an authoritarian government nonetheless a sovereign entity, and when is it a mere warlord with a flag? This question is also of interest to political philosophers who seek to provide a naturalized account of political unity, rather than the a-historical and Western-normativity-laden political contract theories (Searle, 2010).

John Searle’s own account, which focuses on the acceptance of speech acts, has nonetheless been criticized for failing to establish the desired normativity. Margaret Gilbert (2013) charges that individualized acceptance allows individual rescission, preventing institutions from having real normative force. Charles Mills (2017), meanwhile, worries that focusing on individual acceptance ignores the role of sheer coercive power against subordinated groups in state formation. Neither Gilbert’s nor Mills’ positive theories, however, attempt to meet Searle’s criteria for the naturalization of their normative claims.

I propose meeting this challenge by conjoining Rob Koons (2014)’s Parts-as-Sustaining-Instruments ontology  with Mancur Olson (1993)’s economic theory of state formation. Koons’ ontology suggests that unity is secured by mutual dependence of parts and wholes, but that circularity can be avoided by the dependence being synchronic in the one direction and diachronic in the other. Olson’s theory of state formation, meanwhile, relies on the greater alignment of economic incentives between “stationary bandits” (i.e., extractive leaders) and their pillaged populace than is the case with “roving bandits” who pillage with abandon.

In my account, as in Searle’s, acceptance is a synchronic property inhering in each member of the community which makes the whole political institution dependent on its citizen parts. This account makes the community emergent rather than supervenient, however, because the distribution of common goods necessary to support life and the arbitration of disputes is a dependence of the parts on the whole which serves as a diachronic emergence base. This diachronic dependence of the citizens on the state provides the normative basis for consent (as in Gilbert’s view), but crucially the distribution of common goods and arbitration of disputes need not themselves be normatively founded. These services can therefore be provided by a purely self-interested power like Olson’s “stationary bandit,” who violates human rights in the service of coercive subordination. Nonetheless such governments have normative force justifying rational acceptance of sovereignty and precluding international intervention, since they are objectively superior to domination by “roving bandits.” Only governments failing to provide these basic services are rightly subject to international police action.
Science and scientists have a checkered record on race. Some scientists (e.g. Linnaeus, 1758, pp. 21–22; Herrnstein & Murray, 1996; Rushton & Jensen, 2008) have argued for “race realism” in a way that aggravates rather than ameliorates... more
Science and scientists have a checkered record on race. Some scientists (e.g. Linnaeus, 1758, pp. 21–22; Herrnstein & Murray, 1996; Rushton & Jensen, 2008) have argued for “race realism” in a way that aggravates rather than ameliorates racial domination. The response has been to argue that race is a social construct (Kaplan & Winther, 2014; Gannon, 2016), but many scientists have treated “social construct” as synonymous with “nonexistent” (Morning, 2007; e.g. Lewontin, 1972; Bamshad & Olson, 2003; Leroi, 2005; Prontzos, 2019) which also undermines ameliorative projects (Thompson, 2006). I suggest that this unhelpful dichotomy is the result of scientists combining two otherwise innocuous habits without attending carefully to how they interact.

The first habit is taking existence or reality as the instantiation of a property which appears in the best scientific explanation of an empirical phenomenon (see Quine, 1960, 1969; Harman, 1967; Cargile, 2003; van Inwagen, 2009). Because genetics is the primary tool for hereditary population studies and race doesn’t appear as a strong genetic cluster within human populations (Romualdi et al., 2002), many biologists simply regard it as nonexistent. The second habit is building multiple related but incompatible causal models in the service of different representational goals (Weisberg, 2007), a practice now widespread in biology (Godfrey-Smith, 2006). These models can be explanatory, because in complex sciences like biology de-idealization is often impossible and they successfully express counterfactual dependencies (Bokulich, 2011). It turns out that race can play such a role in pharmacology, even though and precisely because it is a social construct (Doyle, 2006), which grants it a certain scientific reality.

This analysis neglects Bokulich (2011)’s third condition for explanatory modeling, however: the phenomenon to be explained must fall within the model’s domain of applicability. Sociological, pharmaceutical, and other models which make use of race in their explanations are not of unrestricted applicability like the basic equations of physics, or even of applicability to all terrestrial life like genetics. The conclusion, if we are careful in relating the two habits, is that race exists only in a qualified sense. It is an answer to a question that can only be put to certain models. We can therefore see how scientists have so often gone wrong about race. By treating race as an entity which exists (or fails to exist) in an unqualified sense, many scientists have been led to unhelpful articulations of race-realism and race-denialism. Remembering that race is a reality only of certain models with ineluctably social components makes it available for ameliorative projects without pretending that it is a natural kind suitable for generic scientific research.
Responsible political decision-making is extremely difficult during a time characterized by a “cumulative departure from coherence.” Left to ourselves, our individual biases go uncorrected, but left loose in a culture undergoing a cycle... more
Responsible political decision-making is extremely difficult during a time characterized by a “cumulative departure from coherence.”  Left to ourselves, our individual biases go uncorrected, but left loose in a culture undergoing a cycle of decline, the “general bias of common sense combines with group bias” in a “distorted dialectic of community.”  On the one side, there is “a solid right that is determined to live in a world that no longer exists” and on the other “a scattered left” hostage to the passing trends of the day.  Each sees the other as at least potentially totalitarian, and is tempted to respond with its own totalitarian methods.  In this dismaying situation, clearly recognizable as our own, Lonergan counsels that “what will count is a perhaps not numerous center, big enough to be at home in both the old and the new, painstaking enough to work out one by one the transitions to be made, strong enough to refuse half measures and insist on complete solutions even though it has to wait.”  Major authenticity, like minor authenticity,  is apparently found in the mean between extremes.
Lonergan’s topography seems to locate this center between a revanchist desire for White America  and a faddish identity politics.  Yet waiting for a more perfect Black Lives Matter movement  leaves us in the place of the “white moderate” who agrees with activist goals but “paternalistically believes he can set the timetable for another man's freedom.”  If we are not to be a stumbling block worthy of condemnation,  then the virtuous mean for major authenticity, like that for minor authenticity, must be more than a mere arithmetic mean.  Making a responsible decision requires understanding the mean relative to us, which in turn requires understanding the potential biases in play.
The right yearns for a world that no longer exists when the hierarchy of power among groups is threatened,  and the left is scattered by fads when supposedly scientific solutions fail to yield immediate results.  This paper thus argues that group bias is characteristic of the right, and general bias is characteristic of the left. A false centrism is created when the left resorts to group bias to overpower the right,  or when the right resorts to general bias to overpower the left.  Such an equilibrium is no more stable than the false moderation of the person who tries to set excessive fear against self-indulgence rather than undertaking the hard work of rooting out both excessive passions in favor of virtue. A true centrism must take the opposite approach, rooting out both kinds of bias. This true centrism neither delays justice out of discomfort nor presumes that discomfiting the powerful will cure all that ails us.
Catholic ethicists today are buffeted by allegations of classicism on one side and lack of fidelity to the Magisterium of the Church on the other. The Magisterium insists on the meta-ethical criterion that ethics must be able to label... more
Catholic ethicists today are buffeted by allegations of classicism on one side and lack of fidelity to the Magisterium of the Church on the other. The Magisterium insists on the meta-ethical criterion that ethics must be able to label some acts as intrinsically evil. Those who recognize that ethics arise from the self-reflection of subjects are dubious that such categorical claims can be grounded absent special revelation. In Thomist terms, the question is how natural law arises from practical reason.
Martin Rhonheimer outlines such an intellectually converted ethics, but has trouble defending his delineations of human action and virtue against the circularity charges of critics like Steve Jensen. Introducing Lonergan’s more rigorous account of judgment solves these problems by integrating both fact and value judgment into responsible action and resolving virtues as heuristic approaches to value. In turn, Rhonheimer provides an analysis of value apprehension in concrete situations which demonstrates that positional ethics can recognize and reject intrinsically evil acts. Lonergan’s description of method in ethics is then carried out in a positional reading of Veritatis Splendor.
This provides a point of departure for dialogue with New Natural Law theorists like John Finnis, who R.J. Snell describes as “positional with regard to the discovery of the principles of natural law, but not with regard to their application” (Lonergan Workshop 2011). A fully positional account of natural law makes Lonergan respectable to ethicists who need to demonstrate fidelity to the Magisterium, engage in refutation of counter-positional comprehensive ethical doctrines, and apply ethics to controverted cases in medicine and elsewhere.
Hettema and Kuipers claim the periodic table began as a theory but then was reduced to quantum mechanics, whereas Scerri holds that though it has not suffered an explanatory reduction, it is and was only a classification. Weisberg replies... more
Hettema and Kuipers claim the periodic table began as a theory but then was reduced to quantum mechanics, whereas Scerri holds that though it has not suffered an explanatory reduction, it is and was only a classification. Weisberg replies that the periodic system’s predictive and explanatory power makes it a theory, regardless of its foundations.

We argue that Scerri’s own work lends support to a theoretical interpretation of the periodic system, since he follows Paneth’s view that it classifies basic substances (transcendentals) and Hendry shows that atomic weight as well as number was derived implicitly, so basic substances are theoretical terms. We then extend Hendry’s solution to the ‘qua’ problem by introducing Lonergan’s distinction between description and explanation to understand the periodic system as a fully explanatory schema where the terms and relations are given together. Since the periodic system explains not only their property regularities but the elements themselves in purely theoretical terms, only a complete explanatory reduction of chemistry to physics could obviate it.
We follow Paneth’s view that the periodic system classifies basic substances (transcendentals) and Hendry’s explication of implicit derivations for atomic weight and number and his solution to the qua problem, but resist Hendry’s attempt... more
We follow Paneth’s view that the periodic system classifies basic substances (transcendentals) and Hendry’s explication of implicit derivations for atomic weight and number and his solution to the qua problem, but resist Hendry’s attempt to make basic and simple substances coextensive as this leads to an irreducible ontological pluralism. Instead we introduce Lonergan’s distinction between description and explanation to conceptualize the IUPAC definitions for the elements in the periodic system as a fully explanatory schema where the terms and relations are given together and defined implicitly. This answer to the chemist’s question rejects any God’s-eye-view or view-from-nowhere and can be categorized as a transcendental, critical scientific, and internalist realism—but one more comprehensive than Hartmann’s, more realist than Putnam’s, and more philosophically integrated than Vihalemm’s.

We show how Lonergan’s method resolves Scerri’s concerns about how basic substances can be meaningfully grouped and discussed without descending into abstract metaphysics. Lonergan’s self-correcting heuristic precludes a priori arguments for both reducibility and irreducibility, couples the epistemic and ontological problems of reduction, and provides a criterion for comparison of proposed chemical ontologies—including reductive ones. Reduction thus becomes an empirical question for the future of quantum chemistry, though downward causation is ruled out with a philosophical affirmation of the dependence of chemistry and ubiquity of physics. Unless and until inter-theoretic reduction occurs, however, Lonergan’s approach secures non-mysterious ontological independence for chemistry.
In Insight Lonergan offers an extremely compact account of the aesthetic pattern of experience, whereas Sontag develops a rich but less philosophical account in Against Interpretation. In this paper I attempt a bi-direction transposition... more
In Insight Lonergan offers an extremely compact account of the aesthetic pattern of experience, whereas Sontag develops a rich but less philosophical account in Against Interpretation. In this paper I attempt a bi-direction transposition of these accounts in order to both ascertain their compatibility and develop a rich and rigorous understanding of the process of creating and participating in works of art. I then use that understanding to discuss the appropriate role of audience participation in modern theater, with a particular focus on whether “camp” and esoteric productions further or hinder the state of theatrical arts.
I begin by isolating what Sontag means by “interpretation” and argue that she is not defending aesthetic experience as an already-out-there-now-real but rather insisting that works of art are truly things—experienced unity, identity, wholes—not mere constrained indirections for the transmission of judgments. I further suggest that this is what Lonergan means when he says that “the validation of the artistic idea is the artistic deed” (CWL3 208), consonant with Sontag’s claim that “art is the objectifying of the will in a thing or performance” (“On Style” 151). I then explore the way that Sontag’s criticism of Bresson, demanding that “art is the discovery of what is necessary—of that and nothing more” (“Bresson” 135) mirrors Lonergan’s explication of explanatory insight that simultaneously fixes terms and relations. Next I relate Sontag’s claims for the inextricability of form and content in art to the “Elements of Metaphysics.” Last I look at the way modern theater deals with Lonergan’s stricture that “it seeks to mean, to convey, to impart…through a participation, and in some fashion a reenactment of the artist’s inspiration and intention” (CWL 208). I contend that moving beyond direct mimesis is helpful, but that the ironic detachment of “camp” is detrimental to such participation.
I propose a formal model of cognitional operations sufficient to explain genetic method. In his article “Insight: Genesis and Ongoing Context,” Fred Crowe calls out Lonergan’s line “the diagram is more important than…is ordinarily... more
I propose a formal model of cognitional operations sufficient to explain genetic method.  In his article “Insight: Genesis and Ongoing Context,” Fred Crowe calls out Lonergan’s line “the diagram is more important than…is ordinarily believed”  as the “philosophical understatement of the century.”  Sixteen pages later he identifies elaborating an invariant cognitional theory to underlie generalized emergent probability and thus “the immanent order of the universe of proportionate being,”  as “our challenge,” “but given the difficulty” he doesn’t “see any prospect for an immediate answer.”  Could this have something to do with the lack of a comprehensive diagram of cognitional theory?  Appendix A of CW18 offers diagrams of the dynamics of knowing and doing perhaps copied from Lonergan’s own blackboard work, but they do not distinguish explanatory and descriptive insights, let alone statistical insights, and do not illustrate the pull upwards or the fusing of routinized insights.
In this paper I develop a series of diagrams that gradually overlay differentiations of cognitional method by the level of the question and the relation of the data, preserving isomorphisms at each step.  I suggest that a fractal approach combined with differentiations of data can preserve the explanatory richness of Lonergan’s account without unduly multiplying the total number of stages, differentiations, and processes.  In this account descriptive insights differ from explanatory insights not only in the question asked of the data, but in the use of phenomenologically qualitative data rather than only the data of events.  Difficulties with the place of self-appropriation lead me to posit a new level between the traditional third and fourth levels.  This both incorporates the relation of the self to the facts prior to the introduction of affectivity and allows an unbroken factorial isomorphism of stages across all levels.  The overall approach gives rise to a new solution to the place of the controversial fifth level that accommodates upward pull, grounding in the other levels, and a basis in uniquely religious experience.
While the method I propose is directed to understanding and not yet verified, the explicit drawing of Lonergan’s rich explication of modern science into an account using fractal isomorphisms to minimize the number of elements provides a basis for a rigorous philosophy of science.  Furthermore while I do not attempt here to prove that my provisional account is contained within that of St. Thomas, I do borrow semantics from his account of the spiritual understanding of scripture in order to maintain continuity with the Thomistic tradition.  Crucially, I do not yet have a rigorous theory of the interrelations between the basic cognitional process (underlying scientific and common sense reasoning) and the patterns of experience or functional specialties.  The most important work yet to be done, and the reason for my desire to present this paper at WCMI, is the verification of my proposed schematization of the levels of consciousness in the mind of the reader:  “Revise it if you can.”
David Christensen and others argue that Dutch Strategies are more like peer disagreements than Dutch Books, and should not count against agents’ conformity to ideal rationality. I review these arguments, then show that Dutch Books, Dutch... more
David Christensen and others argue that Dutch Strategies are more like peer disagreements than Dutch Books, and should not count against agents’ conformity to ideal rationality. I review these arguments, then show that Dutch Books, Dutch Strategies, and peer disagreements are only possible in the case of what computer scientists call Byzantine Failures—uncorrected Byzantine Faults which update arbitrary values. Yet such Byzantine Failures make agents equally vulnerable to all three kinds of epistemic inconsistencies, so there is no principled basis for claiming that only avoidance of true Dutch Books characterizes ideally rational agents. Agents without Byzantine Failures can be ideally rational in a very strong sense, but are not normative for humans. Bounded rationality in the presence of Byzantine Faults remains an unsolved problem.
Research Interests:
Part is not a univocal term. Uses of parthood and composition that do not obey any supplementation principle have a long philosophical tradition and strong support from contemporary physics. We call such uses potential parts. This paper... more
Part is not a univocal term. Uses of parthood and composition that do not obey any supplementation principle have a long philosophical tradition and strong support from contemporary physics. We call such uses potential parts. This paper first shows why potential parts are important and incompatible with supplementation, then provides a formal mereology for such parts inspired by the path-integral approach to quantum electrodynamics.
Research Interests:
Aquinas’s notion of virtual presence is a viable and consistent account of mixture which meets the criteria set out by Aristotle and elaborated by Wood and Weisberg. While the persistence in mixts of the (prime) matter of the elements... more
Aquinas’s notion of virtual presence is a viable and consistent account of mixture which meets the criteria set out by Aristotle and elaborated by Wood and Weisberg.  While the persistence in mixts of the (prime) matter of the elements and their (virtual) forms may be mysterious to common sense, both are highly parallel to the best existing understandings of quantum physics.  Physical models for virtual presence do not answer all of the relevant exegetical questions, but they do avoid making merely verbal distinctions or collapsing Aquinas’s account into either that of Avicenna or Averroes, his explicit contrast cases.  Elemental and hylomorphic accounts of composition are not incompatible as the medievals feared, so there should be no reason for hesitation about using hylomorphism as an alternative to materialism and dualism.
Research Interests:
Political theorists invoke “the common good” to solve many of their thorniest difficulties. Such invocations, however, place strict but little-noticed constraints on what may be meant by “common.” This paper makes explicit Aquinas’s... more
Political theorists invoke “the common good” to solve many of their thorniest difficulties.  Such invocations, however, place strict but little-noticed constraints on what may be meant by “common.”  This paper makes explicit Aquinas’s latent metaphysics of the common, and in doing so proposes an answer to the general composition question for societies.  This answer provides a basis for criticizing various Thomistic accounts of the common good.
Research Interests:
Tollefsen lists a number of meta-ethical advantages inherent to first-person accounts of the moral object: they avoid various worries about moral luck, and their focus on the knowability of goods by practical reason avoids both Hume’s... more
Tollefsen lists a number of meta-ethical advantages inherent to first-person accounts of the moral object:  they avoid various worries about moral luck,  and their focus on the knowability of goods by practical reason avoids both Hume’s Naturalistic Fallacy and the amoralist’s detachment from moral motivation,  which are often portrayed as a dilemma.  First-person accounts also seem uniquely equipped to apply double-effect reasoning, since they give an unambiguous referent to intention and thus dissolve Chisholm’s puzzles about the diffusiveness and division of intention.  The doctrine of double effect is in turn central to capturing “certain kinds of [non-consequentialist] moral intuitions”  which are critical for many cases in applied ethics.  First-person accounts also avoid difficulties arising from the act-omission distinction.  While the New Natural Law Theory  (hereafter NNLT) is the leading attempt to formulate a rigorously first-person account within the broadly Thomist moral tradition,  it is subject to many criticisms, which often conjoin objections to its action and value theories.    In this paper I first carefully formulate the interaction of the NNLT action and value theories in arriving at moral judgment, then review each of those theories for additional distinctions which must be incorporated to respond to criticisms.  Finally I attempt to give a brief statement of a fully first-person account which incorporates those criticisms.  In sum, a Thomist natural law theory should be thoroughly first-personal about action, but recognize that practical rationality requires metaphysical consistency, and support intrinsic values against reductive commensuration while recognizing that such values are instrumental by participation in the one final end.  This is just the view Jensen refers to as moderate physicalism, where “the formality of moral good and evil arises first of all from the exterior action conceived.”  In order for that conception to be rational, it must include both the efficient and formal specifying causes for the benefits desired, and it must obey the first principle of practical reason which beseeches that we pursue good and avoid evil (damage to intrinsic good).  This view can be stated in a completely first-person way.
Research Interests:
In his now-published paper from the fall 2008 School of Philosophy lecture series at The Catholic University of America “Metaphysical Themes—in Honor of John F. Wippel,” Gregory Doolan proposes a resolution to Aquinas’s seemingly... more
In his now-published paper from the fall 2008 School of Philosophy lecture series at The Catholic University of America “Metaphysical Themes—in Honor of John F. Wippel,” Gregory Doolan proposes a resolution to Aquinas’s seemingly contradictory statements regarding whether substance is a real genus.  Rather than advancing a developmental account, Doolan uses the phrase “metaphysical genus” from Article One of the Disputed Question on Spiritual Creatures (hereafter QDSC) as an interpretive key to the rest of the Thomistic corpus  and attributes Aquinas’s denials that substance is a real (i.e. not merely logical) genus to a more restricted sense of genus.  A metaphysical genus, Doolan concludes, is a real genus encompassing all finite substances in an analogical community.  Doolan’s choice of interpretive key is ingenious, but his proposal suffers from lack of textual support for a sense of genus relying on analogical commonality.  While Doolan is right to worry about the Aristotelian objections to a univocal community among beings, Aquinas’s novel real distinction between essence and (the act of) existence  provides the basis for resolving those worries.  A closer reading of Article One of the Disputed Question on Spiritual Creatures in light of that distinction suggests that the analogy involved in the metaphysical genus of substance mooted there is not an analogous sense of substance but an analogous sense of matter.
Research Interests:
Jensen’s spectrum from Abelardian to physicalist positions regarding intention’s role in the specification of moral objects turns out to be a triangle, with the other axis the spectrum between composite and per se accounts of action. This... more
Jensen’s spectrum from Abelardian to physicalist positions regarding intention’s role in the specification of moral objects turns out to be a triangle, with the other axis the spectrum between composite and per se accounts of action. This supports Jensen’s claim that moderate physicalism is a stable position, but raises the prospect of dissenting divergence from that mean along each of the axes. Long and Rhonheimer are right to cast killing in lethal self-defense as praeter intentionem, but wrong to think that it thereby requires no explicit color of law. This account in terms of circumstance, far from being gerrymandered, accords better with the text of II-IIq64a7 itself, its framing within the Prima Secundae, and the constraints imposed later in the Secunda Secundae. Jensen’s account of intention thus best survives the proving ground of lethal private self-defense, but only with the benefit of Long and Rhonheimer’s improved survey of that ground.
Research Interests:
Geometry and physics have long been intertwined as Zeno grappled with the implications of Euclid's formulations and cosmologists attempted to make sense of Einstein's General Relativity. What has not been notably absent from the... more
Geometry and physics have long been intertwined as Zeno grappled with the implications of Euclid's formulations and cosmologists attempted to make sense of Einstein's General Relativity.  What has not been notably absent from the discussion was a coherent theory explaining just what sort of statement a geometry is and how that statement might be verified in conjunction with physics so as to answer meaningful questions about the world.  With his insight into insight that a geometry is a certain kind of insight formulating the relationships of data about extensions and durations, Lonergan has provided this crucial missing link.  A better understanding of what he means by a geometry that can be objectively verified is thus an advance in philosophical cosmology, especially when it is contrasted with the difficulties suffered by those who have made flawed assumptions in the matter.  Whether the universe turns out to be Euclidean in nature or not is a matter for the scientists, but the appropriate data can in fact make an insight into the matter invulnerable and lead to a truly objective geometry.
“Hegel’s system is not afraid of facts…is not afraid of contradictions…The only thing the System has to fear is that it itself should be no more than some incomplete viewpoint, and in fact that is what it is.” In the first work of the... more
“Hegel’s system is not afraid of facts…is not afraid of contradictions…The only thing the System has to fear is that it itself should be no more than some incomplete viewpoint, and in fact that is what it is.”  In the first work of the Encyclopaedia of the Philosophical Sciences, the Logic, Hegel attempts to insulate himself from this sort of claim by categorizing the alternative possibilities for philosophical systems and demonstrating a fundamental difficulty with each.  Thus while Lonergan’s conceptual claims against Hegelianism in Insight are strong, claims of equivalent force must be made that Lonergan’s generalized empirical method does not fall within one of the categories which Hegel so forcefully critiques.  If these claims are successful than the case that Lonergan’s insights into insight transcend Hegelianism are made stronger, while if they fail then Lonergan’s position is cast into doubt as merely another conceptual variation explainable with recourse to the System.  It is my contention that the claim for Lonergan’s transcendence of Hegelianism succeeds, and I will demonstrate this claim by explicating Hegel’s objections against each method he identifies in the Logic and indicating why Lonergan’s thought cannot be fully critiqued by that categorization.
Research Interests:
Taking the engineering part of conceptual engineering seriously means learning from the methods of other engineering disciplines. Since the concepts that philosophers want to engineer are generally parts of large, complex, existing... more
Taking the engineering part of conceptual engineering seriously means learning from the methods of other engineering disciplines. Since the concepts that philosophers want to engineer are generally parts of large, complex, existing systems, they are attempting to engage in systems engineering. The branch of systems engineering that deals most in concepts is software systems engineering. This is a relatively new discipline, having emerged out of less systematic software development practices by borrowing certain key methods from recent developments in architecture. This talk traces the close analogy between architects' design patterns and philosophers' concepts, then follows the lessons learned in software systems engineering about what makes a good design pattern, how to engineer a better one, and how to evangelize better design patterns after their invention. Two key lessons are a sweet spot in the scope of conceptual engineering projects and the importance of structures of collaboration.
Philosophers are often taught to read charitably—to construe arguments in the strongest fashion possible. Doing so is charitable in the ordinary sense of the term, avoids bickering over minor reformulations, and avoids dismissing... more
Philosophers are often taught to read charitably—to construe arguments in the strongest fashion possible.  Doing so is charitable in the ordinary sense of the term, avoids bickering over minor reformulations, and avoids dismissing arguments too easily.  Construing others’ arguments for them can also have negative effects, however.  When the others are high-status members of our own philosophical culture, it allows them to escape criticism and shift burdens onto their more marginalized readers.  When the others are members of other philosophical cultures, it can diminish the true challenge posed to our own ways of thinking.  When and how should we attempt to read charitably?
Many neo-Aristotelians have abandoned talk of elements, considering them to pose insoluble problems which are of only antiquarian interest. Aristotle’s doctrine of homoeomerism, however, commits him to the same difficulties for the... more
Many neo-Aristotelians have abandoned talk of elements, considering them to pose insoluble problems which are of only antiquarian interest.  Aristotle’s doctrine of homoeomerism, however, commits him to the same difficulties for the transformation of ordinary substances as for elemental transformation.  Furthermore, the extant solutions to the Problem of Material Constitution generate the exact same options as those considered insoluble in the case of the elements.  Hylomorphists must squarely face the difficulties of elemental transformation if matter is to be metaphysically useful.
Part is not a univocal term. Uses of parthood and composition that do not obey any supplementation principle have a long philosophical tradition and strong support from contemporary physics. We call such uses potential parts. This... more
Part is not a univocal term.  Uses of parthood and composition that do not obey any supplementation principle have a long philosophical tradition and strong support from contemporary physics.  We call such uses potential parts.  This paper first shows why potential parts are important and incompatible with supplementation, then provides a formal mereology for such parts inspired by the path-integral approach to quantum electrodynamics.
In the second half of the twentieth century there has been an ongoing fascination with the “unreasonable effectiveness” of mathematics in natural science. In the predominant contemporary metaphysical accounts of mathematical beings, that... more
In the second half of the twentieth century there has been an ongoing fascination with the “unreasonable effectiveness” of mathematics in natural science.  In the predominant contemporary metaphysical accounts of mathematical beings, that effectiveness is indeed condemned to remain a mystery.  While the Thomist account, in which mathematical objects are abstractions from the real category of quantity, does a great deal to explain the effectiveness of mathematics in describing the physical world, it was broadly rejected due to seeming incompatibilities with nineteenth and twentieth century developments in the logical foundations of mathematics.  The most historically important claims of incompatibility, however, are apparent rather than real, and reflect limitations in the claimants understanding of both Aquinas’s views and the implicit metaphysics of mathematical formulae.  I show this by first situating the Thomist position within contemporary debates in philosophy of mathematics, second illustrating the need for quantity in motivating accounts of number rich enough for analysis, and third showing that Thomism is not incompatible with the mathematical analyst’s need for infinite sets.
Philosophy of science constrains the definition of miracles via David Hume's objection to their possibility, while philosophy of religion imposes the constraint that they must be immediate evidence of divine power. Can a coherent... more
Philosophy of science constrains the definition of miracles via David Hume's objection to their possibility, while philosophy of religion imposes the constraint that they must be immediate evidence of divine power. Can a coherent definition of miracles be constructed that tracks historical concepts and meets these criteria? I suggest that Thomas Aquinas' definition in the Summa Contra Gentiles can be read in such a way as to meet these constraints and show how it applies to various miraculous accounts from the Bible. This then illuminates the boundaries of religion and science in understanding the world.
The literature reveals two main approaches for denying that the ontology supplied by physics is the only one available to science. First are those advocating supervenience, who treat the ontology of chemistry and the other special... more
The literature reveals two main approaches for denying that the ontology supplied by physics is the only one available to science. First are those advocating supervenience, who treat the ontology of chemistry and the other special sciences as if it were a mathematical formalism: perhaps derivable on its own terms, more useful in some circumstances, and indubitably true, but not expressing any truths about the world which cannot be expressed in the ontology of physics. Second are those advocating downward causation, who in essence believe that physics is incomplete on its own terms: without knowledge derived from the special sciences, they say, predictions about energy transfer will be less restrictive or exact. It is noteworthy that working chemists seem very uncomfortable with both of these approaches. The former ignores the need for chemically-derived constants in so-called ab initio work: if true, it is as yet unproven and unsupported by the scientific evidence. The latter strikes many as redolent of alchemy, and is also notably unproven.
I propose instead that chemistry and physics are genuinely autonomous sciences. This approach demands that both sciences have completely independent ontologies: schema of terms and relations without any overlap. Orbitals, which aren’t part of the ontology of chemistry or that of physics, exist only as a mental posit. Leptons and molecules both exist, but it is meaningless to say that the latter contain some number of the former. This view not only makes sense of the history and mathematics of chemistry (as shown in my presentations at Philopolis Montreal 2011 and 2012), but also treats physics as complete in itself. Chemistry properly speaking, then, can’t make any physical claims about e.g. heat transfer. Much of what chemists do is then defined as physics, but that need not raise any more problems than physicists doing mathematics does, provided the epistemological and ontological claims can be properly extricated. Chemical predictions about physics would be merely heuristic (descriptive rather than purely explanatory). Physics, of course, does condition chemical possibility, but does not determine it (which is consistent with the actual facts about ab initio quantum chemistry).
This paper, unlike my previous efforts, takes its scientific claims as given and instead focuses on working out the philosophical differentiations and implications of such a theory. It seems that the theory is immune to common attacks against strict reduction, supervenience, and downward causation.
Reductionist accounts of chemistry view molecules as heuristics for large classes of physical particles and their interactions, whereas anti-reductionist accounts often invoke downward causation of the whole upon the parts. Those with... more
Reductionist accounts of chemistry view molecules as heuristics for large classes of physical particles and their interactions, whereas anti-reductionist accounts often invoke downward causation of the whole upon the parts. Those with substantial experience in chemistry who view the former as too simple and the latter as too mysterious, like Eric Scerri, have attempted to separate epistemic and ontological issues, arguing that chemistry’s ontology is merely heuristic while its epistemological method is distinct. This claim, while appealing to working scientists, tends to confuse philosophers, who in the tradition of Bertrand Russell see ontology as the content of epistemic process in science.
In this paper I review the historical experiments that led to the development of the periodic table and Avogadro’s number with an eye to discerning the ontological terms and relations revealed by those experiments. What were the actual objects of research and how were they revealed? I contend that without disclaiming the role that energy plays in chemical reactions, elements and their relations in molecules were disclosed together by a set of insights which are primary, rather than heuristic. The objects of chemistry thus have their own proper ontology without a constitutive relation to those of physics.
If the history of chemistry bears out a separate ontology, just what is the relationship of chemistry and physics? I contend that it is a conditioning relation, whereby certain physical states make more likely or preclude certain chemical states, without fully determining those states. I then explore the literature of modern ab initio quantum chemistry to substantiate this assertion. Left open is the question of how chemists can predict bonding in complex molecules if the tool of orbitals is not available, or if orbitals can be recast in an ontologically coherent way rather than as molecular substructures.
The Random House Unabridged Dictionary defines an atom as “the smallest component of an element having the chemical properties of the element, consisting of a nucleus containing combinations of neutrons and protons and one or more... more
The Random House Unabridged Dictionary defines an atom as “the smallest component of an element having the chemical properties of the element, consisting of a nucleus containing combinations of neutrons and protons and one or more electrons bound to the nucleus by electrical attraction; the number of protons determines the identity of the element.” This definition is common, but this paper will show that an instance of an element cannot consist of fundamental particles. In other words, while chemistry clearly abides by the laws of physics, chemical processes cannot be reduced to physical ones, and so the common definition of the atom is a category mistake.
The modern science of chemistry is demarcated by the discovery of the periodic table which systematically categorizes elements in relation to each other; that table was formulated well before the discovery of the electron. In time it was demonstrated that elements have minimal components necessary for their properties and thus the atomic theory was born and the natural kind water was understood to be H2O, meaning that H2O was both necessary and sufficient for water. This is demonstrable both experimentally (rational synthesis works) and theoretically (the periodic table). With the advent of high-energy physics and “atom smashing” it has been broadly assumed that the same is true of atoms: 2He is just 2(2n2p2e). While those constituents are necessary for 2He, however, it is a mistake to assume that they are sufficient. Experimentally, merely composing the elements can easily result in alpha and beta particles or plasma rather than a molecule, and theoretically speaking Schrodinger’s equation is not solvable for molecules. This latter fact is not just an accident of mathematics: it is precisely the empirical residue remaining after the verifiable insights of physics occur that is the ground for the independent science of chemistry.
As long as the terms and relations of chemistry (atoms, molecules, elements, valences) are not terms and relations of physics (particles of the Standard Model, wave functions), and are not demonstrable in the terms of the more fundamental science, it is incorrect to say that chemistry is in principle reducible to physics: it is a special science in its own right, not merely a heuristic. Quantum chemistry is still valuable in illustrating the conditions under which chemical relations are possible to a greater degree of precision than physics of more general interest, but it is incapable of determining chemical outcomes. This provides support from the philosophy of science for the metaphysical doctrine that there are no things within things.
I elaborate criteria for Platonic Technae on the basis of the dialogues, then show that St. Thomas' conception of Sacra Doctrina meets these strictures.
Using tools from Lonergan's Insight, I explode the dichotomies Kierkegaard presents in the first essay of Philosophical Fragments, vindicating Augustine's conception of the unity of truth.
In Good and Evil Actions: A Journey through Saint Thomas Aquinas, Steven J. Jensen attempts to steer a dialectical middle course between what he calls Abelardianism (“the view that by itself the exterior action must receive its moral... more
In Good and Evil Actions: A Journey through Saint Thomas Aquinas, Steven J. Jensen attempts to steer a dialectical middle course between what he calls Abelardianism (“the view that by itself the exterior action must receive its moral character from the interior act of will” as represented by Germain Grisez, John Finnis, Joseph Boyle, and Martin Rhonheimer) and Physicalism (“the view that the exterior action has its own moral species, not derived from the act of the will” as represented by Stephen Long, Jean Porter, and Kevin Flannery) while remaining close to the text of Aquinas. On the question of an overall framework for the philosophy of action Jensen largely succeeds in this ambitious goal, greatly clarifying Aquinas’ terminology and the relation of interior and exterior acts. Jensen is less successful,
however, at showing how Aquinas’ examples can be squared with his broader moral theory and the contemporary understanding of the Church. This is particularly apparent in the questions of self-defense and when circumstances give moral species to actions, both of which are prominently discussed in the work.
Catholic ethicists today are buffeted by allegations of classicism on one side and lack of fidelity to the Magisterium of the Church on the other. The Magisterium insists on the meta-ethical criterion that ethics must be able to label... more
Catholic ethicists today are buffeted by allegations of classicism on one side and lack of fidelity to the Magisterium of the Church on the other. The Magisterium insists on the meta-ethical criterion that ethics must be able to label some acts as intrinsically evil. Those who recognize that ethics arise from the self-reflection of subjects are dubious that such categorical claims can be grounded absent special revelation. In Thomist terms, the question is how natural law arises from reason.
At the 2011 Workshop, R.J. Snell investigated whether the New Natural Law theorists had successfully bridged this gap, and concluded that while moving in the right direction, their efforts had not yet fully embraced the objectivity of authentic subjectivity. Pat Byrne has also presented repeatedly on his developing work in developing a Lonerganian perspective adequate to engage contemporary bioethics. At Lonergan on the Edge 2011, Gilles Mongeau recommended Martin Rhonheimer’s Perspective of the Acting Person to the assembly as an exemplar of converted ethics. Rhonheimer does not reference Lonergan, but reads Veritatis Splendor precisely as a call for an objective ethics grounded in authentic subjectivity, contra the New Natural Law theorists.
I propose a workshop, potentially concluding with a Friday plenary panel, which would bring Byrne and Rhonheimer’s views into dialogue against the backdrop of Veritatis Splendor and New Natural Law.
We will examine this Jesuit method of proselytizing, as contrasted with that of the Friars, and trace its roots through their different missionary territories, the varying charisms of their orders, and finally to the differing policies of... more
We will examine this Jesuit method of proselytizing, as contrasted with that of the Friars, and trace its roots through their different missionary territories, the varying charisms of their orders, and finally to the differing policies of their colonial protectors in East Asia. Our argument is that the so-called “Chinese Rites Controversy” was not a simple theological controversy between orders but rather a complex  anthropological dispute with roots in the life histories of both the missionaries and their native informants.
While consecrated religious may be first responders of a sort, I have arrived at the fast-burning twitter brushfire of HeresyLetterGate (wherein Ross Douthat suggested that Massimo Faggioli might be a heretic and Faggioli in turn... more
While consecrated religious may be first responders of a sort, I have arrived at the fast-burning twitter brushfire of HeresyLetterGate (wherein Ross Douthat suggested that Massimo Faggioli might be a heretic and Faggioli in turn questioned Douthat's credentials) at a more medieval pace. While the thorns have now burned away leaving only scorched grass that withers and fades, I remain consumed by Katie Grimes' powerful coda to the affair, in which she rightly exhorts us to " build a church in which black lives truly matter and to whom white supremacy appears anathema. " Indeed our guilt must be set out and we must gain wisdom of heart, but I would like to turn Catholic theologians' attention to the specifically Christological and ecclesiological terrain of Grimes' exhortation. In order for teachers of theology to " equip the saints for the work of ministry, for building up the body of Christ " (Eph 4:12 NRSVCE), we must be able to distinguish the place of teaching among the other gifts of Christ with regard to the end that " all of us come to the unity of the faith and of the knowledge of the Son of God, to maturity, to the measure of the full stature of Christ " (v13). A body wracked and bent by racism is far from full stature, but what are the contributions of theology to its maturation? This is closely related to Rod Dreher's question:

"Why study academic theology? Does one do it to shore up the master's house, and maybe to add new rooms onto it, based on the experience of living in it during a different time? Or does one study academic theology to tear the house down and build something more modern on the footprint?"

Only in light of the aims of theology can we understand its nature well enough to ascertain who should be doing it and what rules are necessary for its mature practice in a way that transcends the narrative of polarization.