Header Paragraph

Málþing um samstöðu og ágreining í tilefni Nils Klim-verðlaunanna

Image

Í tilefni þess að Finnur Dellsén hlaut á dögunum Nils Klim-verðlaunin standa Heimspekistofnun og Holberg-stofnunin fyrir málþinginu „Consensus and Disagreement: Perspectives from Social Epistemology and Philosophy of Science“. Á málþinginu verða, ásamt Finni, fjórir erlendir fyrirlesarar: Deborah Tollefsen, Boaz Miller, Kristen Intemann og Dunja Seselja. Þau munu flytja erindi um efni á borð við vísindalegan ágreining, samstöðu meðal vísindafólks, efasemdir um vísindi og traust á vísindum.

Málþingið er öllum opið og stendur yfir frá kl. 9 til 17 mánudaginn 20. janúar, í Norræna húsinu. Málþingið fer fram á ensku. Þátttakendur eru hvattir til að skrá sig með tölvupósti á vkmagnusson@gmail.com.

Dagskrá:

  • 9:00-10:00 Group Belief and Group Disagreement. Deborah Tollefsen, University of Memphis.
  • 10:30-11:40 I Know, We Know: Bringing Individual and Collective Knowledge Together. Boaz Miller, Zefat Academic College.
  • 11:40-13:00 Lunch.
  • 13:00-14:10 Fighting Doubt By Promoting Warranted Trust. Kristen Intemann, montana State University.
  • 14:30-15:40 Scientific disagreement, epistemic toleration and collective epistemic responsibility. Dunja Seselja, Einhoven University of Technology.
  • 16:00-17:00 Consensus and Marginal Dissent: A Social Epistemology for the 97%. Finnur Dellsén, University of Iceland & Inland Norway University of Applied Sciences

Um fyrirlestrana:

Group Belief and Group Disagreement. Deborah Tollefsen.

Debates regarding the appropriate response to peer disagreement focus, almost exclusively, on disagreement between individual agents.  The literature brackets the fact that believers are embedded in epistemic communities whose members share beliefs (Christensen 2014 is an exception here) and often ignores the possibility that groups themselves have doxastic states which can come in to conflict with the doxastic states of individuals and other groups. There are at least two sets of questions that are raised by what I shall call, for lack of a better phrase, the groupiness of belief.  First, if beliefs are often shared by multiple people (in some case, millions) what, if anything, does this mean for cases of disagreement?  What should my doxastic reaction be---conciliatory or steadfast or somewhere in between--when I confront the fact that many epistemic peers disagree with me?  Second, if, as some have argued, groups themselves can have beliefs and, presumably, disagree, then what should a group’s doxastic reaction be to such disagreement and what implication does it have for group members?  In this presentation, I briefly explore the first set of questions and then turn to consider in more depth the second set.  I shall discuss two recent articles (Carter 2014; Skipper & Steglich-Petersen 2019) on group peer disagreement and explore the various issues raised by the accounts offered. In particular, I shall focus on the notion of group peerhood which appears to be undertheorized in this recent literature.

I Know, We Know: Bringing Individual and Collective Knowledge Together. Boaz Miller

Philosophers have grown interest in individuals’ epistemic dependence on other members of their epistemic community or their epistemic community as a whole (e.g., Green 2016; Goldberg 2010; Kelp 2014; Miller 2015; Pritchard 2017). They argue that an individual’s epistemic dependence on others goes beyond relying on them as mere informants. Rather, whether her personal beliefs are justified or constitute knowledge depends on epistemic components (e.g., evidence, segments of a reliable belief-forming process, cognitive virtues) possessed only by other subjects, with whom she forms trust relations. This line of thought shares similarities with a tradition in philosophy of science, sociology of knowledge, and feminist epistemology that regards knowledge as a fundamentally communal or collective good, from which individual knowledge is derivative (Kuhn 1970; Kusch 2002; Longino 2002; Nelson 1993).

The relations between epistemic dependence and collective knowledge, however, remain largely unexplored. Particularly, little has been said about what an individual subject’s standing needs to be vis-à-vis her epistemic community for her to personally know. My paper addresses this lacuna.

Taking Longino’s (2002) pioneering analysis of the relations between individual and collective knowledge as a first approximation, I argue that an individual subject personally knows only if three conditions obtain. First, the individual’s true belief is responsibly formed, where a responsible subject does, inasmuch as she can, ‎what is required of her in a situation to bring about true and rational beliefs. ‎Responsibility is delimited in part by role-expectations, while ‎practicability is delimited by facts about her competencies and ‎technological, ethical, and economic circumstances.‎ Second, whether the individual subject is aware of that or not, the overall available evidence within the relevant epistemic community must sufficiently support her belief given a politically legitimate weighing of inductive risks; namely, the risks that stem from making an incorrect epistemic judgment (Douglas 2000). Third, her belief is objectively justified; namely, the evidence the community possesses distinguishes the real state of affairs from relevant alternatives, whether the community is aware of them or not.

My account gives a unified and consistent analysis of situations in which individuals have or lack knowledge or justified belief. A novel and arguably controversial feature of my account is that it diagnoses alleged Gettier cases and sensitivity and safety violations as justification failures, rather than as knowledge failures. According to my account, in alleged Gettier cases and modal failures, the collective justification standards fall short of the justification standards objectively required in that situation (cf. Foley 2012). By contrast, in skeptical scenarios, such as being a brain in a vat or deceived by an evil demon, the justification standards objectively required to rule out relevant alternatives are too high for a community to meet in principle.

Last, my account identifies researchers’ under-substantiated or premature scientific discovery and confirmation claims as a pathology of ‎testimony, which has been overlooked in social epistemology. It also identifies a common, often overlooked way of ‎thinking of knowledge in the real world that emphasizes its being collectively justified, rather than its being true. ‎

Fighting Doubt By Promoting Warranted Trust. Kristen Intemann

There appears to be a significant “gap” between scientists and the public, where people reject widely accepted scientific claims about everything from climate change, to vaccine safety, to even whether the earth, is in fact, a globe (Pew, 2015).  This gap is relevant not only for epistemic reasons. Failure to believe certain scientific claims can also have important effects on people’s behaviors, and their support for public policies consistent with the evidence (Aklin and Urpelainen, 2014, McCright et al., 2013).  I consider several different explanations for this gap, as well as evaluate solutions that have been pursued by scientists and science studies scholars.  I argue that certain strategies, such as emphasizing consensus among experts or discrediting dissenters fail to address (and may actually exacerbate) an important dimension of the problem: a lack of trust both in scientists and the values that are sometimes presupposed in certain research programs.  Alternative strategies that might more effectively address this aspect of the problem are identified and defended.

Scientific disagreement, epistemic toleration and collective epistemic responsibility. Dunja Šešelja

In this talk I will examine the question: what should scientists do once they recognize they are involved in a peer disagreement? This problem has so far largely been discussed in the peer disagreement debate in epistemology, focusing on the adequate doxastic attitude one should hold towards p upon recognizing that one’s peer disagrees on p. However, when it comes to scientific disagreements, philosophers of science have been interested not only–or primarily–in scientists’ beliefs about phenomena in the given scientific domain, but more in their cognitive attitudes towards their theories, models, hypotheses, thereby addressing both the epistemic and methodological norms of inquiry (Šešelja, 2019). Moreover, beside the question of whether and how scientists should adjust their attitudes towards their own theories, of equal importance has been the question, whether and how scientists should adjust their attitudes towards their opponents’ theories. Should one’s opponents’ theory be epistemically tolerated, engaged with or rather ignored? These issues lie at the heart of philosophical discussions on scientific controversies and methodological puzzles arising from them.

My focus in this talk will be on the latter set of questions. I will argue that an adequate response of disagreeing scientists towards their opponents’ views is a matter of two types of epistemic responsibility: the individual and the collective one. With respect to the former, I will discuss the norms of epistemic toleration (Straßer, Šešelja, and Wieland, 2015). With respect to the latter I will discuss the ‘Epistemic Duty to Join Forces’ as a preventionist account of collective epistemic responsibility (Fleisher and Šešelja, 2019, inspired by Hindrinks’s (2018) account of collective moral responsibility). I will illustrate these points in view of some examples of scientific disagreements from the history of sciences.

Consensus and Marginal Dissent: A Social Epistemology for the 97%. Finnur Dellsén

Around 97% of climate scientists endorse anthropogenic global warming (AGW), the theory that human activities are partly responsible for recent increases in global average temperatures. The fact that so many of the relevant experts endorse AGW is a reason for non-experts to believe in AGW as well. But what is the epistemic significance of the fact that 3% of the experts do not endorse AGW? Put differently, what should we make of the fact that 97% rather than 100% of climate scientists endorse AGW? This paper contrasts unanimity, in which virtually no one disagrees with some theory, with consensus, in which some non-negligible proportion (e.g. 3%) either rejects or is uncertain about the theory. By developing a probabilistic model in which scientists are assumed to be epistemically fallible and socially susceptible, I argue that a consensus is often stronger evidence for a theory’s truth than unanimity. I go on to draw several lessons from this conclusion, e.g. concerning what laypeople should infer from expert pronouncements, how journalists should report on scientific theories, what scientists should communicate to the public, and how philosophers should think about the epistemology of testimony.

Image