Cancel Culture in Political Science?

BY JOHN K. WILSON

Pippa Norris of Harvard has an interesting new paper on the views of political scientists globally, “Closed minds? Is a ‘cancel culture’ stifling academic freedom and intellectual debate in political science?

It’s based on a global survey of political science professors, World of Political Science, 2019, in which scholars were asked: “based on your experience, please indicate whether you think the quality of the following aspects of academic life have changed over the last five years” in three areas:

  1. Respect for open debate from diverse perspectives
  2. Pressures to be ‘politically correct’
  3. Academic freedom to teach and research

I am deeply skeptical of using survey research like this to determine what’s really happening on college campuses. Instead, the responses are much more likely to represent what’s reported in the media about higher education. Scholars will interpret “your experience” as “your understanding” rather than a report of personal threats to their freedom. For example, nobody imagines that almost half of political science scholars have experienced a personal attack on their academic freedom in recent years. They are responding to their perception of broader national trends. And during the period covered by the study, there were famous cases nearly every political science professor in America has heard of, such as Charles Murray at Middlebury or Milo at UC-Berkeley, while stories of censorship by the right are much less likely to be publicized.

The survey is also flawed because it asks about changes in the last five years, rather than an objective analysis of current reality. A “trend” question like this may tend to distort reality by asking about recent perceptions of change instead of the actual repression.

But the biggest problem with the survey are the biases in the first two questions that are likely to increase pessimistic responses from right-wing scholars.

First, “Respect for open debate from diverse perspectives.” The concept of viewpoint diversity tends to be much more favored on the right, and a term like “respect” (closely associated with civility or good manners) is also more valued by conservatives, I think. According to this question, it’s not enough to have open debate–you are entitled to respect for your ideas, not just the freedom to express them. The key problem with the use of “respect” is that many people view harsh criticism as a lack of respect. Respect may also be tied to status. It’s possible that older, more conservative scholars think they are entitled to respect because of their seniority. When their views are criticized by students or young scholars, they may view this as “disrespectful” and might agree with this question as it’s worded.

The second question is even more biased. Any mention of “politically correct” is going to be perceived as a discussion of censorship by the left, and excluding censorship by the right. As the author of The Myth of Political Correctness 25 years ago, I think I can say that it is a biased term. There is essentially no discussion about right-wing political correctness. Even those of us who recognize and document the concept tend to use terms such as “conservative correctness” because “political correctness” is a hopelessly biased term.

The only truly neutral question of the three is “academic freedom to teach and research,” and not surprisingly, the results on this question are radically different from the other questions.

Norris sent me the mean data on the individual questions, which reveals very different responses.

On the first two questions, right-wing scholars in the US are dramatically more likely to see a worsening picture than left-wing scholars. But on the academic freedom question, right-wing scholars were only marginally more pessimistic than left-wing scholars. (The mean gap between the far left and the far right is more than 1.5 points on the first two questions, and less than 0.5 points on the academic freedom question.) Considering the massive media attention to attacks on the academic freedom of conservatives, this gap in the third question is only surprising because it is so small.

Because the academic freedom question immediately followed a biased question about politically correct censorship, it’s also possible that respondents were primed to frame academic freedom in terms of the PC question, and to think about violations of academic freedom primarily in terms of politically correct cases where conservatives are censored. That could easily explain the small differences. (It should also be noted that some left-wing scholars may believe that academic freedom is getting worse in the last five years because of other leftists who engage in censorship, not because of censorship by right-wing forces, so we cannot conclude from the academic freedom question that there is no substantial problem of censorship targeting conservatives; we simply have no way of knowing from this survey.)

Unfortunately, Norris’ proposed “Cancel Culture Index” combining these three questions into a hockey stick graph is fatally flawed by the fundamental biases in the first two questions and the limitations of survey research.

She didn’t create the questions, but I think it was too tempting to point to a result that grabs headlines rather than the more accurate result from the academic freedom question that tells us very little of importance. Survey questions generally don’t give us much useful information about complex and contentious social phenomena (such as the extent of censorship and who it affects), and especially not when the questions are biased and imprecise as in this survey.

Perhaps the most notable findings from this survey are the differences between the US and other countries (especially in poorer areas of the world). The US and Western Europe tended to have more left-wing faculty, and for conservatives to see more censorship. By contrast, in much of the rest of the world, leftist faculty are rarer than conservatives and perceive far greater censorship (even with the bias of the questions).​

7 thoughts on “Cancel Culture in Political Science?

  1. Like John Wilson, I am always skeptical about surveys. However, I am equally skeptical of those who are skeptical about those who are skeptical about surveys — myself included. 🙂 This is especially true when the survey results conflict with one’s pre-existing ideology.

    For instance, if asked these questions, I would answer based on my personal experience, since that is how the questions were framed. Why should we ASSUME that “the responses are MUCH more likely to represent what’s reported in the media about higher education. Scholars will interpret ‘your experience’ as ‘your understanding’ rather than a report of personal threats to their freedom.” These respondents probably have advanced degrees and considerable academic experience. Why would they misread a survey question?

    • It’s just not possible that 40-50% of the scholars in political science have personally experienced a direct threat to their academic freedom in teaching or research during the past five years. They must be talking about the broader academic atmosphere. And that’s where we get into the difference between reporting personal experiences and reporting subjective opinions about academia.

  2. Thanks for the interest. All surveys are asking about perceptions as well as direct experience. If people believe that academic conditions are getting better or worse, just like perceptions of the economy or COVID, then we should understand this as the social construction of reality. It matters, and we should seek to understand these perceptions, irrespective of actual conditions. Like the public’s views on rates of crime or jobs, this shapes their reality, whether factually true or false.

    WHY people believe that conditions are getting better or worse is the task of the survey analyst. It may be images in the media, or interpersonal discussions with friends and colleagues, or partisan motivated reasoning and elite cues, or whatever.

    So, with respect, the 5-point scaled ‘getting better/getting worse’ scaled questions are far from biased; they are designed to tap into several common beliefs, part of a broader 22 item battery about working conditions. The design is standard practice in social surveys. If conservatives believe that colleagues are teaching ‘politically correct’ views in American college campuses, and if liberals believe there is worsening academic freedoms, we should seek to explore these views, because this is their perceived reality, not dismiss such concerns out of hand. This is the whole point of survey evidence, it is about monitoring subjective opinions not measuring ‘objective’ conditions!

    Finally, the survey is global, in over 100 countries, not focused on any one society. The fact that similar perceptions were shared in a wide range of advanced industrialized societies, like the US, Australia, the UK, Germany and France, but not in developing countries, is the most important finding to be explored further.

  3. The political science department is very thin ‘cancel’ sampling slice, and really can’t be isolated culturally.

    There is a slight liberty taken here in this essay that seems to be asserting less ideological homogeneity among the Left, and more among the Right. To get at the underbelly of this issue—and the so-called ‘cancel culture’ phenomenon, you really have to unpack a lot of factors that all are mixed together here.

    The belief that Left ideology is resulting in a suppression of political intellectual diversity has at least two dimensions: one experienced by professors quietly carrying out their writing and research; and the broader pedagogic and cultural sociology of the campus as a “colony.” Moreover, implicit “cancelling” is also deeply embedded in faculty hiring and promotion; in PhD acceptance; in publication and editing; in visiting scholar opportunities and in grants and awards. ‘Cancelling’ is culturally inherent in department routines.

    Otherwise, take two political science poles: BLM and BDS. Nearly every university and college, and political science department, has ceded to their demands, indeed promotes them (is ‘cancel’ also an additive act?). Indeed, what is BLM, but a cancel initiative, gone operational and violent?

    Indeed, the modern university is, by definition, ipso facto, now an official “Cancel College.” It walks on eggshells over the slightest of political racial identitarian demands, even if like BLM, it is violent and unlawful. They will “cancel” for any even remote Title VI risk, and of course they at the same will take anyone’s money (like Jeffrey Epstein’s at Harvard). The whole university is, in effect, the political science department, ergo the sample’s true validity and usefulness.

    Our nation’s law schools moreover, more political science operational centers, are generally of such Left ideological extremism, that even law itself is sacrificed (the Floyd case was utterly biased by law school Deans and professors coming out publicly only a day a two after limited news reports, with statements that deviated from all western legal norms of due process, presumption, capacity, due diligence and evidence standards). Law schools are more political science departments, than Political Science. Chicago Law fawned over the disgraced James Comey; it hostilely rejected its other alum, Pat Cipollone, Trump’s successful impeachment defense lead.

    You are also using a few clichés here, such as equating ‘conservatism’ with senior age and status (tell that to Charlie Kirk, and Turning Point which has raised millions from young adults; or The Federalist members, for example). Conservatism (which is a wide spectrum) is deeply represented by the 18-30 sector, and growing.

    Regards, ’96, The University of Chicago

  4. Readers may be interested to see a study, published by the NAS, on D:R ratios (Dem:Rep) across disciplines/departments in a college sample that was quite significant. It is not perfectly responsive to the study referenced above, but very informative and contextual nonetheless. Perhaps quite troubling, as these are significant feeder institutions to graduate professional schools. As for the subjective opinion component, it would be fascinating to overlay relevant student perspective, among others in the network mix.

    https://www.nas.org/academic-questions/31/2/homogenous_the_political_affiliations_of_elite_liberal_arts_college_faculty

    The political science department ratio (8:1) is still quite striking, despite its fit in this set, which you can see in bar and other charts in the report.

    Regards.

    • Personally, I am less concerned with the numerical distribution of liberal vs. conservative professors than in the ways and means by which their political bias is injected into the classroom — through lectures, discussion topics, assigned readings, guest speakers, etc. etc. This can be overt or covert but it’s there. And, yes, people have lost their reputations,been demeaned and insulted, or even lost their livelihoods over such academic politics.

      Remember Quincy Jones’s advice to all those superstar singers who sang “We are the World; we are the Children” back in the day? He said, “Check your ego at the door!” Couldn’t Poli Sci (and other) professors check their IDEOLOGIES at the door?

      • Yes. Quite so: That may be the core data point–the classroom itself, and its total reach, as you note, into lecture material, readings and more. Bias is incredibly subtle, and insidious. That’s among the reasons why I suggested surveying the actual students, as an effective cross-reference. As a parent of several college students, I can say that parents, and employers, are excellent survey sources as well, to complete the bias picture. The change in student cognition, framing, apprehension and judgement, even between semesters, or even from one class, is fascinating to observe. Often it is quite positive; other times, quite troubling.

        The sensibility you have about “checking ideologies at the door (to the classroom)” is unfortunately, probably impossible to effectively filter; which is why, indeed, the D:R or other ratio in the hiring process itself, and the other “door” (to the department offices, or now, digital network) is so key to political or viewpoint heterogeneity. When the ratios are that skewed (see the chart in the article), there is also as you all know, the problem of “group think” or group solidarity, and within departments, or even professional schools, a very visible “bias escalation” and ‘escalation of irrational commitment’ that further excites, fuels and enlarges the bias effect from the individual level, to tribal. That has some relevance to what filtering and suppression may have been present in the Harvard survey. The modern law school is a prime example otherwise. Readers may enjoy my monograph on this and related bias issues, in the UChicago Knowledge Archive:https://knowledge.uchicago.edu/record/2144?ln=en. The Center for Decision Research at UChicago’s Booth School of Business has explored this issue as well.

        Stanley Milgram’s famous experiment in the obedience to authority dynamic, has some relevance, but more interestingly, is Milgram’s less-known “Cyrano” experiments, or the “cyranic illusion” that displays extreme identity incongruity. It is manifest in the mirroring of messaging distributed through social sources and from electronic media, and more disquieting, the effect on students where such illusion is both repeated and amplified by their behavioral and cognitive mirroring of academic authority figures. You can see this every day in otherwise highly educated professionals, speaking as if they were merely a medium. See Corti K, & Gillespie A (2014). Revisiting Milgram’s Cyranoid Method: Experimenting With Hybrid Human Agents. The Journal of Social Psychology PMID: 25185802, and Marcia Holmes (Ph.D, 2014) Voices off: Stanley Milgram’s cyranoids in historical context, https://journals.sagepub.com/doi/full/10.1177/0952695119867021 8. The Holmes piece is very well-done. She wrote her original thesis with the Committee on Social Thought at Chicago.

        Regards, ’96, UChicago; ’84, UTexas Austin

Comments are closed.