Debunking a “Junk Science” Survey of Student Views on Free Speech


When it was first published by the Brookings Institution last week a survey of over 1,500 college students seemed to provide chilling support for those free speech alarmists who claim that intolerant college students pose a major danger to freedom of speech.  According to the survey of students conducted by John Villasenor, a Brookings Institution senior fellow and University of California at Los Angeles professor, a fifth of undergrads now say it’s acceptable to use physical force to silence a speaker who makes “offensive and hurtful statements.”  The story received major coverage in the Washington Post and was touted by the editorial board of the Wall Street Journal and other conservative outlets.

I must say that from the start I was skeptical.  The numbers simply didn’t jibe with my experience and that of almost every colleague I know.  Moreover, nothing in the Post article described the survey’s methodology, and Brookings’s own report said only that it was a “web survey.”  Now leading experts in polling methodology are confirming my suspicions.

The way the survey results have been presented are “malpractice” and “junk science” and “it should never have appeared in the press”, according to Cliff Zukin, a former president of the American Association of Public Opinion Polling.  Villasenor, it turns out, is not a pollster or even a social scientist; he is an electrical engineer.  And his survey was not administered to a randomly selected group of college students nationwide, what statisticians call a “probability sample”. Instead, it was given to an opt-in online panel of people who identified as current college students.

“If it’s not a probability sample, it’s not a sample of anyone, it’s just 1,500 college students who happen to respond,” Zukin told The Guardian. 

Funded by a grant from the Charles Koch Foundation, Villasenor’s survey went through no peer review process.  Villasenor said that he “designed the survey questions and then requested that UCLA contract with a vendor for the data collection.”  Who was that vendor?  Unclear.  Responding to a query from sociologist Neil Caren on the Scatterplot blog, Villasenor offered this:

At my request, UCLA contracted with (or issued a purchase order to; I don’t remember the specifics) the RAND Survey Research Group (SRG), which oversaw the actual data collection (and my understanding is that they in turn used a vendor to help get the panel). RAND provided me with the raw data and I did all of the analysis on that data; RAND SRG played no role in the analysis.

The IRB processing for this survey was done by RAND. There is an MOU between UCLA and RAND stating that when RAND does the IRB approval, there is no need for a separate IRB process within UCLA

In other words, we still haven’t the faintest idea how this survey was conducted and by whom.

Because, Villasenor claimed, the respondents to his call seemed to represent a rough demographic cross-section of college students, he used the results but calculated a “margin of error” for the study, which seemed to lend it a veneer of scientific credibility.  But Timothy Johnson, the current president of the American Association for Public Opinion Research, called that move “really not appropriate.”  Others labeled it “malpractice.”

Villasenor asked students if it was more important for colleges to create an “open learning environment where students are exposed to all types of speech and viewpoints, even if it means allowing speech that is offensive or biased against certain groups of people” or “a positive learning environment for all students by prohibiting certain speech or expression of viewpoints that are offensive or biased against certain groups of people.”  He found that 53% of his respondents said they supported the “positive learning environment” that required “prohibiting certain speech”

Last year, a different, more nationally representative survey of American college student opinions on free speech on campus conducted by Gallup found strikingly different results, as John Wilson reported on this blog in April.  That survey of more than 3,000 college students, who had been selected in a carefully randomized process from a nationally representative group of colleges, had asked students the same question. It found that 78% of students said colleges should create an “open learning environment.”

In short, Villasenor’s highly publicized “survey” is worse than useless; it’s clearly biased.  But that’s not the biggest problem.  As University of Wisconsin professor Donald Moynihan pointed out on Twitter, none of those reporting on this, not Brookings, not the Washington Post, not other outlets that reprinted or commented favorably on the Post‘s coverage, asked a simple question: “it’s a Koch-funded survey by an electrical engineer who does not do surveys but has strong views.  Let’s give it a once-over.”  This is a failure not merely of one ill-qualified and biased researcher, but of the media that so naively reported his meaningless “results.”

As Moynihan sadly concluded, “More people now believe that students oppose free speech, based on a flawed study and resulting headlines.  No correction will fix that.”


In case anyone is interested, here is how Gallup described their methodology:

Results for the college student sample are based on telephone interviews with a random sample of 3,072 U.S. college students, aged 18 to 24, who are currently enrolled as full-time students at four-year colleges. Gallup selected a random sample of 240 U.S. four-year colleges, drawn from the Integrated Postsecondary Education Data System (IPEDS), that were strati ed by college enrollment size, public or private af liation, and region of the country. Gallup then contacted each sampled college in an attempt to obtain a sample of their students. Thirty-two colleges agreed to participate. The participating colleges were [long list of schools.] Gallup used random samples of 40% of each college’s student body, with one school providing a 32% sample, for its sample frame. The sample frame consisted of 54,806 college students from the 32 colleges. Gallup then emailed each sampled student to complete an Internet survey to confirm his or her eligibility for the study and to request a phone number where the student could be reached for a telephone interview. A total of 6,928 college students completed the Web survey, for a response rate of 13%. Of these, 6,814 students were eligible and provided a working phone number. Telephone interviews were conducted Feb. 29-March 15, 2016. The response rate for the phone survey was 49% using the American Association for Public Opinion Research’s RR-III calculation. The combined response rate for the Web recruit and telephone surveys was 6%.

And here’s how Villasenor described his:

Here is some more detailed information regarding the survey: This web survey of 1,500 undergraduate students at U.S. four-year colleges and universities was conducted between August 17 and August 31, 2017. [sentence about financing]. I designed the survey questions and then requested that UCLA contract with a vendor for the data collection.

Users who have liked this post. Please consider sharing on social media and/or making a comment below.

  • avatar

3 thoughts on “Debunking a “Junk Science” Survey of Student Views on Free Speech

  1. Pingback: Colleges: Illiberal Enclaves of Groupthink? | ACADEME BLOG

  2. Pingback: Has the Threat to Free Speech on Campus Been Overblown?

  3. Pingback: Turkey Is Behaving Less Like a NATO State and More Like an Adversary | Turkey Agenda

Your comments are welcome. They must be relevant to the topic at hand and must not contain advertisements, degrade others, or violate laws or considerations of privacy. We encourage the use of your real name, but do not prohibit pseudonyms as long as you don’t impersonate a real person.