My original post on this singular and provocative study of faculty work is available at: https://academeblog.org/2014/04/29/what-we-do-with-our-time/
I would like to thank the authors again for their work on the study [reported in The Blue Review at: https://thebluereview.org/faculty-time-allocation/] and for then agreeing to answer my follow-up questions.
Katie Demps, Matt Genuchi, David Nolin, and John Ziker collectively answered questions 1-6, while Nate Hoffman answered question 7.
1. What results of the study have most and least surprised you?
I think we were not really surprised by the amount of time that our participants work, and the amount of time that is eaten up by email and meetings. Based on our own experiences we were not surprised about the total hours or by the amount of weekend hours worked. We ourselves make up teaching and research activities that don’t get accomplished during the week when we are completing administrative tasks. I think what surprised us the most was the reactions of our undergraduate assistants after they had completed the interviews. They were shocked by how much work we do, and how much is not related to teaching, reflecting a possible disconnect between what the general public thinks professors do and what is required by tenure and promotion committees.
I think we should be clear that we understand that our sample of volunteers is non-representative and likely biased towards those who were interested in the topic. If you read the comments on other blogs where the Blue Review article has gotten attention this is one of the most frequent criticisms of the results. It’s not our fault — the original article is clear about this — but those reposting it talk about it as if that 61 hour figure is gospel.
That said, our study is not based on a survey but a behavioral observation technique which is a unique way to address this issue. Our results are generally consistent with a other studies of faculty time allocation based on surveys that have found that professors work around 54 hours/week. A couple of those studies are cited here: http://www.vox.com/2014/4/24/5647300/being-a-college-professor-isnt-really-a-cushy-job
The Higher Education Research Institute’s (HERI) Faculty Survey, which has been administered triennially since 1989, also confirms our results. In the report of the 2011 study, you can see the frequency distributions for how respondents answered questions related to how they spent their time beginning on page 26 – see: http://www.heri.ucla.edu/monographs/HERI-FAC2011-Monograph.pdf.
2. Has your study had any political impact in Idaho, and/or beyond your state?
We are not aware of any political impact in Idaho or beyond the state.
3. Has your study been highlighted in the local, regional, and/or national press, and if it has, how would you characterize that response? I am especially interested in how the study may have affected the perceptions of the general public.
Our research has been featured at the local level, the Blue Review article that you read, as well as at the national level. For example, the study was recently featured in a story on the homepage of Inside Higher Ed: http://www.insidehighered.com/news/2014/04/09/research-shows-professors-work-long-hours-and-spend-much-day-meetings#sthash.Llnqxy3m.dpbs.
I would characterize the response from the academic community, both locally and nationally, as a response of genuine interest, especially as the Blue Review article highlights some notable issues in how faculty organize their time to meets the demands of university work (e.g. balancing teaching and research). Faculty appear interested in thinking critically about their work load and this study appears to be a catalyst that fosters discussion about how faculty use their time.
At this point, our findings do not appear to have reached a broader audience that would be characterized as the general public, so we cannot speak to any changes in perceptions in the general public.
4. How have the faculty who participated in the study responded to the results and/or characterized the experience of participating in the study?
One faculty participant wrote upon receiving the phase 1 report: “Thanks. I’m most happy to hear that you have some form of documentation that faculty work more than the X hours we are in the classroom (~60 hour/week is noteworthy), and that so much of our time is spent doing tasks that don’t ‘count.'”
Another faculty participant wrote: “Thanks for sending this. It is fascinating. I would be interested to know more about the faculty working off-campus part and why/how they choose to do so, if there’s any way you can collect that data in phase 2. I know why I, personally, don’t work in my perfectly good office, but other people might have different reasons.”
In phase 2 testing using the smartphone application we are developing, we’ve had a range of responses from: “I love TAWKs” to “too many texts.”
5. What did your graduate assistants learn from their participation in the study?
In phase 1 we employed 14 undergraduate research assistants to conduct the 24-hour recall interviews. In a follow up survey to these research assistants 7 of 14 who replied to the survey stated that they agreed or strongly agreed with the statement: “After being involved with this study, I feel more prepared to be involved in social science research in the future?
One repeated realization mentioned by our undergraduate research assistants was they were surprised at just how much work professors do. As the study shows, much of what we do is done alone and off campus, and is therefore invisible to students. Our most visible activity to students is teaching, giving the impression that our jobs consist mainly of lecturing and holding office hours. It’s no wonder there’s a general perception that being a professor is an easy and comfortable job. Before participating in the study I think our undergraduate research assistants were vaguely aware that we also spend time on research, but I think many were unaware of “service” as third dimension of our workload and how just much of a professor’s time is devoted to the general operation of the university. Overall, I think it was a real wake-up call for those considering academic careers.
6. Are you doing any sort of follow up study? How will it differ?
Yes! We are in the midst of collecting data for Phase 2. The data we report is most likely not representative of all academics in the US, or even at Boise State since it is a small, nonrandom sample of BSU professors. The interviews, from which we report data, were completed with the goal of creating a time-sampling application for a smartphone to collect a larger amount of data and provide feedback to participants. This app will ping professors throughout the day and ask them the same questions from the interviews and then to select categories for the type of work activity they are engaged in. This will allow us to collect real-time (not reported) behavioral data from a larger, hopefully more representative sample. We are currently only collecting data with Boise State professors. After another round of app improvement, we hope to move on to Phase 3 and make the app more broadly available within the US and collect data on an even larger scale.
7. Can you describe the genesis and the function of The Blue Review? It seems a somewhat singular approach to publishing faculty scholarship “in house,” and I think that it might provide a model for other institutions.
I started The Blue Review [https://thebluereview.org/] at Boise State about a year and half ago to provide a forum for scholars to write for a broader audience. While the bulk of our writers have been affiliated with the university, we have published scholars from many different institutions. We are actively looking to expand to other universities in the Mountain West and we also publish writers who are not necessarily traditional academics–public intellectuals, bloggers, people with something to say. We call it “popular scholarship in the public interest.” We are independent of the university in that they treat us more like journal rather than a university publication like the alumni mag, for example . . . and we have an editorial board and modified form of peer review that is more akin to a newspaper’s editorial process than a traditional journal, but most of the pieces are reviewed by someone in a similar field. We are quite multi-disciplinary, as you can see.
My background is in journalism, and I’m also an MA student in “Data Journalism.” I think the formula of popular scholarly writing, with the financial support of a public university or universities behind it, is promising on the web. We do need a slightly broader source of funding in the future, but I think we’ve proved out the model so far.
Reblogged this on Ohio Higher Ed.