I was a labor historian earlier in my career. Sadly, I hardly ever get to teach that subject anymore because the class never fills. However, the lessons that most of the best labor historians I read during graduate school described in their work have never left me. Indeed, they seem more relevant to my own life than ever before.
The late Yale historian David Montgomery published Workers’ Control in America all the way back in 1979. It’s a fairly short analysis of exactly how American industrial workers during the late-nineteenth century lost control of the shop floor. After technological change swept their various industries, they no longer had any influence over the production process. “[N]ew methods of industrial management undermined the very foundation of craftsmen’s functional autonomy,” Montgomery wrote on page 26.
“Coupled with systematic supervision and new forms of incentive payment it permitted what Frederick Winslow Taylor called ‘enforced standardization of methods, enforced adoption of the best implements and working conditions, and enforced cooperation of all employees under management'[s] detailed direction.’ Scientific management, in fact, fundamentally disrupted the craftsmen’s style of work, their union rules and standard rates, and their mutualist ethic, as it transformed American industrial practice between 1900 and 1930 [Footnote omitted.]”
The use of the word “disruption” here is only the most obvious reason that I’m beginning to think that nineteenth century factory workers and modern college professors have more in common than most of us ever imagined. The use of Internet-based tools even in traditional face-to-face classes gives managers the ability to monitor our teaching in ways that were unthinkable even twenty years ago. More importantly, it gives universities the ability to set up parallel course structures independent of the influence of traditional faculty. As the Chronicle‘s Wired Campus blog described this situation earlier this week:
One key change has been the creation of new or redefined administrative jobs at colleges intended “to lead their academic-change initiatives.” And the survey found that several colleges have reconstructed their centers for teaching and learning to focus more on student success than just on faculty development, working more often across various departments such as student services and academic affairs.
In other words, supporting teaching with technology is becoming less about offering training sessions for professors about how to use clickers and course-management systems, and more about coordinating bigger-impact projects like redesigning large introductory courses or leading the creation of a new online-degree program.
If you’re a faculty member and you don’t find that last part particularly terrifying, then you aren’t paying attention. Even faculty in who object to teaching online for philosophical reasons may wake up for the first day of classes one of these semesters and find that they have no students left to teach. Our shopfloors will have been moved right out from under our collective noses.
In fact, there is plenty of evidence available that that movement has already begun. Here’s the Internet theorist Clay Shirky writing a post on Medium called “The digital revolution in higher education has already happened. No one noticed.”:
The digital revolution in higher education has happened. In the fall of 2012, the most recent semester with complete data in the U.S., four million undergraduates took at least one course online, out of sixteen million total, with growth up since then. Those numbers mean that more students now take a class online than attend a college with varsity football. More than twice as many now take a class online as live on campus. There are more undergraduates are enrolled in an online class than there are graduate students enrolled in all Masters and Ph.D. programs combined. At the current rate of growth, half the country’s undergraduates will have at least one online class on their transcripts by the end of the decade. This is the new normal.
The first for-credit classes appeared online in the 1980s, but for decades after, such classes were concentrated in a few institutions. That period has ended. More than 95% of colleges and universities with over five thousand students offer online classes for credit. In the same way online dating went from “Eww, weird” to being as ordinary as two tickets to a movie, online education has stopped being “The Future” and has become a perfectly routine way to learn.
This isn’t necessarily a bad thing. Faculty-centered online education can be both convenient for students and pedagogically innovative. Quite simply, professors can do extraordinary things using digital tools in online, hybrid and regular face-to-face classes. Unfortunately, that assumes that their administrators let them. If the online course universe is controlled by Associate Deans trying to make a name for themselves and populated entirely by adjunct faculty who cannot control their own courses, these minor technological miracles are unlikely to happen. That’s why vigilant monitoring of online developments by professors of all kinds is so vitally important to the future of higher education in America and around the world.