BY AARON BARLOW
In the 30+ years since I first looked into using digital technology in the classroom, my enthusiasm for it has waned. Today, I guess I have to classify myself as a Luddite. When I re-read Thomas Pynchon’s “Is It O.K. To Be A Luddite?” my answer to his question is quite different from what it was in 1984, when I first read, with distress, these words by that idol of mine:
THE word ”Luddite” continues to be applied with contempt to anyone with doubts about technology, especially the nuclear kind. Luddites today are no longer faced with human factory owners and vulnerable machines. As well-known President and unintentional Luddite D. D. Eisenhower prophesied when he left office, there is now a permanent power establishment of admirals, generals and corporate CEO’s, up against whom us average poor bastards are completely outclassed, although Ike didn’t put it quite that way. We are all supposed to keep tranquil and allow it to go on, even though, because of the data revolution, it becomes every day less possible to fool any of the people any of the time. If our world survives, the next great challenge to watch out for will come – you heard it here first – when the curves of research and development in artificial intelligence, molecular biology and robotics all converge. Oboy. It will be amazing and unpredictable, and even the biggest of brass, let us devoutly hope, are going to be caught flat-footed. It is certainly something for all good Luddites to look forward to if, God willing, we should live so long.
Back then, Pynchon’s irony perplexed me. I loved the computers I’d been playing with over the past few years and, though I respected the problems that Pynchon sees here (and Philip K. Dick saw everywhere), I thought that we “average poor bastards” could handle it.
Today, I’m not so worried about the confluence Pynchon predicted and Dick explored as I am about the manipulation of technology to give the few greater control over the many–not using it to replace people, particularly, but to push them aside and take control of what they are doing.
For many today, especially among the rich, technology is better than the human–on the face of it. Especially when its control is centralized.
Be that as it may (I’ll get back to it below).
Last week, I played “The Thinking Man” performed by Bob Gibson and Bob (Hamilton) Camp for a colleague:
It includes these lines:
Well now, the man who invented the computer
Was from a place called M.I.T.
He punched out cards and tapes by the yard
Humming “Nearer My God to Thee.”
The irony of that is two-fold, and it dovetails nicely with Pynchon’s point: “Nearer My God to Thee,” aside from its obvious satiric use, resonates with disaster: It is, in the popular imagination (at least), the song the band played as they went down on the Titanic:
It was midnight on the sea,
Band playing “Nearer My God to Thee”:
Fare thee, Titanic, fare thee well.
Perhaps its getting close to midnight and the iceberg, in terms of technology, nears.
Back to my point:
More than ever before, we’ve an elite which sees technology as its salvation. They are going to impose it on the rest of us, whether we want it or not–and they are already doing so–with the gall to claim it as a cost-saver (back to that Gibson and Camp song–one sung, by the way, as the 1960s opened), one that will benefit us all (they claim).
Writ large, technology has never saved humanity a dime, though it certainly enhances the pocketbooks of the one-percent. For the rest? Not so much.
I was thinking about that this morning because of a “Memorandum” that arrived by email from Allan H. Dobrin, the Executive Vice Chancellor and Chief Operating Officer” of the City University of New York, where I teach. It is a Call for Proposals for the Annual CUNY IT Conference this fall. As I have attended a number of times and have presented there at least twice, I read it.
And saw red.
The theme is “Instructional//Information Technology in CUNY: Good Moves in Hard Times.” The first paragraph claims that “we have an opportunity to grow in ways that place student needs at the center of our digital work.” The assumption behind it all is that digital technology is an unqualified good, and that the only question is how to make it succeed. Suggestions for panel questions are three:
How do we judge success of instructional technology, hybrid and fully online programs? What are the best assessment metrics and methods?
How can we enhance student experiences and outcomes by sharing IT best practices across departments, programs, campuses, and between faculty and administration?
What are our opportunities to expand success in times of fiscal austerity? What are cost-effective ways to improve teaching and student outcomes? What are the right ways for CUNY to grow, given the current budget environment?
Now, I’m all for IT conferences, particularly this one. Bur it should not be operating on assumptions that technology is always the answer. Unfortunately, in such conferences, no one is even asked to ask: How do we keep the digital from overwhelming the human? Or: Can we use technology effectively without it taking center stage in the classroom? Or: Could it be that technological possibilities are being used for profit not education? Or even: Are there ways of measuring student achievement that do not rely on numbers?
I could go on.
Personally, what I want to do is find ways of using the technology my students carry with them as an aid to their learning, not what I can provide for them. This attitude, though, doesn’t square with the assumption of primacy of institutional technology that the three suggested questions for the conference hold–or that IT mavens hold. Quite the contrary. I use the tools available in my “Smart” classroom extensively, but always in concert with the smartphones on my students’ desks. I want to diffuse control of technology, not to centralize it.
I would love to propose a panel on ways to use digital technology in the classroom that bypass centralized IT. Somehow, I don’t think it would be well received.
However, if, in the future, we are not going to find ourselves embroiled in a war against the machines (and their controllers), we need to start examining our assumptions about technology, even exploring ways of moving it from top-down control. Otherwise, the end is not going to be pretty.
I wish this conference had a space for discussing that.
Enough. I don’t want to advocate against technology, only against its centralization and centralized control. However, do let me end where Pynchon does, as a warning of what could come it we don’t approach our technologies more carefully:
Meantime, as Americans, we can take comfort, however minimal and cold, from Lord Byron’s mischievously improvised song, in which he, like other observers of the time, saw clear identification between the first Luddites and our own revolutionary origins. It begins:
As the Liberty lads o’er the sea
Bought their freedom, and cheaply, with blood,
So we, boys, we
Will die fighting, or live free,
And down with all kings but King Ludd!