Audrey Watters’ Ed-Tech Disasters of the Decade

BY HANK REICHMAN

Over the past decade Audrey Watters has proven to be one of our most knowledgeable, insightful, critical, and, well, just plain entertaining commentators on educational technology.  Her Hack Education blog is a must-read for anyone who cares about teaching and learning at any level.  One terrific feature of that blog has been her annual reviews of the ed-tech scene, begun in 2010 and repeated each year.  This year, however, Watters may have outdone herself, as she offers, in place of her year-end review, a lengthy (some 55 pages single-spaced when printed) survey of “The 100 Worst Ed-Tech Debacles of the Decade,” posted on December 31, 2019.  They are presented, countdown fashion, from #100 down to #1.  One might differ with some of her rankings, but her incisive commentary is just about always spot-on.

In the interest of encouraging more interest in her list — and in the pitfalls of ed-tech more generally — I want to highlight some of the most outrageous items on the list, with some emphasis on higher ed, although I’ll confess that it can be hard to choose.  I will quote them in the order they appear in her post, offering an occasional comment of my own.

99. The Promise of “Free”

The phrase “if you’re not paying for the product, you are the product” gets bandied about a lot — despite, according to Slate’s Will Oremus, it being rather an inaccurate, if not dangerous, slogan. That being said, if you’re using a piece of technology that’s free, it’s likely that your personal data is being sold to advertisers or at the very least hoarded as a potential asset (and used, for example, to develop some sort of feature or algorithm).

Certainly “free” works well for cash-strapped schools. It works well too for teachers wanting to bypass the procurement bureaucracy. It works well, that is, if you disregard student data privacy and security.

And “free” doesn’t last. Without revenue the company will go away. Or the company will have to start charging for the software. Or it will raise a bunch of venture capital to support its “free” offering for a while, and then the company will get acquired and the product will go away.

(It’s not that paying for a piece of technology will treat you any better, mind you.)

 

92. “The Flipped Classroom”

It was probably Sal Khan’s 2011 TED Talk “Let’s Use Video to Reinvent Education” and the flurry of media he received over the course of the following year or so that introduced the idea of the “flipped classroom” to most people. He didn’t invent the idea of video-taping instruction to watch at home and doing “homework” in the classroom instead; but history don’t matter in Silicon Valley. In a column in The Telegraph in 2010, Daniel Pink used the phrase “flip thinking” to describe the work of math teacher Karl Fisch, who’d upload his lectures to YouTube for his students to watch the night before class so that everyone could work together to solve problems in class. (Fisch, to his credit, made it clear he hadn’t invented the “flipped” practice either, pointing to the earlier work of Jonathan Bergmann and Aaron Sams.)

As a literature person, I can’t help but remark that the teaching practices in my field have typically involved this sort of “flip” in which you assign the readings as homework then ask students to come to class prepared to discuss it, not to listen to lecture.

The problem with the “flipped classroom” is less the pedagogy per se than the touting, once again, of this as “the real revolution,” as Techcrunch did. (The revolution is predicted almost weekly at Techcrunch apparently.) But the practice does raise a lot of questions: what expectations and assumptions are we making about students’ technology access at home when we assign them online videos to watch? Why are video-taped lectures so “revolutionary” if lectures themselves are supposedly not? (As Karim Ani, founder of Mathalicious pointed out in a Washington Post op-ed, “Experienced educators are concerned that when bad teaching happens in the classroom, it’s a crisis; but that when it happens on YouTube, it’s a ‘revolution.’”) And how much of the whole “flipped classroom” model is based on the practice of homework — a practice that is dubious at best and onerous at worst? As education author Alfie Kohn has long argued, homework represents a “second shift” for students, and there’s mixed evidence they get much out of it.

One of the “flipped classroom’s” most passionate advocates was Stanford University’s Daphne Koller, a founder of the online provider Coursera.  In The Future of Academic Freedom I wrote that Koller “argued passionately that online teaching ‘is really a new educational paradigm.’ Teaching online, she claimed, compels instructors to ‘flip’ the classroom—’not the kind of thing that we were trained to do,’ which was instead to ‘stand and orate.’  Putting aside the fact that such ‘flipping’ is possible only with hybrid and not fully online classes,” I wrote, “the sheer parochialism of this comment is astonishing. In most teaching institutions, faculty have for several decades moved well beyond the lecture format to modes of instruction that involve greater active student participation, be it face-to-face or online. ‘Be a guide on the side, not a sage on the stage,’ we’ve been counseled repeatedly by administrators, faculty development officers, and many colleagues. Although interactive education may not be the norm at Stanford, it is quite common at institutions like CSU, in community colleges, and elsewhere, perhaps even more so in traditional than in online classrooms.”

82. “The End of Library” Stories (and the Software that Seems to Support That)

If there was one hate-read that stuck with me through the entire decade, it was this one by Techcrunch’s M. G. Siegler: “The End of the Library.” It was clickbait for sure, and perhaps it came too early in the decade — this was 2013 — for me to be wise enough to avoid this sort of trollish nonsense. You can learn anything online, Siegler argued. (You can’t.) The Internet has “replaced the importance of libraries as a repository for knowledge. And digital distribution has replaced the role of a library as a central hub for obtaining the containers of such knowledge: books. And digital bits have replaced the need to cut down trees to make paper and waste ink to create those books.” (They haven’t.)

Libraries haven’t gone away — they’re still frequently visited, despite dramatic drops in public funding. More and more public libraries have started eliminating fines too because libraries, unlike Techcrunch writers, do care to alleviate inequality.

But new technology hasn’t made it easy. Publishers have sought to restrict libraries’ access to e-book lending, for example, blaming libraries for declining sales. And libraries have also struggled to maintain their long commitment to patron privacy in the face of new software — e-books and otherwise — that has no such respect for users’ rights.

Sadly it appears that too many college and university administrators have bought this bogus line.  How many provosts have blithely informed their faculty that the library won’t be necessary, that it’ll all be online, even on smartphones?  On my own campus the library has long needed replacement.  A new building is now at long last under construction, but, guess what, it’s not going to be called a library but a “core building,” whatever that is.  Some of our librarians say that only a small portion of the current collection will be transferred, that it will essentially be a library without books.  The administration denies that and I hope that’s true.  Still, even in the old building the main attraction has now become a Starbucks.  Yet, believe it or not, students still read, despite it all.

80. Viral School Videos

Viral videos weren’t new this decade, but the popularity of cellphones, along with the introduction of social media and its associated incentives to “go viral,” meant that viral videos became more common. Students recorded fellow students. Students recorded their teachers. They recorded school resource officers. Schools recorded elaborate lip-sync videos. Schools installed surveillance cameras and recorded everyone. While some of this might seem fun and harmless, the ubiquity of these videos signifies how much surveillance (and sousveillance) has been accepted in schools. No doubt, some of these videos helped highlight the violence that students — particularly students of color — face at the hands of disciplinarians. But other times, the scenes that were captured were taken out of context or these videos were purposefully weaponized (by conservative groups, for example, who encourage students to film their professors’ lectures and submit their names to “watch lists”).

Ed-tech’s threat to academic freedom!

78. The Fake Online University Sting

The list of high profile “fake online universities” is fairly long. Indeed, the POTUS once ran one. (He paid a $25 million fine in 2018 to settle fraud claims relating to the now defunct Trump University, but please, go on with how much he’s the anti-corruption President.) And the federal government itself ran fake online universities too.

In April 2016, the Department of Homeland Security arrested 21 people, charging them with conspiracy to commit visa fraud. These individuals were alleged to have helped some one thousand foreign nationals maintain their student visas through a “pay to stay” college in New Jersey.

The college — the University of Northern New Jersey — was a scam, one created by Homeland Security itself. The school employed no professors and held no classes. Its sole purpose was to lure recruiters in turn into convincing foreign nationals to enroll — all this in exchange for a Form I–20 which allows full-time students to apply for a F–1 student visa.

It was an elaborate scam, dating back to 2012, but one that gave out many online signals that the school was “real.” The University of Northern New Jersey had a website — one with a .edu domain, to boot — as well as several active social media profiles. There was a regularly updated Facebook page, a Twitter account, as well as a LinkedIn profile for its supposed president.

Students who’d obtained their visas through the University of Northern New Jersey claimed they were victims of the government’s sting; the government said they were complicit. According to one student, he had asked why he wasn’t required to take any classes, and he’d been told by the recruiter that he could still earn credits through working. “Thinking back, it’s suspicious in hindsight, but I’m not really familiar with immigration law,” the student told Buzzfeed. “And I’d never gotten my Ph.D. before. So I thought maybe this is the way it works.”

“I thought maybe this is the way it works.”

With all the charges of fraud and deceptive marketing levied against post-secondary institutions this decade — from ITT to coding bootcamps, from Trump University to the Draper University of Heroes — we might ask if, indeed, this is the way it works now.

 

74. “Deliverology”

In 2010, Sir Michael Barber published Deliverology 101: A Field Guide for Educational Leaders. The book packaged the ideas he’d developed during his time in the Blair Administration and at McKinsey & Company on how to successfully manage policy reform efforts. “Three critical components of the approach,” he wrote, “are the formation of a delivery unit, data collection for setting targets and trajectories, and the establishment of routines.” In 2011, Barber went to work for Pearson as its Chief Education Advisor, continuing his advocacy for competition, data collection, measurements, and standards-based reforms. (See David Kernohan’s excellent keynote at OpenEd13 for more.) In 2013, on the heels of “the Year of the MOOC,” Barber released a report titled “An Avalanche is Coming,” calling for the “unbundling” of higher education.

The work of Michael Barber underscores the importance of highly paid consultants in shaping what politicians and administrators believe the future should look like and in giving them a set of metrics-obsessed management practices to get there.

California State University faculty members recall this boondoggle well!  Here’s what the California Faculty Association, the AAUP-affiliated CSU faculty union, said about Barber’s b.s. at the time: “In early 2010, the California State University Board of Trustees announced its goal to increase graduation rates by 8% — an admirable goal on its face. Unfortunately, the CSU Chancellor’s strategy to achieve that goal, referred to among other code words as ‘Deliverology,’ will endanger the quality of CSU degrees and access to them – all in order to ‘deliver’ that magic number. ‘Deliverology’ is a term coined by Michael Barber, who has worked in various levels of education in the United Kingdom. ‘Deliverology’ involves, in part, a narrow focus on one single numerical target as the measure of ‘improvement.’ Top management commands and controls delivering services, be they in the public or private sphere, to reach the goal.”  CFA would devote a special section of its magazine, California Faculty, to refuting “deliverology.” For a full review of Barber by former CFA president Susan Meisenhelder go here.

71. “Uber for Education”

“We want to be the Uber for Education,” Udacity founder Sebastian Thrun told the Financial Times in 2015. MOOCs are, no surprise, their own entry on this long list of awfulness. But the phrase “Uber for Education” deserves its own spot here, as the analogy was frequently invoked by education reformers, including Secretary of Education Betsy DeVos. She told the Brookings Institution in 2017 that “Just as the traditional taxi system revolted against ridesharing, so too does the education establishment feel threatened by the rise of school choice. In both cases, the entrenched status quo has resisted models that empower individuals.” School choice, for DeVos, is the Uber for education. An end to regulations would make for the Uber for education.

But for Thrun, the reference to Uber was not about “choice” but about labor — specifically about building a platform to be used by a precarious workforce, lured into piecework with the promise of a big payout. “At Udacity, we built an Uber-like platform,” he told the MIT Technology Review. “With Uber any normal person with a car can become a driver, and with Udacity now every person with a computer can become a global code reviewer.” The promise — whether working for Udacity or for Uber — included better flexibility, more pay. For the customer, the promise is for service on demand. But as a decade of the gig economy has demonstrated, all this is “being fueled by exploitation, not innovation.”

And, of course, many teachers find themselves working as Uber drivers as they struggle to make ends meet on their salaries. Indeed, Uber markets to teachers directly, encouraging them to make the ride-sharing company their second gig, offering teachers in some cities to donate a small percentage of each trip fee to their classroom.

 

59. Clayton Christensen’s Predictions

In 2008, Clayton Christensen and Michael Horn published Disrupting Class: How Disruptive Innovation Will Change the Way the World Learns and predicted that the growth in computer-based instruction would accelerate rapidly until, by 2019, half of all high school classes would be taught over the Internet. Nope. Wrong.

In 2013, Christensen told investor Mark Suster that, in 15 years time, half of US universities would be bankrupt. As with K-12 education, he believed (believes) that online education would provide the “disruptive innovation” to force traditional schooling out of business. In 2017, he doubled down on his prediction — half of colleges and universities would close in a decade. I have set a Calendar reminder for 2028. We can check back in then for the final calculation. Meanwhile, Phil Hill has run the numbers on what higher ed closures actually look like, with visualizations that helps underscore that the vast number of these were to for-profit institutions — and there just aren’t enough of those left to make up the “half of all colleges” claim.

As Jill Lepore reminded us in her scathing critique of Christensen’s “gospel of innovation,” “Disruptive innovation is a theory about why businesses fail. It’s not more than that. It doesn’t explain change. It’s not a law of nature. It’s an artifact of history, an idea, forged in time; it’s the manufacture of a moment of upsetting and edgy uncertainty. Transfixed by change, it’s blind to continuity. It makes a very poor prophet.” But it’s the sort of propheteering that hopes if you repeat a story enough times, that everyone — taxpayers, administrators, politicians, pundits — will start to believe it’s the truth.

Chris Newfield also offered a devastatingly thorough critique of Christensen and response to Lepore on his blog back in 2014.  He concluded: “Prof. Christensen is right that universities need to recover their educational focus. It’s just not his model of disruptive innovation that will achieve this. The process cannot be lead by managers and must be lead by faculty and students. The historical tragedy of the Schumpeter-Christensen model is that it elevated a managerial class that opposed the democratization of invention we now can’t do without. The good news is that there’s no reason to make the same mistake twice.”

57. TurnItIn (and the Cheating Detection Racket)

iParadigms, the company whose product is most closely associated with cheating detection — TurnItIn — was bought and sold twice this decade: in 2014 to Insight Venture Partners for $752 million and earlier this year to Advance Publications for $1.75 billion. Thankfully, because education technology is such an ethical and mission-driven industry, every student who has ever been forced to submit an essay or assignment to TurnItIn received a nice cut of the deal, as befitting their contribution of data and intellectual property to the value of the product.

I jest. Students got nothing, of course. Students write, and then their work gets extracted by TurnItIn, which in turn sells access to a database of student work back to schools. Or as Jesse Stommel and Sean Michael Morris put it, “A funny thing happened on the way to academic integrity. Plagiarism detection software (PDS), like Turnitin, has seized control of student intellectual property. While students who use Turnitin are discouraged from copying other work, the company itself can strip mine and sell student work for profit.” (Another funny thing: essay mills are now touting that they use TurnItIn too, and they can assure their customers that their essays will pass the plagiarism detector.)

Rather than trusting students, rather than re-evaluating what assignments and assessments look like, schools have invested heavily in any number of technology “solutions” to cheating — keystroke locking, facial recognition, video monitoring, and the like, all designed to identify students with “low integrity.”

 

53. The TED Talk

One can trace far too many bad ideas to the event whose focus purports to be on “ideas worth spreading”: TED. It’s not just that the talks are terrible, trite, and full of historically inaccurate information. Say, for example, Sal Khan’s 2011 TED Talk “Let’s Use Video to Reinvent Education.” But people hear those TED Talks and then think they’ve stumbled upon brilliance. Say, for example, Sebastian Thrun, listening to Sal Khan’s 2011 TED Talk and deciding he would redesign higher education — an idea that prompted an experiment at Stanford in which he offered his graduate level AI class online, for free. (You’ll never believe what happened next.)

Unfortunately, education-related TED Talks are some of the most popular ones out there. Some of these are laughably silly, such as Nicholas Negroponte’s prediction of a pill you will be able swallow to “know Shakespeare.” And some of the ones with the greatest appeal, such as Sugata Mitra’s “School in the Cloud,” may just re-inscribe the very exploitation and inequality that the TED Talks promise, with their 18-minute-long sleight-of-hand, to disrupt.

TED Talks are designed to be unassailable — ideas to spread but never challenge. As I noted back in 2013, “You don’t get to ask questions of a TED Talk. Even the $10,000 ticket to watch it live only gives you the privilege of a seat in the theater.”

Confession: I have never watched a TED talk.

49. Yik Yak

Once valued at $400 million, having raised some $73.5 million in venture capital, the anonymous messaging app Yik Yak closed its doors in 2017. Good riddance.

Founded in 2014 by Tyler Droll and Brooks Buffington (seriously), Yik Yak was for a time quite popular with students, who too often took advantage of its anonymity to harass others and to post racist and sexist remarks. As The New York Times chronicled, threats of violence posted to the app prompted several schools to go on lockdown, and several students were subsequently arrested — in Virginia, Michigan, and Missouri, for example — for posts they’d made. Some schools blocked the app on their WiFi network; students at Emory denounced it as a platform for hate speech.

As one student noted, the hyper-localization of the app was a constant reminder that these threats were not coming from some random person across the country; they were coming from someone in your class.

This really was a nightmare, no matter how short-lived.

24. The Secretaries of Education

How funny that two of the longest serving members of both the Obama and the Trump Administrations were their Secretaries of Education: Arne Duncan and Betsy DeVos, respectively. Actually, it’s not funny at all. They were both pretty terrible. Beloved by VCs. (Indeed, since leaving office Duncan became one.) And beloved by ed reform types. But terrible. Among the ed-tech disasters they facilitated: a broken FAFSA, the floundering EQUIP experiment, andterrible FERPA enforcement, for starters.

More generally, the Department of Education has shown how tightly connected it is to industry. Former department officials have cycled in and out of the Gates Foundation, the Chan Zuckerberg Initiative, ISTE, the student loan industry, startups, and the like.

One could argue that DeVos’s ideas are far worse than Duncan’s, bad as those were.  But she apparently has the virtue of being incompetent.  He, alas, was not.

22. Automated Essay Grading

Robot essay graders — they grade just the same as human ones. Or at least that was the conclusion of a 2012 study conducted by University of Akron’s Dean of the College of Education Mark Shermis and Kaggle data scientist Ben Hamner. The researchers examined some 22,000 essays administered to junior and high school level students as part of their states’ standardized testing process, comparing the grades given by human graders and those given by automated grading software. They found that “overall, automated essay scoring was capable of producing scores similar to human scores for extended-response writing items with equal performance for both source-based and traditional writing genre.”

“The demonstration showed conclusively that automated essay scoring systems are fast, accurate, and cost effective,” said Tom Vander Ark, partner at the investment firm Learn Capital, in a press release touting the study’s results. (It did not.)

When edX announced it had developed an automated essay grading program, its president Anant Agarwal boasted that the software was an improvement over traditional grading methods. “There is a huge value in learning with instant feedback,” he told The New York Times. (There is not.)

Automated essay grading software can be fooled with gibberish, as MIT’s Les Perelman has shown again and again. Moreover, the algorithms underpinning them are biased, particularly against Black students. But that hasn’t stopped states from adopting automated grading systems for standardized testing and software companies from incorporating automated grading systems into their products.

 

15. Jeffrey Epstein and the MIT Media Lab

“A decade before #MeToo, a multimillionaire sex offender from Florida got the ultimate break,” Julie K. Brown wrote in The Miami Herald in 2018, detailing the sweetheart plea agreement that Jeffrey Epstein had received that helped him stay out of prison despite a 53-page federal indictment for sexual abuse and sex trafficking. But in July of this year, Epstein was arrested in New York and charged with sex trafficking. Epstein committed suicide while waiting for his trial, an act that has spawned numerous conspiracy theories.

Unsealed documents linked Epstein to a sex ring in which girls were forced to have sex with a number of powerful men, including Prince Andrew, former New Mexico Governor Bill Richardson, and AI pioneer Marvin Minsky. Indeed, many scientists had continued to take Epstein’s money, even after he was jailed in 2007 for soliciting sex with a girl. Among those, Joi Ito, the head of the MIT Media Lab, who posted an apology to his blog for taking money for the Lab as well as for his own startup endeavors. There was, no surprise, a huge outcry, and several professors resigned their positions there. (Lab co-founder Nicholas Negroponte, on the other hand, said that taking Epstein’s money was justified and that relying on billionaires’ largesse was what made the work there possible.) An article in The New Yorker by Ronan Farrow detailed how Ito and others at the Media Lab had covered up Epstein’s donations. Less than a day after it was published, Ito resigned.

Accepting the 2019 Barlow/Pioneer Award from the EFF, danah boyd called for a “great reckoning” in the tech industry, but we will have to wait until the next decade for that apparently. What will come first: that great reckoning? Or Ito’s return to tech?

Just two days ago The New York Times reported that “Top administrators knew about the gifts, felt conflicted about them, and accepted them anyway. The university’s president even signed a thank-you note.”  Yet the outside law firm that MIT hired to investigate absolved the administration of any wrongdoing.  According to the Times, “The investigation found that Mr. Epstein made 10 donations — totaling $850,000, slightly more than M.I.T. had previously disclosed — from 2002 to 2017. Mr. Epstein also visited campus at least nine times from 2013 to 2017, a period that followed his conviction on sex charges involving a minor in Florida.”  Yet only one person was disciplined.  You guessed it, not an administrator or high manager, much less the university president, but a single engineering professor who had previously acknowledged accepting Epstein’s money but failed to inform the school.

“One current and two former M.I.T. vice presidents who learned of Mr. Epstein’s donations in 2013 and began quietly approving them were rebuked in the report, but not disciplined because it concluded that they had not broken any university policies,” the Times said.  “The three vice presidents — R. Gregory Morgan, Jeffrey Newton and Israel Ruiz — made ‘significant errors in judgment that resulted in serious damage to the M.I.T. community,’ according to the report.”

5. Gamergate

In the fall of 2014, Deadspin writer Kyle Wagner declared that “The Future Of The Culture Wars Is Here, And It’s Gamergate.” “What we have in Gamergate,” he wrote “is a glimpse of how these skirmishes will unfold in the future—all the rhetorical weaponry and siegecraft of an internet comment section brought to bear on our culture, not just at the fringes but at the center. What we’re seeing now is a rehearsal, where the mechanisms of a toxic and inhumane politics are being tested and improved.” He was right.

Just a few months earlier, “an angry 20-something ex-boyfriend published a 9,425-word screed and set in motion a series of vile events that changed the way we fight online,” The New York Times recently recalled. “The post, which exhaustively documented the last weeks of his breakup with the video game designer Zoë Quinn, was annotated and punctuated with screenshots of their private digital correspondence — emails, Facebook messages and texts detailing fights and rehashing sexual histories. It was a manic, all-caps rant made to go viral. And it did. The ex-boyfriend’s claims were picked up by users on Reddit and 4chan and the abuse began.” “Gamergate,” as the online harassment campaign came to be called, merged old troll tactics with new troll tactics; and it is clear that it “prototyped the rise of harassment influencers” and the alt-right: Steve Bannon and Breitbart, Milo Yiannopoulos, Mike Cernovich.

If your response here is “what does this have to do with education,” you haven’t been paying attention. It isn’t just that games-based education has had to reckon with Gamergate. (Or not reckon with it, as the case may be. Markus Persson, the creator of Minecraft, acquired by Microsoft in 2014, likes to use the word “feminist” as an insult and has claimed that gender doesn’t exist in Minecraft — a game where the main character is a blocky dude called Steve, for crying out loud.) The Verge recently wrote that “As misinformation and hate continues to radicalize young people online, teachers are also grappling with helping their students unlearn incorrect, dangerous information. ‘It has made a lot of us teachers more cautious,’ they say. ‘We want to challenge our students to explore new ways of thinking, to see the cultural meaning and power of video games, but we’re understandably anxious and even scared about the possible results.’”

As a chilling symposium published last summer on the fifth anniversary of the Gamergate scandal by the New York Times Magazine demonstrated, these events still haunt the online world, not least in how they established a model for harrassment that has spread to higher education and poses a major threat to academic freedom.  As writer Charlie Wurzel put it in this symposium, “Today, five years later, the elements of Gamergate are frighteningly familiar: hundreds of thousands of hashtag-swarming tweets; armies of fake Twitter accounts; hoaxes and disinformation percolating in murky chat rooms and message boards before spreading to a confused mainstream media; advertiser boycotts; crowdfunding campaigns; racist, sexist and misogynist memes; YouTube shock jocks; D-list celebrities hand-wringing about political correctness on Twitter; Milo Yiannopoulos, Steve Bannon and Breitbart; Candace Owens. . . . Gamergate is occasionally framed as a battle for the soul of the internet between a diverse, progressive set and an angry collection of white males who feel displaced. And it is that, too. But its most powerful legacy is as proof of concept of how to wage a post-truth information war.”

4. “The Year of the MOOC”

The New York Times declared 2012 “the Year of the MOOC.” And I chose “The Year of the MOOC” rather than “The MOOC” as the big disaster because I wanted to underscore how much of the problem here was the PR, the over-promising and the universities and startups believing their own hype.

The hype about massive open online courses really began in late 2011, with huge enrollment in three computer science courses that Stanford offered for free online during the Fall semester, along with the announcement of MITx in December. But starting in January 2012, the headlines — and the wild promises — about MOOCs were incessant: Sebastian Thrun announced he was leaving Stanford to found Udacity, predicting that in 50 years, “there will be only 10 institutions in the world delivering higher education and Udacity has a shot at being one of them”; Stanford professors Andrew Ng and Daphne Koller unveiled their new startup, Coursera; and MIT and Harvard announced the formation of edX. Hundreds of millions of dollars of investment were funneled into these (and other) MOOC providers. Universities scrambled to publicize their partnerships with them. (In June 2012, the University of Virginia Board of Trustees fired President Teresa Sullivan, contending she was too slow to jump on the MOOC bandwagon even though the school was already in talks to join Coursera.)

MOOCs, like so many ed-tech products, were declared “revolutionary,” with hundreds of thousands of students signing up to listen to video-recorded lectures, take online quizzes, and chat in forums. (The technology, despite the deep computer science knowledge of the venture-capital backed MOOCs’ founders, was mostly crap.) Hundreds of thousands of students signed up, but very very few finished the courses (and most “successful” MOOC students already had college degrees) — a potential counter to any claim that these free online classes were going to extend educational opportunities to everyone and replace a university education. The dropping-out was often dismissed as a feature, not a bug. Students were just curious; they never planned on completing the course, advocates insisted.

But as MOOC providers started to work more closely with universities to offer their courses for credit, the low completion rates (arguably) mattered more. In a high profile announcement in early 2013, California Jerry Brown, San Jose State University President Mo Qayoumi, and Udacity CEO Sebastian Thrun unveiled a pilot program that marked a first for the state: San Jose State would award college credits for special versions of select Udacity classes. The program would “end college as we know it,” Techcrunch cooed. But just a few months later, citing concerns about the quality of the courses, San Jose State put the project on pause. “We have a lousy product,” Sebastian Thrun told Fast Company that fall, saying with a shrug he didn’t even like the word “MOOC.”

Investors continued to fund MOOCs nonetheless. Coursera has raised over $310 million. (Richard Levin, the former President of Yale who led that schools failed online initiative AllLearn back in the early 2000s, was briefly CEO.) Udacity has raised some $160 million. But even with all that venture capital to keep the lights on, MOOCs have had to look for some sort of revenue model. So gone — mostly gone, at least — are the free online courses. Gone are the free certificates. The MOOC revolution simply wasn’t.

 

1. Anti-School Shooter Software

The most awful education technology development of the decade wasn’t bankrolled by billionaire philanthropists. Rather it emerged from a much sicker impulse: capitalism’s comfort with making money off of tragedy. It emerged from this country’s inability to address gun violence in any meaningful way. And that is despite a decade that saw a steady rise in the number of school shootings. “10 years. 180 school shootings. 356 victims,” CNN reported this summer. That is despite, at the beginning of the decade, a shooting that left 20 second graders dead. instead of gun reform, we got anti-school shooter software, a culmination sadly of so many of the trends on this list: surveillance, personalization, data mining, profiling, Internet radicalization, predictive analytics.

For a while, many ed-tech evangelists would bristle when I tried to insist that school security systems and anti-school shooting software were ed-tech. But in the last year or so, it’s getting harder to deny that’s the case. Perhaps because there’s clearly a lot of money to be made in selling schools these products and services: shooting simulation software, facial recognition technology, metal detectors, cameras, social media surveillance software, panic buttons, clear backpacks, bulletproof backpacks, bulletproof doors, emergency lockdown notification apps, insurance policies, bleeding control training programs, armed guards, and of course armed teachers.

“Does It Make More Sense to Invest in School Security or SEL?” Edsurge asked in 2018. Those are the choices education technology now has for us apparently: surveillance or surveillance.

What an utter failure.