with Zachary Dixon
Recently, through the Brookings Institute, Darrell West and Joshua Blieberg wrote about the promise that “big data” holds to improve the feedback loop and the quality of student writing assessment. Equally important was West and Bleiberg’s recognition that “[r]eal world demonstrations of how big data would fit into the classroom environment have been far between,” and how the dearth of practical applications has ultimately limited the application of big data lessons to writing studies.
Acknowledging the same importance, and the same necessity for pragmatic investigation and application, faculty and graduate students at the University of South Florida have been active in researching the applications of our own big data set, which was generated from the use of My Reviewers, our web-based suite of tools designed for the collection and evaluation of student writing. Because our big data set is generated in real-time via My Reviewers, within the real-world context of student writing done in and for our FYC program, we believe that the hypotheses and conclusions drawn from that data set provide us with meaningful and actionable proof-points that help make our writing program more agentic and objective.
Below, we would like to share some of the recent research that has been generated out of and through our big data:
Dixon, Z., & Moxley, J. (2013). Everything is illuminated: What big data can tell us about teacher commentary. Assessing Writing, 18(4), 241-256.
This article applies concordance software to the My Reviewers corpus of instructor comments in order to analyze the local practices of instructor feedback, and to describe how digital tools and corpi open new spaces for Writing Program Administrators to develop a portrait of their program’s writing ecology.
-
Demonstrates how a single programmatic rubric facilitates high levels of inter-reader reliability
-
Highlights the relationship between curricular changes and student success
-
Illustrates a pragmatic method and tool for measuring transfer
Moxley, J. M. (2013). Big Data, Learning Analytics, and Social Assessment Methods. Journal of 6 Writing Assessment. (in-press).
This article explores the value of using My Reviewers, social media, and a community rubric to assess writing ability across genres, course sections, and classes, in order to demonstrate ways that learning analytics help facilitate effective, evidence-based curriculum decisions.
-
Demonstrates how a single programmatic rubric facilitates high levels of inter-reader reliability
-
Highlights the relationship between curricular changes and student success
-
Illustrates a pragmatic method and tool for measuring transfer
Langbehn, Karen, Megan McIntyre, and Joseph Moxley. “Re-Mediating Writing Program Assessment.” Digital Writing Assessment & Evaluation. Eds. Heidi A. McKee and Danielle Nicole DeVoss. Logan, UT: Computers and Composition Digital Press/Utah State U P, 2013.
This article describes the way that My Reviewers facilitated and expedited the First Year Composition’s curricular development and demonstrates the ways that networked assessment tools enhance the communal agency and objectivity of writing programs.
-
Highlights the relationship between curricular changes and student success
-
Describes how WPAs can make evidence-based curricular changes
Vieregge, Q., Stedman, K., Mitchell, T., & Moxley, J. In press. Agency in the Age of Peer Production. Studies in Writing and Rhetoric Series. National Council of Teachers of English, Urbana, IL.
This book-length qualitative study investigates highs and lows of how My Reviewers and crowd-sourcing contributed to the development of the First Year Composition’s community rubric, ultimately evidencing the tectonic shifts in agency that result from the application of peer-production methods and tools to communal problems.
-
Presents the real-world peer-production of a programmatic rubric
-
Clarifies the place of individual agency within standardizing practices
-
Illustrates My Reviewers’ ability to facilitate inter reader reliability