Human-Centered Computing for Humanists Case Studies from the Computational Thinking and Learning Initiative at Vanderbilt University

1. Abstract

The Computational Thinking and Learning Initiative (CTLI) at Vanderbilt University formed out of an awareness that, across society and academe, computation is changing the nature of knowledge. As the practices and methods for producing, sharing, and contesting knowledge change, the enterprise of the university—its disciplinary scholarship, liberal arts mission, and charge to prepare professionals for the world of work—is being reshaped by algorithmic norms. The rise of computational thinking as a transdisciplinary category holds both promise and peril for the humanities. How can humanists, especially digital humanists, take advantage of the push for computational thinking across the curriculum while avoiding the dangers of appropriation? Alternatively, how can those developing and refining computational methods tap into the critical perspectives of the humanities? This paper discusses how an interdisciplinary group of colleagues is drawing on the theory of human-centered computing to develop environments and curricula for students of the humanities to explore the basics of text mining while also providing them with space to critique and resist the imposition of algorithmic rationality.

The CTLI foregrounds a particular image of computational thinking across the curriculum, working to identify, stabilize, and study new forms of human-computer partnership that are responsive to disciplinary ways of knowing. Our aim is to study new ways in which individuals or groups of humans, along with computers or groups of computational entities, can come together productively and critically as collective computational-thinking units and build on complementary strengths to investigate problems while rejecting facile technical solutions. The perspective of human-centered computing, that is, the “design of computing systems with a human focus from beginning to end,”1 functions as an Archimedean point of our collaboration.

A trans-institutional group of researchers and scholars have assembled at the CTLI, including faculty from the schools of arts and science, engineering, and education as well as the library and the data sciences institute. Together, we are exploring how human-centered computing collaborations transform epistemologies, practices, and pedagogies across disciplines.

Following the programmatic overview that Anderson and Ramey presented at DH2019,2 these collaborators selected textual analysis as one of two focus areas (the other being climate change) for the first year of the initiative. Using NetsBlox,3 a block-based programming environment developed at Vanderbilt University, we constructed components and curricula to teach students in the humanities the fundamentals of computational thinking (variables, looping, functions, recursion, etc.) by manipulating textual corpora rather than matrices of numbers. We wanted this environment to be "low threshold" enough for middle and high school students to use it as an entry point to disciplinary inquiry, while having a sufficiently "high ceiling" to allow scholars of literature and history to use it meaningfully as well.

Given our commitment to discipline-specific visions of computational thinking, our pilot project attempted to understand how the professional vision4 of the humanist might resonate with the technological capacity of new computational tools and methods. We viewed human-computer collaboration as an integration of productively-different ways of interacting with the objects of analysis (e.g., texts). Computers and humans “read” differently, and the trick was to find a way to put these ways-of-reading into conversation. We used design-based research5 to investigate possibilities, supporting and studying scholars' creative efforts to engage with technological tools and achieve what they regarded as progress on humanistic projects. We then developed a learning environment that enabled younger students to take on similar relations to computational tools, but in simpler, playful settings. Working with the same computational approaches at different levels of complexity and sophistication, we aimed to gain new perspectives on their power and limitations. We explored a design space with scholars and students centered on the analysis of style and affect in poetry, iteratively identifying and testing functionality with the ultimate objective of creating activities for both secondary-school and undergraduate courses.

In the course of our explorations, we narrowed the team’s focus to three different technologies for textual analysis: (i) fundamental natural language processing concepts such as named entity recognition; (ii) word embeddings, such as Word2Vec;6 and (iii) a “query runner” to TEI-encoded documents in BaseX,7 a native XML database. During our collaboration, the team applied each of these technologies, first playfully, to explore what insights they could yield with familiar and constructed texts; then more deliberately, in settings in which it was likely that the computer’s “readings” might be put in productive conversation with humans’ readings. Finally, the team proposed questions, on the one hand, that revealed new patterns in larger corpora; and activities, on the other, that engaged younger learners in reflecting on style as a feature of writing under the writer’s control.

In this paper, we report on this first year’s effort and our progress in building a block-based computing environment and curriculum that supports textual analysis by both high school students and professors of literature. We discuss how these tools and perspectives scaffold activities as diverse as teaching secondary students about the linguistic differences between poetry and prose and detecting stylistic patterns among Victorian writers. We also discuss the limitations of taking a human-centered computing approach when conducting digital humanities research; in particular, we examine the drawbacks of block-based languages for text mining in comparison with tools like Lexos8 and Voyant.9

Notes

1 Jaimes, Alejandro, Nicu Sebe, and Daniel Gatica-Perez. "Human-centered computing: a multimedia perspective." In Proceedings of the 14th ACM international conference on Multimedia, pp. 855-864. ACM, 2006.

2 Anderson, Clifford B., and Lynn T. Ramey. “Thinking Computationally in the Digital Humanities: toward block-based programming for humanists.” DH 2019. Utrecht, 2019. https://dev.clariah.nl/files/dh2019/boa/0484.html.

3 See https://netsblox.org/

4 Goodwin, Charles. "Professional vision." American Anthropologist 96.3 (1994): 606-633.

5 Cobb, Paul, et al. "Design experiments in educational research." Educational Researcher 32.1 (2003): 9- 13.

6 Mikolov, Tomas, Ilya Sutskever, Kai Chen, Greg S. Corrado, and Jeff Dean. "Distributed representations of words and phrases and their compositionality." In Advances in neural information processing systems, pp. 3111-3119. 2013.

7 See http://basex.org/

8 See https://wheatoncollege.edu/academics/special-projects-initiatives/lexomics

9 See https://voyant-tools.org/

Clifford B. Anderson (clifford.anderson@vanderbilt.edu), Vanderbilt University, United States of America, Corey E. Brady (corey.brady@vanderbilt.edu), Vanderbilt University, United States of America, Brian Broll (brian.broll@vanderbilt.edu), Vanderbilt University, United States of America and Lynn T. Ramey (lynn.ramey@Vanderbilt.Edu), Vanderbilt University, United States of America

Theme: Lux by Bootswatch.