Writing Across Berkeley

All Articles

Algorithmic Literacies for Research and Writing

September 17, 2018

This article provides a brief introduction to algorithms, and considers how new forms of algorithmic literacy might play a role in education.

In the past few years, emerging forms of algorithm-based machine learning tools have begun to compose original works of literature.

I would like to propose here that directing students to actively interpret these new works of literature could provide a novel means to introduce them to traditional rhetorical skills as well as to emerging literacies related to collaborative intelligence, or human/machine collaboration. Skills related to this kind of collaboration, including machine interpretation, are rapidly emerging as critical literacies for 21st century life.

The Harvard Business Review notes that “Humans need to perform three crucial roles. They must train machines to perform certain tasks; explain the outcomes of those tasks, especially when the results are counterintuitive or controversial; and sustain the responsible use of machines (by, for example, preventing robots from harming humans).”

It may not be the case that students of literature need to attend to the training or sustaining of algorithmic tools, but they might, in fact, be uniquely suited to the task of explaining the output of those tools, and for unraveling their implications for reading and writing in the 22nd century.

 

Wait . . . what’s an algorithm?

 

In the middle of the 20th century a group of loosely affiliated French authors formed Oulipo, a collective dedicated to rule-based methods for generating literary works.

Perhaps the group’s most famous work is Georges Perec’s novel La Disparition, an ambitious lipogram that excludes the letter e, the most common vowel in the French language.

Another Oulipian method, Jean Lescure’s N+7, is a fantastic early example of algorithmic literature. To create a work using N+7 you simply replace each noun in an existing work of literature with the seventh noun following the original term in the dictionary. The method is sufficiently programmatic that an N+7 machine (though imperfect in its recognition of nouns), can transform William Carlos Williams’ 1962 poem "The Great Figure" into "The Great Fillet" with the click of a button:

 

The Great Figure

 

Among the rain

and lights

I saw the figure 5

in gold

on a red

firetruck

moving

tense

unheeded

to gong clangs

siren howls

and wheels rumbling

through the dark city.

The Great Fillet

 

Among the raisin

and light-years

I saw the fillet 5

in gondolier

on a red

firetruck

moving

terminal

unheeded

to goon clarets

sit-down howls

and whelks runaway

through the dartboard clairvoyant.

 


An algorithm, then, is simply a set of rules used to solve a problem. In the case of the Oulipo, the problem was how to create a work of literature. The N+7 algorithm applies a single rule (replace each noun with the 7th succeeding noun in a dictionary) to a single input (any work of literature), and outputs a unique piece of writing. You could visualize the N+7 algorithm as a flowchart:

 

 

 
 

Machine writing

 

Most algorithms are far more sophisticated than the N+7 algorithm above. Strides in machine learning techniques in recent years, for example, have enabled algorithmic tools to “learn” to recognize human faces and translate natural languages with tremendous accuracy.

And it turns out that some of the same machine learning methods that led to those advances can also be used to write original pieces of literature. Reading these machine-generated texts, which are often created by algorithms complex enough to be considered forms of artificial intelligence, can provide a strangely illuminating window on the craft of writing.

Consider the short science-fiction film, Sunspring, which was composed by a computer and produced by human beings in 2016. The screenplay for Sunspring “was authored by a recurrent neural network” after it was “fed with a corpus of dozens of sci-fi screenplays.... mostly movies from the 1980s and 90s.”

This is a common procedure in machine learning: a computer takes as its input a large corpus of like items (e.g., newspaper articles, 19th century novels, science fiction screenplays), and then attempts to produce an original work using patterns that it has detected in the corpus.

While machine learning techniques have improved rapidly, it’s not yet the case that the works of literature they create are particularly coherent, or compelling as standalone creations. Sunspring, for example, lacks a clear narrative and the dialogue is frequently nonsensical

But if one reads the screenplay closely, with special attention to how the machine-author has attempted to reproduce elements detected in its small corpus of late-20th century science fiction screenplays, it’s possible to extend observations about the screenplay itself to address issues of genre and craft of science fiction screenwriting. Consider this short assignment idea:


Prompt: Watch the short film, Sunspring (9 min., 2016). The narrative is abstract, but pay close attention to the dialogue. What patterns do you notice? Do particular themes arise? How do the characters relate to each other? Now consider the genre of science fiction films on which the film was based: Do your observations reveal anything more broadly about science fiction films in general? The craft of screenwriting?


A common assignment for undergraduate writing courses asks students to read a work of primary literature in the context of critical secondary sources.

In the short assignment above students read a work created by a machine as both an example of, and a critical reflection on, a particular collection of works.

This encourages close-reading of a primary text as a way to explore a particular frame of understanding.

Rather than a feminist reading of a primary source, or a Marxist one, though, here we ask students to explore what constitutes a machine reading of the primary sources at hand. As machines become more common as authors and co-authors of the texts we interact with in our everyday lives, our ability to understand and interpret their unique points of view will be critical.

 

About the Author: Cody Hennesy is an E-Learning and Information Studies Librarian