Hybrid Literary Studies: Critical Making, Steampunk, Digital Humanities

Posted by Roger Whitson on March 2nd, 2014


[Slide 1]
I study nineteenth-century British literature, but particularly how digital media is changing our understanding of literary studies. This is a map of how global warming is increasing the number of cyclone tracks and flooded areas in the Asian Pacific from the Intergovernmental Panel on Climate Change. Global warming is, as Tim Morton has suggested, a hyperobject, it displaces space and time and leads to “unthinkable timescales.” Because of that, I want to suggest that phenomena like these requires new forms of collaboration between scientists and humanists.

It’s important, in other words to think:

  1. the kinds of hybridities that can be accomplished in literary studies by collaborating with other disciplines
  2. and what literature can do to respond to some of the complexities of our historical moment.

[Slide 2]
This is a tweet from Amanda Watson #altac librarian and digital humanists.highlighting response Brian Croxall got on application to Yale U search.
Brian’s already talked about this in one of his presentations. I’m particularly interested in the ideological language about “fit” and “coverage.” Consider these metaphors when placed alongside phenomena like global warming. How does one “cover” a hyperobject like global warming?

People on the job market hear these words all the time, and yet they point to an organizational knowledge in lit studies that is becoming more and more obsolete.

[Slide 3]
This is the timeline-coverage model of literary scholarship from online-literature.com, a fairly conservative version of British literature in the nineteenth century and early twentieth century. And when I see lines like this one, my first instinct is to ask: what’s missing? (the horror: Where’s William Blake? Anyone but white men before the Victorian Period?)

Grant Wiggins and Jay McTighe (along with Mark Sample) argue that the idea of coverage can act to “protect and conceal” or to “hide from view.” The timeline assumes what we should criticize: the idea of coverage and periodization as objects of knowledge in literary criticism. See, for instance, Ted Underwood’s critique of periodization in Why Literary Periods Mattered.

Knowledge in the digital age doesn’t look like a timeline anymore.

[Slide 4]
It looks more like this. This is a map of connections in sites using Linked Open Data until September 2011, part of the so called semantic web. If Web 1.0 is defined by webpages (i.e. form, format), Web 2.0 is defined by XML (i.e. content), then Web 3.0 is defined by data or action: data that can be directly processed by machines without human intervention. See, for instance, STI’s video. Those of you who know Bruno Latour’s Actor-Network Theory or Jane Bennett’s Vibrant Matter should understand this.

Ted Underwood has a great quote that summarizes these complications. “Humanists are used to approaching debates about historical representation as if they were zero-sum questions. I suppose we are on some level still imagining this as a debate about canonicity — which is, as John Guillory pointed out, really a debate about space on the syllabus. Space on the syllabus is a zero-sum game. But the process of building big data is not zero-sum; it is cumulative. Every single thing you digitize is more good news for me.”

This brings me to a Deleuzian question: not what do you know or what is the truth, but what can you do?

[Slide 5]
This emergent complex world requires thinking that emphasizes: collaborationsystems-thinking, and a rigorously material approach to technology. Andrew Blum, Tubes: “We treat the internet as if it were or a fantasy,” and not comprised of physical places, tactile objects, tubes of fiber-optic cables. I argue that this is due to a lack of knowledge about computation in the humanities.

This is from Fredrich Kittler’s article “Thinking Colours and/or Machines”:
“[I]n principle, the interminably open horizon of human existence does not allow for any computerization, but that computers can be helpful tools (in the sense of Being and Time) since they, like humans, exist on the basis of language. Conceptualizing the most complex technological medium as a tool, however, is so common and comforting that the humanities are free to continue their business as usual. Given that tools are always defined from the point of view of their user, there is no need to question the old approach that defines machines from the point of view of humans; and subsequently there is no need to consider the possibility that, conversely, humans are defined by machines.”

[Slide 6]
My work is turning to two discourses that (for me) help to address these concerns: Critical Making from Matt Ratto and Garnet Hertz, who are attempting to mix the worlds of hands-on production and critical theory.

After discussing Critical Making, show how steampunk could be considered a mediator between more textual-based forms of literary inquiry and the object-based work of the maker movement. Steampunk is “Victorian science fiction,” but become a design scheme incorporates fashion, programming, crafting, metalwork, etc.

[Slide 7]
One big practice in critical making is circuit bending. Consumer electronics are opened up and metal probes are used to create circuits that were never intended by the manufacturer. Here, we see a speak-and-say that’s been modded to create electronic music.

Garnet Hertz and Jussi Parikka call similar practices a form of “zombie media,” in which old or obsolete technologies can be opened, modded, and made to work again.

[Slide 8]
This is Matt Ratto’s Pixsel workshop, in which participants create physical models of digital pixels. The purpose is to explore the differences and intersections of screen pixels and their material underpinning. You can find his critical making experiences documented on his GitHub site. You can see there is a confusion about the intersection of this making practice and the theory informing it. More on that later.

[Slide 10]
The pixel has interesting connection to materiality.basically a physical point on a raster image, which arranges them in rectangular grids that approximate the image.

Gary Dyson’s Turing’s Cathedral talks about how pixels (picture element) are based upon bites (binary digit) and points to fundamental differences in voltage.
Information is quantified by voltage.

Dyson: “’Any difference that makes a difference’ is how cybernetician Gregory Bateson translated Shannon’s definition into informal terms. To a digital computer, the only difference that makes a difference is the difference between a zero and a one.” (3)

Eventually more than one bit could be crammed into a pixel. Today, one familiar number is 1080 megapixels (or million pixels) per square inch on an HD TV (number of pixels per square inch depends heavily on the resolution of your screen), and this is the current industry standard.

There is a HUGE amount of abstraction in skeumorphic interfaces that occur just on the level of what we experience on GUIs, graphics, etc.

[Slide 11]
From Ratto’s description of the workshop: “The fixsel workshop serves as a simple, participatory way to guide reflection regarding the specific materialities of digital images.”

Pre-made materials included pre-programmed ATTiny 85 microcontrollers, printed paper templates, an RGB LED, and Copper tape.

Then, they collectively reproduce an image projected on a screen by matching their pixel to the projected image pixels.

Two questions:
How does the physical pixel version differ from the projected image version?
What is conserved between the two images?

Their critical reflection included: Matt Kirschenbaum’s distinction between formal and forensic materiality, N. Katherine Hayles’s notion of medium-specific materiality, and Johanna Drucker’s idea of mathesis: prevalence of numerical logic in social and political spheres.

[Slide 12]
Was this experience a success? Here’s Adeline Koh’s response to the project. “To critically make something, we should address questions such as the following: Which agents do we give agency to in a project and why? Who are the voices that are allowed to speak, and who are heard? Which components act, and which components act upon?

These are important questions, but they are incomplete because they underestimate the role of matter in transmitting ideology. Kittler’s comment above is apropos because it points to a wide spread assumption among humanists that ideological questions are reducible to human-social institutions or texts rather than the objects whose materiality interface with our bodies in highly ideological ways. This interface is what Richard Sennett calls “material consciousness:” which means, in technologist David Harvey’s words, that “the sum total of our making is a rich source of evidence about our identities as individuals and about the characters of our societies.” Ideology doesn’t just mean a social context, but resides in the very atoms of the things populating our world. By filtering philosophical and political questions about materiality through a hands-on experience, Ratto’s hybrid practice attempted to inspire material consciousness. Yet the hour-long experience might not have been the best way to inspire such a practice.

[Slide 13]
Critcal making, on the other hand, gives us a material understanding of technological ideology. So: ideology for people involved in critical making (and it’s sister genre media archaeology) are concerned with the materiality of media objects and the way those materials have histories that are physical and geological.

Shannon Mattern has shown aspects of technology complicate social narratives of media studies.

The first is recursive history. Kittler argues, for instance, that we need to stop talking about technology within in linear humanist cultural histories.
Instead, there are elements and themes that recur (with a difference) in different technologies and across different moments in history. His example is the siren which begins as a seductive sea nymph, emerges again as a monster, then emerges again as a technological alarm that fuses aspects of both: it is a form of audio, but there’s nothing particularly “seductive” about hearing an alarm in the middle of the night.

Recursivity is also important in programming. The basic idea of programming is that you are creating recursive algorithms in loops that are changed with different variables. So, this impacts the way we understand how time operates, not only on a geological or ideological sense, but within the operability of computational mechanisms.

[Slide 14]
What Sigfried Zielinski calls the deep time of the media manifests itself as material strata in Earth’s crust. The fact that we can see electronics as what Jussi Parikka calls “mini-mines of minerals and metals,” means that technology has a history that exceeds our social understanding of them. The notion of technological obsolescence doesn’t take into consideration the fact that machines often incorporate similar forms of superimposed temporality. For Mattern, this means that “[w]e can dig up the cables, pull out the wires, trace the epigraphy on building facades, analyze the disks – and observe their layering and interconnection.”

[Slide 15]
I believe that steampunk can act to bridge the divide between social forms of thinking and technological ones – because the aesthetic embraces both literary texts and quirky technological objects.

Quirky histories product of increasing awareness of how computation is impacting our sense of time. Emphasis on recursivity, alternative technologies, learning how systems work   Steampunk emerged out alt-histories in science fiction or “what-if” tales. HG Wells and Jules Verne seen as important reference points. Two of the canonical foundation texts, Moorcock’s Warlord of the Air and Gibson and Sterling’s The Difference Engine both concern themselves with so-called historical experiments. They are not self-consciously steampunk, but use the nineteenth century to think about how our present would be different under different historical circumstances. Moorcock imagines how Britiain’s colonial empire would expand had World War I never occurred. Gibson and Sterling, on the other hand, look at how the information age would have been different had it occurred with the actual development of the analytical engine in the nineteenth century. Later texts, such as China Mieville’s Railsea and Full Metal Alchemist leave behind the pseudo-versimilitude of earlier texts and treat the nineteenth century as a design scheme. I’ll return to that idea later.

[Slide 16]
These steampunk narratives, particularly the idea that nineteenth-century design can inspire object and technology work, are behind the object experimentations found in steampunk fiction. Anachronism for steampunk, in my view, is a way to excavate and creatively manipulate the superimposition of different material ideologies.

Consider, for example, Tim Shealy’s Teacup Stirling Engine. The stirling engine is interesting for engineers because it was a technology that competed with more successful technologies like the internal combustion engine and the steam engine. It failed for social and technological reasons: particularly the lack of sophisticated lubricants which would have made the engine just as efficient – potentially – as the relatively more successful combustion engine. But this isn’t the only way technologies can fail. As Michael Lindgren recounts, Babbage difference engine failed due to the withdrawal of funding from the British government who no longer saw the benefit of allowing them to continue their work. The social and the technological work together in these situations.

[Slide 17]
There’s a parallel interest in resurrecting these technologies as part of a so-called “green steam” movement. Engineers are starting to rethink the stirling engine as a possible solution for alternative fuel needs.

Steampunk designer Jake Von Slatt restored a nineteenth-century traction engine to show the horsepower that’s possible with steam. Each of these steam experiments show how the genre can create different kinds of technological infrastructures in order to imagine alternatives to the material and technological infrastructure that currently enables the disposability of consumer capitalist products.

[Slide 18]
But steampunk also creates potentials for different kinds technological design. The consulting group Adaptive Path, for instance, found that steampunk design helped market cellphones amongst rural and illiterate users in India who had no interest in the touch-screens of iPhones and Android phones. The tactile and audible elemnts of steampunk (clicking dials, knobs, scroll wheels) were a more satisfying experience for them. The alternatives embodied in this design contest an ideology that, far too often, imagines our relationship with technology based upon the sleek GUIs and design aesthetics common amongst the consumer electronics of America and Europe. This is not to say that the steampunk phone transcends Western techno-ideology. It simply illustrates that Apple’s dominance in design is not inevitable or universal.

[Slide 19]
I’d like to end this discussion by invoking what Julian Bleeker, Kari Kraus, and Joshua Tanenbaum, Karen Tanenbaum, and Ron Wakkary call design fiction. The idea is that fiction inspires design prototypes that can impact the manufacturing of technology. A common example is how Star Trek inspired not only science papers imagining the possibility of tractor beams and warp speed, but flip phones designed to look like communicators from the original series. Fiction enables technology, and literary scholars can contribute to this phenomenon in ways that respect our critical and humanistic traditions.

For Wakkary, Tanenbaum, and Tanenbaum the idea of steampunk as a design fiction helps to reinscribe values that are contrary to the industrial marketplace: like do-it-yourself (DIY), a rebellion against black-box materiality, and forms of technological adaptation.

But if we take seriously this world of design fiction and embrace it as a major outcome of literary study, where does that leave us as a discipline?

There’s no easy answer to that question. Steampunk is a fascinating way to explore several experiments in the way the nineteenth-century remains with us. For instance, what is nineteenth century about the work of China Mieville or Jay Lake?

There’s a sense of renewal that’s interwoven with the past which brings me to an important piece about digital preservation from DH-scholar, #altac pioneer, and director of UVa’s Scholars Lab Bethany Nowviskie:

“Digital humanities can be forward-looking only by looking back. The extent to which we can have an effective prospect on the future depends on our continued ability to do retrospective work. And this means not only preserving our collections and thinking carefully about the ways that we re-mediate them, but it also means understanding what it is to make and build and transmit and share. […]We make things because that’s how we understand. We make things because that’s how we pass them on, and because everything we have was passed on to us as a made object. We make things in digital humanities because that’s how we interpret and conserve our inheritance. Because that’s how we can make it all anew.”

Thank you.

A New Birth of Academic Freedom

Posted by Roger Whitson on August 7th, 2014

My NASSR 2014 Presentation – “@autoblake: Between Critical Making and Computation”

Posted by Roger Whitson on July 8th, 2014

DHSI 2014 Day 1, or Why We Need the MLA Report

Posted by Roger Whitson on June 3rd, 2014

Some #dhsi2014 Essentials

Posted by Roger Whitson on May 25th, 2014

Faculty Letter about Sexual Assault on WSU Campus

Posted by Roger Whitson on May 15th, 2014

Reviewing the Reviewer: Grant Scott’s Review of Blake 2.0: William Blake in Twentieth-Century Art, Music, and Culture.

Posted by Roger Whitson on May 2nd, 2014

On Several Scales of Reals and Realities, Archival or Digital or Otherwise

Posted by Roger Whitson on April 18th, 2014

Day of the Digital Humanities 2014 Videos

Posted by Roger Whitson on April 8th, 2014