1.1 From Digital Humanities to Speculative Computing - UCLA ...
1.1 From Digital Humanities to Speculative Computing - UCLA ...
1.1 From Digital Humanities to Speculative Computing - UCLA ...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
1.0 <strong>Speculative</strong> <strong>Computing</strong><br />
<strong>1.1</strong> <strong>From</strong> <strong>Digital</strong> <strong>Humanities</strong> <strong>to</strong> <strong>Speculative</strong> <strong>Computing</strong><br />
Johanna Drucker<br />
Our activities in speculative computing were built on the foundation of digital<br />
humanities. The community at Virginia in which these activities flourished was largely,<br />
though not exclusively, concerned with what could be done with texts in electronic form.<br />
Important projects in visualization were part of first generation digital humanities, but the<br />
textual inclination of digital humanities was nurtured in part by links <strong>to</strong> computational<br />
linguistics whose analyses were well served by statistical methods. i (Sheer practicality<br />
played its part as well. Keyboarded entry of texts may raise all kinds of not so obvious<br />
issues, but no equivalent for “entering” images exists—an useful point, as it turns out, for<br />
purposes of the arguments about materiality that will come up here.) But other pioneering<br />
projects conceived by literary or his<strong>to</strong>rical scholars involved in critical editing and<br />
bibliographical studies found the flexibility of digital instruments advantageous. ii Early<br />
on, it became clear that aggregation of information, access <strong>to</strong> surrogates of primary<br />
materials, and the manipulation of texts and images in virtual space all provided<br />
breakthrough research <strong>to</strong>ols. But these environments also gave rise <strong>to</strong> theoretical and<br />
critical questions that caused innovative reflections on traditional scholarship.<br />
The character of digital humanities was formed initially by concessions <strong>to</strong> the<br />
exigencies of computational disciplines. iii Humanists played by the rules of computer<br />
science and its formal logic, at least at the outset. Part of the excitement was learning new<br />
languages through which <strong>to</strong> rethink our habits of work. The impulse <strong>to</strong> challenge the<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
20
cultural authority of computational methods in their received form only came after a<br />
period of infatuation with the power of digital technology and the mythic ideal of<br />
mathesis (a formal approach <strong>to</strong> knowledge representation) it seemed <strong>to</strong> embody. That<br />
period of infatuation (a replay of a long tradition) fostered the idea that formal logic<br />
might be able <strong>to</strong> represent human thought as a set of primitives and principles, and that<br />
digital representation might be the key <strong>to</strong> unlocking its mysteries. Quaintly naïve as this<br />
may appear in some circles, the ideal of a fully managed world governed by logical<br />
procedures is an ongoing u<strong>to</strong>pian dream for many who believe rationality provides an<br />
absolute, not a relative, basis for knowledge, judgment, or action. iv The linguistic turn in<br />
philosophy in the early decades of the 20 th century was fostered in part by the<br />
development of formalist approaches that aspired <strong>to</strong> the reconciliation of natural and<br />
mathematical languages. v The intellectual premises of British analytic philosophy and<br />
those of the Vienna Circle, for instance, were not anomalies, but mainstream<br />
contributions <strong>to</strong> a tradition of mathesis that continued <strong>to</strong> find champions in structural<br />
linguistics and its legacy throughout the 20 th century. vi The popular culture image of the<br />
brain as a type of computer turns these analogies between thought and processing in<strong>to</strong><br />
familiar clichés. Casual reference <strong>to</strong> nerve synapses as logic gates or behaviors as<br />
programs promote an unexamined but readily consumed idea whose ideal is a <strong>to</strong>tal<br />
analysis of the human thought processes so as if they can be ordered according <strong>to</strong> formal<br />
logic. vii Science fiction writers have exploited these ideas endlessly, as have futurologists<br />
and pundits given <strong>to</strong> hyperbolic exaggeration, but the receptivity <strong>to</strong> their ideas shows how<br />
deeply rooted the mythology of mathesis is in the culture at large. viii<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
21
<strong>Digital</strong> humanists, however, were not interested in analogies between organic<br />
bodies and logical systems, but in the intellectual power of information structures and<br />
processes. The task of designing content models or conceptual frameworks within which<br />
<strong>to</strong> order and organize information, as well as the requirements of data types and formats<br />
at the level of code or file management, forged a pragmatic connection between<br />
humanities research and information processing. The power of meta-languages expressed<br />
as classification systems and nomenclature was attractive, especially combined with the<br />
intellectual discipline imposed by the parameters and stringencies of working in a digital<br />
environment. A magical allure attached <strong>to</strong> the idea that imaginative artifacts might yield<br />
their mysteries <strong>to</strong> the traction of formal analyses, or that the character of artistic<br />
expressions might be revealed by their place within logical systems. The distinction<br />
between managing or ordering texts and images with metadata or classification schemes<br />
and the interpretation of their essence as creative works was always clear. But still,<br />
certain assumptions linked the formal logic of computational processes <strong>to</strong> the<br />
representation of human expressions (in visual as well as textual form), and the playful<br />
idea that one might have a “reveal codes” function that would expose the compositional<br />
pro<strong>to</strong>cols of an aesthetic work had a way of taking hold. At first glance, the ability of<br />
formal processing <strong>to</strong> manage complex expressions either by modeling or manipulation<br />
appeared <strong>to</strong> be mere expediency. But computational methods are not simply a means <strong>to</strong><br />
an end. They are a powerful change agent setting the terms of a cultural shift.<br />
By contrast, the premises of speculative computing are not just a game played <strong>to</strong><br />
create projects with risky outcomes, but a set of principles through which <strong>to</strong> push back on<br />
the cultural authority by which computational methods instrumentalize their effects<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
22
across many disciplines. The villain, if such a simplistic character has <strong>to</strong> be brought on<br />
stage, is not formal logic or computational pro<strong>to</strong>cols, but the way the terms of such<br />
operations are used <strong>to</strong> justify decisions about administration and management of cultural<br />
and imaginative life on the basis of presumed objectivity and its <strong>to</strong>talizing claims. ix The<br />
terms on which digital humanities had been established, while essential for the realization<br />
of projects and goals, needed <strong>to</strong> come under scrutiny within a larger critical frame<br />
because of the way it aligned with such managerial methods. As in any ideological<br />
formation, the unexamined character of assumptions allows them <strong>to</strong> pass as natural. We<br />
defined speculative computing <strong>to</strong> push subjective and probabilistic concepts of<br />
knowledge as experience (partial, situated, and subjective) against objective and<br />
mechanistic claims for knowledge as information (<strong>to</strong>tal, managed, and externalized).<br />
* * *<br />
If digital humanities activity were <strong>to</strong> be reduced <strong>to</strong> a single precept, it would be<br />
the requirement <strong>to</strong> disambiguate knowledge representation so that it operates within the<br />
codes of computational processing. This requirement has the benefit of causing humanist<br />
scholars <strong>to</strong> become acutely self-conscious about the assumptions on which we work, but<br />
also, <strong>to</strong> concede many aspects of ambiguity for the sake of workable solutions in a<br />
computational environment. Basic decisions about what the information or substantive<br />
value of any document or information rendered in a digital surrogate are highly charged<br />
with theoretical implications. Every decision about whether a text is rendered through<br />
keyboarding in<strong>to</strong> ASCII, stripping away the format features of its original, or what type<br />
or category of work a file is determined <strong>to</strong> be, is fraught with implications. Is The<br />
Confessions of Jean Jacques Rousseau a novel? The document of an era? Or a<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
23
iographical portrait? A memoir and first-person narrative? Or a his<strong>to</strong>rical fiction?<br />
Should the small, glyphic figures in William Blake’s handwriting that appear within his<br />
lines of poetry be considered part of the text? Or simply disregarded because they cannot<br />
be rendered in an ASCII symbol set? x At every stage of development, digital instruments<br />
require such decisions. As these are made, the cultural legacy that becomes available in<br />
digital form is shaped through interpretive acts.<br />
Because of this intense engagement with interpretation and questions about the<br />
presumed foundations of epistemology, the field of digital humanities extends the<br />
theoretical questions that came in<strong>to</strong> focus in deconstruction, post-modern theory, critical<br />
and cultural studies, and other theoretical inquiries of recent decades. Basic concerns<br />
about the ways processes of interpretation constitute their objects within cultural and<br />
his<strong>to</strong>rical fields of inquiry are raised again, and with another level of his<strong>to</strong>rical baggage<br />
and cultural charge attached <strong>to</strong> them. Who, after all, will hold the power over the<br />
decisions for modeling the classification systems for knowledge in digital<br />
representations? What does it mean <strong>to</strong> create ordering systems, models of knowledge and<br />
use, or environments for aggregation or consensus? The next phase of cultural power<br />
struggles will be embodied in the digital instruments that will model what we think we<br />
know and what we can imagine.<br />
<strong>Digital</strong> humanities is an applied field as well as a theoretical one. xi The task of<br />
taking these meta-considerations in<strong>to</strong> application puts assumptions <strong>to</strong> a different set of<br />
tests and also raises the stakes with regard <strong>to</strong> outcomes. Theoretical insight is constituted<br />
in this field in large part through encounters with application. The statistical analysis of<br />
texts, creation of structured data, and design of information architecture are the basic<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
24
elements of digital humanities. Representation and display are integral aspects of these<br />
elements, but they are often premised on an engineering-influenced approach, grounded<br />
in a conviction that transparency or accuracy in the presentation of data is the best<br />
solution. Blindness <strong>to</strong> the rhe<strong>to</strong>rical effects of design as a form of mediation (not<br />
transmission or delivery) is one of the features of that cultural authority of mathesis that<br />
plagues the digital humanities community. Expediency is the name under which this<br />
authority exercises its control, and in its shadow, a stealth conviction grows in which<br />
resolution and disambiguation are virtues, and where “well-formed” data behaves in ways<br />
that eliminate the contradictions <strong>to</strong>lerated by (traditionally self-indulgent) humanists.<br />
This hierarchy of virtues arises from the fact that approaches <strong>to</strong> digital projects<br />
are usually highly pragmatic: creating a searchable corpus, or making primary materials<br />
for his<strong>to</strong>rical work available, or linking these <strong>to</strong> an interactive map and timeline capable<br />
of displaying data selectively. Therefore theoretical issues that arise are intimately bound<br />
<strong>to</strong> the actual work of building online collections and projects. These dimensions become<br />
readily apparent in the design of any information structure, particularly if hierarchies or<br />
other structures are brought in<strong>to</strong> play. Humanists are skilled at complexity and ambiguity.<br />
Computers, as is well known, are not. In that clash of value systems fundamental<br />
epistemological and ideological differences arise. Within the digital humanities<br />
community, I frequently saw the triumph of computer culture over humanistic values. xii<br />
The attitude that objectivity was a virtue by contrast <strong>to</strong> the supposedly fuzzy quality of<br />
subjectivity became evident. And the way this objectivity—defined in many cases within<br />
the computation community as anything that can be accommodated <strong>to</strong> formal logical<br />
processes–presents itself as truth was passing as utterly natural in the culture at large. All<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
25
the lessons of deconstruction and post-structuralism –the extensive critiques of reason<br />
and grand narratives, the recognition that presumptions of objectivity are merely cultural<br />
assertions of a particular, his<strong>to</strong>rical formation – threaten <strong>to</strong> disappear under the<br />
normalizing pressures of designing projects <strong>to</strong> conform <strong>to</strong> digital pro<strong>to</strong>cols. This<br />
realization drove SpecLab’s thought experiments and design projects, pushing us <strong>to</strong><br />
envision and realize alternative possibilities.<br />
<strong>Digital</strong> <strong>Humanities</strong> and Electronic texts<br />
As stated above, digital humanities is not defined entirely by textual projects, though in<br />
so far as the community in which I was involved focused largely on text-based issues, its<br />
practices mirrored the logocentric habits endemic <strong>to</strong> the academic establishment. xiii I<br />
came <strong>to</strong> many of my own convictions through reflections on visual knowledge<br />
production, but these were realizations that were formulated in a dialogue with the textual<br />
studies community in digital humanities. Understanding the premises on which work in<br />
that arena was conceived is important as background for our design work at SpecLab–and<br />
<strong>to</strong> undoing any easy opposition that might seem <strong>to</strong> arise from pitting visual works against<br />
texts, or analogue modes against digital ones in either of the easy binarisms that arise so<br />
easily and readily as obstacles <strong>to</strong> more complex thought.<br />
Textual studies met computational methods on several different fields of<br />
engagement. Some of these were methods of manipulation, such as word processing,<br />
hypertext, or codework (the term usually reserved for creative productions made by<br />
setting algorithmic procedures in play). xiv Others were <strong>to</strong>ols for bibliographical studies,<br />
critical editing and collation, stylometrics, or linguistic analysis. Another, also mentioned<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
26
iefly above, was the confluence of philosophical and mathematical approaches <strong>to</strong> the<br />
study of language that shared an enthusiasm for formal methods. The his<strong>to</strong>ry of<br />
programming languages and their relation <strong>to</strong> modes of thought, as well as their contrast<br />
with natural languages, is yet another. All of these have a bearing on digital humanities,<br />
either directly (as <strong>to</strong>ols taken up by the field) or indirectly (as elements of the larger<br />
cultural condition within which digital instruments operate effectively and gain their<br />
authority).<br />
Twenty years ago a giddy excitement about what Michael Heim termed “electric<br />
language” turned the heads of humanists and writers. xv Literary scholars influenced by<br />
deconstruction saw in digital texts a condition of mutability that seemed <strong>to</strong> put the idea of<br />
differential “play” in<strong>to</strong> practice. xvi And the linking, browsing, combina<strong>to</strong>ric possibilities<br />
of hypertext and hypercard provided a rush <strong>to</strong> authorial imagination. Suddenly it seemed<br />
that conventions of “linearity” were being exploded through the manipulative<br />
possibilities offered by new media in<strong>to</strong> rhizomatic networks that undercut the appearance<br />
of stasis attributed <strong>to</strong> the printed page. Text seemed fluid, mobile, dynamically charged.<br />
Since then, habits of use have encoded the once dizzying concept of links in<strong>to</strong> daily<br />
activity in web browsers and the magic of being able <strong>to</strong> rewrite and rework texts on the<br />
screen is just the business of everyday life. But as the special effects “wow” fac<strong>to</strong>r of<br />
those early encounters has evaporated, the deeper potential for interrogating what a text is<br />
and how it works have come in<strong>to</strong> view within more specialized practices of electronic<br />
scholarship and criticism. In particular, a new order of metatexts has come in<strong>to</strong> being that<br />
encodes (and thus exposes) attitudes <strong>to</strong>wards textuality.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
27
Early digital humanities is generally traced <strong>to</strong> the work of Father Busa whose<br />
Index Thomisticus was begun in 1949. xvii Busa’s scholarship involved statistical<br />
processing (the creation of concordances, word lists, and studies of frequency). His<br />
repetitive tasks were dramatically speeded by the au<strong>to</strong>mation enabled by computers.<br />
Other developments followed in stylometrics (quantitative analysis of style for attribution<br />
and other purposes), string-searches (on texts as sequences of letters), and increasingly<br />
sophisticated processing of the semantic content of texts (context sensitive analysis, the<br />
semantic web, etc..). xviii More recently, scholars involved in the creation of electronic<br />
archives and collections have established conventional approaches <strong>to</strong> metadata (as Dublin<br />
Core standard information fields for electronic documents), mark-up (the TEI, or Text<br />
Encoding Initiative), and other elements of digital text processing and presentation. xix<br />
This process continues <strong>to</strong> evolve as the scale of materials online expands from creation of<br />
digital reposi<strong>to</strong>ries (archives and collections), issues of access and display, <strong>to</strong> peer<br />
reviewed publishing, interpretative <strong>to</strong>ols, and other humanities-specific activities that<br />
take advantage of the participa<strong>to</strong>ry, iterative, and dynamic properties of electronic<br />
environments. The encounter of texts and digital media has reinforced theoretical<br />
realizations that printed materials are not static, self-identical artifacts and that the act of<br />
reading and interpretation is a highly performative intervention in a textual field that is<br />
charged with potentiality rather than existing in a fixed condition. xx One of the challenges<br />
we set ourselves was <strong>to</strong> envision ways <strong>to</strong> show this dramatically, in a visual<br />
demonstration, rather than simply <strong>to</strong> assert it as a critical insight.<br />
The technical processes involved in these activities are not simply mechanical<br />
manipulations of texts. So-called “technical” operations always involve interpretation,<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
28
often structured in<strong>to</strong> the shape of the metadata, mark-up, search design, or presentation<br />
and expressed in graphic display. The grid-like structures and frames in web browsers<br />
express this interpretive organization of elements and their relations, though not in<br />
anything like an isomorphic mirroring of data structures <strong>to</strong> visual presentation. Web page<br />
features such as sidebars, hot links, menus, tabs, have become so rapidly<br />
conventionalized that their character as representations has become invisible. Under<br />
scrutiny, the structural hierarchy of information coded in<strong>to</strong> but<strong>to</strong>ns, bars, windows, and<br />
other elements of interface reveals the rhe<strong>to</strong>ric of display. Viewing the source code–the<br />
electronic equivalent of looking under the hood–shows an additional level of information<br />
structure. But this still doesn’t provide access <strong>to</strong> or reading knowledge of the metadata,<br />
database structures, programming pro<strong>to</strong>cols, mark-up tags, or style sheets that are behind<br />
the display. Because these various metatexts actively structure a domain of knowledge<br />
production in digital projects, they are crucial instruments in the creation of the next<br />
generation of our cultural legacy. Arguably, few other textual forms will have greater<br />
impact on the way we read, receive, search, access, use, and engage with the primary<br />
materials of humanities studies than the metadata structures that organize and present that<br />
knowledge in digital form. xxi<br />
<strong>Digital</strong> humanities can be described in terms of its basic elements–statistical<br />
processing, structured data, meta-data, and information structures. Migrating traditional<br />
texts in<strong>to</strong> electronic form allows certain things <strong>to</strong> be done with them that are difficult, if<br />
not impossible, with print texts. These functions are largely enabled by the ability of<br />
computers <strong>to</strong> perform certain kinds of rote tasks very quickly. Au<strong>to</strong>mating the act of<br />
string searching (matching specific sequences of alphanumeric characters) allows the<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
29
creation of concordances, word lists, and other statistical information about a text. This<br />
supports stylometrics, the statistical analysis of characteristics of style for purposes of<br />
attribution. It also allows large quantities of text <strong>to</strong> be searched for discourse analysis,<br />
particularly that which is based on reading a word, term, or name in all its many contexts<br />
across a corpus of texts.<br />
Many of the questions that can be asked using these methods are well-served by<br />
au<strong>to</strong>mation. Finding every instance of a word or name in a large body of work is so<br />
tedious and repetitive that the basic grunt-work – like that performed by the<br />
aforementioned Father Busa – takes so long that the analysis was deferred during years of<br />
data gathering by hand. Au<strong>to</strong>mating highly defined tasks creates enormous amounts of<br />
statistical data. The data then suggest other approaches <strong>to</strong> the study at hand. The use of<br />
pronouns vs. proper names, the use of first person plural vs. singular, the reduction or<br />
expansion of vocabulary, use of Latinate vs. Germanic forms – these are very basic<br />
elements of linguistic analysis in textual studies that give rise <strong>to</strong> interesting speculation<br />
and scholarly projects. xxii Seeing patterns across data is a powerful effect of aggregation.<br />
Such basic au<strong>to</strong>mated searching and analysis can be performed on any text that has been<br />
put in<strong>to</strong> electronic form.<br />
In the last decade the processes for statistical analysis have sophisticated<br />
dramatically. String searches on ASCII (keyboarded) text have been superceded by<br />
folksonomies and tag clouds generated au<strong>to</strong>matically through tracking patterns of use. xxiii<br />
Search engines and analytic <strong>to</strong>ols no longer rely exclusively on the tedious work of<br />
human agents as part of the computational procedure. xxiv Data mining allows context<br />
dependent and context-independent variables <strong>to</strong> be put in<strong>to</strong> play in ways that would have<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
30
equired elaborate coding in an earlier era of digital work. xxv The value of any<br />
computational analysis hangs, however, on deciding what is able <strong>to</strong> be parameterized<br />
(subject <strong>to</strong> a quantitative or otherwise standard assessment). The terms of the metric or<br />
measure according <strong>to</strong> which any search or analytic process is carried out is framed with<br />
particular assumptions about the nature of data. Questioning what is considered data,<br />
what is available for analysis, is as substantive a consideration as what is revealed by its<br />
analysis. I am not just making a distinction between what can be measured easily<br />
(because it is discrete and lends itself <strong>to</strong> digital processing, such as counting the number<br />
of e’s in a document) and what cannot (calculating the white space that surrounds them).<br />
Rather, I am intent <strong>to</strong> call attention <strong>to</strong> the difference between what we think can be<br />
measured and what is outside that conception entirely (e.g. the his<strong>to</strong>ry of the design of<br />
any particular “e” as expressed or repressed in its form). The critique of post-<br />
structuralism posed <strong>to</strong> structuralist formalisms exposed assumptions based in cultural<br />
value systems expressed as epistemological ones. The very notion of a standard metric is<br />
itself an ideology. (The his<strong>to</strong>ry of any “e” is a complicated s<strong>to</strong>ry indeed.) The distinction<br />
between what can be parameterized and what cannot is not the same as the difference<br />
between analogue and digital systems, but between complex, culturally situated<br />
approaches <strong>to</strong> knowledge and <strong>to</strong>talized, systematic ones. xxvi<br />
Metalanguages and metatexts<br />
The au<strong>to</strong>mated processing of textual information is fundamental <strong>to</strong> digital humanities,<br />
but, as stated earlier, so is the creation and use of meta-texts used as description and<br />
enhancement of information, but also, as performative instruments. As readers and<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
31
writers we are habituated <strong>to</strong> language and, <strong>to</strong> some extent, <strong>to</strong> the “idea of the text” or<br />
“textuality” as well. Insights in<strong>to</strong> the new functions of digital metatexts build on<br />
arguments that have been in play for twenty years or more in bibliographic, textual, and<br />
critical studies. xxvii Meta-languages have a fascinating power, carrying a suggestion of<br />
higher order capabilities. xxviii As texts that describes a language, naming and articulating<br />
its structures, forms, and functions, they seem <strong>to</strong> trump languages that are merely used<br />
for composition or expression. A metatext is a subset of meta-language, one that is<br />
applied <strong>to</strong> a specific task, domain, or situation. <strong>Digital</strong> metatexts are not merely<br />
descriptive, they are not simply higher order commentaries on a set of texts. They are<br />
performative and in many cases, contain pro<strong>to</strong>cols that enable dynamic procedures of<br />
analysis, search, and selection, as well as display. Even more importantly, metatexts<br />
express models of the field of knowledge in which they operate. By the structure and<br />
grouping of elements within the outline of metadata (what elements should be part of the<br />
metadata for a title, publication information, or physical description of an artifact?) or the<br />
terms a metadata scheme contains (media or formats of graphical forms, the media of<br />
their reproduction, or just a description of the imagery or pic<strong>to</strong>rial elements), these<br />
instruments have a powerful effect. They shape the fields in which they operate quite<br />
explicitly by defining what can and cannot be said about the objects in any particular<br />
collection or online environment. Metadata schemes must be read as models of<br />
knowledge, as discursive instruments that bring the object of their inquiry in<strong>to</strong> being.<br />
This insight, like many others taken from the encounter between digital<br />
technology and text works and practices, opens our horizon of understanding about<br />
traditional texts as well as the basic humanistic projects of scholarship and interpretation.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
32
Analysis of metadata and content models is an essential part of the critical apparatus of<br />
digital humanities.<br />
One tenet of faith in the field of digital humanities is that engaging with the<br />
constraints of electronic texts provides reflection on traditional text formats. xxix The<br />
requirement <strong>to</strong> make explicit much of what is left implicit is a necessity in a digital<br />
environment. Thus the oft-repeated principle that computers can’t <strong>to</strong>lerate the kind of<br />
ambiguity characteristic of humanities texts or interpretative methods. Because digital<br />
metatexts are designed <strong>to</strong> do something <strong>to</strong> texts (divide elements by content, type, or<br />
behavior) or <strong>to</strong> do something as metatexts (in databases, mark-up languages, metadata)<br />
they are performative.<br />
The term structured data applies <strong>to</strong> any information on which a formalized<br />
language of analysis has been imposed. In textual work in digital humanities, the most<br />
common type of structured data takes the form of mark-up language. The term “mark-up”<br />
simply refers <strong>to</strong> the act of putting tags in<strong>to</strong> a stream of alpha-numeric characters. The<br />
most familiar form of mark-up is HTML (HyperText Markup Language), which consists<br />
of a set of tags for instructing browsers how <strong>to</strong> display information. HTML tags identify<br />
format features – a is labeled and displayed differently than a break<br />
between paragraphs or text sections. While this may seem simplistic, and obvious, the<br />
implications for interpretation are complex. HMTL tags are content neutral. They<br />
describe formal features, not types of information:<br />
This, says the HTML tag, should be rendered in italics for emphasis.<br />
<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
33
But the italics used for a title and those used for emphasis are not the same. Likewise a<br />
“header” is not the same as a “title” –they belong <strong>to</strong> different classification schemes, one<br />
graphical and the other bibliographical. While graphic features such as bold, italic, or<br />
relative scale and size have semantic value, their semiotic code is vague and<br />
insubstantial.<br />
XML, (eXtensible Markup Language), contain tags that describe and model<br />
content. XML labels the content of texts. So instead of identifying “headers” or<br />
“paragraphs” and other physical or graphical elements, XML tags identify content types –<br />
titles, subtitles, author’s names or pseudonyms, places of publication, dates of editions,<br />
and so on:<br />
<br />
“Really, is that what XML does?” she asked.<br />
“Yes,” he replied, graciously, trying <strong>to</strong> catch<br />
her gaze.<br />
<br />
All this seems straightforward enough until we pause <strong>to</strong> consider that perhaps this<br />
exchange should take a tag, given the phrases the follow:<br />
[start here?]<br />
<br />
“Really, is that what XML does?” she asked.<br />
“Yes,” he replied, graciously, [or should<br />
start here?] trying <strong>to</strong> catch her gaze.<br />
<br />
[or start here?]<br />
His glance showed how much he appreciated the intellectual interest—and the<br />
way it was expressed by her large blue eyes, which she suddenly dropped,<br />
blushing. “Can you show me?” <br />
<br />
But where does begin and end? Before or after the first exchange? Or<br />
in the middle of it? The importance of defining tag sets and of placing individual tags<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
34
ecomes obvious very quickly. XML tags may describe formal features of works such as<br />
stanzas, footnotes, cross-outs, or other changes in a text. XSL tags are based on<br />
conventions for domain and discipline specific tasks. The tags used in marking up legal<br />
or medical documents are very different from those appropriate <strong>to</strong> the study of literature,<br />
his<strong>to</strong>ry, or biography. Even when tags are standardized in a field or within a research<br />
group, making decisions about which tags <strong>to</strong> use in any situation involves a judgment call<br />
and relies on considerable extra-textual knowledge. In the example above, the concept of<br />
is far more elastic than that of .<br />
XML documents are always structured as nested hierarchies, tree structures, with<br />
parent and child nodes and all that this rigid organization implies. The implications of<br />
this rigidity brought the tensions between mathesis and aesthesis <strong>to</strong> the fore in what<br />
serves as an exemplary case. The hierarchical structure of XML was reflected in a<br />
discussion of what was called the OHCO thesis. OHCO stands for “ordered hierarchy of<br />
content objects”. The requirements of XML were such that only a single hierarchy could<br />
be imposed on (actually inserted in<strong>to</strong>) a document. This meant that a scholar was<br />
frequently faced with the problem of choosing between categories or types of information<br />
<strong>to</strong> be tagged. A conflict between marking the graphic features of an original document<br />
and the bibliographic features of the same document was one such problem faced by<br />
scholars migrating materials in<strong>to</strong> electronic forms. Did one chunk a text in<strong>to</strong> chapters, or<br />
in<strong>to</strong> pages, since you could not do both (a chapter could end and another begin on the<br />
same page, so the two systems cancelled each other). Such decisions might seem trivial,<br />
hair-splitting, but not if attention <strong>to</strong> material features of a text are considered important.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
35
Returning <strong>to</strong> the original example above, imagine trying <strong>to</strong> sort out where a<br />
begins and ends, and how it overlaps with other systems of content<br />
( or ). The formal constraints of<br />
XML simply do not match the linguistic complexity of aesthetic artifacts. xxx Even the<br />
distinction between saying that texts could be considered ordered hierarchies for the sake<br />
of markup, rather than saying that they were structured in modular chunks, registers a<br />
distinction that qualifies the claims of the formal system. But even as these philosophical<br />
quarrels and challenges arose, the process of tagging goes on, and with it, a normalization<br />
of such practices within a community where such issues are simply accepted as necessary<br />
for pragmatic work. xxxi<br />
Because XML schemes can be extremely elaborate and a need for standardization<br />
within professional communities quickly became apparent. Even the relatively simply<br />
task of standardizing nomenclature so that a short title is always not<br />
or requires that tags be agreed upon in a community of users. The<br />
goal of creating significant digital collections required consensus and regulation, and an<br />
organization called the Text Encoding Initiative was established. TEI, as Matt<br />
Kirschenbaum once wittily remarked, is a shadow world government. He was right of<br />
course, since an organization setting standards for knowledge representation, especially<br />
standards that are essentially invisible <strong>to</strong> the average reader, is indeed a powerful entity.<br />
These are the subtle, often insidious because highly naturalized, means by which<br />
computational culture infiltrates humanist communities and assumes an authority over its<br />
operations by establishing pro<strong>to</strong>cols and practices that require conformity. One can shrug<br />
off a monopoly hold on nomenclature with the same disregard as a resignation <strong>to</strong><br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
36
standard-gauge rails, or suggest that perhaps transportation and interpretation are<br />
necessarily subject <strong>to</strong> different regimes. Because of the rhe<strong>to</strong>rical force of digital<br />
instruments, I suggest the latter approach.<br />
Discussion of tags is a bit of a red herring, as they may disappear in<strong>to</strong> his<strong>to</strong>rical<br />
obsolescence, replaced by sophisticated search engines and other analytic <strong>to</strong>ols. But the<br />
problem raised by XML tags, or any other system of classifying and categorizing<br />
information, will remain: they exercise rhe<strong>to</strong>rical and ideological force. If is<br />
not a tag or allowable category then it cannot be searched. Think of the implications for<br />
concepts like or . A set of tags for structuring data is a powerful<br />
interpretative grid imposed on a human expression that has many ambiguities and<br />
complexities. Extend the above example <strong>to</strong> texts analyzed for policy analysis in a<br />
political crisis and the costs of conformity rise. Orwell’s dark imaginings are readily<br />
realized in such a system of explicit exclusions and controls.<br />
Sinister paranoia aside, the advantages of structured data are enormous. Their<br />
character and content makes digital archives and reposi<strong>to</strong>ries different in scale and<br />
character from websites, and will enable next generation design features <strong>to</strong> aggregate and<br />
absorb patterns of use in<strong>to</strong> flexible systems. The distinction between an archive or<br />
reposi<strong>to</strong>ry and a website is distinction between a static or fixed display and a dynamic<br />
archive. Websites built in HTML make use of an armature for organizing information.<br />
They hold and display it in one form, just as surely as work in a display case or shop<br />
window. You can pick a path through it, read in whatever order you like, but you can’t<br />
repurpose the data, aggregate it, or process it. A website with a collection of book covers,<br />
for instance, might contain hundreds of images that you can access through an index. If it<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
37
has a database under it, you might be able <strong>to</strong> search for information across various fields.<br />
But an archive, like Holly Shulman’s Dolley Madison letters, contains fully searchable,<br />
tagged text. xxxii That archive contains thousands of letters, and theASCII text transcription<br />
of all of the letters is tagged, marked-up, structured. Information about Madison’s social<br />
and political life can be gleaned from that archive in a way that would be impossible in a<br />
simple website.<br />
Through the combined force of its descriptive and performative powers, a digital<br />
metatext embodies and reinforces assumptions about the nature of knowledge in a<br />
particular field. The metatext is only as good as the model of knowledge it encodes. It is<br />
built on a critical analysis of a field and expresses that understanding in its organization<br />
and the functions it can perform. The intellectual challenge comes from trying <strong>to</strong> think<br />
through the ways the critical understanding of a field should be shaped or what should<br />
comprise the basic elements of a graphical system <strong>to</strong> represent temporality in humanities<br />
documents. The technical task of translating this analysis in<strong>to</strong> a digital metatext is trivial<br />
by contrast <strong>to</strong> the compelling exercise of creating the intellectual model of a field.<br />
Models and Design<br />
Structured data and metatexts are expressions of a higher order model in any digital<br />
project. That model is the intellectual concept according <strong>to</strong> which all the elements of a<br />
digital project are shaped, whether consciously or not. One may have a model of what a<br />
book is or how the solar system is shaped without having <strong>to</strong> think reflectively about it,<br />
but in creating models for information structures, the opportunity for thinking self-<br />
consciously abut the importance of design is brought <strong>to</strong> the fore.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
38
A model creates a generalized schematic structure while a representation is a<br />
stand-in or surrogate for something else. A textual expression may be an utterance,<br />
instance of exposition or composition, and though it encodes all kinds of assumptions, it<br />
may not explicitly schematize a model. A representation may be as specific and as replete<br />
with information as the original. A portrait, a nameplate, a handprint and a driver’s<br />
license are all representations of a person. None are models. All are based on models of<br />
what we assume a portrait, a name, an indexical trace, or an official document <strong>to</strong> be. The<br />
generalized category of “official document” can itself be modeled so that it contains<br />
various parameters and defining elements. A model is independent of its instances. A<br />
representation may be independent of its referent, <strong>to</strong> use the semiotic vocabulary, but it is<br />
specific and not generalizable. A model is often conceived as a static form, but it is also<br />
dynamic, functioning as a program <strong>to</strong> call forth a set of actions or activities. The design<br />
of the e-book, <strong>to</strong> which we will return in a final chapter, provides a case study in the ways<br />
a model of what a common object is can be guided by unexamined principles that<br />
produce non-functional results.<br />
Text modeling creates a general scheme for describing the elements of a text (by<br />
form, format, content, or other categories–each of these, as will be clear below, ask us <strong>to</strong><br />
think about a text differently), but it is also a way of actively engaging, producing an<br />
interpretation. Modeling and interpretation can be perilously iterative–and the creation of<br />
metadata can involve innumerable acts of rework. Even when the metadata remains<br />
unchanged, its application is neither consistent nor stable. Just as every reading produces<br />
a new textual artifact, so any application of metadata or text models enacts a new<br />
encounter, performs the text differently.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
39
While every model embodies interpretative attitudes expressed as formal<br />
structures, many information structures have graphical analogies, and can be unders<strong>to</strong>od<br />
as diagrams that organize the relations of elements within the whole. A tree structure<br />
with its vocabulary of parent-child relationships, for instance, is set up with very distinct<br />
assumptions about hierarchy. Matrices, lattices, one-<strong>to</strong>-many and one-<strong>to</strong>-one<br />
relationships, the ability <strong>to</strong> “cross walk” information from one structure <strong>to</strong> another, <strong>to</strong><br />
disseminate it with various functionalities for use by broadly varied communities, or <strong>to</strong><br />
restrict its forms so that it forces a community <strong>to</strong> think differently about the information –<br />
these are all potent features of information architecture.<br />
All of this work, whether it is the design of a string search or the graphical<br />
presentation of a statistical pattern, the creation of a set of metadata fields, tags, or the<br />
hierarchical or flat structure of architecture for data, is modeling. It is all an expression of<br />
form that embodies a generalized idea of the knowledge it is presenting. The model is<br />
abstract, schematic, ideological and his<strong>to</strong>rical through and through, as well as discipline-<br />
bound and highly specific in its form and constraints. It is also, often, invisible. An<br />
alphabetical ordering is a model. So is a tree structure or a grid. All have their origins in<br />
specific fields and cultural locations. All carry those origins with them as an encoded set<br />
of relations that structure the knowledge in the model. Model and knowledge<br />
representation are not the same, but the morphology of the model is semantic, not just<br />
syntactic. On the surface, a model seems static. In reality, like any “form” it is a<br />
provocation for a reading, an intervention, an interpretive act <strong>to</strong> which it gives rise—<br />
these statements are the core tenets of SpecLab’s work.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
40
The ideological implications of diagrammatic forms have been neutralized by<br />
their origins in empirical sciences and statistics where the convenience of grids and tables<br />
supercedes any critique of the rhe<strong>to</strong>ric of their organization. But are all arrival and<br />
departure values in a railway timetable really the same? Is each simply one among many<br />
choices in a column where their order emphasizes their similarity over their difference? Is<br />
that dark, cold, clammy pre-dawn moment of a 4:45 am departure really commensurate<br />
with the bustling activity of a midday holiday afternoon’s 3:30 pm train from the same<br />
platform? What, in such an example, constitutes the data? Formal organizations ignore<br />
these differences within the neutrality of their rational order.<br />
The cultural authority of computing is in part enabled by that neutrality. Likewise,<br />
the graphical forms through which information is displayed online are shot through with<br />
ideological implications. The grids and frames, menu bars and metaphors of desk<strong>to</strong>ps and<br />
windows, not <strong>to</strong> mention the speed of clicking and rules that govern display are all ripe<br />
for a new rhe<strong>to</strong>ric of the screen analysis. The graphic form of information, especially in<br />
digital environments, often is the information: design is functionality. Information<br />
architecture and information design are not isomorphic. Therein lies another whole<br />
domain of inquiry in<strong>to</strong> the rhe<strong>to</strong>rical force of digital media.<br />
When design structures eliminate any trace, or possibility, of individual inflection<br />
or subjective judgment, they conform <strong>to</strong> a model of mathesis that assumes objective,<br />
<strong>to</strong>talizing, mechanistic, instrumental capability readily absorbed in<strong>to</strong> administering<br />
culture. Why is this a problem? Because of the way generalizations erase difference and<br />
specificity and operate on assumptions that instrumentalize norms without regard for the<br />
situated conditions of use. Collective errors of judgment constitute the his<strong>to</strong>ry of human<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
41
cultures, when the scale at which these can be institutionalized is enabled by electronic<br />
communications and computational processing, what is at stake seems highly significant.<br />
The chilling integration of IBM technology in<strong>to</strong> the bureaucratic administration of the<br />
extermination machines of the Third Reich provides an extreme example of the horrific<br />
efficiency <strong>to</strong> which managed regimes of information processing can be put. That single<br />
instance should be sufficient caution against systematic <strong>to</strong>talization. Closer (perhaps) <strong>to</strong><br />
the vicissitudes of daily life is the example of bureaucratic identity confusion brought<br />
about in the dark vision of the film Brazil. The grotesqueries wrought by the replacement<br />
of T for B in the confusion of the names Tuttle and Buttle on which the film’s grim<br />
narrative unfolds are <strong>to</strong>o frequently replayed in the plight of daily information<br />
(mis)management of medical, insurance, and credit records. One doesn’t have <strong>to</strong><br />
subscribe <strong>to</strong> exaggerated fears of a police state <strong>to</strong> grasp the problematic nature of<br />
<strong>to</strong>talizing (but error-ridden) systems of bureaucracy—or of subscribing <strong>to</strong> the models on<br />
which they gain their authority. The place of subjective inflection and its premise of<br />
partial, fragmentary, non-objective and non-<strong>to</strong>talizing approaches <strong>to</strong> knowledge is of a<br />
radically different order. The ethics and teleology of subjectivity cannot be absorbed in<strong>to</strong><br />
alignment with <strong>to</strong>talizing systems. On that point of difference hangs what is at stake in<br />
our undertaking. Finding ways <strong>to</strong> express this in information structures and also<br />
authoring environments is the challenge that led us <strong>to</strong> speculative computing.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
42
1.2 <strong>Speculative</strong> computing: basic principles and essential distinctions<br />
With speculative computing, we moved beyond what was seeming <strong>to</strong> be the instrumental,<br />
well-formed, and almost standardized business of digital humanities. We used the<br />
computer <strong>to</strong> create aesthetic provocations – visual, verbal, textual results that were<br />
surprising and unpredictable. Most importantly, we inscribed subjectivity, the basic point<br />
of view system from which any and every interpretative and expressive representation is<br />
created, in<strong>to</strong> the design of digital environments. To do this, we designed projects that<br />
showed position and inflection, point of view as a place within a system, and subjectivity<br />
as the marked specificity of individual voice and expression. We wanted <strong>to</strong> show<br />
interpretation, expose its workings. We wanted <strong>to</strong> force questions of textuality and<br />
graphicality <strong>to</strong> the fore. To do this, we (a smaller core of SpecLab participants) created a<br />
series of experimental projects that ranged in their degree of development from proof-of-<br />
concept <strong>to</strong> working platforms for use. But we also created a theoretical and<br />
methodological framework.<br />
Our readings and conversations led us <strong>to</strong> develop a specialized vocabulary<br />
borrowed from disciplines that bring issues of interpretation in<strong>to</strong> a new intellectual<br />
framework. Most important among these for my development was the influence of<br />
radical constructivism in the work of Ernst von Glasersfeld, and the concepts from<br />
second generation systems theorist Heinz von Foerster. xxxiii <strong>From</strong> these two sources, and<br />
work of biologists/cognitive scientists Francesco Varela and Umber<strong>to</strong> Maturana, a<br />
reformulation of knowledge as experience allowed us <strong>to</strong> shed the old binarism in which<br />
subjectivity is conceived in opposition <strong>to</strong> objectivity. xxxiv In such a reformulation,<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
43
knowledge is always interpretation, and thus located in a perceiving entity whose own<br />
position, attitudes, and awareness are all constituted in a codependent relation with its<br />
environment. That system is always in flux, and thus has the complex heterogeneous<br />
character of a cultural field shot through with forces that are always ideological and<br />
his<strong>to</strong>rical. Because we are all always simultaneously subjects of his<strong>to</strong>ry and in his<strong>to</strong>ry,<br />
our cognitive processes are shaped by the continuous encounter with the phenomenal and<br />
virtual world such that we constitute that world across a series of shifting models and<br />
experiences. These are familiar concepts within cognitive studies and constructivist<br />
theories of knowledge, as well as within with critical theory (though implementing these<br />
ideas within knowledge production environments poses new challenges). Alan<br />
MacEachren’s synthesis of an information-processing model of visual perception (which<br />
sounds far more mechanistic than it is, since it incorporates the constructivist approach<br />
<strong>to</strong>uched on above, provided another methodological <strong>to</strong>uchs<strong>to</strong>nes. xxxv Integrating these<br />
precepts in<strong>to</strong> the design, or at the very least, the conception of the design of digital<br />
environments meant <strong>to</strong> expose models of interpretation was a challenge that may still<br />
have eluded our technical grasp, but it motivated the projects at SpecLab from the<br />
beginning.<br />
In addition <strong>to</strong> the basic tenets of co-dependent emergence and constructivist<br />
approaches <strong>to</strong> knowledge we incorporated the ideas of probability from quantum theory<br />
and various tenets of the turn of the 19 th century poet Alfred Jarry’s ‘pataphysics. xxxvi Our<br />
sources for creating a probablistic rather than a mechanistic concept of text came from<br />
the work of Heisenberg and Schroedinger, largely through McGann’s influence. xxxvii In<br />
this move, a text became defined as an n-dimensional field of potentialities, in<strong>to</strong> which a<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
44
eading intervened. Adopting their theories of probability and potentiality, we shifted<br />
from mechanistic models of text <strong>to</strong> quantum ones. We conceptualized a “text” not as a<br />
discrete and static entity, but a coded provocation for reading, a field of potentialities.<br />
Constrained by those codes, thus not open <strong>to</strong> any and all possibilities, but it is formed<br />
anew as an act of interpretative intervention. Here again the echoes of deconstruction are<br />
perceptible, but with the difference that these are shifted in<strong>to</strong> problems of modeling and<br />
representing such activities within an electronic space. The n-dimensionality of texts, <strong>to</strong><br />
use McGann’s term again, also engages their social production and the associational<br />
matrix through which meaning is produced in a heteroglossic network, combining the<br />
dialogic method of Mikhail Bakhtin with influence from the thick readings of a<br />
generation of French textual theorists. xxxviii But speculative computing is not a rehash of<br />
poststructuralist theory, nor an advanced version of either dialogic or dialectical<br />
approaches. <strong>Speculative</strong> computing is grounded in a serious critique of the mechanistic,<br />
entity-driven approaches <strong>to</strong> knowledge that is based on a distinction between subject and<br />
object. By contrast, speculative computing proposes a generative, not merely critical,<br />
attitude <strong>to</strong>wards knowledge.<br />
My approach <strong>to</strong> graphical knowledge production came from the semiotic studies<br />
of Jacques Bertin and the his<strong>to</strong>ry of visual languages of form, work by MacEachren, and<br />
an extensive and systematic reading of materials in information visualization, psychology<br />
and physiology of vision, and cultural his<strong>to</strong>ry of visual epistemology. xxxix This latter is an<br />
enormously underdeveloped field, one that calls for serious study now in particular, when<br />
visualization is becoming so ubiqui<strong>to</strong>us within digital environments. Finally, our reading<br />
of Charles Peirce provided a method of interpretation based in abduction as well as a tri-<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
45
partite theory of signification (a sign stands for something <strong>to</strong> someone, and does not<br />
operate merely in the formal signifier/signified structure outlined by Ferdinand de<br />
Saussure). xl This theoretical foundation provided a platform on which <strong>to</strong> elaborate a<br />
theory of enunciation and subjectivity.<br />
Within our definition of the concept of subjectivity, we considered both structural<br />
and inflected modes. McGann’s reference for the structural approach is Dante Gabriel<br />
Rossetti’s idea of “the inner standing point.” xli This idea posits subjectivity in a structural<br />
way, as the inscription of point of view within the field of interpretation. xlii This maps<br />
readily on<strong>to</strong> linguistic theories of enunciation, and the contrast of speaking and spoken<br />
subjects within a discursive field. xliii The second aspect of subjectivity is that of<br />
inflection, the marked presence of affect and specificity or difference that inheres in<br />
material expressions. This crucial feature of subjectivity is registered as the trace of<br />
difference. I once referred <strong>to</strong> this as the “aesthetic massage co-efficient of form.” Though<br />
this was a deliberately wry and overwrought phrase, its compact density contains a real<br />
description. The term “massage” refers <strong>to</strong> inflection, differential traces, as a “coefficient”<br />
or property of “forms” i.e. material expressions. Associating this with aesthetics links the<br />
concept <strong>to</strong> perception, or the idea of knowledge as experience. All of these concepts<br />
became working keywords for us, shorthand <strong>to</strong> our elaborate conversations: subjectivity,<br />
the inner standing point, inflection, graphical knowledge production, aesthetics, and<br />
experiential rather than <strong>to</strong>talized approaches. I’ve referred <strong>to</strong> this attitude as “post-<br />
Cartesian” <strong>to</strong> indicate the leap from subject/object distinctions and mind/body split <strong>to</strong> a<br />
re-conceptualization beyond such binarisms. In a post-Cartesian frame, subjectivity is not<br />
in opposition <strong>to</strong> objectivity. Subjectivity describes the co-dependent condition of situated<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
46
knowledge production informed by post-structuralist and deconstructive criticism, second<br />
generation systems theory, probabilistic approaches, and radical constructivism.<br />
So while speculative computing builds on certain competencies developed in<br />
digital humanities, its theoretical polemic overturns the latter’s premises in many<br />
respects. Our focus at SpecLab has been on issues of interpretation and aesthetic<br />
provocation. We’re trying <strong>to</strong> push against, rather than conform <strong>to</strong>, the logical constraints<br />
imposed by digital media. The humanistic impulse has been strong in its dialogue with<br />
“informatics” and “computing.” But until now it has largely conformed <strong>to</strong> the agenda-<br />
setting requirements set by computational environments. The many advantages of this<br />
have been clear. Scratch a digital humanist and they’ll tell you everything they’ve learned<br />
by being subjected <strong>to</strong> the intellectual discipline of a field grounded in formal logic (even<br />
if, as in the case of mark-up debates described above, they disavow the implications). xliv<br />
But speculative computing, as we are using it, defines an approach based on<br />
interpretation and subjectivity that deliberately challenges the authority of such formality.<br />
For anyone familiar with digital humanities, this reads as a radical move. But <strong>to</strong><br />
understand how radical, the contrast has <strong>to</strong> be sketched explicitly.<br />
By now, the basic elements of digital humanities should be clear, in particular, <strong>to</strong><br />
repeat what was stated above, that the crux of such work was willingness <strong>to</strong> engage with<br />
the task of disambiguation required <strong>to</strong> process any information in digital form. xlv This<br />
pushes vague, fuzzy, often unarticulated assumptions up against stringent limits. <strong>Digital</strong><br />
environments demand that humanists make explicit the usually implicit assumptions on<br />
which they do their work. The job of explaining what we do <strong>to</strong> a “machine” (the quaint<br />
colloquial term by which computational and digital technologies are identified in<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
47
common parlance), in step-by-step procedures, leaves no room for judgment calls or<br />
ambiguity of any kind. An intense self-reflexivity results. Even the most apparently<br />
simple task – naming or classifying – is immediately revealed as a complex interpretive<br />
act. Basic categories of textual activity – the title of a work, name of an author, place or<br />
time of publication – suddenly reveal their un-categorizable nuances and complications.<br />
Some of these are technical (terms in translation vs. transliteration, problems of<br />
orthography or nomenclature). But some are conceptual: what constitutes the “work” in a<br />
multi-versioned piece with many aspects <strong>to</strong> it, some in the form of notes, images, scrawls,<br />
publications, manuscripts, corrected proofs, or mentions in correspondence?<br />
To summarize, the basic accomplishments of digital humanities have been <strong>to</strong><br />
establish technical pro<strong>to</strong>cols on which <strong>to</strong> process texts in a meaningful way and <strong>to</strong> use<br />
various <strong>to</strong>ols of quantitative analysis, pattern recognition, or stylometrics for analysis.<br />
The benefits of aggregation and large-scale data processing are immediately apparent <strong>to</strong><br />
those involved. Other benefits include the ability <strong>to</strong> serve up original materials in<br />
facsimile form online, <strong>to</strong> create linked and hyper-linked text bases and reference<br />
materials, <strong>to</strong> aggregate much that was peripheral or geographically distributed in a single<br />
working environment. Translation, searching, collation, and other methods of text<br />
manipulation have been au<strong>to</strong>mated with different degrees of success and/or useful failure.<br />
The extent <strong>to</strong> which the method of digital humanities embodies assumptions about<br />
and thus constrains the objects of its inquiry is apparent when we look at the terms that<br />
delimit its principles: calculation, computation, processing, classification, and electronic<br />
communication. Each deserves a momentary gloss <strong>to</strong> establish the distinctions between<br />
digital humanities and speculative computing.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
48
Calculation is based on numerical information and the ability <strong>to</strong> perform certain<br />
functions that can be readily au<strong>to</strong>mated through mechanical and digital means.<br />
Calculation is limited by its inability <strong>to</strong> represent any content other than quantitative<br />
values. Charles Babbage’s 19 th century devices, calcula<strong>to</strong>rs, and office machines are not<br />
computers, only au<strong>to</strong>mated calcula<strong>to</strong>rs. xlvi<br />
Computation links au<strong>to</strong>mated processing <strong>to</strong> the symbolic realm. Computation<br />
makes use of signs whose values can represent any information, even thought their ability<br />
<strong>to</strong> be manipulated through a fixed set of pro<strong>to</strong>cols is determined by a succinct formal<br />
logic. The addition of an extra level of articulation in coding symbolic values on<strong>to</strong> binary<br />
entities that could be processed electronically made the leap from au<strong>to</strong>mated calculation<br />
<strong>to</strong> computation possible. This leap disconnects semantic values (what symbols mean)<br />
from their ability <strong>to</strong> be processed (as elements in a formal system). Information, as<br />
Claude Shannon famously demonstrated, is content neutral. xlvii<br />
<strong>Digital</strong> processing enacts that logic through step-by-step algorithmic procedures,<br />
many specified by programming languages and their different functionalities and syntax.<br />
Alan Turing’s design for a universal computer transformed this basic capability in<strong>to</strong> an<br />
inexhaustible computational engine. xlviii<br />
Classification systems build a higher order linguistic signifying structure on that<br />
formal base. The use of digital surrogates (themselves digital artifacts inscribed in code)<br />
reinforces the disposition <strong>to</strong> imagine that all code-based objects are self-evident, explicit,<br />
and unambiguous. Schemes for nomenclature and organization come out of library<br />
sciences and information management, as well as long traditions of typologies developed<br />
in the encyclopedic and lexicographic traditions across the sciences and humanities. xlix<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
49
Electronic communication approaches assume information functions in a<br />
transmission mode – encoded, s<strong>to</strong>red, and then output. The fungibility of information and<br />
the function of noise are certainly taken in<strong>to</strong> account in theoretical discussions, even<br />
when these are transmission-based and highly technical approaches <strong>to</strong> electronic<br />
communication, such as those of Shannon and his collabora<strong>to</strong>r, Warren Weaver. l Bit by<br />
byte, the digital approach reinforces a mechanistic understanding of communication and<br />
representation. Knowledge in this context becomes synonymous with information, and<br />
information takes on the character of that which can be parameterized through an<br />
unambiguous rule set.<br />
This basic terminology is premised on the cultural authority of code and an<br />
engineering sensibility grounded in problem-solving. Because it has <strong>to</strong> be used for<br />
functionality, the “unquestionable” nature of the code base of computational activity is<br />
full of ideological agendas. Most fundamental? The formal, logic required becomes<br />
naturalized – not only as a part of the technical infrastructure, but as a crucial feature of<br />
the intellectual superstructures built <strong>to</strong> function on it. I’ve reiterated this several times<br />
because this is the crux of our motivation <strong>to</strong> differentiate SpecLab intellectually from<br />
digital humanities.<br />
Now of course not all digital humanists take this literal-ness without question.<br />
Considerable self-reflexive thought about objects of study has pulled theoretical<br />
philosophers of all stripes back in<strong>to</strong> discussions of digital projects. li Calling assumptions<br />
in<strong>to</strong> question is the name of the digital epistemological game as much as it is the standard<br />
of conference papers and publications elsewhere in the humanities. Likewise, the study of<br />
artifacts in a digital environment is just as apt <strong>to</strong> prompt discussions of race/class/gender<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
50
in hegemonic practices, critiques of imperialism and cultural authority, power relations<br />
and disciplinary measures, as is scholarly research in a conventional key. In that regard,<br />
SpecLab is part of a larger critical phenomenon, even if its approaches are deliberately<br />
pitted against certain aspects of the base on which it builds. lii But our focus is not on the<br />
artifacts themselves, but on the design of the environments and the way ideological<br />
assumptions built in<strong>to</strong> their structure and infrastructure perpetuate unexamined concepts.<br />
<strong>Speculative</strong> computing distinguishes itself from digital humanities on the basis of<br />
its sources of inspiration and intellectual traditions. If digital humanities is grounded in an<br />
epistemological self-consciousness through its encounter with dis-ambiguation,<br />
speculative computing is driven by a commitment <strong>to</strong> interpretation-as-deformance in a<br />
tradition that has its roots in parody, play, and critical methods such as those of the<br />
Situationists, OuLiPians, and the longer tradition of ‘pataphysics with its emphasis on<br />
“the particular” instead of “the general”. liii <strong>Speculative</strong> computing <strong>to</strong>rques the logical<br />
assumptions governing digital technology. It pushes back in the dialogue between the<br />
humanities modes of interpretation and code-based formalism. Obviously, any activity<br />
functioning in a digital environment continues <strong>to</strong> conform <strong>to</strong> the formal logic of<br />
computational instruments. But the questions asked are fundamentally different within<br />
the theoretical construct of speculative computing. This schematic overview charts the<br />
points of comparison:<br />
DIGITAL HUMANITIES SPECULATIVE COMPUTING<br />
1. Information technology/ Formal logic 1. `Pataphysics/ The science of exceptions<br />
2. Quantitative methods 2. Quantum interventions<br />
(Problem-solving approaches) (Imagining what you do not know)<br />
(practical solutions) (imaginary/imaginative solutions)<br />
3. Self-identical objects/entities 3. Au<strong>to</strong>poeisis / constitutive or configured identity<br />
(Subject/object dicho<strong>to</strong>my) (Co-dependent emergence)<br />
4. Induction/deduction 4. Abduction<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
51
5. Discrete representations 5. Heteroglossic processes<br />
(Static artifacts) (Intersubjective exchange/discourse fields)<br />
6. Analysis /observation 6. Subjective deformance/intervention<br />
(Mechanistic) (Probabilistic)<br />
The digital humanities community has been concerned with the creation of digital <strong>to</strong>ols<br />
in humanities contexts. The emphasis in speculative computing is the production of<br />
humanities <strong>to</strong>ols in digital contexts. We are far less concerned with making devices <strong>to</strong> do<br />
things – sort, organize, list, order, number, compare – than with creating ways <strong>to</strong> expose<br />
any form of expression (book, work, text, image, scholarly debate, bibliographical<br />
research, description, or paraphrase) as an act of interpretation (and any interpretive act<br />
as a subjective deformance).<br />
Now, here is a brief explanation of the binary pairs in the chart above:<br />
1) Information/’Pataphysics: The use of information technology in digital<br />
humanities supported development of <strong>to</strong>ols for corpus linguistics: counting, measuring,<br />
and thinking differently about the instances and contexts of word use. Other <strong>to</strong>ols allow<br />
for searching, s<strong>to</strong>ring, retrieving, classifying, and then visualizing data. All of these<br />
procedures are based on assumptions of the self-identity of the object, rather than its co-<br />
dependence on those processes. Computational processes constitute their objects of<br />
inquiry just as surely as his<strong>to</strong>rical, scientific, or other critical methods. Informatics is<br />
based on standard, repeatable mathematical and logical procedures. Quantitative methods<br />
normalize their data in advance. These assume a norm and a system that conforms <strong>to</strong><br />
standardizable rules. Information conforms <strong>to</strong> statistical methods.<br />
By contrast, ‘pataphysics derives from the study of exceptions and anomalies in<br />
all their specificity and non-standard form. ‘Pataphysical methods take outliers and<br />
deviation from statistical procedures. Only a punning method suffices, thus the invention<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
52
of our term “’patacritical”. If norms, means, and averages govern statistics, then sleights,<br />
swerves, and deviation have their way in the ‘pataphysical game. Adopting a ‘patacritical<br />
method is not an excuse for random chaos or abandonment of intellectual discipline. It<br />
calls for attention <strong>to</strong> individual cases without assumptions about the generalization <strong>to</strong> be<br />
drawn. In short, it takes exceptions as rules that constitute a system de fac<strong>to</strong>, even if<br />
repeatability and reliability cannot be expected. Deviation from all norms and constant<br />
change dictate that the exception will always require more rules. liv Such an approach<br />
privileges bugs and glitches over functionality. Not necessarily useful in all<br />
circumstances, exceptions are valuable <strong>to</strong> speculation in a substantive, not trivial, sense.<br />
2. Quantitative method/quantum intervention: The idea of interpretation as a<br />
quantum intervention is based on the insistence that any act of reading, looking, viewing<br />
is by definition a production of a text/image/work. For this <strong>to</strong> be the case, the work under<br />
investigation can’t be conceived as static, self-identical, or reliably available <strong>to</strong><br />
quantitative methods. Instead, speculative methodology is grounded in a quantum<br />
concept of a work as a field of potentiality (poetentiality might be a better term). The act<br />
of reading/viewing is an intervention in the field, a determining act that precipitates a<br />
work. lv The spirit of indeterminacy goes against the engineering sensibility. Problem-<br />
solving methods do not apply in a quantum field. Practical solutions have no bearing on<br />
the exposure of interpretation-as-intervention. Humanistic research takes the approach in<br />
that a thesis is an instrument for exposing what one doesn’t know. The ‘patacritical<br />
concept of imaginary solutions isn’t an act of make-believe, but an epistemological move,<br />
much closer <strong>to</strong> the making-strange of the early 20 th century avant-garde. It forces a<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
53
econceptualization of premises and parameters, not a reassessment of means and<br />
outcomes.<br />
3. Self-identicality/co-dependent emergence: Another tenet of the speculative<br />
approach, common sense once it is familiar, but oddly disorienting at first glance, is that<br />
no object (text, image, data, quantity or entity) is considered self-identical. McGann cites<br />
philosopher George Spencer-Brown’s Laws of Form, A=A if and only if A≠A, as the<br />
source for this formulation within the field of logic. lvi The post-structuralist critical<br />
tradition is premised on the analysis of contingency. Every text is made as an act of<br />
reading and interpretation. lvii When this is combined with recent theories of cognitive<br />
studies and radical constructivist psychology, it returns the interpretative act <strong>to</strong> an<br />
embodied, situated condition and the object of inquiry becomes a constituted object, not<br />
an a priori thing.<br />
Maturana and Varela’s theory of au<strong>to</strong>poeisis, or co-dependent emergence between<br />
entity and system, also changes mechanistic concepts of subject and object relations in<strong>to</strong><br />
dynamic, systems-based reconceptualization. lviii Subject and object are not discrete, but<br />
interrelated and co-dependent, in an au<strong>to</strong>poeitic description. In this account, an entity’s<br />
identity (whether it is an organism or an object of intellectual inquiry) emerges in a<br />
codependent relation with its conditions, not independent of them. The conventional<br />
distinctions of subject and object are not blurred, rather, the ground on which they can be<br />
sustained disappears because there is no figure/ground subject/object dicho<strong>to</strong>my, only a<br />
constituting system of co-dependent relations. The subject/object dicho<strong>to</strong>my that<br />
structures text/reader relations was as mechanistic as New<strong>to</strong>nian physics. To reiterate the<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
54
quantum method cited above, and integrate it here, the intervention determines the text.<br />
The act of reading calls a text (specific, situated, unique) in<strong>to</strong> being.<br />
4. Induction/Abduction: Under such circumstances, scientific techniques of<br />
induction and deduction don’t work. They are structured and structural, assuming self-<br />
identicality in themselves and for their objects, an inscribe relations of causality and<br />
hierarchy. By contrast, methods of comparative abduction, taken from Charles Peirce, do<br />
not presume that a logical system of causal relations exists outside the phenomenon and<br />
supplies an explanation for their relations. Configured relations simply (though this is far<br />
from simple) produce semantic and syntactically structured effect independent of<br />
grammatical systems. The contingently configured condition of form–a hefty phrase<br />
indeed–points <strong>to</strong> the need <strong>to</strong> think about forms as relations, rather than entities. This<br />
requires rewiring for the Anglo-analytic brain, accus<strong>to</strong>med as it is <strong>to</strong> getting hold of the<br />
essence and substance of intellectual matter. Even the mechanisms of the dialectic tend <strong>to</strong><br />
elude the empiricist sensibility, and it has the benefit of a device-driven process and<br />
hierarchy of thesis-antithesis-and resolution <strong>to</strong> a higher order synthesis. Abduction<br />
adheres <strong>to</strong> no single set of analytic procedures. Every circumstance produces its own<br />
logic as description, rather than explanation.<br />
5. Discrete/heteroglossic: Traditional humanistic work assumes its object. A book,<br />
poem, text, image, or artifact, no matter how embedded in social production or<br />
psychoanalytic tangles, is usually assumed <strong>to</strong> have a discrete, bounded identity. Our<br />
emphasis was on the co-dependent nature of that identity. The fiction of the “discrete”<br />
object is exposed as a function of their relation <strong>to</strong> a larger discourse field. In this<br />
conception, a book is a sort of snapshot, an instantiation, a slice through the production<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
55
and reception his<strong>to</strong>ries of the text-as-work. The larger field of its composition and<br />
distribution comprise what we call the discourse field. This includes its many iterations<br />
and versions, as well as its multi-dimensional his<strong>to</strong>ry of readings and responses,<br />
emendations and corrections, changes and incidental damages. A discourse field is<br />
indeterminate, not random, chaotic, or fixed, but probabilistic. It is also social, his<strong>to</strong>rical,<br />
rooted in real and traceable material artifacts, all of which comprise that field.<br />
The concept of the discourse field draws directly on Mikhail Bakhtin’s<br />
heteroglossia and the dialogic notion of a text. lix Heteroglossia tracks language in<strong>to</strong> a<br />
semantic field and an infinite matrix of associations. It makes shifts, links, and<br />
connections across his<strong>to</strong>rical and cultural domains. Behind every word is another word,<br />
from every text springs another, and each text/word is reinvigorated and altered in every<br />
instance of production (including every reading, citation, and use). These associations are<br />
constantly remade, not quantifiable and static in their appearance, meaning, or value.<br />
The dialogic approach underpins a theory of media as nodes and sites of inter-<br />
subjective exchange (rather than as things-in-themselves). All texts and artifacts are<br />
always the site of inter-subjective exchange. A film, poem, or pho<strong>to</strong>graph or other human<br />
expression is a work by someone spoken <strong>to</strong> and for others, from a situated and specific<br />
position of subjectivity <strong>to</strong> provoke response from a viewer or reading from their own<br />
conditions and circumstances. The work is not an inert or fixed text or image. Its very<br />
existence as a work depends upon the inter-subjective activity. Any artifact is a<br />
provocation for a reading, no matter how stable it appears <strong>to</strong> be in print or screen display.<br />
These artifacts are media of exchange in social space, the instrument for creation of value<br />
through interpretative activity, not information delivering systems. lx<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
56
6. Analysis/deformance: Finally, speculative computing draws on critical and<br />
theoretical traditions <strong>to</strong> describe every act of interpretation as deformative. That is, it is<br />
not merely performative, bringing an inert text <strong>to</strong> life, but also, deformative, a generative<br />
act of production, not reproduction or replay. It makes the work anew, each time, whether<br />
in a reading of a text on a page, in the ear, on the screen. Intepretation is deformance, an<br />
act like Situationist dé<strong>to</strong>urnement, charged with social and political mobility, but also,<br />
aesthetic mutation and transformation.<br />
The theoretical agenda of speculative computing may seem <strong>to</strong> be thick with<br />
unsupported claims, but supplies a brief for project development. Our thinking developed<br />
with the projects, not in advance, and the relation between theoretical work and practical<br />
design was and remains generative and fluid, necessarily so. <strong>Speculative</strong> computing takes<br />
seriously the destablization of all categories of entity, identity, object, subject,<br />
interactivity, process or instrument. In short, it rejects mechanistic, instrumental, and<br />
formally logical approaches. It replaces them with concepts of au<strong>to</strong>poesis (contingent<br />
interdependency), quantum poetics and emergent systems, heteroglossia, indeterminacy<br />
and potentiality (or, poetentiality), intersubjectivity, and deformance. <strong>Digital</strong> humanities<br />
is focused on texts, images, meanings, and means. <strong>Speculative</strong> computing engaged with<br />
interpretation and aesthetic provocation. Like all computational activity, it is generative<br />
(involved with calls, instructions, encoding), iterative (emergent, complex, and non-<br />
linear, non-causal), intra-and-intersubjective (reference frames, issues of granularity,<br />
chunking), and recursive (repeating but never identical deformances).<br />
<strong>Speculative</strong> computing struggles for a critical approach that does not presume its<br />
object in advance. It lets go of the positivist underpinnings of the anglo-analytic mode of<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
57
epistemological inquiry. It posits subjectivity and the inner standing point as the site of<br />
intepretation. It replaces the mechanistic modes of Saussurean semiotics, with their<br />
systems-based structures for value production with Peircean semiotics that includes a<br />
third term (a sign represents something <strong>to</strong> someone for some purpose). It attends <strong>to</strong> the<br />
moment of intervention as deterministic of a phenomenon within a field of potentiality. It<br />
attempts <strong>to</strong> open the field of discourse <strong>to</strong> its infinite and peculiar richness as deformative<br />
interpretation. Is it different from digital humanities? As night from day, text from work,<br />
and the force of controlling reason from the pleasures of delightenment.<br />
i See http://www.aclweb.org/ and its publications for work in computational linguistics.<br />
ii For a few pioneers, see Blake, www.blakearchive.org/blake/main.html, Perseus,<br />
www.perseus.tufts.edu/About/grc.html, Rossetti, www.rossettiarchive.net, Peter<br />
Robinson, www.canterburytalesproject.org/CTPresources.htmlHoyt Duggan,<br />
www.iath.virginia.edu/piers/ Women Writers Project and other work of the Scholarly<br />
Technology Group at Brown, http://www.stg.brown.edu/staff/julia.html.<br />
iii Kathryn Sutherland, ed., Electronic Text (Oxford: Clarendon Press, 1997),<br />
iv For instance: William J. Mitchell, City of Bits: Space, Place, and the Infobahn,<br />
(Cambridge: MIT Press, 1995).<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
58
v Robert J. Stain<strong>to</strong>n, Philosophical perspectives on language.(Peterborough, Ont.,:<br />
Broadview Press, 1996) and W.G. Lycan, Philosophy of Language: A Contemporary<br />
Introduction. (New York: Routledge, 2000)<br />
vi Bertrand Russell, Alfred North Whitehead, young Ludwig Wittgenstein, Rudolf<br />
Carnap, Gottlub Frege, and W.V. Quine, are the outstanding figures in this tradition.<br />
vii For some of the early systematic thinking, see: Marvin Minsky, The Society of Mind<br />
(NY: Simon and Schuster, 1986), Artificial Intelligence with Seymour Papert (Eugene,<br />
Oregon State System of Higher Education; [distribu<strong>to</strong>r: Distribution Center, University of<br />
Oregon] 1973), writings by Herbert Simon, Representation and meaning; experiments<br />
with information processing systems, (Englewood Cliffs, NJ: Prentice Hall, 1972), Terry<br />
Winograd, Artificial intelligence and language comprehension (Washing<strong>to</strong>n : U.S. Dept.<br />
of Health, Education, and Welfare, National Institute of Education, 1976, Hubert<br />
Dreyfus, What Computers Can’t Do, (NY: Harper and Row, 1979), even Richard Powers,<br />
Galatea 2.2 (NY: Farrar, Straus, and Giroux, 1995). Daniel Crevier, AI: The Tumultuous<br />
His<strong>to</strong>ry of the Search for Artificial Intelligence, (NY: Basic Books, 1993) is still a useful<br />
introduction <strong>to</strong> the emergence of crucial fault lines in this area.<br />
viii For a range of imaginings on this <strong>to</strong>pic: Philip K. Dick, Do Androids Dream of<br />
Electric Sheep, Rudy Rucker, Software (Eos, 1987), Bill Joy, “Why the Future Doesn’t<br />
Need Us,” www.wired.com/wired/archive/8.04/joy.html Donna Haraway, Simians,<br />
Cyborgs, and Women (NY: Routledge, 1991), and Chris Habels Gray and Steven<br />
Men<strong>to</strong>r, The Cyborg Handbook (NY: Routledge, 1995).<br />
ix For an extreme and then more tempered view, see first Arthur Kroker, <strong>Digital</strong><br />
Delirium, (NY: St. Martin’s Press, 1997) and then Wendy Chun op.cit.. The writings of<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
59
Critical Art Ensemble, www.critical-art.net/ synthesize critical philosophy and digital<br />
culture studies.<br />
x Howard Besser, published works and online references:<br />
http://besser.tsoa.nyu.edu/howard/#standards<br />
xi For an overview of the field from varying perspectives, see: A Companion <strong>to</strong> <strong>Digital</strong><br />
<strong>Humanities</strong>, edited by Susan Schreibman, Ray Seimens, and John Unsworth (Malden and<br />
Oxford: Blackwell’s Publishers, 2004).<br />
xii The EText Center at University of Virginia is a useful example:<br />
http://etext.virginia.edu/.<br />
xiii Barbara Stafford, Good Looking,(Cambridge: MIT Press, 1996) is a striking<br />
demonstration of the extent <strong>to</strong> which logo-centrism prevails in academic work. Her<br />
arguments, from the point of view of artists, art his<strong>to</strong>rians, or practitioners of visual<br />
knowledge production, seem so obvious as <strong>to</strong> be unnecessary, and yet, for textual<br />
scholars, they posed a revelation that seemed <strong>to</strong> challenge basic assumptions, at least in<br />
some quarters. See also Franco Moretti, “Graphs, Maps, and Trees,” No, 24, November-<br />
December 2003, New Left Review first of three articles.<br />
http://www.newleftreview.net/Issue24.asp?Article=05<br />
xiv For discussions of Codework, start with the exchanges between Rita Raley and John<br />
Cayley, www.electronicbookreview.com/ thread/electropoetics/codology, and discussions<br />
on www.poemsthatgo.com/ideas.htm.<br />
xv Michael Heim, Electric Language (New Haven: Yale University Press, 1987)<br />
xvi George Landow, Hypertext (Johns Hopkins University Press, 1992) and J. David<br />
Bolter, Writing Space (L. Erlbaum Associates, 1991).<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
60
xvii Susan Hockey, Electronic Texts in the <strong>Humanities</strong> (Oxford: Oxford University Press,<br />
2000), p.5.<br />
xviii Hockey, op cit., is an extremely useful, objective introduction <strong>to</strong> the field and its<br />
his<strong>to</strong>ry. In addition see: A Companion <strong>to</strong> Unsworth & Schreibman, cited above; Willard<br />
McCarty, <strong>Humanities</strong> <strong>Computing</strong> (Hampshire and NY: Palgrave, 2005); and Elizabeth<br />
Bergmann Loizeaux and Neil Fraistat, Reimagining Textuality (Madison: University of<br />
Wisconsin Press, 2002).<br />
xix See www.tei-c.org.<br />
xx Jerome McGann, “Texts in N-dimensions and Interpretation in a New Key,” expands<br />
this compact list through a broader discussion that summarizes our work at SpecLab from<br />
his perspective. (www.humanities.mcmaster.ca/~texttech/pdf/vol12_2_02.pdf)<br />
xxi Michael Day’s paper on metadata and digital preservation:<br />
http://www.ukoln.ac.uk/metadata/presentations/ecdl2001-day/paper.html and this<br />
discussion of metadata: http://digitalarchive.oclc.org/da/ViewObjectMain.jsp will provide<br />
some useful starting points.<br />
xxii Henry Kucera and Nelson Francis of Computational Analysis of Present-Day<br />
American English in 1967 is considered a turning point for this field, and the publication<br />
Studies in Corpus Linguistics, Elena Tognini-Bonelli, general edi<strong>to</strong>r, and the professional<br />
journal publications Corpora, and Studies in Corpus Linguistics.<br />
xxiii Adam Mathes, “Folksonomies - Cooperative Classification and Communication<br />
Through Shared Metadata,” is a useful introduction.<br />
http://www.adammathes.com/academic/computer-mediated-<br />
communication/folksonomies.html<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
61
xxiv For a recent summary and overview: http://jamesthorn<strong>to</strong>n.com/search-engine-<br />
research/<br />
xxv For updates on data mining: http://dataminingresearch.blogspot.com/<br />
xxvi That distinction, more than any other, differentiates natural and formal languages, and<br />
not surprisingly, demarcates work in the fields of artificial intelligence from that of<br />
cognitive studies, for instance. The most striking example of recognition of this<br />
distinction is in the difference between the attitudes on which Ludwig Wittgenstein<br />
approached the study of language in his Tractatus and in Philosophical Investigations.<br />
That shift registers the recognition that the project of <strong>to</strong>talized, objectified formalistic<br />
approaches <strong>to</strong> language was a failure and the subsequent realization that only use,<br />
situated and specific, could provide insight in<strong>to</strong> the signifying capabilities of language.<br />
xxvii Jerome McGann, The Critique of Modern Textual Criticism (Chicago: U. of Chicago<br />
Press, 1983), and also McGann Black Riders: The Visible Language of Modernism<br />
(Prince<strong>to</strong>n: Prince<strong>to</strong>n University Press, 1993), Randall McLeod, lectures on “Material<br />
Narratives” (http://www.sas.upenn.edu/~traister/pennsem.html), Steve McCaffery and bp<br />
nichol, Rational Geomancy (Vancouver: Talonbooks, 1992), Johanna Drucker, The<br />
Visible Word: Experimental Typography and Modern Art 1909-1923 (Chicago: U. of<br />
Chicago Press, 1994), and essays by Nick Frankel, McLeod, Peter Robinson, Manuel<br />
Portela et alia in Marking the Text, edited by Joe Bray, Miriam Handley, and Anne C.<br />
Henry (Aldershot, Birimgham, Singapore and Sydney: Ashgate, 2000).<br />
xxviii Journal of Functional Programming (2003), 13: 257-260 Cambridge University<br />
Press, Special issue on “Logical Frameworks and Metalanguages”<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
62
xxix Jerome McGann, Radiant Textuality (NY and Hampshire: Palgrave, 2001),<br />
especially, “Chapter 5: Rethinking Textuality,” pp. 137-166.<br />
xxx Allen Renear, “Out of Praxis: Three (Meta)Theories of Textuality,” in Sutherland, ed.,<br />
op.cit.., and Allen Renear, Steve De Rose, David Durand, and Eli Mylonas, “What is<br />
Text, Really?” Journal of <strong>Computing</strong> in Higher Education, 1, no.2 (Winter, 1990).<br />
xxxi Probably the most significant critique of markup was Dino Buzzetti’s paper, “Text<br />
Representation and Textual Models,” http://www.iath.virginia.edu/ach-<br />
allc.99/proceedings/buzzetti.html, brought much of that debate <strong>to</strong> close by pointing out<br />
another important issue – that the insertion of tags directly in<strong>to</strong> the text created a conflict<br />
between the text as a representation at the level of discourse (<strong>to</strong> use the classic terms of<br />
structuralist linguistics) and the level of reference. Putting content markers in<strong>to</strong> the plane<br />
of discourse (a tag that identifies the semantic value of the text relies on reference, even<br />
though it is put directly in<strong>to</strong> the character string) as if they are marking the plane of<br />
reference was a flawed practice. Markup, in its very basic activity, embodies a<br />
contradiction. It collapses two distinct orders of linguistic operation in a most confused<br />
and messy way. Buzzetti’s argument demonstrated that XML itself was built on this<br />
confused model of textuality. Thus the foundation of metatextual activity was itself the<br />
expression of a model, even as the metatexts embodied in mark-up schemes were<br />
modeling the semantic content of text documents.<br />
xxxii Holly Shulman, Dolley Madison <strong>Digital</strong> Edition, University of Virginia Press,<br />
www.upress.virginia.edu/books/shulman.html<br />
xxxiii Heinz von Foerster, Observing Systems, (Salinas, CA: Intersystems Publications,<br />
1981); Ernst von Glasersfeld, "An introduction <strong>to</strong> radical constructivism", in Watzlawick,<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
63
P. (Eds), The invented Reality: How Do We Know? (New York, NY: W.W. Nor<strong>to</strong>n,<br />
1984), pp.17-40. And Ernst von Glasersfeld, E., Radical Constructivism. A Way of<br />
Knowing and Learning, (London: Falmer Press, 1995),<br />
xxxiv Maturana, H.R., Varela, F.J., Au<strong>to</strong>poiesis and Cognition: The Realization of The<br />
Living, (Bos<strong>to</strong>n: D. Reidel Publishing, 1980).<br />
xxxv Alan MacEachren, How Maps Work (NY and London: Guilford Press, 1995)<br />
xxxvi Alfred Jarry, Life and Opinions of Dr. Faustroll, Pataphysician (NY: Exact Change,<br />
199 ), Werner Heisenberg, Philosophical Problems of Quantum Physics, (Woodbridge,<br />
CT: Ox Bow press, 1979), William Charles Price, The Uncertainty principle and<br />
foundations of quantum mechanics : a fifty years' survey (NY: Wiley, 1977)<br />
xxxvii Jerome J. McGann, “Texts in N-Dimensions and Interpretation in a New Key”<br />
http://texttechnology.mcmaster.ca/pdf/vol12_2_02.pdf , Heisenberg, op.cit. Erwin<br />
Schrodinger, Science and Humanism: Physics in our Time (Cambridge: Cambridge<br />
University Press, 1951) and My View of the World (Cambridge: Cambridge University<br />
Press. 1964)<br />
xxxviii Mikhail Bakhtin, The Dialogic Imagination (Austin: University of Texas, 1981).<br />
xxxix Jacques Bertin , The Semiology of Graphics (Madison: University of Wisconsin<br />
Press, 1973) and Fernande Saint-Martin, Semiotics of Visual Language (Blooming<strong>to</strong>n:<br />
University of Indiana Press, 1990), Paul Mijksenaar, Visual Function (Prince<strong>to</strong>n:<br />
Prince<strong>to</strong>n Architectural Press, 1987), Stephen Kosslyn, Image and Mind (Cambridge:<br />
Harvard University Press, 1980), Richard L. Gregory, Eye and Brain (Prince<strong>to</strong>n:<br />
Prince<strong>to</strong>n University Press, 1990). James Jerome Gibson, The ecological approach <strong>to</strong><br />
visual perception, (Hillsdale, N.J. : Lawrence Erlbaum Associates, 1986, originally<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
64
published in 1979. ), Peter Galison, Image and Logic: A material culture of microphysics<br />
(Chicago and London: University of Chicago Press, 1997), Martin Kemp, Visualizations:<br />
The nature book of art and science (Berkeley: UC Press, 2000), Stephen Wilson,<br />
Information Arts (Cambridge: MIT, 2002), or David Freedberg, Eye of the Lynx<br />
(Chicago: University of Chicago Press, 2002), James Elkins, The Domain of Images<br />
(London and Ithaca: Cornell University Press, 1999), David Marr, Vision: a<br />
computational investigation in<strong>to</strong> the human representation and processing of visual<br />
information (SF; W.H. Freeman, 1982).<br />
xl Charles Peirce, The Philosophical Writings of Charles Peirce, edited by Justus Buchler<br />
(NY: Dover, 1955, republication of the 1940 edition of "The Philosophy of Peirce:<br />
Selected Writings," Routledge and Kegan Paul, Ltd., London) Ferdinand de Saussure<br />
Course in General Linguistics , edited by Charles Bally and Albert Sechehaye, (London:<br />
Duckworth, 1983).<br />
xli Jerome McGann, Dante Gabriel Rossetti and The Game that Must be Lost (New<br />
Haven: Yale University Press, 2000)<br />
xlii Gérard Genette, Nouveau Discours du Reciti, (Paris: Editions du Seuil, 1983), Paul<br />
Smith, Discerning the Subject (Minneapolois: University of Minnesota Press, 1988)<br />
xliii Genette, op.cit. Emile Benveniste, Problemes de linguistique générale, (Paris:<br />
Gallimard, 1966-84).<br />
xliv Schreibman, Seimens, Unsworth, op.cit.<br />
xlv Research in humanities computing: selected papers from the ALLC/ACH conference,<br />
Association for Literary and Linguistic <strong>Computing</strong>. (Clarendon Press, Oxford, from 1991<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
65
ongoing) and Schreibman, Siemens, Unsworth, op.cit as excellent starting points for the<br />
critical discourse of digital humanities for the last fifteen years.<br />
xlvi Charles Babbage, Charles Babbage and his Calculating Engines, (NY: Dover, 1961)<br />
xlvii Claude Shannon, “A Mathematical Theory of Communication,” (Bell Labs, 1947)<br />
http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html<br />
xlviii Alan Turing (1936), "On Computable Numbers, With an Application <strong>to</strong> the<br />
Entscheidungsproblem", Proceedings of the London Mathematical Society, Series 2,<br />
Volume 42 (1936). Eprint. Martin Davis, Universal Computer (W.W. Nor<strong>to</strong>n, 2000)<br />
xlix John Comaromi, Dewey Decimal Classification: His<strong>to</strong>ry and Current Status, (New<br />
Delhi: Sterling, 1989). Wayne Weigand, The ‘‘Amherst Method’’: The Origins of the<br />
Dewey Decimal Classification Scheme, Libraries & Culture, Vol. 33, No. 2, Spring 1998,<br />
http://www.gslis.utexas.edu/~landc/fulltext/LandC_33_2_Wiegand.pdf Fritz Machlup,<br />
Information: Interdisciplinary Messages (NY: Wiley and Sons, 1983)<br />
l Warren Weaver and Claude Shannon, The Mathematical Theory of Communication,<br />
(Urbana, Illinois: University of Illinois Press, 1949, republished in paperback 1963).<br />
li Alan Liu, The laws of cool : knowledge work and the culture of information, (Chicago:<br />
University of Chicago Press, 2004); Willard McCarty, <strong>Humanities</strong> <strong>Computing</strong>, (London:<br />
Palgrave Macmillan, 2004); Lev Manovich, The Language of New Media, (Cambridge,<br />
MA: MIT Press, 2001); Stephen Wilson, Information arts : intersections of art, science,<br />
and technology, (Cambridge: MIT Press, 2002); William J. Mitchell, The reconfigured<br />
eye: visual truth in the post-pho<strong>to</strong>graphic era (Cambridge: MIT Press, 1994). This<br />
initial list crosses disciplines. All are works that involve discussion of the dialogue of<br />
formal reasoning and cultural issues in representation and knowledge.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
66
lii As we have seen, digital humanities draws heavily on the traditions of Rene Descartes,<br />
Gottfried Leibniz (mathesis universalis), and the 19 th -century algebra of George Boole’s<br />
tellingly named Laws of Thought. It has direct origins in the logical investigations of<br />
language in Gottlub Frege and Bertrand Russell, early Ludwig Wittgenstein (during his<br />
youthful, mathematical optimism), and the extended legacy of formalist approaches <strong>to</strong><br />
natural language in the approach of Noam Chomsky. As also mentioned above, models of<br />
mind-as-computer dominated early cybernetic thought. Norbert Weiner’s feedback loops<br />
had the desired aim of making human behavior conform <strong>to</strong> the perceived perfection of a<br />
machine. True, systems theory has sophisticated Weiner’s original model considerably.<br />
AI debates have moved in<strong>to</strong> the realms of cognitive studies, away from rule-based<br />
notions of programmable intelligence or even bot<strong>to</strong>m-up, neural-net, experiential-<br />
modeling. But complexity theory and chaos models are often only higher-order<br />
formalisms, with claims <strong>to</strong> <strong>to</strong>talizing explanations and descriptions, not refutations of<br />
formal logic’s premises. Stephen Wolfram’s <strong>to</strong>talizing explanation of a combina<strong>to</strong>ric and<br />
permutational system is only one extreme manifestation of a pervasive sensibility.<br />
Mathesis still under-girds a persistent belief shared by some humanities digerati with<br />
closet aspirations <strong>to</strong> <strong>to</strong>tal control of all knowledge-as-information.<br />
liii Jarry, op.cit. and Warren Motte, OuLiPo: a primer of potential literature (Normal, IL:<br />
Dalkey Archive Press, 1998) and Harry Mathews, Oulipo Compendium (London: Atlas<br />
Press, 1998)<br />
liv One undergraduate who worked with us created a game in Gnomic, a game entirely<br />
about rule-making. http://www.gnomic.com (9/26/2006)<br />
lv McGann, “Texts in N-Dimensions and Interpretation in a New Key” op.cit.<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
67
lvi McGann, ibid, cites George Spencer-Brown, The Laws of Form, (NY: Julian Press,<br />
1972).<br />
lvii This is garden variety post-structuralism and deconstruction, from the lessons of<br />
Roland Barthes, S/Z (Paris: Editions du Seuil, 1970), Image/Music/Text (NY: Hll nad<br />
Wang, 1977), Mythologies (Paris: Editions du Seuil, 1957), Jacques Derrida, Of<br />
Gramma<strong>to</strong>logy (Baltimore: Johns Hopkins University Press, 1976), Writing and<br />
Difference (Chicago: University of Chicago Press, 1978), and Paul de Man, Blindness<br />
and Insight (New York: Oxford University Press, 1971) and Allegories of Reading (New<br />
Haven: Yale University Press, 1979)<br />
lviii Glasersfeld, op.cit., von Foerster, op.cit., Maturana and Varela, op.cit. Norbert<br />
Wiener, Cybernetics of Control and Communication in the Animal and the Machine, (<br />
New York, NY: Wiley, 1948).<br />
lix Bakhtin, op.cit.<br />
lx Cultural theorists of media, from Harold Innis, and James Carey through Kaja<br />
Silverman, Janice Radway, Dick Hebdidge stress the constitutive rather than instrumental<br />
function of mediating systems. James Carey, “Cultural approach <strong>to</strong> Communication,”<br />
Chapter 1, Communication as Culture (Bos<strong>to</strong>n: Unwin Hyman, 1989).<br />
<strong>1.1</strong> His<strong>to</strong>ry / 3/2008 /<br />
68