#hypertext

History of Hypertext and Hypertext Literary Theory

Dylan Kinnett

There must be a more informed hypertext fiction, one that corresponds appropriately to the nature of its medium and to the medium’s users. These are questions to be addressed: What hypertext is fiction and what is it not, where has it been in the course of its history, and where should it go, and the most important question, is it art?

The idea of hypertext literature has been proposed and discredited. It has been discussed and debated much more frequently, it seems, than the idea has been actually put into practice. Hypertext literature is created on occasion but meets with critical disagreement, and it seems the critics are the only people reading.

That previous discourse, the theses and the antitheses, can be surveyed with a mind toward synthesis. In order to understand the nature of the proposed hypertextual medium for literature, it will be fruitful to examine the development of hypertext itself, which occurred largely as a part of a realm far removed from literature. Hypertext has moved from the realm of pure theory, from being a tool for the educated echelons, to its current state, mostly an advertising wasteland masquerading as mass-communication.

Internet technology was originally used for government and academic purposes. Sets of specialized information were brought together in the form of cross-referenced networks. Academics soon brought their critical attention to the possible structures of information. How can information be arranged into more useful structures? What are the implications of these new structures?

Hypertext, the electronic text medium, is considerably unlike conventional text. Hypertext allows for a structure unlike conventional text, where bits of information can be linked to other bits, which are also linked to other bits, etc. This system offers certain structural possibilities for the composition of new texts, as well as the arrangement of old ones. These possibilities were postulated by academics who saw potential in hypertext for their theories about the structure of ideas. The development of hypertext has allowed authors to arrange their compositions in new ways, previously impossible.

Collusion between technological innovation and the developments in communication philosophy has resulted in a hypertext theory. The proponents and detractors of hypertext fiction can be more easily understood if we first consider the perspective from which hypertext theory originally developed.

Theodor H. Nelson put the word hypertext together. This was in the 1960’s, well before the general public yet grasped the concept to which it refers. Nelson defines hypertext as “non-sequential writing – text that branches and allows choices to the reader, best read at an interactive screen. As popularly conceived, this is a series of text chunks connected by links which offer the reader different pathways” (Nelson 2).

The concept of hypertext is much older than its name, though. Two of the most notable events that led to hypertext’s development are a speech and an essay. Ralph Waldo Emerson gave the speech in 1837 and an American scientist named Vannevar Bush wrote the essay “As We May Think,” which was published in the July, 1945 edition of The Atlantic Monthly. Both call “for a new relationship between thinking man and the sum of our knowledge” (Bush 101).

Approaching the end of World War, when American Science had been devoted extensively toward developments for the war efforts. Vanevar Bush proposes that future progress depends upon, and should “implement the ways in which man produces, stores, and consults the record of the race” (Bush 104). He argues this based on the fact that the complete store of information is growing more complex and specialized. By the middle of the 20th century, specialized information was not necessarily accessible to the few specialists capable of understanding it. Further complicating the accessibility of information was that fact that all of it was in print, occupying so much physical space and so many different spaces. Bush urges developments toward changing that. In addition, Bush argues for a new structure for the information which is to be stored in new ways:

Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be done in only one place…. Having found one item, moreover, one has to emerge from the system and re-enter on a new path.

The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested to it by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain… Selection [of information] by association, rather than indexing, may yet be mechanized (106).

“As We May Think” develops the subject of the mechanized indexing of information, arranged by association.

The article contains a description of a postulated “device,” a “mechanized private file and library” which “may be consulted with exceeding speed and flexibility,” a kind of personal computer and research assistant (Bush 107). Vannevar Bush’s ideas were foundational to hypertext theory, so much so that it seems his article is describing the Internet. A description of a typical interaction with such a device, for which he coins the term “memex,” is a description of an associative path through the written record, from general to specific and down divergent paths along the way, each leading toward other specific points. This is a description of what has become a common experience. The process Bush defines, using technology very similar to what he describes: these are commonplace parts of the research process anymore. The path of these associative trains of thought is a familiar path now, and Bush says that it is because we think this way ourselves.

It is important to note that Vannevar Bush is operating under the assumption that we require a thing we call “computer,” or a “memex,” in order in order to organize and distribute the whole library of human knowledge. Bush urges an associative arrangement of that information so that it may be more compatible, so to speak, with the way it is natural for us to think. Inherently, our minds store things associatively. Inherently, we are capable of doing what it is that Bush is proposing our computers do. The only reason we need the computer, or the “memex,” is to preserve and arrange a body of knowledge larger than it would be relevant or possible to store within our minds.

Bush’s ideas are extensions of the reality of his present. He stresses this, asserting that his predictions are not as far-fetched as they might seem to the audience of his time. According to Bush, by the mid-twentieth century, “The world has arrived at an age of cheap complex devices of great reliability; and something is bound to come of it” (103). [k2]

If the “age of cheap complex devices” can be called the modern age, then the “something bound to come of it” can be called post-modernity. It is precisely that something, and thoughts about it, that has drawn postmodern thinkers [k3]×toward the idea of hypertext and its implications. The postmodern approach to hypertext can be more easily understood if we first consider their philosophical approach overall. It is necessary to make a diversion here, to understand the postmodern approach before going any further. The vast majority of hypertext literary criticism and hypertext theory that has been published to date discusses its subjects in terms of, or with references to, the rubric of postmodernism.

In their theory, there should be an aftermath to the structure of the world, or the structure of our society, or the structure of our minds. There should be a new way of things. The modern world is a hierarchy; the leader rules the government, which rules the people who rule their families, as god rules the universe. The theory is that it is illogical for people to put anything at the “top” of such a structure in the first place. It is illogical precisely because our logic is not entirely hierarchical by nature. Our logic and the language we express it with are associative things, not linear ones. Realizing this, hierarchy is no longer an acceptable structure for our ideas. With it go similar structures, such as patriarchy. There has been a perceivable move away from these notions.

The move was notably recorded in a paper presented by Ron Inglehart at “Global Trends 2005: A Future Trends Conference” organized by The Center for Strategic and International Studies. “One of the striking trends in advanced industrial societies of the last few decades has been a move away from hierarchy, centralization, bureaucratization” (Inglehart). Between 1970 and 1994, Inglehart conducted studies of 43 societies across the globe. He concluded that the societies are exhibiting a generational shift above and below the approximate age of twenty-five. The older group exhibits a worldview dominated by an emphasis on hierarchy while the younger group is characterized by an emphasis on non-hierarchical structures.

Postmodernism is the name of this shift in thought. The name is ambiguous to say the least, as are the theoretical tenets that tend to be associated with it. The editors of “Postmodern American Fiction: A Norton Anthology” define the ideology as " … a tentative grouping of ideas, stylistic traits and thematic preoccupations that set the last four decades apart from earlier eras" (Geyh et. al x). They go on to say that those ideas are in large part a reaction to current events and pre-standing modern assumptions, such as authorial authority. Characteristically postmodern stylistic traits stem from avant-garde aesthetic developments which include pastiche and multiple or shifting perspective, and the editors say that the thematic preoccupations include racial tension, conspiracy and apocalypse. They describe postmodern philosophy and criticism as a set of questions concerning subjectivity and the construction of identity “the constructedness of meaning, truth and history” (x). They say that the thinkers are “… marked by a thoroughgoing skepticism toward the foundations and structures of knowledge×(Geyh et. al x).

These editors had to struggle with the complexities of postmodern ideology in order to successfully compile their book. Their introduction provides a productive examination of postmodernism, as compared to previous ideas. According to the editors, Modernism is " a complex but affirmative artistic movement that rose above (while combating) the diminishment of human consciousness that emanated from popular culture”(Geyh et. al xvii).

The editors describe postmodernism, in contrast to modernism, by its openness to the effects of popular culture on “High” culture. With this understanding comes an appreciation of the effects of technology on culture. This is in contrast to Modernism’s attempts to halt the progress of realities we are now forced to accept. In many ways, the phrase “Popular Culture” has become a redundancy. Popular Culture’s existence in the world around us has had an effect on our development as individuals. It is part of us. While the urge to rise above popular culture may exist, that urge must be reconciled with the reality of the situation. Popular culture has established firm roots (Geyh et. al xii).

One question from this perspective is: what does communication now look like? George Landow is a prominent thinker in the field of hypertext theory “who first introduced us to the important correlation between the deconstructive philosophy of Jacques Derrida” and hypertext (Hubrich). Landow says, “we must abandon conceptual systems founded upon ideas of center, margin, hierarchy, and linearity and replace them with ones of multilinearity, nodes, links, and networks” (2).

A message, in a book for example, is structured in an order dictated by its author, concurrently with the message’s meaning. There is a hierarchical conception typically associated with our understanding of communication’s transmission of meaning. Barthes describes the author-text hierarchy, in his essay The Death of the Author. “The Author is… in the same relation of antecedence to his work as a father to his child.” The traditional conception of a text’s meaning is that it is handed down by the text from the author to the audience, but Barthes says, “a text is not a line of words releasing a single ’theological’ meaning (the ‘message’ of the Author-God); instead, he understands text in terms of “a multi-dimensional space in which a variety of writings, none of them original, blend and clash” (128). That “variety of writings” is presumably the same thing Vannevar Bush called the “record of the race” (104).

Meaning aside, from the first page to the last, with each page sequentially in between, a book’s contents are structured in a this-then-that order. In this manner a story is best told chronologically. Critics have long postulated that a chronological hierarchy of ideas is not a necessity, but rather a limitation of the printed medium. Further, they assert the idea that it is possible to structure a written text, like a conversational text, or a thought text, in an associative, rather than a causal order. Then, we can make that text a story.

Enter hypertext. Landow continues explicating the postmodern approach to text by adding, “Almost all parties to this paradigm shift…see electronic writing as a direct response to the strengths and weaknesses of the printed book” (Landow 25).

This developing theory regarding the nature of text was developed by the academics that were among the first people to utilize electronic text. For the most part, hypertext was theorized before it was realized. The realization of hypertext has an interesting history of its own. It is the development of the medium in question, and as such a part of it.

Building upon Bush’s information structure ideas, which had been named hypertext by 1965, several universities and institutions began developing computer systems designed to catalog and retrieve information within an associative network. The first of these, “developed in 1967 by a team of researchers led by Dr. Andries van Dam at Brown University was named simply, Hypertext Editing System. IBM sold this system to NASA for use with the documentation associated with the Apollo Space Program” (Feizabadi).

In 1975, researchers at MIT developed a hypermedia network of photographs of the city of Aspen , Colorado . In this network of photographs there is an example of a non-linear structure in that:

The images were linked in such a way that would allow the user to start at a given point and move forward, back, left, or right. Once a route through the city was chosen, the system could display the images in rapid succession creating a movie-like motion… Another interesting feature of the system was a navigation map which was displayed in addition to the movie window. The user could jump directly to a point on the city map instead of finding the way through the city streets to that destination. The Aspen Movie Map was a landmark in hypermedia development in that, through a sophisticated application, it demonstrated what could be achieved with the technology available at the time. (Feizabadi)

The nineteen-eighties saw an explosion of hypertext possibilities. New, more intricate information networks developed, such as Xanadu and Intermedia. In 1985, Janet Walker invented the first hypertext system designed for general purposes. It was widely used. One of its more notable innovations was the development of bookmarks. Still, hypertext systems were designed for use on only one type of computer. Nothing came alone for multi-platform use until 1986. At that time a series of “note card” based hypertext applications was built upon to develop Apple Computer’s HyperCard software, which was widely distributed. Two years later, the World Wide Web was developed, linking data stored all over the planet. Initially, this data was limited to scientific, academic, government and military information (Feizabadi).

Already, in its short history, hypertext has grown in its forms and functions beyond any of its original intentions. It is very doubtful that Vannevar Bush had any idea what the commercial sector would do with anything like his memex machine. He would probably recoil at the gargantuan realm of the dot-com and its addition to the “record of the race.” Given the history of growth and development, it stands to reason that hypertext will only continue to shift in its forms and uses, so it seems premature to rule out anything like the possibility of a widely used hypertext literature.

It is important to note that hypertext, and thus the Internet, was developed, and originally used exclusively for strictly factual data. It was originally used for institutional purposes. The first hypertext writers published their critical theory online, where indexing, cross-referencing, annotation, citation etc. were much easier than they had been with conventional text methods. The first “literary” compositions to be posted electronically were non-literature. They were critical evaluations of preexisting texts. David S. Miall warns of the effect these beginnings have had on the present body of hypertext theory writings, “where literary and non-literary writing appear to be treated as the same thing” (Miall).

Critics, the press, and the public generally assume that hypertext and the World Wide Web are synonymous because the later has been their only access to the former. This tendency has a profound effect upon contemporary hypertext theory. Collision between ideology and technology occurred with the development of a widespread hypertext text protocol, the World Wide Web. It is not the only such network. It is not the only way of distributing hypertext. Other ways have existed for some time and still others are in development. Later discussion will concern these.

The idea that the new hypertext could be used for creative purposes [k8]×was introduced to the world-at-large when, in 1992, The New York Times Book Review published an article by Robert Coover entitled “The Death of Books.” This article is almost unanimously referred to as the original introduction to hypertext fiction. Consequently, ever since, the issue of hypertext literature has been attached to the “death of books” idea. The article describes electronic literature and goes on to postulate that its existence is a small step toward our culture’s elimination of the book (Coover).

Jay David Bolter succinctly counters that controversial grandstanding. “The question of whether the computer will ultimately replace the printed book… [is] still the wrong one for several reasons.” That said, Bolter proceeds to devote hundreds of words to “the wrong question,” but he does point out the problems with asking it.

Many of the arguments against the computer as a reading technology depend upon assumptions about the legibility of computer screens. We cannot know precisely how the technology will improve in the coming decades. Furthermore, we cannot know what choices and compromises readers may be willing to make in ten, twenty, or fifty years… More often perhaps, a new technology takes over one function and leaves other functions to an existing technology. (Bolter)

Bolter is not concerned about the replacement of the book with an electronic medium; he concentrates instead on the changing role of books in an electronic age. He points out that “printing replaced handwriting for the distribution of the most kinds of texts, but it did not make handwriting obsolete.” Electronic media is likely to have an effect on print similar to the effect print had on the role of handwriting in our culture. “In any case the mere survival of the printed book is not what matters” (Bolter).

Hypertext theory and criticism is drowning in such talk. If it isn’t a discussion of the “death of books’ thanks to grandstanding like Coover’s, it’s a discussion of that discussion, like Bolter’s. None of this really addresses the question of what hypertext is, except insofar that it is not printed text. None of this wonders what hypertext should be, except when it is relegated to a role sub-standard to that of books, or defended from that relegation.

There are critics, proponents of hypertext literature, who address the question of hypertext’s validity as literature, in its own right, independently from books. Their view was well summarized by Laura Miller in the New York Times: “The theory of hyperfiction insists that readers ought to be, and long to be, liberated from two mainstays of the traditional novel: linear narrative and the author.” These ideas stem from ideas like those of Derrida and Barthes, respectively, but hypertext theory has taken the ideas in new directions.

One of the most ardent proponents of hypertext and hypertext theory is George P. Landow. He makes an important point about the connection between the development of hypertext and the theories that preceded it.

The many parallels between computer hypertext and critical theory have many points of interest, the most important of which, perhaps, lies in the fact that critical theory promises to theorize hypertext and hypertext promises to embody and thereby test aspects of theory, particularly those concerning textuality, narrative, and the roles or functions of reader and writer. Using hypertext, critical theorists will have, or now already have, a laboratory with which to test their ideas. Most important, perhaps, an experience of reading hypertext or reading with hypertext greatly clarifies many of the most significant ideas of critical theory. (Landow)

There are several theories being put to the test by hypertexts. Notably, the aforementioned new ideas regarding centrality, hierarchy, and the relationship between author and reader are constantly applied to the study of hypertext. Also, and to an overwhelming degree, hypertext authors are forced to reckon with the poststructuralist concept of the limits of print. Landow and others propose a hypertext literature on the grounds that it can give rise to a plethora of new ideas from a range of discourses, from programming to literary criticism to philosophy.

Opposition to the idea of creative hypertext [k9]×does exist. A notable example of this opposition is the book entitled “The Gutenberg Elegies” by Sven Birkets. It discusses “The Fate of Reading in an Electronic Age” (Birkets). After abandoning all of what he calls “proselytizing” in Robert Coover’s article, Birkets begins his new explanation of hypertext as it relates to literature.

Ground zero: The transformation of the media of communication maps a larger transformation of consciousness— maps it, but also speeds it along; it changes our terms of experience and our ways of offering response. Transmission determines reception determines reaction. Looking broadly at the way we live—on many simultaneous levels, under massive stimulus loads—it is clear that mechanical-linear technologies are insufficient. We require swift and obedient tools with vast capacities for moving messages…. As the tools proliferate, however, more and more of what we do involves network interaction… they are fast becoming our new cognitive paradigm…. What is the relevance of all this to reading and writing? This must now be established from scratch. (Birkets 87)

Birkets goes to great lengths to describe the difference between printed and electronic text, and ends by saying that the average reader “will wonder what is the difference” (Birkets 87). His distinction is best described by his analogy that text is to the bullion stored in Fort Knox as electronic text is to the money that represents it. (Birkets 95). Nevertheless, he asserts agreeably that for the writer of electronic text, “A change of procedure must be at least subtly reflected in the result.”

The author mounts his attack on the proposition of an electronic literature on the grounds that it “promotes process over product and favors the whole over the execution of the part”. One of the theoretical fundamentals his opponents’ view is Barthes’ “Death of the Author” with its denial of authorial authority, and Birkets does not agree with it. He says it puts “venerable novelistic values like unity, integrity, vision, voice… in danger” (Birkets 120). He is, of course operating under the assumption that these are good values, fundamentals. He questions what we would do without “the power of the fiat: let there be no world but this” which is the assertion be believes to be at the root of creation itself (Birkets 120[k10]×).

In the same publication that housed Robert Coover’s distended call for “The Death of Books,” Laura Miller replies with a very different approach. Her tone is fresh, decidedly non-academic, and down-to-earth. She sides with the average reader. She quickly escapes the black-hole discussion of the death of books, with the help of her present-day (1998) observations of a world where we have things like Amazon.com, a web page that sells books by the millions. People are still reading books, she says, and she has “yet to encounter anyone who reads hypertext fiction. No one, that is, who isn’t also a hypertext author or a journalist reporting on the trend” (Miller).

Miller does not approve of Landow’s hypertext-as-laboratory justification for the validity of hypertext literature. She says, “What the laboratory of hyperfiction demonstrates, though, is how alienated academic literary criticism is from actual readers and their desires.”

She makes the point that the average reader does not feel oppression coming down from the authority of the author, the oppression that Barthes worked so hard to demonstrate. She then makes the point that the structure of a linear story, a chain of events is something the average reader seeks to enjoy rather than to be liberated from.

Miller’s critique of hypertext is at its strongest when she considers structure. She describes the experience of reading a work of hypertext fiction as “a listless task, a matter of incessantly having to choose among alternatives, each of which, I’m assured, is no more important than any other” (Miller).

There is a heated debate over the validity of hypertext literature. It might help to summarize the views of both sides. The proponents of hypertext literature see it as a new textual utopia where their theories of what text should be and do can be realized.

The detractors of hypertext literature are diverse. Some of them prefer to read from a printed page rather than a screen, and make this preference the center of their debate. Others take offence to the overly theoretical nature of the very existence of such a thing as hypertext literature, especially since the average reader is far removed from the theories behind it.

Both sides of the debate make valid claims. The detractors are right to criticize the heavy theoretical basis for hypertext literature, where theory tends to take precedence over the content of the literature itself. This precedence is very dangerous to the ideas espoused by the theory, ideas concerning liberation from central ideological authority, ideas which the proponents of hypertext literature might very well be right to espouse.

If the literature that embodies a theory is not palatable as literature, then it renders the theory inedible as well. Readers won’t see hypertext literature as a good idea if they don’t see it work, and they won’t see it work without meaningful content, delivered in an intuitable way. All this concentration on theory is distracting from the questions of content, and meaning. Neither side has sufficiently addressed the questions of meaning and content in hypertext literature.

Next: Aesthetics in a Hypertext Age →

Also: Bibliography →