Alan Liu: ‘ Transcendental Data: Toward a Cultural History and Aesthetics of the New Encoded Discourse’

I find Alan Liu’s ‘Transcendental Data’ from Critical Inquiry 31 a very interesting article because he tries to outline how the discourse network 2000 works — in reference to Kittler’s concept of the discourse network 1800 & 1900. Liu rightly identifyies XML and the ideology of division of content and presentation as the fundaments of discourse network 2000.

That is very close to what I sometimes call the php/mySQL or database-turn in online publishing. (And one can add the so-called Web 2.0-stuff). We’re all writing these tiny text-objects, (or uploading images or sounds), that are furnished with meta-data by the softwares we use, and are then possibly ‘endlessly’ redistributed over the networks, and aggregated according to various ‘preferences’, in various contexts (again, in possibly endless combinations).

A discourse network is a discursive circuit. Or — Liu quoting Kittler — “The term discourse network. ..can also designate the network of technologies and institutions that allow a given culture to select, store, and process relevant data. Technologies like that of book printing and the institutions coupled to it, such as literature and the university, thus constituted a historically very powerful formation….Archeologies of the present must also take into account data storage, transmission, and calculation in technological media. (Friedrich A. Kittler, Discourse Networks, 1800/1900, trans. Michael Metteer and Chris Cullens, Stanford, Ca., 1990, p. 369)

Liu asks these question: “What is the social logic that underlies the technologic of discourse network 2000? (…) How is an author now a postindustrial producer? (…) What are the aesthetics of encoded or structured discourse or, as I will term it, of postindustrial dematerialization? (…) How is it possible for writers or artists to create in such a medium?

Well, with regard to the last question, I’m tempted to say: “Easy, we type and hit the publish button.” Writing is still ‘putting words in the right order’. But what Liu wants to get at is, of course: “How does discourse network 2000, enable a certain form of writing, of sharing knowledge, of discussing (in writing)”.

Liu then proceeds with giving a basic overview of XML, with a short reference to TEI. (I wonderded, did Critical Inquiry ever before or since print such a basic introduction to any subject?)

“These cardinal needs of transformability, autonomous mobility, and automation resolve at a more general level into what may be identified as the governing ideology of discourse network 2000:the separation of content from material instantiation or formal presentation.” (p. 58)

“Data islands, or more generally what I will call data pours, are places on a page — whether a web page or a word processing page connected live to an institutional database or XML repository — where an author in effect surrenders the act of writing to that of parameterization.” (p. 59)

“Now web pages increasingly surrender their soul to data pours that throw transcendental information onto the page from database or XML sources reposed far in the background.” (p. 61)

“What is at stake is indeed what I called an ideology of strict division between content and presentation — the very religion, as it were, of text encoding and databases.” (p. 62)

“Discourse network 2000 is a belief. According to its dogma, true content abides in a transcendental logic, reason, or noumen so completely structured and described that it is in and of itself inutterable in any mere material or instantiated form. Content may be revealed only through an intermediary presentation that is purely interfacial rather than, as it were, sacramental — that is, not consubstantial with the noumenal.” (p. 62)

He then concludes that: “Authors and readers become operators of black box machinery who select criteria for prescripted actions.” (p. 63). I’d say that’s a bit stretching the argument. It is certainly true for people who do not know how to change the defaults; it is true for those working in a fixed (institutional) context and keep to the rules, and do not want to change any of the rules. Etc. Also one has to remember that blackboxing also enables people to work with technology…, and doesn’t imply that nothing can be changed.

In the following section Liu outlines eloquently the importance of standardization, and the continuity between industrialism and post-industrialism. XML asks for standardization, yet really is no standard, but a meta-standard. In a few quotes:

“My thesis is that the postindustrial technologic of encoded or structured discourse dates back — with a signal difference I will indicate later — to nineteenth- and early twentieth-century industrialism.” (p. 64)

“New in Taylorism was the additional principle that decisions had to be extracted from the embodied work of the laborer and described on instruction cards as procedures that could be optimized, reprogrammed, distributed, and otherwise mediated.” (p. 67)

“Databases and XML are now our ultimate functional managers. They are the automatic mediators of the work of contemporary knowledge.” (p. 69)

” The upshot of such a social history of databases and XML is that the common presumption of business writers, technologists, and others that there was a sharp break between industrialism and postindustrialism is historically too shallow.” (p. 71)

“Only by understanding the deep connection between industrialism and postindustrialism are we now prepared to discern the great difference of the latter. Both epochs, as we have seen, share the projects of standardization and management. But only postindustrialism saw these projects through to their radical conclusion, which might be called metastandardization and metamanagement.” (p. 72)

“XML, for example, is technically not a standard but a metastandard, a family form of standards that governs the extensible creation of specific standards of XML tags or schemas.” (p. 72)

The last section of the article deals with the data sublime. And that jump — to Turner, Gibson and Novak, so to the computational sublime — comes too easy.

(I also do not agree with Liu’s argument that new media arts & new media are a too new field “to commit to any one analysis”: which field commits to that anyway?)

This section is about the idea (or ideology) that a massive amount of data (dataclouds) will, who knows through self-organization, or through other ‘formations’, come to show meaningful patterns. That’s what Gibson was onto in Idoru and his other later novels. But instead of jumping to this aesthetic, one should — I think — rather look at how datamining, marketing and the search engines deal with this in a real-world way; that affects our lived reality, can be seen as one of the (f)actors that construct our reality.

Nevertheless, there are good bits here too, good treatments of Novak, Jevbratt etc. The last two pages of my photocopy of the article are again full ofpencil markings.

“But the avant-garde conviction that there was a necessary relation between form and content was nevertheless a reflection of industrial standardization and management.” (p. 79) [Yet, as Liu states, there was a third term in the modernist equation of form & content: materiality].

“When the material substrate was removed to allow for internet transmission, that is, variable methods of standardization — for example, XML documents governed by a common standard but adaptable to undetermined kinds of hardware, software, and usages — could suddenly be imagined.” (p. 80)

Liu asks: “Is the writer or artist any longer an author in such circumstances, let alone a creative one?” (p. 80). My margin says in pencil: “sure”.

“In the romantic era circa 1800, Kittler observes, the hermeneutic discourse network began when a source of meaning located in Nature or the Mother called to poets to transmit its transcendental essence through language conceived as a mere channel of translatability.” (p. 80)

“In the modernist era circa 1900, by contrast, mother nature was a faint echo. The true source of the signal, Kittler argues, (…) was an apparently random, senseless, automatic, untranslatable, and thus nonhermeneutic noise inherent in the channel of transmission itself — like tuning your radio to a Pychonesque channel of revelation indistinguishable from utter static.” (p. 81)

“The distinctive signal of 2000, by contrast, synthesizes 1800 and 1900. In 2000, the channel is just as seemingly senseless, random, and automatic as in 1900. But the source point of the transmission is phase-shifted so that phenomenally senseless automatism follows from a precursor act of sense making in the databases and XML repositories outside the direct control of the author.” (p. 81)

Wait: “databases and XML repositories outside the direct control of the author”. Not for those authors who set up their own databases, who know XML, who will manipulate Technorati, or stay out of that… So this statement, I think, is too general. The technology is partly imposed on us, partly we are able to construct it ourselves.

Liu takes away too much of the acting power (there another word…) from the writer. (Maybe he hates working with the TEI-people ;-) ). It does shows how important it is to have open standards that can be developed further and changed.

“[N]ow the author is in a mediating position as just one among all those other managers looking upstream to previous originating transmitters — database or XML schema designers, software designers, and even clerical information workers (who input data into the database or XML source document).”

Yes, but that doesn’t mean an author is not creative. It is true that our current writing and publishing technologies make it far more easy to cut-&-paste-&-change; it is true we do more ‘circulating’. it’s true, authors can be, and are often their own publishers. But isn’t that much more an ‘enabling’ feature, than a ‘loss of creativity’? Even if we work inside preformatted contexts? Not that Liu is nostalgic, he is not. He just states that we do not regard an author anymore so much as “the originator transmitter of a discourse” (p. 81).

“[C]ontent held in databases and XML now sets the very standard for an ultra-structured and ultra-described rationality purer than any limiting instantiation of the Ding an Sich. And so what Kittler calls the mother tongue — now the discourse of the motherboard, of the matrix itself — seems to return.” (p. 81)

Again, this stretches the argument (into the abstract). I do not find that very productive. Motherboard, matrix… It is also not true — although texts are circulating in the network, distributed over many harddisks, aggregated in different combinations at different end points, and are ultra-structered for that purpose. But all the bits of texts are still written (typed in, copied), and still they are read, and acted upon. (True, some more by search engines than by human beings). One can let oneself be blinded by the sublimity of it all, but I don’t see why one should.

Liu concludes: “The core problem is what I have in my Laws of Cool called the ethos of the unknown — of the unencoded, unstructured, unmanaged — in human experience. In our current age of knowledge work and total information, what experience of the structurally unknowable can still be conveyed in structured media of knowledge (databases, XML, and so on)? Perhaps the arts — if they can just crack the code of ordinary cool and make it flower — know.” (p. 81).

I have two remarks to make to this (not having read his Laws of Cool):
1. Is there such a thing as a perfect divide between the ‘unencoded, unstructured, unmanaged’ on the one hand and the encoded, structured, managed? Environments that are structured on one level, can allow for total unstructeredness on another level.
2. XML can be very messy too.

I have the feeling this is somewhat a pseudo-issue. If our texts are put into XML-schemes, that doesn’t mean that our written sentences are structured better. (XML doesn’t mind if your sentence is grammatical). And it sure doesn’t mean expierence becomes more structured.

But maybe I don’t get what Liu is at.

Liu btw is a Pynchonite & a Wakian (his most recent article in Critical Inquiry deals with FW). And of course he quotes this great passage from The Crying of Lot 49:

“She [Oedipa Maas] could, at this stage of things, recognize signals like that, as the epileptic is said to — an odor, color, pure piercing grace note announcing his seizure. Afterward it is only this signal, really dross, this secular announcement, and never what is revealed during the attack, that he remembers. Oedipa wondered whether, at the end of this (if it were supposed to end), she too might not be left with only compiled memories of clues, announcements, intimations, but never the central truth itself, which must somehow each time be too bright for her memory to hold; which must always blaze out, destroying its own message irreversibly, leaving an overexposed blank when the ordinary world came back.” Thomas Pynchon, The Crying of Lot 49, New York, 1999, p. 76

All quotes from: Alan Liu, ‘ Transcendental Data: Toward a Cultural History and Aesthetics of the New Encoded Discourse’, in Critical Inquiry 31 (Autumn 2004).

Also available here: http://www.uchicago.edu/research/jnl-crit-inq/features/artsstatements/arts.liu.htm.

en,quotations,software,ubiscribe,writing | August 3, 2006 | 16:16 | Comments Off on Alan Liu: ‘ Transcendental Data: Toward a Cultural History and Aesthetics of the New Encoded Discourse’ |

0 Comments

RSS for comments on this post.

sorry, the comment form is closed at this time.

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License. | Arie Altena