Lev Manovich
Cinema by Numbers:
ASCII Films by Vuk Cosic

If the history of analog cinema officially begins in 1895 with the Lumières, the history of digital cinema, which yet is to be written, can start in the late 1930s with German Zuse. Starting in 1936, and continuing into the Second World War, German engineer Konrad Zuse had been building a computer in the living room of his parents' apartment in Berlin. Zuse's machine was the first working digital computer. One of his innovations was program control by punched tape. For the tape Zuse used discarded 35mm movie film.

One of these surviving pieces of film shows binary code punched over the original frames of an interior shot. A typical movie scene--two people in a room involved in some action--becomes a support for a set of computer commands. Whatever meaning and emotion contained in this movie scene are wiped out by this new function as data carrier. The pretense of modern media to create simulation of sensible reality is similarly cancelled: media is reduced to its origin condition as information carrier, nothing else, nothing more. In a technological remake of the Oedipal complex, a son murders his father. The iconic code of cinema is discarded in favor of the more efficient binary one. Cinema becomes a slave to a computer.

But this is not yet the end of the story. Our story has a new twist--a happy one. Zuse's film with its strange superimposition of the binary over iconic anticipates the convergence that gets underway half a century later. Media and computer--Daguerre's daguerreotype and Babbage's Analytical Engine, the Lumière Cinématographie and Hollerith's tabulator--merge into one. All existing media are translated into numerical data accessible to the computer. The result: graphics, moving images, sounds, shapes, spaces and text become computable, that is, simply another set of computer data. In short, media becomes new media.

This meeting changes both the identity of media and of computer itself. No longer just a calculator, a control mechanism, or a communication device, a computer becomes a media processor and synthesizer. If before, a computer would read in a row of numbers and output a statistical result or a projectile's trajectory, now it can read in pixel values, blurring the image, adjusting its contrast, or checking whether it contains an outline of a gun. Building upon these lower-level operations, it can also perform more ambitious ones: searching image databases for images similar in composition or content to an input image; detecting shot changes in a movie; or synthesizing the movie shot itself, complete with setting and the actors.

The identity of media has been changed even more dramatically. For instance, old media involved a human creator who manually assembled textual, visual, or audio elements (or their combination) into a particular sequence. This sequence was stored in or on some material, its order determined once and for all. Numerous copies could be run off from the master, and, in perfect correspondence with the logic of an industrial society, they were all identical. New media, in contrast, is characterized by automation and variability. Many operations involved in media creation and manipulation are automated, thus removing human intentionality from the creative process, at least in part. For instance, many web sites automatically generate pages from databases when the user reaches them; in Hollywood films, flocks of birds, ant colonies, and even crowds of people are automatically created by AL (artificial life) programs; word processing, page layout, and presentation software comes with "wizards" and "agents" that offer to automatically create the layout of a document; 3D software automatically renders photorealistic images given the scene description.

New media is also essentially variable (other terms that can be used to describe this qualitymight be "mutable" or "liquid").[1] Stored digitally, rather than in some permanent material, media elements maintain their separate identity and can be assembled into numerous sequences under program control. At the same time, because the elements themselves are broken into discrete samples (for instance, an image is represented as an array of pixels), they can be also created and customized on the fly.

The logic of new media thus corresponds to the postindustrial logic of "production on demand" and "just-in-time" delivery, which themselves were made possible by the use of digital computers and computer networks in all stages of manufacturing and distribution. In this regard, the "culture industry" is actually ahead of the rest of the industry. The idea that a customer determines the exact features of her car at the showroom, the data is transmitted to the factory, and that hours later is delivered the new car remains a dream; but in the case of computer media, it is already a reality. Since the same machine is used as a showroom and a factory, and since the media exist not as a material object but as data that can be sent through the wiresat the speed of light, the response is immediate.

This is the new logic of new media, or at least some of its axioms; but how does this logic manifests itself on the level of language? In other words, given the new structure of media on the material level (discrete character on different levels; distributed--that is, network-based--representation), and the new kind of operations we can perform on it (copy and paste, sampling, digital composing, image processing, and other algorithmic actions), do we create different-looking images? For instance, since filmmakers can now compose feature films entirely on a computer, do they make radically new kinds of films?

The answer to these questions so far have been definitely mixed. In the case of a moving image, the introduction of, first, electronic and, later, computer tools in video postproduction throughout the 1980s and the 1990s has led to the emergence of a new visual language of television: multilayered space, 2D combined with 3D, transparent planes, dynamic typography. So if you compare the look of television in the 1990s with that of the 1970s, the difference is dramatic. In the case of feature films, however, filmmakers are using basically the same technology as their TV counterparts--but the result is a much more traditional film language. 3D animation, digital compositing, mapping, paint retouching: in commercial cinema, these radical new techniques are mostly used to solve technical problems, while the old cinematic language is preserved unchanged. Frames are hand-painted to remove the wires that supported an actor during a shoot; a flock of birds is added to a landscape; a city street is filled with crowds of simulated extras. Although most Hollywood releases now involve digitally manipulated scenes, the use of computers is always carefully hidden. Commercial narrative cinema still continues to hold on to the classical realist style in which images function as unretouched photographic records of some events that took place in front of the camera.

How to make sense of this mixed evidence? If, historically, each cultural period (Renaissance, Baroque, and so on) brought with it a new expressive language, why is the computer age often satisfied with using the language of the previous period, in other words, that of the industrial age? The answer to this question is important because usually a new cultural language and new social-economic regime go together. Normally this thesis, especially beloved by Marxist critics, serves to move from the economic to the cultural, that is, a critic tries to see how a new economic order finds its reflection in culture. But we can also move in the opposite direction, from culture to economy. In other words, we can interpret radical shifts in culture as indicators of the changes in economic-social structure. From this perspective, if the new information age did not bring with it a revolution in aesthetic forms, perhaps this is because it has not come yet? Despite the pronouncements about the new net economy by Wired magazine, we may be still living in the same economic period that gave rise to Human Comedy and Gone With the Wind. Net.capitalism is still capitalism. Postmodern critics of the 1980s may have gotten it right after all: cultural forms that were good enough for the age of the engine turned out to be also good for the age of the "geometry engine" and the "emotion engine." ("Geometry engine" is the name of a computer chip introduced in Silicon Graphics workstations a number of years ago perform real-time 3D graphics calculations; "emotion engine" is the name of the processor to be used in the forthcoming Playstation 2; it will allow real-time rendering of facial expressions). In short, as far as its cultural languages are concerned, new media is still old media.

When radically new cultural forms appropriate for the age of wireless telecommunication, multitasking operating systems and information appliances will arrive, what will they looklike? How would we even know they are here? Would future films look like a "data shower" from the movie Matrix? Is the famous fountain at Xerox PARC in which the strength of the water stream reflects the behavior of the stock market, with stock data arriving in real time over Internet, represents the future of public sculpture?

We don't yet know the answers to these questions. However, what we as artists and critics can do now is point out the radically new nature of media by staging--as opposed to hiding--its new properties. And this is exactly what Vuk Cosic's ASCII films accomplish so well.

It is worthwile to relate Cosic's fims to both Zuse's "found footage movies" from the 1930s and to the first all-digital commercial movie made sixty years later--Lucas's Stars Wars: Episode 1, The Phantom Menace. Zuse superimposes digital code over the film images. Lucas follows the opposite logic: in his film, digital code lies under his images. That is, given that most images in the film were put together on computer workstations, during the postproduction process they were pure digital data. The frames were made up from numbers rather than bodies, faces, and landscapes. The Phantom Menace is, therefore, the first feature-length commercial abstract film: two hours worth of frames made up of numbers. But this is hidden from the audience.

What Lucas hides, Cosic reveals. His ASCII films "perform" the new status of media as digital data. The ASCII code that results when an image is digitized is displayed on the screen. The result is as satisfying poetically as it is conceptually--for what we get is a double image, a recognizable film image and an abstract code together. Both are visible at once. Thus, rather than erasing the image in favor of the code as in Zuse's film, or hiding the code from us as in Lucas's film, here the code and the image coexist.

Like my own "little movies" jupiter.ucsd.edu/~manovich/little-movies series of Net films (1994-present ), Cosic uses well-known films as his material for "ASCII history of moving images." Both projects also rely on the same strategy of defamiliarizing familiar lens-based images ("otstranenie") through algorithmic operations. In my "Classic Cinema 1," I reduce a scene from Hitchcock's Psycho to a Mondriaan-like abstraction by applying standard "mosaic" filter in Adobe's "Premiere" video-editing software; in Cosic's "ASCII history," the scenes from classical films are running through a custom player application thaT converts moving images into an ASCII code. The result something looks as though it were weaved. These are the kind of movies which J. M. Jacquard could have produced on his programmable loom, which he invented around 1800 and which inspired Charles Babbage in his work on the Analytical Engine.

Like VinylVideo by Gebhard Sengmüller (http://www.vinylvideo.com/ ), Cosic's ASCII initiative (www.vuk.org/ascii/aae.html) is a systematic program of translating media content from one obsolete format into another. These projects remind us that since at least the 1960s the operation of media translation has been at the core of our culture. Films transferred to video; video transferred from one video format to another; video transferred to digital data; digital data transferred from one format to another: from floppy disks to Jaz drives, from CD-ROMs to DVDs; and so on, indefinitely. The artists were first to notice this new functioning of culture: in the 1960s, Roy Lichtenstein and Andy Warhol already made media translation the basis of their art. Sengmuller and Cosic understand that the only way to fight media obsolescence is by resurrecting dead media. Sengmuller translates old TV programs into vinyl disks; Cosic translates old films into ASCII images. (See also Bruce Sterling's Dead Media Project http://eff.bilkent.edu.tr/pub/Net_culture/Folklore/Dead_Media_Project/.)

Why do I call ASCII images an obsolete media format? Before the printers capable of outputting raster digital images became widely available toward the end of the 1980s, it was commonplace to make printouts of images on dot matrix printers by converting the images into ASCII code. I was surprised that in 1999 I still was able to find the appropriate program on my UNIX system. Called simply "toascii," the command, according to the UNIX system manual page for the program, "prints textual characters that represent the black and white image used as input." The reference to early days of computing is not unique to Cosic but shared by other net.artists. Jodi.org, the famous website, often evokes DOS commands and the characteristic green color of computer terminals from the 1980s (www.jodi.org); Alexei Shulgin, who collaborated with Cosic on "ASCII Music Videos" http://www.easylife.org/386dx project, has performed music using old 386PC (). But in the case of ASCII code, it use evokes not only a peculiar episode in the history of computer culture but a number of earlier forms of media and communication technologies as well.

ASCII is an abbreviation of American Standard Code for Information Interchange. The code was originally developed for teleprinters and was only later adopted for computers in the 1960s. A teleprinter was a twentieth-century telegraph system that translated the input from a typewriter keyboard into a series of coded electric impulses, which were then sent transmitted over communications lines to a receiving system, which decoded the pulses and printed the message onto a paper tape or other medium. Teleprinters were introduced in the 1920s and were widely used until the 1980s (Telex being the most popular system), when they were gradually replaced by fax and computer networks ("teleprinter" http://www.eb.com:180/bol/topic?thes_id=378047 Encyclopaedia Britannica Online Accessed May 27, 1999).

ASCII code was itself an extension of an earlier code invented by Jean-Maurice-Emile Baudot in 1874. In Baudot code, each letter of an alphabet is represented by a five-unit combination of current-on or current-off signals of equal duration. ASCII code extends Baudot code by using eight-unit combinations (that is, eight "bits" or one "byte") to represent 256 different symbols. Baudot code itself was an improvement over the Morse code invented for early electric telegraph systems in the 1830s. And so on.

The history of ASCII code compresses a number of technological and conceptual developments which lead to (but I am sure will not stop at) a modern digital computers: cryptography, real-time communication, communication networks... By juxtaposing this code with the history of cinema, Cosic accomplishes what can be called an artistic compression: it brings together many key issues of computer culture and new media art together in one rich and elegant project.


[1] Jon Ippolito's discusses the notion of "variable media" in "The Museum of the Future: a Contradiction in Terms?" ArtByte 1 (no. 2, June-July 1988), 18-19. My usage of the term "variable" is similar to his, although I see variability as a fundamental condition of all computer media, rather than as something that applies only to computer art.