Lev Manovich 1 Jan 2000

Principles of New Media (2)

4. Variability

A new media object is not something fixed once and for all but can exist in different, potentially infinite, versions. This is another consequence of numerical coding of media (principle 1) and modular structure of a media object (principle 2). Other terms which are often used in relation to new media and which would be appropriate instead of variable is mutable and liquid.

Old media involved a human creator who manually assembled textual, visual and/or audio elements into a particular composition or a sequence. This sequence was stored in some material, its order determined once and for all. Numerous copies could be run off from the master, and, in perfect correspondence with the logic of an industrial society, they were all identical. New media, in contrast, is characterized by variability. Instead of identical copies a new media object typically gives rise to many different versions. And rather being created completely by a human author, these versions are often in part automatically assembled by a computer. (The already quoted example of Web pages automatically generated from databases using the templates created by Web designers can be invoke here as well.) Thus the principle of variability is closely connected to automation.

Variability would also will not be possible without modularity. Stored digitally, rather than in some fixed medium, media elements maintain their separate identity and can be assembled into numerous sequences under program control. In addition, because the elements themselves are broken into discrete samples (for instance, an image is represented as an array of pixels), they can be also created and customized on the fly.

The logic of new media thus corresponds to the post-industrial logic of 'production on demand' and 'just in time' delivery which themselves were made possible by the use of computers and computer networks in all stages of manufacturing and distribution. Here culture industry (the term was originally coined by Theodor Adorno in the 1930s) is actually ahead of the rest of the industry. The idea that a customer determines the exact features of her car at the showroom, the data is then transmitted to the factory, and hours later the new car is delivered, remains a dream, but in the case of computer media, it is reality. Since the same machine is used as a showroom and a factory, i.e., the same computer generates and displays media -- and since the media exists not as a material object but as data which can be sent through the wires with the speed of light, the customized version created in response to user's input is delivered almost immediately. Thus, to continue with the same example, when you access a Web site, the server immediately assembles a customized Web page.

Here are some particular cases of the variability principle (most of them will be discussed in more detail in later chapters):

4.1. Media elements are stored in a media database; a variety of end-user objects which vary both in resolution, in form and in content can be generated, either beforehand, or on demand, from this database. At first, we may think that this is simply a particular technological implementation of variability principle, but, as I will show in Database section, in a computer age database comes to function as a cultural form of its own. It offers a particular model of the world and of the human experience. It also affects how the user conceives of data which it contains.

4.2. It becomes possible to separate the levels of content (data) and interface. A number of different interfaces can be created to the same data. A new media object can be defined as one or more interfaces to a multimedia database (see introduction to Interface chapter and Database section for more discussion of this principle).

4.3. The information about the user can be used by a computer program to automatically customize the media composition as well as to create the elements themselves. Examples: Web sites use the information about the type of hardware and browser or user's network address to automatically customize the site which the user will see; interactive computer installations use information about the user's body movements to generate sounds, shapes, and images, or to control behaviors of artificial creatures.

4.4. A particular case of 4.3 is branching-type interactivity (sometimes also called menu-based interactivity.) This term refers to programs in which all the possible objects which the user can visit form a branching tree structure. When the user reaches a particular object, the program presents her with choices and let her pick. Depending on the value chosen, the user advances along a particular branch of the tree. For instance, in Myst each screen typically contains a left and a right button, clicking on the button retrieves a new screen, and so on. In this case the information used by a program is the output of user's cognitive process, rather than the network address or body position. (See Menus, Filters, Plug-ins for more discussion of this principle.)

4.5. Hypermedia is another popular new media structure, which conceptually is close to branching-type interactivity (because quite often the elements are connected using a branch tree structure). In hypermedia, the multimedia elements making a document are connected through hyperlinks. Thus the elements and the structure are independent of each other --rather than hard-wired together, as in traditional media. World Wide Web is a particular implementation of hypermedia in which the elements are distributed throughout the network . Hypertext is a particular case of hypermedia which uses only one media type - text. How does the principle of variability works in this case? We can conceive of all possible paths through a hypermedia document as being different versions of it. By following the links the user retrieves a particular version of a document.

4.6. Another way in which different versions of the same media objects are commonly generated in computer culture is through periodic updates. Networks allow the content of a new media object to be periodically updating while keeping its structure intact. For instance, modern software applications can periodically check for updates on the Internet and then download and install these updates, sometimes without any actions from the user. Most Web sites are also periodically updated either manually or automatically, when the data in the databases which drives the sites changes. A particularly interesting case of this updateability feature is the sites which update some information, such as such as stock prices or weather, continuosly.

4.7. One of the most basic cases of the variability principle is scalability, in which different versions of the same media object can be generated at various sizes or levels of detail. The metaphor of a map is useful in thinking about the scalability principle. If we equate a new media object with a physical territory, different versions of this object are like maps of this territory, generated at different scales. Depending on the scale chosen, a map provides more or less detail about the territory. Indeed, different versions of a new media object may vary strictly quantitatively, i.e. in the amount of detail present: for instance, a full size image and its icon, automatically generated by Photoshop; a full text and its shorter version, generated by Autosummarize command in Microsoft Word 97; or the different versions which can be created using Outline command in Word. Beginning with version 3 (1997), Apple's QuickTime format also made possible to imbed a number of different versions which differ in size within a single QuickTime movie; when a Web user accesses the movie, a version is automatically selected depending on connection speed. Conceptually similar technique called distancing or level of detail is used in interactive virtual worlds such as VRML scenes. A designer creates a number of models of the same object, each with progressively less detail. When the virtual camera is close to the object, a highly detailed model is used; if the object is far away, a lesser detailed version is automatically substituted by a program to save unnecessary computation of detail which can't be seen anyway.

New media also allows to create versions of the same object which differ from each other in more substantial ways. Here the comparison with maps of diffident scales no longer works. The examples of commands in commonly used software packages which allow to create such qualitatively different versions are Variations and Adjustment layers in Photoshop 5 and writing style option in Word's Spelling and Grammar command. More examples can be found on the Internet were, beginning in the middle of the 1990s, it become common to create a few different versions of a Web site. The user with a fast connection can choose a rich multimedia version while the user with a slow connection can settle for a more bare-bones version which loads faster.

Among new media artworks, David Blair's WaxWeb, a Web site which is an 'adaptation' of an hour long video narrative, offers a more radical implementation of the scalability principle. While interacting with the narrative, the user at any point can change the scale of representation, going from an image-based outline of the movie to a complete script or a particular shot, or a VRML scene based on this shot, and so on. Another example of how use of scalability principle can create a dramatically new experience of an old media object is Stephen Mamber's database-driven representation of Hitchock's Birds. Mamber's software generates a still for every shot of the film; it then automatically combines all the stills into a rectangular matrix. Every cell in the matrix corresponds to a particular shot from the film. As a result, time is spatialized, similar to how it was done in Edisons's early Kinetoscope cylinders (see The Myths of New Media.) Spatializing the film allows us to study its different temporal structures which would be hard to observe otherwise. As in WaxWeb, the user can at any point change the scale of representation, going from a complete film to a particular shot.

As can be seen, the principle of variability is a useful in allowing us to connect many important characteristics of new media which on first sight may appear unrelated. In particular, such popular new media structures as branching (or menu) interactivity and hypermedia can be seen as particular instances of variability principle (4.4 and 4.5, respectively). In the case of branching interactivity, the user plays an active role in determining the order in which the already generated elements are accessed. This is the simplest kind of interactivity; more complex kinds are also possible where both the elements and the structure of the whole object are either modified or generated on the fly in response to user's interaction with a program. We can refer to such implementations as open interactivity to distinguish them from the closed interactivity which uses fixed elements arranged in a fixed branching structure. Open interactivity can be implemented using a variety of approaches, including procedural and object-oriented computer programming, AI, AL, and neural networks.

As long as there exist some kernel, some structure, some prototype which remains unchanged throughout the interaction, open interactivity can be thought of as a subset of variability principle. Here useful analogy can be made with theory of family resemblance by Witgenstein, later developed into the influential theory of prototypes by cognitive psychologist Eleonor Rosh. In a family, a number of relatives will share some features, although no single family member may posses all of the features. Similarly, according to the theory of prototypes, the meanings of many words in a natural language derive not through a logical definition but through a proximity to certain prototype.

Hypermedia, the other popular structure of new media, can also be seen as a particular case of the more general principle of variability. According to the definition by Halacz and Swartz, hypermedia systems 'provide their users with the ability to create, manipulate and/or examine a network of information-containing nodes interconnected by relational links.' Since in new media the individual media elements (images, pages of text, etc.) always retain their individual identity (the principle of modularity), they can be wired together into more than one object. Hyperlinking is a particular way to achieve this wiring. A hyperlink creates a connection between two elements, for example between two words in two different pages or a sentence on one page and an image in another, or two different places within the same page. The elements connected through hyperlinks can exist on the same computer or on different computers connected on a network, as in the case of World Wide Web.

If in traditional media the elements are hardwired into a unique structure and no longer maintain their separate identity, in hypermedia the elements and the structure are separate from each other. The structure of hyperlinks -- typically a branching tree - can be specified independently from the contents of a document. To make an analogy with grammar of a natural language as described in Noam Chomsky's early linguistic theory, we can compare a hypermedia structure which specifies the connections between the nodes with a deep structure of a sentence; a particular hypermedia text can be then compared with a particular sentence in a natural language. Another useful analogy is with computer programming. In programming, there is clear separation between algorithms and data. An algorithm specifies the sequence of steps to be performed on any data, just as a hypermedia structure specifies a set of navigation paths (i.e., connections between the nodes) which potentially can be applied to any set of media objects.

The principle of variability also exemplifies how, historically, the changes in media technologies are correlated with changes the social change. If the logic of old media corresponded to the logic of industrial mass society, the logic of new media fits the logic of the post-industrial society which values individuality over conformity. In industrial mass society everybody was supposed to enjoy the same goods -- and to have the same beliefs. This was also the logic of media technology. A media object was assembled in a media factory (such as a Hollywood studio). Millions of identical copies were produced from a master and distributed to all the citizens. Broadcasting, cinema, print media all followed this logic.

In a post-industrial society, every citizen can construct her own custom lifestyle and select her ideology from a large (but not infinite) number of choices. Rather than pushing the same objects/information to a mass audience, marketing now tries to target each individual separately. The logic of new media technology reflects this new social logic. Every visitor to a Web site automatically gets her own custom version of the site created on the fly from a database. The language of the text, the contents, the ads displayed - all these can be customized by interpreting the information about where on the network the user is coming from; or, if the user previously registered with the site, her personal profile can be used for this customization. According to a report in USA Today (November 9, 1999), "Unlike ads in magazines or other real-world publications, 'banner' ads on Web pages change wit every page view. And most of the companies that place the ads on the Web site track your movements across the Net, 'remembering' which ads you've seen, exactly when you saw them, whether you clicked on them, where you were at the time and the site you have visited just before."

More generally, every hypertext reader gets her own version of the complete text by selecting a particular path through it. Similarly, every user of an interactive installation gets her own version of the work. And so on. In this way new media technology acts as the most perfect realization of the utopia of an ideal society composed from unique individuals. New media objects assure users that their choices - and therefore, their underlying thoughts and desires - are unique, rather than pre-programmed and shared with others. As though trying to compensate for their earlier role in making us all the same, today descendants of the Jacqurd's loom, the Hollerith tabulator and Zuse's cinema-computer are now working to convince us that we are all unique.

The principle of variability as it is presented here is not dissimilar to how the artist and curator Jon Ippolito uses the same concept. I believe that we differ in how we use the concept of variability in two key respects. First, Ippolito uses variability to describe a characteristic shared by recent conceptual and some digital art, while I see variability as a basic condition of all new media. Second, Ippolito follows the tradition of conceptual art where an artist can vary any dimension of the artwork, even its content; my use of the term aims to reflect the logic of mainstream culture where versions of the object share some well-defined data. This data which can be a well-known narrative (Psycho), an icon (Coca-Cola sign), a character (Mickey Mouse) or a famous star (Madonna), is referred in media industry as property. Thus all cultural projects produced by Madonna will be automatically united by her name. Using the theory of prototypes, we can say that the property acts as a prototype, and different versions are derived from this prototype. Moreover, when a number of versions are being commercially released based on some property, usually one of these versions is treated as the source of the data, with others positioned as being derived from this source. Typically the version which is in the same media as the original property is treated as the source. For instance, when a movie studio releases a new film, along with a computer game based on it, along with products tie-ins, along with music written for the movie, etc., usually the film is presented as the base object from which other objects are derived. So when George Lucas releases a new Star Wars movie, it refers back to the original property - the original Star Wars trilogy. This new movie becomes the base object and all other media objects which are released along with refer to this object. Conversely, when computer games such as Tomb Rider are re-made into movies, the original computer game is presented as the base object.

While I deduced the principle of variability from more basic principles of new media - numerical representation (1) and modularity of information (2) - it can also be seen as a consequence of computer's way of to represent data and model the world itself: as variables rather than constants. As new media theorist and architect Marcos Novak notes, a computer - and computer culture in its wake - substitute every constant by a variable. In designing all functions and data structures, a computer programmer tries to always use variables rather than constants. On the level of human-computer interface, this principle means that the user is given many options to modify the performance of a program of a media object, be it a computer game, a Web site, a Web browser, or the operating system itself. The user can change the profile of a game character, modify how the folders appear on the desktop, how files are displayed, what icons are used, etc. If we apply this principle to culture at large, it would mean that every choice responsible for giving a cultural object a unique identity can potentially remain always open. Size, degree of detail, format, color, shape, interactive trajectory, trajectory through space, duration, rhythm, point of view, the presence or absence of particular characters, the development of the plot - to name just a few dimensions of cultural objects in different media - all these can be defined as variables, to be freely modified by a user.

Do we want, or need, such freedom? As the pioneer of interactive filmmaking Graham Weinbren argued in relation to interactive media, making a choice involves a moral responsibility. By passing these choices to the user, the author also passes the responsibility to represent the world and the human condition in it. (This is paralleled by the use of phone or Web-based automated menu systems by all big companies to handle their customers; while the companies are doing this in the name of choice and freedom, one of the effects of this automation is that labor to be done is passed from company's employees to the customer. If before a customer would get the information or buy the product by interacting with a company employee, now she has to spend her own time and energy in navigating through numerous menus to accomplish the same result.) The moral anxiety which accompanies the shift from constants to variables, from tradition to choices in all areas of life in a contemporary society, and the corresponding anxiety of a writer who has to portray it, is well rendered in this closing passage of a short story written by a contemporary American writer Rick Moody (the story is about the death of his sister):

I should fictionalize it more, I should conceal myself. I should consider the responsibilities of characterization, I should conflate her two children into one, or reverse their genders, or otherwise alter them, I should make her boyfriend a husband, I should explicate all the tributaries of my extended family (its remarriages, its internecine politics), I should novelize the whole thing, I should make it multigenerational, I should work in my forefathers (stonemasons and newspapermen), I should let artifice create an elegant surface, I should make the events orderly, I should wait and write about it later, I should wait until I'm not angry, I shouldn't clutter a narrative with fragments, with mere recollections of good times, or with regrets, I should make Meredith's death shapely and persuasive, not blunt and disjunctive, I shouldn't have to think the unthinkable, I shouldn't have to suffer, I should address her here directly (these are the ways I miss you), I should write only of affection, I should make our travels in this earthy landscape safe and secure, I should have a better ending, I shouldn't say her life was short and often sad, I shouldn't say she had demons, as I do too.

 

5. Transcoding

Beginning with the basic, material principles of new media - numeric coding and modular organization - we moved to more deep and far reaching ones - automation and variability. The last, fifth principle of cultural transcoding aims to describe what in my view is the most substantial consequence of media's computerization. As I have suggested, computerization turns media into computer data. While from one point of view computerized media still displays structural organization which makes sense to its human users - images feature recognizable objects; text files consist from grammatical sentences; virtual spaces are defined along the familiar Cartesian coordinate system; and so on - from another point of view, its structure now follows the established conventions of computer's organization of data. The examples of these conventions are different data structures such as lists, records and arrays; the already mentioned substitution of all constants by variables; the separation between algorithms and data structures; and modularity.

The structure of a computer image is a case in point. On the level of representation, it belongs to the side of human culture, automatically entering in dialog with other images, other cultural semes and mythemes. But on another level, it is a computer file which consist from a machine-readable header, followed by numbers representing RGB values of its pixels. On this level it enters into a dialog with other computer files. The dimensions of this dialog are not the image's content, meanings or formal qualities, but file size, file type, type of compression used, file format and so on. In short, these dimensions are that of computer's own cosmogony rather than of human culture.

Similarly, new media in general can be thought of as consisting from two distinct layers: the cultural layer and the computer layer. The examples of categories on the cultural layer are encyclopedia and a short story; story and plot; composition and point of view; mimesis and catharsis, comedy and tragedy. The examples of categories on the computer layer are process and packet (as in data packets transmitted through the network); sorting and matching; function and variable; a computer language and a data structure.

Since new media is created on computers, distributed via computers, stored and archived on computers, the logic of a computer can be expected to significant influence on the traditional cultural logic of media. That is, we may expect that the computer layer will affect the cultural layer. The ways in which computer models the world, represents data and allows us to operate on it; the key operations behind all computer programs (such as search, match, sort, filter); the conventions of HCI - in short, what can be called computer's ontology, epistemology and pragmatics - influence the cultural layer of new media: its organization, its emerging genres, its contents.

Of course what I called a computer layer is not itself fixed but is changing in time. As hardware and software keep evolving and as the computer is used for new tasks and in new ways, this layer is undergoing continuos transformation. The new use of computer as a media machine is the case in point. This use is having an effect on computer's hardware and software, especially on the level of the human-computer interface which looks more and more like the interfaces of older media machines and cultural technologies: VCR, tape player, photo camera.

In summary, the computer layer and media/culture layer influence each other. To use another concept from new media, we can say that they are being composited together. The result of this composite is the new computer culture: a blend of human and computer meanings, of traditional ways human culture modeled the world and computer's own ways to represent it.

Throughout the book, we will encounter many examples of the principle of transcoding at work. For instance, The Language of Cultural Interfaces section will look at how conventions of printed page, cinema and traditional HCI interact together in the interfaces of Web sites, CD-ROMs, virtual spaces and computer games.

Database section will discuss how a database, originally a computer technology to organize and access data, is becoming a new cultural form of its own. But we can also reinterpret some of the principles of new media already discussed above as consequences of the transcoding principle. For instance, hypermedia can be understood as one cultural effect of the separation between a algorithm and a data structure, essential to computer programming. Just as in programming algorithms and data structures exist independently of each other, in hypermedia data is separated from the navigation structure. (For another example of the cultural effect of algorithm-data structure dichotomy see Database section.) Similarly, the modular structure of new media can be seen as an effect of the modularity in structural computer programming. Just as a structural computer program consist from smaller modules which in their turn consist from even smaller modules, a new media object as a modular structure, as I explained in my discussion of modularity above.

In new media lingo, to transcode something is to translate it into another format. The computerization of culture gradually accomplishes similar transcoding in relation to all cultural categories and concepts. That is, cultural categories and concepts are substituted, on the level of meaning and/or the language, by new ones which derive from computer's ontology, epistemology and pragmatics. New media thus acts as a forerunner of this more general process of cultural re-conceptualization.

Given the process of conceptual transfer from computer world to culture at large, and given the new status of media as computer data, what theoretical framework can we use to understand it? Since on one level new media is an old media which has been digitized, it seems appropriate to look at new media using the perspective of media studies. We may compare new media and old media, such as print, photography, or television. We may also ask about the conditions of distribution and reception and the patterns of use. We may also ask about similarities and differences in the material properties of each medium and how these affect their aesthetic possibilities.

This perspective is important, and I am using it frequently in this book; but it is not sufficient. It can't address the most fundamental new quality of new media which has no historical precedent - programmability. Comparing new media to print, photography, or television will never tell us the whole story. For while from one point of view new media is indeed another media, from another is simply a particular type of computer data, something which is stored in files and databases, retrieved and sorted, run through algorithms and written to the output device. That the data represents pixels and that this device happened to be an output screen is besides the point. The computer may perform perfectly the role of the Jacquard loom, but underneath it is fundamentally Babbage's Analytical Engine - after all, this was its identity for one hundred and fifty years. New media may look like media, but this is only the surface.

New media calls for a new stage in media theory whose beginnings can be traced back to the revolutionary works of Robert Innis and Marshall McLuhan of the 1950s. To understand the logic of new media we need to turn to computer science. It is there that we may expect to find the new terms, categories and operations which characterize media which became programmable. From media studies, we move to something which can be called software studies; from media theory - to software theory. The principle of transcoding is one way to start thinking about software theory. Another way which this book experiments with is using concepts from computer science as categories of new media theory. The examples here are interface and database. And, last but not least, I follow the analysis of material and logical principles of computer hardware and software in this chapter with two chapters on human-computer interface and the interfaces of software applications use to author and access new media objects.

 

This text is an excerpt from The Language of New Media by Lev Manovich.