Thursday, March 27, 2014

Review of Software Takes Command by Lev Manovich (Bloomsbury, 2013)

In Software Takes Command, Lev Manovich provides a compelling account of how all forms of cultural media have become produced through software.  In so doing, he contends:

‘[s]oftware has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs’ (p. 2).

Such arguments have been made in the nascent software studies literature for a number of years, with proponents suggesting that given the extent to which software now conditions everyday life it deserves to be examined in its own right as a significant actant and theoretical category (e.g., Fuller 2008, Chun 2011, Kitchin and Dodge 2011).  As Manovich puts it in a book proposal co-written with Benjamin Bratton in 2003,

‘[if] we don’t address software itself, we are in danger of always dealing only with its effects rather than the causes: the output that appears on a computer screen rather than the programs and social cultures that produce these outputs’ (p. 9).

As he notes, such studies are concerned with questions such as what is the nature of software?, ‘[w]hat is “media” after software?’ (p.4), ‘what does it mean to live in a “software society”?’ and ‘what does it mean to be part of “software culture”?’ (p. 6).  He seeks to answer such questions through an in-depth genealogical study of the ‘softwarization’ of cultural media - art, photos, film, television, music, etc. - that has been occurring since the 1970s, tracing out the simulation and extension of analogue techniques in software such as Photoshop, and the creation of entirely new techniques.

Through a series of theoretically informed and empirically rich chapters, Manovich reflects on how different media became thoroughly infused with software, how it altered different practices, and how to make sense of software’s effects.  He persuasively argues that softwarization has led to the formation of a new ‘metamedium’ in which what were previously separate media, and already existing and not-yet-invented media, become fused.  This metamedium is composed of a composite of algorithms and data structures, and techniques that are general purpose (such as cut-and-paste) and those that are media-specific which combine to produce various forms of ‘hybridity’ and ‘deep remixability’. 

Moreover, ‘[u]nited within the common software environment the languages of cinematography, animation, drawing, computer animation, special effects, graphic design, typography, drawing, and painting, have come to form a new metalanguage’ (p. 268).  Further, given the partial and provisional nature of software - always being updated and patched, always processing data - he contends that software produces a world of permanent change and flux.  He concludes that ‘[t]urning everything into data, and using algorithms to analyze it changes what it means to know something.  It creates new strategies that together make up software epistemology.’  We are only just starting to make sense of such an epistemology.

Given the logic and power of the argument forwarded it is relatively straightforward to begin to translate Manovich’s argument and approach to other domains.  Software, after all has gradually been infusing the practices of work, science, home life, communication, consumption, travel, and so on.  Indeed, as I read the text I started to sketch out a potential project tracing how maps have become software, producing a genealogy of geospatial media.  It will be interesting to see such translations being made and for the theory to be fleshed out as it encounters new scenarios and phenomena.
My view is, however, that such translations need to be broader and more ambitious in their scope.

Whilst Manovich is undoubtedly right that software is a key metamedium utilising new metalanguages that are reshaping cultural practices, the analytical framing adopted over-fetishizes code at the expense of its wider assemblage of production and use.  This is because his proposed approach is quite narrowly framed.  He argues: “To understand media today we need to understand media software - its genealogy (where it comes from), its anatomy (interfaces and operations), and its practical and theoretical effects” (p. 125). 

Horewever, drawing on my own work on thinking about data (Kitchin, in press), we need to be careful not to lose sight of the fact that software is bound up in a whole suite of discursive and material practices and structures, including:

•    systems of thought (philosophies, theories, models, ideologies, etc)
•    forms of knowledge (manuals, papers, magazines, websites, experience, word of mouth, etc)
•    finance (business models, investment, venture capital, grants, philanthropy, etc)
•    political economies (policy, tax regimes, public and political opinion, ethical considerations, etc)
•    governmentalities and legalities (data standards, system requirements, protocols, regulations, laws, licensing, intellectual property regimes)
•    materialities and infrastructures (computers, databases, networks, servers, etc)
•    practices (Techniques, learned behaviours, scientific conventions, etc)
•    organisations and institutions (corporations, consultants, manufacturers, retailers, government agencies, universities, conferences, clubs and societies, etc)
•    subjectivities and communities (of data producers, managers, analysts, scientists, politicians, users, etc)
•    places (labs, offices, field sites, data centres, business parks, etc)
•    marketplaces (for software, data, coders, etc).

Understanding software then, I would contend, requires placing it within its wider context that shapes how it is conceived, produced, and used in often quite messy, contingent and relational ways.  Manovich rightly contends that software is a new ‘medium in which we can think and imagine differently’ (p. 13), but we should not fall into the trap of over-fetishizing and decontextualizing it; software is enmeshed in complex assemblages that have to be recognized and understood if we are to make full sense of how it makes a difference.  Nevertheless, Software Takes Command is a very good starting point for such a journey.

Chun, W.H.K. (2011) Programmed Vision: Software and Memory. MIT Press, Cambridge, MA.

Fuller, M. (ed.) (2008) Software Studies: A Lexicon.  MIT Press, Cambridge, MA.

Kitchin, R. (2014, in press) The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences.  Sage, London.

Kitchin, R. and Dodge, M. (2011) Code/Space: Software and Everyday Life.  MIT Press, Cambridge, MA.

No comments: