CCTP 797: Technology / Theory / Culture
Professor Martin Irvine
[The syllabus is being revised for January 2015]
This course will provide an interdisciplinary overview of the major philosophies and theories of technology and media with a focus on the interdependencies between the social and technological domains. Students will be introduced to the major methodological and theoretical issues in this field, and will learn how to practice self-reflexive theory for the interdisciplinary work of CCT. We will recover the background of problems in the philosophy of science and the construction of knowledge, the history of the meaning of “technology” in our culture, cultural semiotics, recent developments in cognitive science, and systems and network theory developed in Mediology and Actor-Network theory. We will study both the “cybercultural imaginary”--the imagery and ideology of technology in popular culture--and major technical systems designs--computers and operating systems, the Internet and Web, and the technology of space in architecture and cities. A major motivation in the course will be “de-blackboxing” technologies, theories, and social environments: we will attempt to crack open what is normally relegated to a closed, inaccessible "black box," like the esthetically sealed i-devices (“I don’t know how it works, it just does”) and the invisible generative/reproductive processes of institutions, ideologies, and disciplines.
The Seminar Blog and Course Grading
The course will be conducted as a real-time seminar with a Web-only syllabus, weekly case studies, and weekly student discussion in the seminar Wordpress site (see the pages on weekly essay instructions and the seminar method and grading.) For a final seminar research project, students will write a rich-media essay posted in the course site, which will become a publicly accessible web publication.
- Katie Hafner, Where Wizards Stay Up Late: The Origins Of The Internet. New York, NY: Simon & Schuster, 1998. [ISBN 0684832674]
- William Gibson, Neuromancer. Ace / Penguin, 1984. [ISBN 0441007465] Begin reading during first week.
- ----------. Pattern Recognition. New York, NY: Putnam / Penguin, 2003. [ISBN 0425192938]
- Floridi, Luciano. Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010. [ISBN: 0745645720]
- Lev Manovich, Software Takes Command: Extending the Language of New Media (London; New York: Bloomsbury Academic, 2013). [ISBN: 1623567459]
- Neal Stephenson, Snow Crash. New York, NY: Bantam Spectra, 1992. [ISBN 0553380958]
- Regis Debray, Media Manifestos. Trans. Eric Rauth. London and NY: Verso, 1996.
- George Dyson, Turing’s Cathedral: The Origins of the Digital Universe. New York, NY: Pantheon, 2012.
- Bruno Latour, Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford University Press, USA, 2005.
- Lev Manovich, The Language of New Media. Cambridge, MA: MIT Press, 2002
- Neal Stephenson, The Diamond Age, or, A Young Lady’s Illustrated Primer. New York, NY: Bantam Spectra, 1995.
- Neal Stephenson, Cryptonomicon. New York, NY: Avon Books, 1999.
- Noah Wardruip-Fruin and Nick Montfort, eds. The New Media Reader. Cambridge, MA: MIT Press, 2003
Introductory Lecture and Discussion:
What are we talking about when we talk about "technology" and "culture"?
Booting the Seminar
- Overview: Methods, grading, the seminar Wordpress site, weekly participation and presentations.
Introductions and Course Background by Prof. Irvine:
- Technology Theory: Introduction and Orientation (introductory essay from book in progress)
- Overview of major themes and approaches in the seminar (presentation):
- Technologies as ongoing combinations of implementable abstract models (from tools and writing to complex machines, computation, and software-based media)
- All dimensions of mediation and interface: social, technical, cultural, political, economic
- The mediating functions of computers, software, pan-digital platforms: media, mediations, and computer devices as a metamedium
- The continuum of technical implementations and cognitive externalizations: from language and symbolic capabilities to writing, the cumulative histories of media forms, and computer code.
- The analog-digital-analog continuum: contemporary experience of embedded and externalized media and computational technologies
Background Readings (we will return to these texts later):
- Leo Marx, "Technology: The Emergence of a Hazardous Concept." Technology and Culture, July, 2010.
- Regis Debray, " What is Mediology?" Le Monde Diplomatique, Aug., 1999. (Trans. Martin Irvine)
- Smith and Riley, Cultural Theory, Introduction, 1-5, and survey contents of book.
- Lev Manovich, Software Takes Command, Introduction, 1-51.
Introductory Case Studies: examples from everyday technologies and consumer products
- Digital intelligence and computer design in almost everything: the Coke bottle.
- Supply chain management and logistics: Video overview of SCM | Materials handling | Amazon's fulfillment centers | Amazon fulfillment technology |
- i-Everything: the complex of technologies behind the iPhone and iPad.
- iPhone as case study of bundled functionality through hybrid combination of cumulative combinations.
Useful Tools for Doing Research: For this Course and CCT
An orientation to key theories and philosophies of technology and media over the past 70 years. Looking for the ideologies and major assumptions driving theories of technology, especially recent "new" technology, and the role of media technologies. Introductory familiarity with issues of technology determinism, the problem of agency attributions, technologies and social networks.
Learning to Read for the Argument
An introduction to learning how to "read for the argument," learning how to work with the major assumptions and outcomes of a theory or philosophy without getting bogged down or overwhelmed by the length or complexity of the reading. We work in a universe or network of arguments, ideas with consequences, and we always begin by jumping in mid-stream and finding our way in the history of ideas and arguments. I will teach you how to discover the key "take aways" (positions, schools of thought, major assumptions) in an essay, article, or book so that you will begin building your own knowledge repertoire for working with anything else you are reading.
One of the difficulties in surveying the writing and arguments from many disciplines comes from the "boundary work" each field does to demarcate a legitimate field of study and theorizing and posit the objects with which it is concerned. We have multiple and rival vocabularies, arguments justifying claims and the legitimate knowledge boundaries of the field. It's important to learn the key "universes of discourse" and begin to assess how the approaches can be used and what analytical or theoretical work they can do.
Backgrounds on the Concepts and History of "Technology"
- Martin Irvine, "Technology Theory: An Introduction" (Intro essay, book chapter in progress)
- Wikipedia, "History of Technology" for a paradigm presentation of the traditional background.
Note the trajectory of a "story" with a teleology from early tool making to contemporary technologies. The "emplotment" of a progressive narrative of technology is one of the main ideologies of the modern (and postmodern) world.
- Leo Marx, "Technology: The Emergence of a Hazardous Concept." Technology and Culture, July, 2010.
Note the shifting semantic space for the word technology:
Tools, machines, industrialization and automation of machines, electronics and computers: concepts of science, engineering, technology.
Compare: Technik (German) and technique (French): human control of nature in a built environment, tools and machines for production.
Tools vs. applied science (instrumental vs. applied knowledge view of "technology")
"Technology" became socially constructed as an autonomous domain.
- Langdon Winner, "Do Artifacts Have Politics." Excerpt from The Whale and the Reactor: A Search for Limits in an Age of High Technology. Chicago, University of Chicago Press, 19-39.
- Bruno Latour, How to Write The Prince for Machines (1988) (excerpt).
This is a difficult essay to jump into, since Latour is playing off the themes of a famous book about political power and showing that technology and science distribute social power through distributed networks of agency. We will return to this ideas later, but for this week, we need to consider the Actor-Network model for describing technology and agency.
- Useful short background: Darryl Cressman, "A Brief Overview of Actor-Network Theory" (2009).
McLuhan and Media Theory to Remediation (1960s-2000s)
- Marshall McLuhan, "The
Medium is the Message" (Excerpts from Understanding
Media, The Extensions of Man. 2nd
Edition, Introduction and Chap. 1 through p.19)
- McLuhan's arguments became a launchpad for huge debates and research approaches on media and technology since the late 1960s. What are the consequences of adopting his terms and definitions of "medium" and "extensions of the body"? What about the determinist assumptions?
- Technological or Media Determinism (Daniel Chandler): Overview of issues, survey for background.
- Regis Debray, "What is Mediology?" Le Monde Diplomatique, Aug., 1999. (Trans. Martin Irvine)
- Jay David Bolter and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: The MIT Press, 2000. Excerpts: Intro | Chap. 1.
Computation and Software
- Lev Manovich, Software Takes Command, Introduction, 1-51.
- What kinds of discourses have we inherited about technology (machines, artefacts) and media technologies specifically? Extrapolating from this week's readings, what issues are still current in thinking about computers, information, human interaction?
Big questions to begin gathering intellectual resources for: Technology and agency? What presupposed (unacknowledged) theory of technology do you find in the Apple i-device world? Utopian? Re-mediated? Instrumental? Do technologies get bundled with political or moral determinations?
What kind of technical artefact is a tablet computing device or smart phone? What cultural/social forces compel us to keep devices "black-boxed"? (We will investigate this question more fully in the following weeks.)
Begin weekly blog discussions
Learning Objectives and Discussion Questions
Learning the major conceptual frameworks for communication and information theory and ways to critique inherited models to advance more satisfying theory for our communication and media environment today.
Communication and information theory from the 1950s-80s is widely taken for granted in discussions of media, technology, and computation. Originally developed as models for transmitting electronic signals, these theories have also also informed models of meaning transmission in communication and in culture more broadly. We need to investigate the main assumptions, then ask what is left out of the models for understanding meaning and the social-cultural uses of messages. For example, the networks of semantic and symbolic meanings understood and encoded in signal mediums but not properties of their material form, factoring in transmission through time and different contexts, the limits of linear one-way models, larger questions about production and receptions contexts that are more like networks than point-to-point connections. What kind of models can explain all the communication "modalities": one to one, one-many, many-one, many-many, and dialog (one-other, mutually assumed); synchronous (at the same time) and asynchronous (delays in reception-response, short time span or very long).
Readings: Models of Communication and Information and Their Consequences
- Luciano Floridi, Information, Chapters 1-4. [For background on the main traditions.]
- James Gleick, Excerpts from The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011). Excerpts from Introduction and Chap. 7. [Readable background on the history of information theory.]
- Ronald E. Day, "The ‘Conduit Metaphor’ and the Nature and Politics of Information Studies." Journal of the American Society for Information Science 51, no. 9 (2000): 805-811.
Models and metaphors for "communication" have long been constrained by "transport", "conduit," and "container/content" metaphors that provide only one map of a larger process. How can we describe "communication" and "meaning" in better ways that account for all the conditions, contexts, and environments of "meaning making"? How do network metaphors disrupt the linear point-to-point metaphors?
- Models of the Communication Process and An Ecological Model of Communication
(Davis Foulger, Brooklyn College/CUNY)
Good overviews of the history of approaches to communication and a more inclusive ecological systems model. Note the context of the original transmission models and further complex models. Note that some models come from rhetorical thought, engineering and and information sciences, and sociology and semiotics.
- Irvine, Conceptual Models in Communication Theory: Communication Models and Methods (presentation; will discuss in class)
Communication and Culture: Beginnings of a New Paradigm
- James Carey, "Communication
and Culture" (from Communication as Culture: Essays on Media and Society, 1992) [pdf]. Summary
of Carey's Views (excerpts)
[An influential essay by a leader in the modern field of "Communication" that repositions the study of communication and mediating technologies in the cultural, ideological, and economic context.]
- Stuart Hall, "Encoding/Decoding" (first published, 1973). An important revision of communication theory with the semiotics of cultural codes and meaning systems. Influential in many fields.
Supplementary Background, Bibliography, and Sources
- Primary sources:
- Claude Shannon, The Mathematical Theory of Communication (1948). The Bell System Technical Journal, Vol. 27, pp. 379-423, 623-656, July, October, 1948.
- Warren Weaver, "Recent Contributions to the Mathematical Theory of Communication." An introduction to Shannon's theory and article. From: Shannon, Claude E., and Warren Weaver. The Mathematical Theory of Communication. Champaign, IL: University of Illinois Press, 1949.
- Craig, Robert T. "Communication Theory as a Field." Communication Theory 9, no. 2 (1999): 119-161.
- Jerry Norman's History of Information Site (wealth of primary source backgrounds)
See especially: Timeline of artefacts in the history of Information and Communication.
The signal-code-transmission model of information theory has been the foundation of signal and message transmission systems in all communication technology from the telegraph to the Internet. But is the signal-code-transmission model an adequate model for describing meaning, the semantic, social, and cultural significance of encoded signals?
What is "communicated" in a communication act or event? How are message "contents" abstractable in transmission?
Where and when are "meanings" in communication and information? What are the conceptual consequences of the content - container - transport/conduit models of communication? Can we model the contexts and environments of meaning not explicitly stated in the "information" of a "transmitted message"?
Examples and cases:
How de we know what an email message means? What is "communication" in an art object: are symbolic artefacts "communications"? What is missing in the "information theory" model in accounting for messages and meanings? Where are the meanings in our understanding of cultural artefacts?
Student weekly blog discussions
From Language and Symbolic Systems to Code and Cognitive Technologies
The human capacity for language and symbolic thought is the starting point of many disciplines and research programs in all aspects of communication, symbolic culture, and computation.
Much work in software models and code depends on the major discoveries in modern linguistics and the cognitive sciences for symbolic processing. Research in the cognitive sciences and linguistics is now a major component of anthropology, evolutionary biology and neurology, computation and artificial intelligence, semiotics, and communication theory. Familiarity with the concepts, terms, and assumptions of contemporary linguistics is thus essential for discussing and describing all other symbolic combinatorial systems that use language or function as language-like systems.
We can only do a top-level overview here, but with some familiarity in the problems and core concepts, you can advance to topics of interest in your own research.
Read at least through the Introductions readings section, and others as time or interest allows.
Language as a Symbolic Cognitive System: The Linguistics Method
- Irvine, "Linguistics: Key Concepts" (start here)
- From Andrew Radford, et al. Linguistics: An Introduction. 2nd ed. Cambridge, UK: Cambridge University Press, 2009.
Includes the Table of Contents for the whole book so that you can see the contents of a typical recent textbook (following a post-Chomsky approach).
Review the "Introduction" to Linguistics as a field, and sections on Words (lexicon) and Sentences (grammatical functions and syntax). Read and scan enough in each section to gain familiarity with the main concepts.
- Steven Pinker, "How Language Works." Excerpt from: Pinker, The Language Instinct: How the Mind Creates Language. New York, NY: William Morrow & Company, 1994: 83-123.
Accessible introduction to central issues by a leading cognitive linguist.
- Terrence W. Deacon, The Symbolic Species: The Co-evolution of Language and the Brain. New York, NY: W. W. Norton & Company, 1998. Excerpts from chapters 1 and 3.
Influential work and argument for language and symbolic cognition co-evolving with the human brain. Read for his main argument about language, symbols, and brain evolution.
- Advanced: The Details of Core Linguistic Theory and Method (as reference or as time allows)
- Noam Chomsky, "Form and meaning in natural languages." Excerpt from Language and Mind, 3rd. Edition. Cambridge University Press, 2006.
[This exposition is from the middle period of Chomsky's thought, and provides the "standard theory" of generative linguistics that now has many branches, modifications, and elaborations. If you work through this, you'll have a sense of what motivates much linguistic research on the central question of "what is language" right up today. Note: Chomsky focuses on syntax, and does not include important questions in semantics and pragmatics.]
- Noam Chomsky, Aspects of the Theory of Syntax. Cambridge, MA: MIT Press, 1965. From Chapter 1: Excerpt in pdf.
[This was the book that launched a new way to do linguistics with a syntax-centered model.]
- Ray Jackendoff, “Précis of Foundations of Language: Brain, Meaning, Grammar, Evolution,” Behavioral and Brain Sciences 26, no. 06 (2003): 651-665.
[Scan for the basic issues and key arguments. Very detailed and complex argument, based on many years of research and debate in the core issues of linguistics. This essay is an outline of the main argument and topics in Jackendoff's major work, Foundations of Language (2003).]
Syntax Tool: Visualization of Sentence Structure and Computational Parsing
- XLE-Web: parsing tool and tree generator for syntax based on the "Lexical Functional Grammar" model of generative grammar. Choose "English" and map any sentence for its constituent (c- ) and functional (f- ) structure! (Uses linguistic notation from two formal systems.) Syntax parsers are used in computational linguistics and all complex text analysis in software and network applications. Web search algorithms and Siri voice recognition and interpretation have to use parsers for generating the probable grammatical structure of natural language phrases
- Try the test sentence "I like dark beer but dark beer doesn't like me." You will see how the parser generates several grammatical structure options because of the ambiguities in switching subject and object with the verb [like]. Then try any sentence; try some very complex ones (compound subjects, many conjunctions and clauses, etc.).
Student weekly blog discussions
What is language? What is "a language"? What are the implications of using language as a system as the model for all other symbolic systems (visual, audio, combinations) and most forms of communication and media? Do other symbolic systems that we use for expression (visual, music, multimedia combinations) work like a language (combinations by rules that precede the expression)?
Inventing Computation: Key Ideas
This unit focuses on the key concepts in computation as the core model for software, computer and information design, and all digital media. (We will study the computer industry later in the seminar.) We will approach the questions from a non-specialist perspective, but it's important for everyone to get a conceptual grasp of the core ideas in computation because they are now pervasive throughout many sciences (including the cognitive sciences), and are behind everything we do daily with computational devices, information processing, and digital media (for example, what is happening in the Google algorithm in searches).
Readings and Background
- Irvine, An Introduction to Computational Concepts
- Daniel Hillis, The Pattern On The Stone: The Simple Ideas That Make Computers Work. New York: Basic Books, 1999. (excerpts from chaps. 1-2).
- Andrew Hodges, Turing (Great Philosopher's Series). London: Phoenix, 1997; New York: Routledge, 1999.
[Web version of Hodges' short, accessible book about Alan Turning's main ideas. In a fundamental sense, all computers and digital devices are "Turing Machines".]
- Background on John von Neumann's computation models:
Project: Lessons on Udacity and Code Academy
- Create an account on both sites and let the tutorials prompt you through the lessons. Go as far as you can.
- Udacity: http://www.udacity.com
Sign up for this course on intro to Computer Science: http://www.udacity.com/course/cs101
- Code Academy: http://www.codecademy.com
Sign up for the self-paced tutorial lessons on the Python programming language "track": (http://www.codecademy.com/tracks/python)
- and if you have time and interest, try the Web Fundamentals "track":
(other tracks: http://www.codecademy.com/tracks).
- These lessons will guide you through some basic computing concepts and also let you write some basic code and see the results when it runs.
Supplementary and Advanced Readings
- Charles Schmidt (Rutgers University), Computation and Cognition (online readings for course)
- Seymour Papert on Warren McCulloch: Introduction to Embodiments of Mind (MIT Press, 1965)
- Harvey Cragon, "The Von Neumann Machine," excerpt from Computer Architecture and Implementation. Cambridge University Press, Cambridge. pp. 1-13.
Student weekly blog discussions
Symbolic Cognition, Sign Systems, Mediation > Cognitive Technologies
Within a broad cluster of fields--ranging from neuroscience to cognitive linguistics, cognitive anthropology, and computational models of cognition and artificial intelligence research--there has been a major convergence on questions and interdisciplinary methods for studying cognition, human meaning-making and the "symbolic faculty" generally, including all our cumulative mediations and externalizations in "cognitive technologies." Cognitive science has been closely related to computer science in seeking computational models for brain and cognitive processes, and proposing hypotheses that explain cognition as a form of computation.
Many disciplines now converge around the major question of how our meaning systems function with analogous and parallel "architectures" that must include (1) rules for combinatoriality of components (an underlying syntax for forming complex and recursive expressions of meaning units), (2) intersubjective preconditions "built-in" to the meaning system for collective and shared cognition, and (3) material symbolic-cognitive externalizations (e.g, writing, images, artefacts) transmitted by means of "cognitive technologies" (everything from writing to digital media and computer code) which enable human cultures and cultural memory. This recent interdisciplinary research is a "game changer" for the way we think about human communication and media technologies.
In one view, the human symbolic faculty has generated a continuum of functions from language and abstract symbolic thought to machines, media technologies, and computation:
Language > Symbol Combinatoriality > Abstraction > Mathematics > Machines > Computation
The mainstream disciplines in communication and media studies are very conservative (remaining within a demarcated field in the humanities and social sciences) and have not yet incorporated recent advances in cognitive science fields that are directly relevant to core assumptions and research questions on language, symbolic culture, and media technologies. We therefore have an open opportunity to learn from cognitive science fields and reconfigure the inter- and transdisciplinary field in promising ways for both theoretical and applied work.
Related Principles in all Symbolic Systems and Technologies: Combinatoriality, Compositionality, Componentiality, Recursion, Externalized Memory in symbols and artefacts, Intersubjective and Collective foundation of meaning
Major research programs and approaches to know about in the cognitive science disciplines:
- Terrence Deacon: human evolution to the "symbolic species", and hypotheses for symbolic cognitive abilities in multiple signs systems (from language to writing, images, and complex artefacts).
- Edwin Hutchins and Andy Clark on the model of "distributed cognition" and "extended mind" in artefacts, cognitive technologies, material sign systems, and culture more broadly.
- Neuro-evolutionary biology of cognition: Merlin Donald on being human as being social symbol user.
- Semiotics and sign systems: concepts and descriptions for the structures of meaning systems generally.
Introductions and Orientations
- Irvine, "The Grammar of Meaning Making: Introduction to Sign Systems and Symbolic Cognition." (draft of book chapter in progress. Read sections 1-4.)
- Kate Wong, “The Morning of the Modern Mind: Symbolic Culture.” Scientific American 292, no. 6 (June 2005): 86-95.
[Read this as an archaeological-evolutionary follow-up to Terrence W. Deacon, The Symbolic Species: The Co-evolution of Language and the Brain. New York, NY: W. W. Norton & Company, 1998 (from prior week).]
- Andy Clark, Mindware: An Introduction to the Philosophy of Cognitive Science. New York: Oxford University Press, 2001. (Excerpts from Introduction, Chapters 1-2)
- James Hollan, Edwin Hutchins, and David Kirsh, “Distributed Cognition: Toward a New Foundation for Human-computer Interaction Research.” ACM Transactions, Computer-Human Interaction 7, no. 2 (June 2000): 174-196.
[Edwin Hutchins and colleagues have pioneered research into "distributed cognition," and this work has important implications for our concepts of cognitive technologies, symbols, media, and interfaces.]
- Zhang, Jiajie, and Vimla L. Patel. “Distributed Cognition, Representation, and Affordance.” Pragmatics & Cognition 14, no. 2 (July 2006): 333-341.
[This is a useful, short summary of central issues in this field of research, accessible to non-specialists.]
- Irvine, "Cognition, Meaning, Symbol" (presentation, background overview of concepts; for class discussion also)
- Background Readings (for reference and as time allows for this week):
- Merlin Donald, "Evolutionary Origins of the Social Brain,"from Social Brain Matters: Stances on the Neurobiology of Social Cognition, ed. Oscar Vilarroya, et al. Amsterdam: Rodophi, 2007.
- Colin Renfrew, “Mind and Matter: Cognitive Archaeology and External Symbolic Storage.” In Cognition and Material Culture: The Archaeology of Symbolic Storage, edited by Colin Renfrew, 1-6. Cambridge, UK: McDonald Institute for Archaeological Research, 1999.
[Important argument to supplement Merlin Donald's view about the evolutionary origins of the symbolic brain: material culture is part of the externalizing cognitive process.]
Student weekly blog discussions
- Building on the past two weeks' topics, discuss a way to describe the forms of our symbolic activity as organized in an everyday media form: a software app, a film/video/TV shot or sequence (not a whole movie/video), a series of photographs, a piece of music. Can you describe the "distributed cognition" / "collective symbolic cognition" functions? Does the view of computer and media technologies as "cognitive technologies" or "symbolic-cognitive artefacts" provide a better understanding of these technologies?
Gaining a foundation in the influential theories and interdisciplinary approaches for the study of media, symbolic mediation, and technical mediation that have shaped research and study from the 1960s to the present. What is "new" about new media, software controlled media, and network mediation, and what aspects can we understand as a continuum in re-combinatorial functions and processes? How can we best understand, describe, and analyze the concepts and implementations for interfaces and multimodal forms? How do interfaces go "meta" in combinatoriality and presentation frameworks (computer devices and digital display screens as a metamedium, a medium for other media)?
Key Traditions and Concepts in Media Theory
- Irvine, "Media Theory: An Introduction" (Intro essay from working draft of book chapter)
- (Revisit) Marshall McLuhan, "The
Medium is the Message" (Excerpts from Understanding
Media, The Extensions of Man. 2nd
Edition). Recall McLuhan's main concepts and arguments for media.
- Supplementary Background:
Daniel Czitrom, "Metahistory, Mythology, and the Media: The American Thought of Marshall McLuhan," excerpt from Daniel J. Czitrom, Media and the American Mind: From Morse to McLuhan. Chapel Hill, NC: University of North Carolina Press, 1982; read pp.172-182.
- Supplementary Background:
From Media to Systems of Mediation: The Mediological Approach (Review as Needed)
- Regis Debray, "What is Mediology?" Le Monde Diplomatique, Aug., 1999. Trans. Martin Irvine.
- Regis Debray, Transmitting
Culture, trans. Eric Rauth. New York: Columbia University
Excerpts in pdf: From Chaps. 1-2; from Chap. 7, "Ways of Doing."
- Review of Transmitting Culture: Constantina Papolias, "Of Tools and Angels: Regis Debray's Mediology," Theory, Culture & Society, 21/3 (2004): 165-70.
- Irvine, Intro to Mediology and Actor Network Theory: How To Hack Black Boxes (CCTP-505 presentation)
- Irvine, Working with Mediology and Network Theory (introductory essay, review in updated version)
- Background to mediology and mediation theory:
Frédéric Vandenberghe, "Régis Debray and Mediation Studies, or How Does an Idea Become a Material Force?" Thesis Eleven, 89, May 2007: 23-42. [Detailed philosophical critique of mediology and traditions of thought forming Debray's context.]
Post-Digital Media, Mediation, Computer Interfaces, and Metamedia
- Alan Kay and the Dynabook Concept as a Metamedium:
Alan Kay's original paper on the Dynabook concept: "A Personal Computer for Children of all Ages." Palo Alto, Xerox PARC, 1972). [Wonderful historical document. This is 1972--years before any PC, Mac, or tablet device.]
Alan Kay and Adele Goldberg, “Personal Dynamic Media” (1977), excerpt from The New Media Reader, ed. Noah Wardrip-Fruin and Nick Montfort. Originally published in Computer 10(3):31–41, March 1977. (Cambridge, MA: The MIT Press, 2003), 393–404. [Revised description of the concept for publication.]
Interview with Kay in Time Magazine (April, 2013). Interesting background on the conceptual history of the GUI, computer interfaces for "interaction," and today's computing devices.
- Jay David Bolter and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: The MIT Press, 2000. Excerpts: Intro | Chap. 1. Review for core arguments: hypermediacy, interface, re-mediation in digital media.
- Lev Manovich, The Language of New Media (excerpts). Cambridge: MIT Press, 2001.
Read selections from chap. 1 ("What is New Media") and chap. 2 ("The Interface"). Note the categories Manovich set up in chap. 1 for defining "New Media."
For Manovich, the main differentiating feature of "new media" is software: media produced, controlled, and displayed with software. Digital media = software mediatized media.
Author's website with supplements to the book.
- Lev Manovich, "The Database as a Genre of New Media." (1997) [A well-known argument from Manovich's early work.]
- Lev Manovich, "New Media: Eight Propositions." Excerpt from “New Media from Borges to HTML,” from The New Media Reader, edited by Noah Wardrip-Fruin and Nick Montfort, The MIT Press, 2002.
- Lev Manovich, Software Takes Command, pp. 55-239; and Conclusion.
Follow Manovich's central arguments about "metamedium", "hybrid media", and "interfaces".
Excerpts from chapters 1-2, ebook version (pdf).
Mediology Case Studies & Examples for Discussion
- In class: interfaces and media from books to Google Glass
Student weekly blog discussions
- Choose a discussion topic: Consider a digital interface (e.g., a Web browser and its "windows"; the "tiled" iPhone/iPad display; or a specific software interface) in as many dimensions of mediation as you can describe. What are the consequences of software-driven screens implementing the "metamedium" function? Thinking with mediology, describe as many of the concealed ("invisibility cloaked") forces and agencies being mediated or "interfaced" in an everyday digital device (not the "content").
Main Themes and Topics:
The City and/as Technology:
Local Material Spaces, Pervasive Computing, Global Networks
“I’ll begin with the following hypothesis: Society has been completely urbanized. This hypothesis implies a definition: An urban society is a society that results from a process of complete urbanization. This urbanization is virtual today, but will become real in the future.” (Henri Lefebvre, The Urban Revolution, 1970:1)
Henri Lefebvre's opening statement in The Urban Revolution (1970) was written at a time when over one third of the world’s population lived in cities or metropolitan regions; today more than one half of the world’s population is concentrated in cities and large metropolitan regions. Cities have always been the material, spatial, economic, technological, and social centers of the world, aggregating and concentrating resources and interactions according to the laws of network effects in dense nodes. "Complete urbanization" since the 1990s means a specific kind of urbanization in "global cities": networked, pervasive computing as the norm and city dwellers as information navigators with many layers of computational "distributed cognition" added to the built environment.
Information technology and the Internet has aided concentration and agglomeration, and now cities are the major zones of "pervasive," "embedded," and "ubiquitous" computing with environmental sensors, embedded chips and RFD tags, surveillance cameras, geo-location/GPS tracking, and telecom/data wireless networks everywhere. This week is devoted to considering some major aspects of the real spaces/places of urban environments and the extension of computational "cognitive spaces" through pervasive computing, global information networks, and re-concentrations of capital (all forms) in global informational cities.
Learning Objectives and Topics:
Understanding the city as multiple forms of technology in a built environment: architecture, urban street and road systems, infrastructure, nodes of communication and information. Cities as concentrators and aggregators of technology. Infrastructure: the "city of atoms" (materials and space) and the "city of bits" (the space of data and information flows, software and digital design, data centers, Internet and telecom switching centers). The movement of people in, out, and through cities is information: human traffic flow in highways, transportation systems, streets, shopping centers, offices.
Cities in Macro Globalization Views: Global City & World City
- Globalization and World Cities Research Network (Major Resource of World City Research) [Reference site]
- Visualization of Global World Cities by Connections (pdf). [Review major data points.]
- World City Globalization (How Cities are Connected and Hierarchies of Connectedness)
[Background to the methodology for how world cities are described. For reference.]
- Saskia Sassen, "Locating Cities on Global Circuits." Environment & Urbanization, Vol 14 No 1 April 2002.
[Review for main arguments and macro level features of global cities by a leading urban sociologist.]
Micro/Local Views of the City, Space, and Place from Built Environment to Post-Digital Reconfigurations
- Michel De Certeau, from The Practice of Everyday Life (excerpts): "Walking in the City" and "Spatial Stories"
[Classic essays on the sociology of lived experience in the city.]
- Nashid Nabian and Carlo Ratti, “The City to Come,” in Innovation: Perspectives for the 21st Century (OpenMind), https://www.bbvaopenmind.com/en/article/the-city-to-come/.
- Dietmar Offenhuber and Carlo Ratti, “Reading the City: Reconsidering Kevin Lynch’s Notion of Legibility in the Digital Age,” in The Digital Turn: Design in the Era of Interactive Technologies, ed. Zane Berzina, Barbara Junge, and Walter Scheiffele (Zurich: Weissensee Academy of Art, Park Books, 2012), 216-224.
From Built Environment (Place and Space) to Pervasive and Ubiquitous Computing
- Carlos Ramos, Juan Carlos Augusto, Daniel Shapiro, "Ambient Intelligence--The Next Step for Artificial Intelligence." [Accessible introduction to issues.]
- Jesper Kjeldskov et al., “Digital Urban Ambience: Mediating Context on Mobile Devices in a City,” Pervasive and Mobile Computing 9, no. 5 (October 2013): 738-749.
[Good article for latest approaches to ambient computing in urban environments, but consider the overall concepts without getting caught up in the specific products and current implementations.]
- Thad Starner, “Project Glass: An Extension of the Self,” IEEE Pervasive Computing, 2013.
[We can only begin an analysis of the kind of extended computing and augmented reality modeled in Google Glass. But it is a good case since this technical implementation presupposes the whole infrastructure of the built environment and Internet-based ubiquitous computing (wireless and wired networks, digital media, "big data," Cloud computing, etc.)]
Reference and Resource Sites: City Infrastructures and Pervasive Computing
- MIT's SENSEable City Lab:
- IEEE Divisions on Pervasive Computing and Intelligent Systems (Computing Now)
Student Project for blog discussions
- Study a major neighborhood in Washington, DC for a "thick description" of the (1) the messaging system of the city for people navigating the space of the built environment (streets, roads, architecture), (2) indications of the ambient, pervasive computing-information environment, (3) visible and hidden/assumed infrastructure for utilities and communications in the daily use of technologies. Examples of dense locations: (1) intersection of 14th and U Streets, and 2-3 blocks in each direction from the intersection, (2) intersection of K Street and Connecticut Ave., and 2-3 blocks in each direction from the intersection, (3) the Washington Mall, from the Washington Monument to the Capitol Building, and views along the way. Where are people moving from/to? Road traffic, metro, walking? How many are using cell phones/smart devices? How many observable wireless Internet places/zones are there (coffee shops, bars, restaurants, more)?
Industrial machines, scientific interventions in the human body, and computer technology have a long trajectory of representation in the popular imagination beginning with reactions to the Industrial Revolution in post-Romantic writing and art in Europe and America and in political philosophy in Marx. Much of the representation from Mary Shelley's Frankenstein, Thoreau's Walden up to Fritz Lang's Metropolis lead to common imagery and ideology up to the present day. What is going on in this trajectory of ideas, ideologies, emotional responses? America also has a unique "official" narrative of itself involving determinist industrialization, progress and supremacy through technology (from steam engines to rail roads, electricity, cars, air flight , telecommunications, and computers), and ongoing paranoia about "big industry" and the anti-Eden of techno-dystopias. Taking in as much of a "big picture" view as we can this week, through around 150 years of discourse and representation, what has been and is driving this narrative "story arc" and the ideologies, repressions, and misrecognitions about "technology" that continue on to today? How do we get from "the Machine in the Garden" to the techno-Frankenstein cyborg dystopias imagined in the past two decades? Fantasies of technology agency? What ideologies about bodies, machines, and human identity are represented in the cyborg, android, and intelligent machines?
- Frankenstein to Cyborgs and the Matrix
- Leo Marx, The Machine in the Garden: Technology and the Pastoral Ideal in America (excerpts)
Introduction | Google Books chapters
America has always had a special fascination with--and fear of--post-industrial technologies framed in a discourse of "interventions in nature" and myths of a technological Fall from the Garden of Eden (in secular form). We can trace a trajectory of ideologies and anxieties from the 19th century to the present.
- Leo Marx, "The Idea of 'Technology' and Postmodern Pessimism," excerpted from from Does technology drive history?: the dilemma of technological determinism (Cambridge: MIT Press, 1994).
- A Technoculture Dossier: Anthology of key texts on the cultural reactions to industrialization, machines, and technology, c.1820- c.1920. Extracts from writings by Henry Adams, Henry David Thoreau, Walt Whitman, Mary Shelley, Karl Marx, Sigmund Freud.
Some identifiable currents of discourse and ideology were set in motion in the early 19th century that haunt the discourse of our own time. Most people are unaware of the controlling metaphors, imagery, and ideologies which form the first modern "technoculture." Track the responses to urbanization, machines, factories, the industrialization of city and country, the roll out electric power and telecommunications, views of the alienation of mankind from work and production, and the seemingly overwhelming power of big industrial machines.
- Sigmund Freud, Civilization and Its Discontents (Das Unbehagen in der Kultur; literally "The Uneasiness in Culture") (1930) (Wikipedia overview) (etext excerpts).
Freud's major arguments: post-industrial society and economies require "inhuman" demands on natural instincts. Individual and collective/social repression of libido and aggression is the price we pay for "civilization" (Kultur). We need to recognize the effects of civilization's alibis and repressions for not dealing with libido, resulting in the return (and revenge) of the repressed in other forms. Machines and technologies are seen as prostheses of human mental and bodily functions that we have not adapted to.
This view is closely parallel to Freud's study of "The Uncanny" (Unheimlich), the sense of strangeness, uncertainty, fear, and anxiety felt when encountering objects or beings (fictional or real) not clearly animate or inanimate, human or nonhuman (like robots, cyborgs, and monsters) (etext excerpts).
- From the Technology Sublime to Technology and its Discontents: Cyborgian Projections
Donna Haraway, "Manifesto for Cyborgs" (excerpt from Simians, Cyborgs and Women: The Reinvention of Nature (New York; Routledge, 1991), pp.149-181. Web/html version | Pdf version. Seminal essay for reframing the cultural symbol of the "cyborg".
Katherine Hayles, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. University Of Chicago Press, 1999. [Excerpts from the Introduction and Chap. 10 (Conclusion).]
----------. “Unfinished Work: From Cyborg to Cognisphere.” Theory, Culture & Society, 23, no. 7–8 (December 1, 2006): 159 –166.
- Joel Dinerstein, “Technology and Its Discontents: On the Verge of the Posthuman,” American Quarterly 58, no. 3 (2006): 569-595.
The Frankenstein Effect (Segments to be Screened in Class and Continue to Next Week)
- Thomas Edison, Early film, Frankenstein (silent movie) (Youtube) (info)
- Fritz Lang, Metropolis (1927) (info)
- James Whale, Frankenstein (1931) (info)
- Ridley Scott, Blade
Runner (1982) (from the novel, Do Androids Dream of Electric Sheep? by Philip K. Dick)
Replicants (androids): dystopian future and film noir femme fatale meets sexual fetishization of the cyborg/android.
Screening in class: bio-tech birth of the Replicants; Dekard's visit to the uncanny "friends" of Sebastian and fight scene with Pris. Frankenstein moments in Roy's coming to self-consciousness and wants to meet his maker.
- The Terminator series: technology meets gender for hypermasculinity and dystopian fears, with a touch of the leather kink.
- The Borg in the Star Trek series: one imaginary for the borg-sexed hive-mind body: 7 of 9 and the Borg Queen.
- Eve of Destruction (1991): the value of B-movies for revealing the popular psyche: gender on the rampage.
Eve 8 is a weaponized cyborg, and becomes the sexualized and aggressive version of her designer. A female cyborg designed to pass as human for military intelligence with a fully armable nuclear bomb in her torso. What could possibly go wrong? Compare to Robocop (1987).
- Mamoru Oshii, Ghost
in the Shell (1995)
A very influential anime (inspiration for The Matrix) that imagines sentient androids (organic and machine/computational intelligence) as workers and data carriers, who also achieve self-consciousness to question the nature of their existence and their "soul" (the "ghost in the machine"), a Frankenstein theme.
- Andy and Larry Wachowski, The
Matrix (1999), The
Matrix: Reloaded (2003), Matrix Revolutions (2003); Warner Bros. Matrix site.
The "Matrix self" is sexier, aestheticized, stronger, idealized, fire-arms and martial arts proficient, preferably dressed in black vinyl and leather as if from a high-end designer fetish boutique. The "Matrix world" is shot in a "green key" and the "real world" is shot in a "blue key".
- Appleseed Ex Machina (2007) CG anime film and is the sequel to the 2004 Appleseed film, similarly directed by Shinji Aramaki, and was produced by Hong Kong director and producer John Woo.
This movie takes the anime cyborg to new levels of detail with 3D animation and CGI effects for rendering a cinematic experience.
- I, Robot; Avatar; Minority Report, etc.: ongoing aestheticization of body-machine, utopian and repressed sexualities. Movies from around 2003-present mainly implement an established Hollywood formula, which often recycles the earlier ideas in action/adventure genre frameworks.
Art, Culture, Machine Projects
- Ghost in the Machine, exhibition at the New Museum, New York (2012)
Student weekly blog discussions
Choice of two approaches: (1) Using the conceptual and historical background in this week's readings, discuss a popular culture genre (1-2 cases) where the post-Frankenstein and techno-dystopian ideas drive the narrative or representations of characters and machines; or, (2) discuss the meaning and consequences of the American way of dealing with the "technological sublime" (runs from 19th century Romantic views to techno-utopian fantasies used in marketing today).
Learning Objectives and Topics:
Understanding a speculative fiction genre (the "cyberpunk" tradition and related genres) as a combination of conceptual and imaginary methods for projecting and extrapolating scenarios from technology as embedded in culture and living systems.
"Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation...A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data..." --William Gibson, Neuromancer
What do we make of the trajectory in the popular imagination of technoculture dystopian narrative with its fetishizing of technology and interpenetration of material/machine technologies and the body? Human-machine interfaces become brain-neural network combinations? The "cyberpunk" reinvention of the SF novel brought a whole new philosophical approach to the converging worlds of technology, popular culture, science, communications, surveillance, and human-computer/human-machine combinations.
"Cyberpunk" (not a term used by the writers) emerged in the 1980s, at the same time as MTV and the mainstreaming of the punk rock music scenes. The writers, who preferred the term "speculative function," saw their work as a literary-philosophical hack on both the literary-cultural archive and the new cybernetic, computer, surveillance cultures, with the image of the transnational high-tech corporate worlds coming into view. Before commercial and consumer access to the Internet, a new group of writers and film makers were extrapolating from the present into near-future and contemporary worlds that are clearly of this world (not futuristic space travel or aliens) with imaginative dark humor, irony and satire, forming new hybrid genres in the process. As William Gibson famously stated, "the future is already here, it's just not evenly distributed."
This meta-genre for themes and plots in fiction and film continues in many forms today, generating new intertextual/intermedial, combinatorial approaches. The hybrid genres were possible only in the reconfiguring media sphere of inherited popular literary and film genres, inherited media technologies and the newer digital/virtual forms. Once the non-SF SF had been inserted (uploaded) into the popular culture system, it spawned new nodes and reconfigurations leading up to the now-cliched stylizations of many movies and TV shows (including all the goth-vampire subgenres of the past decade). "Cyberpunk" is just a handy name for a well-received hack in the codes of the literary and popular culture systems, a move in a larger postmodern game, now a code-set subsumed in OS and cultural networks of just about every fantasy and futuristic genre.
The genre overlaps with the Marvel and DC comics characters and stories based on mutations and scientific catastrophes that create "superpowers" and organic-techno-scientific-machine hybrids (Xmen, Avengers, Fantastic Four, The Watchmen).
Readings: Primary Texts, Novels
- William Gibson, Neuromancer (1984). Gibson coins the word "cyberspace" as the mental representation of direct neurological "jacking in" to "the matrix" (computer network). Wikipedia background. This novel also popularized the term "the matrix" for computer networks and the software that runs them. William Gibson's website.
- Neal Stephenson, Snow Crash (1992). Before graphical browsers or interfaces for the Net and Web, Stephenson described a fully-realized graphical interface to the Metaverse (the virtual networked world) and computer user identities as "avatars" (popular use from this novel). Wikipedia background. Neal Stephenson's website.
Background and Theory
- Ideas from 1950s-1980s artificial intelligence, cybernetics, extended mind research and theory:
Major shared foundations for the "new speculative fiction" in the 1980s-2000s come from the writings of Philip K. Dick and Vernor Vinge (who coined the term "singularity" for the advanced state of computer intelligence). A brief overview of Vinge's ideas is good for reading Gibson's and Stephenson's early novels:
- "Cyberpunk" in Wikipedia; see also "List of Cyberpunk works."
- Richard Kadrey and Larry McCaffrey, “Cyberpunk 101: A Schematic Guide to Storming the Reality Studio,” excerpt from Storming the Reality Studio: A Casebook of Cyberpunk and Postmodern Science Fiction, Duke University Press, Durham & London (1991), 17-32.
[This guide to literary, film, TV, and music works (though now dated) is useful for opening up the network of affiliations and and intermedial relationships that enabled the development of the hybrid genres employed by Gibson, Stephenson, and other writers.]
- Henry Jenkins (USC), Course syllabus for "SF as Media Theory." [Review this for useful approaches to reading speculative SF novels and movies for media and technology theory.]
- Wong Kin Yuen, "On the Edge of Spaces: Blade Runner, Ghost in the Shell , and Hong Kong's Cityscape," Science Fiction Studies, 27/1, 2000.
Filmography: Blade Runner to The Matrix and Japanese Anime (continued screenings)
- Ridley Scott, dir. Blade Runner (1982). Wikipedia background. Based on the 1968 novel, Do Androids Dream of Electric Sheep? by Philip K. Dick (info).
- David Cronenberg, dir. Videodrome (1983). Wikipedia background.
- James Cameron, dir. The Terminator I (1984) and II (1992). Wikipedia background on series.
- Wachowski Brothers, dir. The Matrix series (1999, 2003). Wikipedia info.
- Mamoru Oshii, dir. Ghost in the Shell (Anime: 1995; based on the manga begun by Masamune Shirow in 1989; English version, 1995). Wikipedia background. Clip: birth of The Major (cyborg) scene.
- Robert Longo, dir. Johnny Mnemonic (1995), Based on a 1981 short story by William Gibson, an early conception of the wetware data world in Neuromancer. Production not approved by Gibson. Wikipedia background.
- Shinji Aramaki, dir. Appleseed (2004). Based on the Appleseed manga created by Masamune Shirow.
- Shinji Aramaki, dir. Appleseed, Ex Machina (2007). Produced by John Woo. Wikipedia background.
- Other notable works affiliated with the genre and style:
Tron (1982) and Tron: Legacy (2010); Robocop (1987) (and remake scheduled for 2014); Strange Days (1995); Dark City (1998); Existenz (1999); Total Recall (orginal and remake).
Student weekly blog discussions
Drawing from our prior readings (from distributed cognition to cyberculture theory in last week's readings), discuss a major human-machine interface idea developed by Gibson and/or Stephenson, and how these ideas/images/representations have continued in other forms of literature and popular culture. Is it useful to consider speculative SF as "philosophy by other means"? How do we account for the dominant (overly-dominant?) themes of techno-catastrophes, cyborg fantasies, and falls from Edenic "nature" in the popular media culture of the past 20 years?
The computer is a great case for the mediological approach: what were, and are, the networks of conditions and relations that make the computer possible? De-blackboxing the best-known black box leads to histories and political economy of industries and institutions; national and international policy, regulation, and standards; convergence of technologies and sciences at various moments; social demographics and technology adoption patterns; cultural assumptions, everyday embeddedness of digital devices and chips (just as a beginning). The field of forces seen in the mainstreaming of the computer after WWII and the conditions leading to the PC revolution allow us to map out the many invisible forces and relationship nodes that mediology and Actor-Network Theory are all about. (This is, of course, a huge topic, and we'll leave out many other topics like the invisible computer design of everyday things and the embedding of computer chips in almost every industrial made device from cars to micro-wave ovens).
Readings: A brief social history of the computer to the rise of the PC
- The Machine That Changed the World: Video Documentary Series. (Another site hosting the series.)
Probably the best accessible documentary of computer history. Sit back, enjoy the whole story.
- Computer Museum, Timeline of Computing History
- Univ. of Pennsylvania: ENIAC Museum
Alfred D. Chandler and James W. Cortada, eds., A Nation Transformed by Information
How Information Has Shaped the United States from Colonial Times to the Present. Oxford Univ. Press, 2000.
Alfred Chandler, Chap. 1, Introduction.
Lee S. Sproull, Chap. 8, Computers in U.S. Households Since 1977.
- Thomas Haigh, "Computing the American Way: Contextualizing the Early US Computer Industry." IEEE Annals of the History of Computing, April-June, 2010.
- Of course, Wikipedia covers all things computing and computer history: computer (definition); history of hardware; software; operating system.
Important Arguments by Major Founders of Contemporary Thinking about Computers, Technology, Information, and Intelligence
- Alan Turing, "Computing Machinery and Intelligence," Mind, 1950. (Open source version.)
A seminal essay in Artificial Intelligence and models of computation, and the origin of the "Turing Test" concept for judging computer "intelligence." Turing is famous as the cryptographer who cracked the code for the German Enigma Machine in WWII.
- See Wikipedia on "Turing Test," "Turing Machine," and "Alan Turing" for background, and articles on "Turing" and "Turning Machine" in the Stanford Encyclopedia of Philosophy.
- On the background history of code-breaking and the Enigma Machine.
- On Turning as a philosopher, see Andrew Hodges, Turing (1997), online version of the "Turing Test" chapter here. (see other sources on this website).
- Vannevar Bush, "As We May Think," Atlantic, July, 1945. (Additional authorized e-text version.)
A seminal essay on managing information and human thought, leading to the concepts of GUIs (graphical user interface for computers), hypertext, and linked documents. See Wikipedia background.
See background on Bush's concept of the "Memex," precursor to hypertext information systems.
Video of a working model of the "Memex".
- Douglas Engelbart (Wikipedia background) developed Bush's concepts for computers: he is key inventor of the mouse and computer GUI.
Douglas Engelbart's famous paper, "Augmenting Human Intellect: A Conceptual Framework" (1962)
Engelbart's "Mother of All Demos" (1968): debut of the mouse and the structure of linked hypertext files. Video 1. Video 2 (documentary)
Doug Engelbart Institute: archive of all work in Engelbart's research.
The Stanford "Mouse Site," history and background of the mouse and computer GUIs.
Douglas Engelbart, First Demo of the Mouse and GUI (1968).
Doug Engelbart, The Engelbart Hypothesis: Dialogs with Douglas Engelbart (book).
- Ted Nelson (Wikipedia background), coined the term "hypertext" in his seminal book, Literary Machines (1988;1991) (excerpts). Expanding on Engelbart's work, he developed the concept of user-based text connections across all documents. See Ted Nelson's site.
Operating Systems and Interfaces
- Neal Stephenson, In the Beginning was the Command Line (1999 and rev.)
Student weekly blog discussions
- For your wiki essay this week, find an aspect of the computer, computation technologies, or digital information that allows you to "de-black box" it by connecting some of the main relationships, dependencies, and conditions that enabled the computer and computing devices to have their power in our current "information age" (largely led by the US): in the related industries and institutions; policy and government funding; private investment and entrepreneurial culture; various states of the material technologies (hardware, and/or software); standards-making across institutions and manufacturers allowing development of markets and the creation of industry ecosystems; social conditions leading to the adoption of technologies, market conditions, and consumer uses. Try thinking "mediologically" in looking at the necessary contexts, conditions, and structures/systems of mediation and transmission (examples: government investment in the computer sector during and after WWII (especially IBM) and unregulated private development (leading to the PC); the transition from the business and government markets to "personal" computing markets; accelerated miniaturization, processing speed, and storage/memory required to converge to enable more complex PC software right down to current "smart" devices).
What can a mediological/ANT study of the Net and Web open up for a newly visible analysis of forces and networked interdependencies usually blackboxed? With the convergence of telecommunications, computing, information science, hardware/software, and digital media, the Internet and Web have subsumed our prior media into a new mediasphere and metamedium, a system of ongoing reconfigurations of material technologies, software and algorithms, content, and the institutions and industries that create the "Internet" as such. The design is "permanently extensible" as new developments and hybrid technologies emerge for the Internet/Web system.
“What can be studied is always a relationship or an infinite regress of relationships. Never a ‘thing.’ ” -- Gregory Bateson
For example, the Internet as a global, international "network of networks" using the core Internet protocols and the client-server architectures that use them (all Web software, smartphone apps, cloud, etc.) would be impossible without the invisible institutional and political-economic agreements and policies (e.g., international telecom and data traffic agreements and regimes; in), hardware and software standards (implemented by major multinational companies like Cisco and Google), and the multiple dependencies in "agency networks" for everything that happens between--and among--end users and the globally distributed system.
So, the "Internet" isn't a thing or external object that can have "effects" on us: the Internet is a social-technical-political-economic system enacted through distributed agency in one of our most complex "orchestrated combinations of prior combinations." Forming a useful answer to a simple question like "what is the Internet?" takes us into a systems of relations that resists reification in an "it"--no thing or object as a singular referent of the term. This decentralized, distributed "system" (technical - social - political - economic), though weighted at the nodal concentration centers of global power, has many consequences and challenges for simplistic (instrumental, operational) models of agency and power. A big macro, international question today: how is the Internet--and everything done and enacted through it--to be controlled, governed, regulated, and accessed? Who "owns" the Internet?
The Internet as a system is our "mother of all case studies" for deblackboxing social-technical systems and questions of distributed agancy and distributed cognition. We can pose many useful question by using the Internet and Web as a paradigm of mediation, agency, and transmission: the significance of the convergence of all digital (digitized) media forms across a global network using common protocols and standards; the industry, policy and technology ecosystems that enable the Internet to function and grow with new innovations. Given that Internet protocols are "open" and standards based on industry collaboration and consensus with confirmation by NGO policy groups (except in cases of monopoly dominance) and that networks are distributed systems with no one center, where are the real sources of power and authority? How do we "re-orchestrate" regulatory policy in the political-economy regimes for media, data, entertainment (TV, radio, cable), and communications after convergence on the common platform of the Internet?
Readings and Backgrounds
History and Background of the Internet
- Katie Hafner, Where Wizards Stay Up Late: The Origins Of The Internet. New York, NY: Simon & Schuster, 1998
- "A Brief History of the Internet," Internet Society.
The Internet Society also hosts a directory of other well-documented histories of the Internet and the Web.
[Read around in these histories to get a basic sense of development.]
- Overview in Wikipedia [What you would expect from Wikipedia; use the sources, not the Wikipedia article.]
- Hobbes' Internet Timeline [a chronological timeline, good reference]
- Video: History of the Internet [not bad conceptual guide]
- History and Future of the Internet (World Science Festival video, with Vint Cerf and many others)
- Evolution of the Web (visualization)
Philosophy of Internet Design and ArchitectureInternet protocols and architecture are an "open system": what are the consequences of this?
- Vint Cerf and Robert Kahn, "A Protocol for Packet Network Intercommunication." 1974, IEEE.
- David Clark, "The Design Philosophy of the DARPA Internet Protocols," Originally published in Proc. SIGCOMM ‘88, Computer Communication Review Vol. 18, No. 4, August 1988, pp. 106–114.
[This is a very useful article on the design philosophy of Internet Protocols, mostly accessible to non-engineering specialists.]
- The Internet Ecosystem (ISOC)
Architecture, Infrastructure, and Systems
- AT&T Internet Map [a map and data points for one of the largest private build-outs of the Internet infrastructure]
- Internet Backbone | Global Internet Exchange Points |
- "The Cloud Factories," New York Times series on power use by Internet data centers.
The World Wide Web, HTML, and Interfaces to the Web
- Background history of hypertext and hypermedia (linking among files and documents)
- Tim Berners-Lee's original proposal for the Web (1989) (pdf)
- Ted Nelson, The Xanadu Project: First Hypertext System Model (this paper, 1999).
Student weekly blog discussions:
Thinking through a mediological and ANT view of the Internet
- The Internet and all its subsystems provide on of the most complex systems for thinking about mediations, interdependencies, and agencies. Most people only see functions presented at the interface level--and an interface works by making itself invisible (by design and by our cultural expectations). The complexity of the computer and network architecture are invisible. For your writing, take a case--like an app or digital media type--and build out a complexity model of dependencies, histories of technological development, economic ecosystems, institutions of mediation (standards, policy and regulation, industry groups, patents?), markets and demographics?
- Questions: How could we provide better informed policy arguments; for example, debates about regulation of TV/media/entertainment industry "content" delivered over the Internet as something for the FFC on the model of cable TV or broadcast? on the metamedium level, what is the difference between a "web browser" and an iPad app--both technically and philosophically as an interface for users? How can we resist talking about "the Internet" as a reified, uniform "technology" and look to the variety of subsystems and subcomponents that must be orchestrated to work together? What does it mean to be "on the Internet"?
Main Themes and Topics:
"AI can be defined as the attempt to get real
machines to behave like the ones in the
--Aaron Sloman, School of Computer Science, The University of Birmingham
An introduction to major concepts in "Artificial Intelligence" research, and making connections to other ideas in covered in the seminar (distributed cognition and memory, computation, human-computer interaction).
AI and Intersecting Fields
Many concepts and research projects in AI since the 1960s extend back to "human machine symbiosis" paradigms in cybernetics and the quest to find ways to program "learning machines," that is, machines or "intelligent" systems (software and hardware) that can learn, self-adapt, and "re-program" their software routines based on self-correcting responses to inputs from the world (interpreting information from sensors and physical data) and inputs from interactions with humans (simulating speech and language, and developing semantics, concepts, and higher order abstract reasoning). AI research and assumptions thus overlap with many specialized research programs in robotics, cognitive science, "smart" and automated computer/device systems, and human-computer interaction more generally.
Why We as Non-Specialists Should Care About AI Questions
One of the most highly-- and contentiously-- debated questions in the overlapping fields of AI, cognitive science, computer science, and philosophy of mind is the extent to which computational models and simulations of human cognition and "intelligence" (however we define these to be) are also reversible in the sense that higher-level human cognitive capacities (like reasoning, inferring, learning, error correction, problem solving, symbolic interpretation) are themselves best understood as forms of neurally based computation. A major philosopher of science calls this modeling problem the "epistemological engine" problem: a dominant technology becomes a paradigm for descriptions of human functions. If all things computational are artefacts of (collective) human cognitive faculties, how can they "transcend" the human mind? Even though we don't have the specialist background to intervene in this debate (or can hope to "cover" the issues in one week!), it is very important for non-specialists to have a sense of what is at stake and learn as much as possible to ask important questions.
De-Blackboxing and the Popular Culture Fantasy Context
Popular culture has been saturated with all kinds of fantasies--dystopian and utopian--about advanced computer "intelligence" with independent agency. This imaginary and speculative dimension has had two consequences: imagined scenarios can provide visionary prototypes for motivating real research (what if we had intelligent robots that could do precision diagnosis and surgery to save lives?), but popular culture fantasies usually make understanding and de-blackboxing the science and technology more difficult because of the unfounded projection of independent agency on computational systems.
This week we will gain some basic insights into the key assumptions of of AI research and briefly investigate one highly publicized case of futurist extrapolation of AI in Ray Kurzweil's description of a supercomputing "singularity." He also makes many assumptions with unaccounted for presuppositions about the human and "mind" and "intelligence" that makes them capable of "reverse engineering" in computational models. As time permits, we considered IBM's highly-publicized "Watson" system and the Siri voice recognition system as forms of AI currently in use.
Udacity Online Course: Intro to Artificial Intelligence
- Work through this course as an intro tutorial. This course, taught by two Stanford CS professors, was big news a year ago and proof-of-concept for Udacity's CS teaching platform. This version is self-paced and pitched to those with basic CS and math backgrounds. As part of the weekly essay, take notes on the points that you discover, and note--but don't be put off by--the areas requiring more CS and math background.
Backgrounds on AI from Developments in Cybernetics
- J. C. R. Licklider, Man-Computer Symbiosis. IRE Transactions on Human Factors in Electronics,
volume HFE-1, pages 4-11, March 1960.
[Seminal essay in AI and networking; vision of an "internet" before the Internet by one of the main players in the creation of the Internet later. Review for main topics.]
- John McCarthy, "What is AI?" A classic, accessible Q&A essay by one of the main founders of the field.
- Bruce Buchanan, "A Very Brief History of AI." AI Magazine, American Association for Artificial Intelligence, 2005.
Also a very useful overview of the scope of AI topics on the interdisciplinary site by The Association for the Advancement of Artificial Intelligence. See the Overview of AI Topics.
Ray Kurzweil, on the "Singularity," AI and Utopian Futurism
- Ray Kurzweil, Introductory Essay on the Singularity in The Futurist (2006).
- Sample chapters from author's website.
- Ray Kurzweil, Video presentation on key ideas for "The coming singularity" and "brain simulation" (Big Think).
- Ray Kurzweil, "How to Make a Mind" (The Futurist, 2013). Brief essay on the main ideas in his How to Create a Human Mind (2012).
- Critiques and Discussions of Kurzweil's ideas:
- Gary Marcus, "Ray Kurzweil's Dubious New Theory of Mind," The New Yorker, Nov. 12, 2012.
- Jaron Lanier, critique and discussion of issues techno-utopian beliefs:
"One Half of a Manifesto" (Edge.org): see especially his comments at the end of the page on Kurzweil's arguments.
- MIT Technology Review, on Kurzweil's recent position at Google.
- Wikipedia background (not vetted).
Major Popular Instances of AI: IBM's "Watson" & "Deep Blue" and Siri (to be discussed in class)
- IBM: Cognitive Computing site. | IBM: Watson system
IEEE Journal Issue (3.4, 2012) devoted to IBM's Watson system.
- Apple keeps Siri very productized and blackboxed. Siri combines voice/speech recognition with AI for recognizing searchable terms and phrases.
Weekly blog discussions
- As you survey some of the main topics in AI research and theory in the context of our seminar, what do you find that non-specialists can (or should) learn about in this field? Why do these technologies and the field itself seem especially blackboxed and problematic in terms of questions about agency and mediation of functions? Consider one or two questions or problems that you find important and would like to learn more about if you had the opportunity.
Final essay due date: one week after last day of class.