Paik Super Highway
Image: From Nam June Paik, Electronic Super Highway (1995), Smithsonian Museum of American Art

Georgetown University
Graduate School of Art and Sciences
Communication, Culture & Technology Program

Key Concepts in Technology and How to Use Them (CCTP-798)
Professor Martin Irvine
Fall 2014

A new for-credit graduate-level course open to all qualified students and working professionals as well as to Georgetown University students.

This public site outlines the learning objectives and topics for the course. Course content and resources are available only to enrolled students.

About the Course

Course Objectives and Grades

This online for-credit course will provide students  and working professionals with important conceptual and analytical tools for understanding the key concepts behind our current media and computational technologies. The course will be taught entirely online using Georgetown’s Blackboard course platform with video and multimedia presentations, and real-time video conferencing.

View Course Introductory Trailer video.

A unique value-add for this online course is the online library of relevant book chapters, articles, graphical illustrations, videos, and research publications that only students enrolled in this course can access. This up-to-date digital library of research and learning resources is not available in any other course.

Our course mantra is technology is too important to be left to technologists (alone). The main objective is equipping students with the current methods for understanding the key concepts, functions, and design principles of contemporary technologies to enable better-informed, higher-level participation in any field or profession. Since our digital technologies are now embedded in everything we do, knowledge of the principles of technology is essential for leadership in any field: for participating in public debates about future developments, in decision making, and relevant policy. Using research and theory from multiple disciplines, we will focus on our cognitive, symbolic, and media technologies in their larger socio-technical contexts, and investigate methods for defining their underlying re-implementable design principles and mediating functions. Media, communication, and computational technologies are part of a historical continuum of cognitive-symbolic cultural technologies that extends from earlier uses of writing, mathematics, and image representations to the multiple combined systems of technical mediation since the development of electronics, modern communications and networks, computation, and digital media. We will create a framework for understanding major media forms in a social-historical continuum of technology functions.

Syllabus units will include: introductions to studying technologies through design principles and functions; cognitive science approaches to symbolic representation and computation; key concepts in software, code, and interface design; the principles of digital media and digitization; the design principles of the Internet and World Wide Web (with recent developments and why this architecture will continue to matter); and key concepts in artificial intelligence and ambient computing (embedded and extended intelligence in the lived environment). We will use methods for “de-blackboxing” technologies so that they reveal their networks of social-technical dependencies and major re-implementable functions and design concepts.

Need for Attention and Working Through Difficult Questions and Concepts

To the extent that it's feasible for learning with no pre-requisites, students will encounter the "primary sources" of concepts and approaches in the scientific and scholarly literature from the main authors themselves, rather than from encountering the ideas, questions, and arguments in textbook summaries or paraphrases. A major part of graduate-level education is working through the main questions and ideas yourself, and learning how to make the concepts your own!


By the end of the course, students will have acquired an interdisciplinary knowledge base of key concepts, design principles, and analytical methods for understanding our key technologies and ways to participate in higher-level discussions and debates in their own fields and professions. By acquiring these conceptual and analytical tools, students will be able to advance beyond being merely consumers or users of technologies and will be prepared to become thought leaders on major issues in their own fields.


Grades will be based on biweekly assessments, participation in online discussions, a short written assignment, and a final capstone written project. We will have weekly live, synchronous, video discussion sessions, and the professor will host weekly virtual office hours via web video conferencing.

Information on Admission to Course

Books: Required and Recommended

This course is not primarily based on books or textbooks.


  • W. Brian Arthur, The Nature of Technology: What It Is and How It Evolves. New York, NY: Free Press, 2009. [ISBN 1416544062]
  • Luciano Floridi, Information: A Very Short Introduction. Oxford, UK: Oxford University Press, 2010. [ISBN: 0745645720]
  • Lev Manovich, Software Takes Command: Extending the Language of New Media (London; New York: Bloomsbury Academic, 2013). [ISBN: 1623567459]


  • Technical Background:
  • Ron White and Timothy Downs. How Computers Work. 9th ed. Indianapolis, IN: Que Publishing, 2007.
    New edition to be published in Dec. 2014.
  • Important Concepts and History of the Technologies:
  • Janet Abbate, Inventing the Internet. Cambridge, MA: The MIT Press, 2000.
  • James Gleick, The Information: A History, a Theory, a Flood. New York, NY: Pantheon, 2011.
  • Tim Berners-Lee, Weaving the Web: The Original Design and Ultimative Destiny of the World Wide Web. New York, NY: Harper Business, 2000.
  • Katie Hafner, Where Wizards Stay Up Late: The Origins of The Internet. New York, NY: Simon & Schuster, 1998. [ISBN 0684832674]
  • Janet H. Murray, Inventing the Medium: Principles of Interaction Design as a Cultural Practice. Cambridge, MA: MIT Press, 2012.
  • Donald A. Norman, The Design of Everyday Things. New York, NY: Basic Books, 1988 and reprints.
  • Noah Wardruip-Fruin and Nick Montfort, eds. The New Media Reader. Cambridge, MA: MIT Press, 2003.

Research Resources:

Learning Objectives and Topics:

An orientation to the approaches in the course, focusing on understanding computational, media, and communication technologies through their design principles, implementation of functions, and social-technical systems and networks.

Defining kinds of technologies, and differentiating the cognitive and symbolic technologies (media, information, and communication technologies) from general and instrumental technologies. Learning the methods for exposing and refuting technological determinist assumptions and how to develop descriptions and analyses for a more complete view of media technologies as implementations of human cognitive abilities and agency.

Students will begin learning how to develop conceptual tools for understanding technology that can be mobilized for "myth-busting" (countering misconceptions) and "de-blackboxing" (opening a technology through its system of interdependent social-technical components, histories of development, and distributed agency).

Major topics, concepts, and themes through the course:

  • Technologies as architectures for ongoing combinations of implementable design concepts or abstract models (from tools and writing to complex machines, computation, and software-based media).
  • Modular design principles and the logic of combinatoriality and hybridization.
  • Media and computational technologies as combinatorial symbolic artefacts.
  • De-blackboxing technologies that are received as closed totalities in consumer or business products: why "transparency" makes all technologies opaque for ordinary users.
  • All dimensions of mediation and interface: social, technical, cultural, political, economic
  • The mediating functions of computers, software, pan-digital platforms: media, mediations, and computer devices as a metamedium
  • The continuum of technical implementations and cognitive externalizations: from language and symbolic capabilities to writing, the cumulative histories of media forms, and computer code.
  • The analog-digital-analog continuum: contemporary experience of embedded and externalized media and computational technologies.
  • Understanding major design concepts in computation, software, and digital networks so that you can participate in important debates about the future and betters uses of technology.

Busted Myths

  • Utopian and Dystopian projections of technology
  • Progressivist and other determinist futures for technology

Introduction: Video Lectures: Our Approach to "Understanding Technology"

Course Introduction: Part 1 | Course Introduction: Part 2

Text of Video Introduction

Background Readings for Context and Framing Issues in the Course:

  • Martin Irvine, Technology Theory: Introduction and Orientation (introductory essay)
  • Leo Marx, "Technology: The Emergence of a Hazardous Concept." Technology and Culture, July, 2010.
    • Reading focus: Note the shifting semantic range for the word technology and pervasive beliefs and values supporting "technological determinism" (the assumption that the features and properties of a technology autonomously cause/determine social/cultural/political effects).
    • Note the shifting range of artefacts included in the concept: human control of nature in a built environment, tools and machines for production; industrialization and automation of machines, electronics and computers; concepts of science, engineering, technology.
    • Compare: Technik (German) and technique (French): technical (engineering) knowledge
    • Question: How did we get from this history to the use of the word "technology" today to mean only things computational and digital?
  • Donald Norman, The Design of Everyday Things. 2nd ed. New York, NY: Basic Books, 2002. Excerpts from Preface and Chap. 1.
    [This is a popular non-technical introduction to design principles encountered in many kinds of everyday manufactured things from a leading thinker in the cognitive science approach to design and HCI (Human Computer Interaction). Norman follows human-centered principles and provides methods for thinking about how concepts and functions can be implemented and "mapped onto" manufactured things.]
  • Donald Norman, Things That Make Us Smart: Defending Human Attributes in the Age of the Machine. Reading, MA: Perseus Books, 1993. Excerpts on cognitive technologies.
    [We will study the principles behind "cognitive technologies" and artefacts in the following weeks. This is a brief outline of one leading approach.]
  • Hal Abelson, Ken Ledeen, and Harry Lewis. Blown to Bits: Your Life, Liberty, and Happiness After the Digital Explosion. Upper Saddle River, NJ: Addison-Wesley, 2008. Excerpt from Introduction.
    [This book is typical for popular published accounts of computer and digital technology. Attend to the way the authors (excellent authorities in their own right) frame arguments in terms of historical inevitabilities and impacts (technologies as autonomous agencies).]
  • Erik Brynjolfsson and Andrew McAfee. The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies. New York: W. W. Norton & Company, 2014. Excerpt from Chapter 1.
    [An example of ordinary business description; mostly "cheer leading" without analysis. Descriptive details, data, and charts tend to be used as justifications for causal assumptions. The technologies and all the processes that surround them remain black boxes with only outputs subject to description.]

Using Research Tools for this Course (and beyond)

  • Learn how to use Zotero for managing bibliography and data for references and footnotes.
    Directions and link to app, Georgetown Library (click open the "Zotero" tab).
    You can save, organize, export and cut and paste your references into writing assignments for this course.
  • Georgetown Library Online Journal Search (with Off-Campus Login for enrolled students)
    Access to journals in our fields of study for going further in the research literature for a topic.

baseline knowledge check of technology terms, concepts, and knowledge as you begin the course.

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • What kinds of discourses, concepts, and ideologies have we inherited about technology (machines, artefacts) and media technologies specifically?
  • Can you think of popular examples of determinist, utopian, or dystopian arguments or popular representations?
  • Why do you think it's so common for our culture to frame technology as either progressive/utopian or harmful/dehumanizing/dystopian?

Learning Objectives:

“What can be studied is always a relationship or an infinite regress of relationships. Never a ‘thing.’ ” -- Gregory Bateson

This unit will provide a framework for the main concepts and approaches used in the course. Through an interdisciplinary overview, students will learn important concepts, terms, and analytical tools for describing and understanding media, communication, and computational technologies as developed in recent research and theory. This framework provides powerful tools for "myth-busting" erroneous views about technology and for "de-blackboxing" everyday technologies by using their analyzing their interconnected components as interfaces to the larger system of dependencies -- their social and technical networks -- that allow them to work.

This week we'll begin learning how to apply important principles that, working together, produce the "effects" we often attribute to communication and computational technologies. We'll discover how and why "what's invisible is (often) the most powerful" in our technologies, and develop methods for opening up what's ordinarily invisible or hidden from view:

  • understanding technologies not as single or isolated "products" but as connecting nodes in larger social- technical and economic networks that follow network "laws"
  • the cumulative effects of combinatorial design principles for combining existing and prior technologies, methods, and functions in new combinatorial designs
  • cognitive extension and off-loading onto artefacts: a primary function of all forms of symbolic expression -- from writing and images to computation and digital communication media -- is extending our cognitive abilities, intentions, and expressions through collective technical mediation

This framework also opens up larger and longer-term forces in human social history. Social and technical history over many centuries shows that there's a built-in "ratchet effect" in human systems of symbolic artefacts and communication technologies. Technologies like writing and material media for communications don't have to be reinvented by every generation of people in a society: the state of technical implementations hold in place (like locking in place a gear position on a ratchet) their cumulative history of development, and, further, enable ongoing development by those who extend and combine functions as new technologies emerge. Digital media and graphical displays in all our computational devices are great examples of the results of cumulative development and design principles for recombining technical mediation and social functions.

Introducing Key Concepts and Terms

  • Combinatoriality and cumulative combinations
  • Network effects in use of technologies
  • Cognitive artefacts and cognitive technology
  • Social construction of technology
  • Metamedium (media designed for representing other media; e.g., windows-based display interfaces, mobile device screens, tablets)
  • Deep Remix/Remixability (open recombinability as properties of digital media and software)

Introductory Video Lecture

Text of Week 2 Video Introduction


  • Martin Irvine, "Technology Theory: An Introduction" (Introductory essay; finish)
  • Brian Arthur, The Nature of Technology: What It Is and How It Evolves. Chapters 1, 2, 4, and 6 (others as time allows).
  • Lev Manovich, Software Takes Command. Read Introduction and pp. 43-49.
    [Attend to Manovich's development of the concepts of metamedium (a medium for representing other media like a PC or tablet), hybrid combinations, and the deep remix principle (as an extensible feature of software and digital media. We will study the background for these concepts and how to apply them throughout the course.]
  • Michael Cole, On Cognitive Artifacts, From Cultural Psychology: A Once and Future Discipline. Cambridge, MA: Harvard University Press, 1996. Connected excerpts.
    [A good summary of the cognitive psychology background that provided important concepts and assumptions in Human-Computer Interaction/Interface design theory as it developed from the 1960s-2000s.]
  • Donald A. Norman, "Cognitive Artifacts." In Designing Interaction, edited by John M. Carroll, 17-38. New York, NY: Cambridge University Press, 1991.
    [Read pp. 1-23 for the basic concepts. Norman is a leading guru of "user-centered" design philosophy and has a background in both cognitive science and computer science.]
  • Itiel E. Dror and Stevan Harnad. "Offloading Cognition Onto Cognitive Technology." In Cognition Distributed: How Cognitive Technology Extends Our Minds, edited by Itiel E. Dror and Stevan Harnad, 1–23. Amsterdam and Philadelphia: John Benjamins Publishing, 2008. Read pp. 1-5 and 23-24 for this week.
    [Good survey of topics and concepts that we will explore more fully in following weeks. Read and skim now for basic background.]
  • Zhang, Jiajie, and Vimla L. Patel. “Distributed Cognition, Representation, and Affordance.” Pragmatics & Cognition 14, no. 2 (July 2006): 333-341.
    [Brief intro to topics we will study in more detail in following weeks. This is a useful, short summary of central issues in this field of research, accessible to non-specialists.]
  • From: Carl Shapiro and Hal R. Varian. Information Rules: A Strategic Guide to the Network Economy. Boston, MA: Harvard Business School Press, 1998.
    [Short excerpt from Chap. 1 on "Network Effects" in technology adoption. Also Google Books version (read Chap. 1 to p. 17). The "network effects" principle is related to what provides value to "cumulative combinations" as analyzed by Brian Arthur.]
  • Rudi Volti, Society and Technological Change. 7th ed. New York: Worth Publishers, 2014. Excerpts from intro.
    [A useful textbook introduction to mainstream ways that technology history is studied and taught.]

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • How would you describe the features and functions of a contemporary electronic device like a "smart" TV connected to a cable service? A PC? Even though our devices are like a black box, can you find the indications of the cumulative combinations of functions and designs that Arthur describes?
  • What kind of technical artefact is a tablet computing device or smart phone? What cultural/social forces compel us to keep devices "black-boxed"? (We will investigate this question more fully in the following weeks.)
  • Big questions to begin thinking about: What presupposed (unacknowledged) theory of technology do you find in the Apple i-device world? Utopian? Re-mediated? Instrumental? How do technologies get bundled with political or moral determinations?

Learning Objectives and Main Topics:

Learning the major concepts and design principles of modular design in systems thinking for understanding the design and implementation of media, communication, computation, and information technologies.

Key Terms and Concepts:

  • Modularity, modular design
  • Complex system
  • Decomposition, decomposable system (decomposing complex system/network into modules)
  • Abstraction, abstraction layer or level (in hierarchy of modules and system functions)
  • Combinatorial Design

Introductory Video Lecture/Presentation

Text of Week 3 Video Introduction


  • Lidwell, William, Kritina Holden, and Jill Butler. Universal Principles of Design. Revised. Beverly, MA: Rockport Publishers, 2010. [Selections: Read Affordances, Hierarchy, Mental Model, and Modularity for this week.]
    Well-illustrated compendium of design concepts. Helps make the concepts and principles intuitively clear.
  • Richard N. Langlois, "Modularity in Technology and Organization." Journal of Economic Behavior & Organization 49, no. 1 (September 2002): 19–37.
    [Read sections 2-4, pp. 20-26 for an overview of concepts initiated by Herbert Simon (see below) on complex systems.]
  • Carliss Y. Baldwin and Kim B. Clark. “Modularity in the Design of Complex Engineering Systems.” In Complex Engineered Systems: Science Meets Technology, edited by Dan Braha, Ali A. Minai, and Yaneer Bar-Yam, 175-205. Cambridge, MA: Springer, 2006.
    [This is a summary and application of the main ideas in their excellent book: Carliss Y. Baldwin and Kim B. Clark, Design Rules, Vol. 1: The Power of Modularity. Cambridge, MA: The MIT Press, 2000.]
  • W. Brian Arthur, The Nature of Technology: What It Is and How It Evolves. Chapters 7, 9, 11 (others as time allows).
    [Compare Arthur's combinatorial model with the specific analyses above.]
  • Janet H. Murray, Inventing the Medium: Principles of Interaction Design as a Cultural Practice. Cambridge, MA: MIT Press, 2012 (excerpts from the Introduction. pp. 1-21).
  • Apple, Inc. User Interface Design Principles for the Mac OS X: User's "Mental Model"
    Note how the "official" design principles for Apple are mainly ways to "operationalize" and "productize" (procedures for implementing functions in specific instances) well-known design principles.

Deeper Background (for reference and as time and interest allow)

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • We will use the Apple iPhone (or other smart phone/mobile device that you may own) as a continuing case study for applying the concepts in the course. Begin applying the concepts and methods from this week's readings:
  • How do modular design principles and systems thinking help explain the complex design and development of smart phones and mobile devices? Do the software interfaces (icons, user controls, feedback, display output, etc.) reveal or hide modularity?
  • Can you begin thinking through a description of related modules in the iPhone design that must connect at a systems level while the details of what happens in a specific module is unknown to the others (abstraction layers/levels, handing off functions, orchestrated combinatoriality)?

Assessment 1

Learning Objectives and Main Topics

Learning the major conceptual frameworks for communication and information theory as defined and used in electronics and computation, and ways to critique inherited models to advance more useful concepts for our communication and media environment today. Learning the concepts behind the engineering definition of information and communication is essential for understanding how all information and media technologies work, but because the engineering definitions are necessarily "pre-semantic" (or non-semantic) and bracket off the meanings of digitally encoded information units, the engineering model is not applicable for describing how we encode meaning and the many contexts of meaning that motivate and frame transmitted signals. We need other models of communication from linguistics, semiotics, and cognitive science to complete the description of human meaning making in artefactual communication and media representations.

Key Terms and Concepts:

  • Information (as a unit of probability and differentiation from other possibilities)
  • The Transmission model of Communication and Information
  • Dominant metaphors: signal, conduit, container, channel, source, destination
  • The meaning contexts of messages and information

Backgrounds: Why the Key Concepts of Information Theory are Important

Communication and information theory from the 1950s-80s is widely taken for granted in discussions of media, technology, and computation. The signal-code-transmission model of information theory has been the foundation of signal and message transmission systems in all communication technology from the telegraph to the Internet and digital wireless signals. Originally developed as models for transmitting error-free electronic signals in telecommunications, these theories and concepts have also also informed models of meaning communication in culture more broadly (with severe limitations on how meaning can be explained).

It's essential to understand how the signal transmission model works and why it's important for all the digital technologies that we use. It provides an essential abstraction layer in all electronic and digital systems. We also need to understand that it is not a model for the larger sense of communication and meaning systems that our symbolic-cognitive technologies allow us to implement. We'll need to understand why meaning and social uses of communication are left out of the signal transmission model and how we use signals and coded information units within the systems of meaning that motivate them.

How do we get "meanings" into and from bits and data? The meaning networks of communicators and information users (for anything expressed in any medium) are understood, assumed, and encoded in signal mediums but are not properties of their material form. (This is the basic feature of human symbols, which is the main subject of semiotics: meaning is not a property of perceptible signals but is what is enacted by cognitive agents who use collectively understood material signs.) In any model of information and communication, we also need to account for contexts in two senses of the term: both the sender's and receiver's context (world of meanings and kinds of expressions assumed), and the message's contexts (its relation to other messages both near and far in time).

Examples: We know when information units/data have been transmitted and received successfully because we recognize the symbolic units (meaning units) that are being encoded! The background technologies are designed to send and receive signals (e.g., radio waves, bits and bytes) without error (or reduction in error for probable decoding). But we would only design this kind of system because we are encoding symbolic expressions or representations already understood as meaningful. Think about the fact that we recognize as meaningful:

  • the text in text messages (after the information is interpreted in software for rendering on our displays): meaning is what motivated the encoded information;
  • the spatial array of visual information in a photograph after display
  • the sounds that we recognize as a musical genre when it plays through audio devices.

We see many other limitations in the linear one-way transmission models for describing communication: all of our communications have production and reception contexts that form networks of meaning, use, and purposes that must be accounted for in other ways. Further, what kind of models can account for all the communication "modalities": one to one, one-many, many-one, many-many, and dialog (one-other, mutually assumed); synchronous (at the same time) and asynchronous (delays in reception-response, short time span or very long).

The transmission model of information is essential to understand, but can't be used for extrapolating to a model for communication and meaning (as it often is in some schools of thought). The signal transmission theory is constrained by a signal-unit point-to-point model. It can't account for the fact that in living human communication there is never only one message to be communicated or one unit of communication/information but, rather, a message unit appears in a dense network of prior, contemporaneous, and future messages surrounding anything communicated. The motivation context of a message includes essential meta-information known to communicators using a medium (kinds/genres of messages, social conventions, assumed background knowledge). We will investigate how this works more fully next week.

Week 4: Introductory Video Lecture/Presentation

Text of Week 4 Video Introduction

Other online Video Lessons (for background on the signals engineering model)

  • Kahn Academy Video Lessons: Information Theory in Scientific Descriptions (Basic Background)
    You can work through this introductory set of video lessons quickly; focus on modern information theory.
  • Optional:
    MIT: Digital Communication Systems
    : MIT Open Courseware Videos.
    This is a good introduction to the electrical/digital information theory and systems design. As is common, "communication" is used for signal "transmission" independent of semantic or conceptual value. View the intro lecture and Lesson 9, as time allows.

Readings: Models of Communication and Information and Their Consequences

  • Luciano Floridi, Information, Chapters 1-4.
    [For background on the main traditions of information theory, mainly separate from cognitive and semantic issues.]
  • James Gleick, Excerpts from The Information: A History, a Theory, a Flood. (New York, NY: Pantheon, 2011). Excerpts from Introduction and Chap. 7.
    [Readable background on the history of information theory.]
  • Peter Denning and Tim Bell, "The Information Paradox." From American Scientist, 100, Nov-Dec. 2012.
    Computer science leaders address the question: Modern information theory is about "pre-semantic" signal transmission and removes meaning from the equation. But humans use symbol systems for meaning. Can computation include meaning?
  • Ronald E. Day, "The ‘Conduit Metaphor’ and the Nature and Politics of Information Studies." Journal of the American Society for Information Science 51, no. 9 (2000): 805-811.
    Models and metaphors for "communication" have long been constrained by "transport", "conduit," and "container/content" metaphors that provide only one map of a larger process. How can we describe "communication" and "meaning" in better ways that account for all the conditions, contexts, and environments of "meaning making"? How do network metaphors disrupt the linear point-to-point metaphors?
  • Terrence W. Deacon, "What's Missing from Theories of Information?" In Information and the Nature of Reality: From Physics to Metaphysics, edited by P. C. W Davies and Niels Henrik Gregersen, 146–69. Cambridge, UK: Cambridge University Press, 2010.
  • James Carey, "Communication and Culture" (from Communication as Culture: Essays on Media and Society, 1992) [pdf]. Summary of Carey's Views (excerpts)
    [An influential essay by a leader in the modern field of "Communication" that repositions the study of communication and mediating technologies in the cultural, ideological, and economic context.]

Reference and Supplemental: Semantic and Pragmatic (Context) Dimensions

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • Describe the main features of the signal transmission theory of information. But why doesn't the signal-code-transmission model include a description of meaning (the semantic, social, and cultural significance of encoded signals)? What is missing in the "information theory" model in accounting for messages and meanings? Where are the meanings in our understanding of messages, media, and artefacts?
  • What are the conceptual consequences of using the content - container - transport/conduit metaphors when describing communication?
  • How de we know what a text message, an email message, or social media message means? What kinds of communication acts are involved? What do senders and receivers know that aren't a matter of sending/receiving the signals (strings of text characters) correctly?

Learning Objectives and Topics:

Gaining a foundation in the significance of the research generated by the study of symbolic cognition in intersecting disciplines and sciences--linguistics, anthropology, evolutionary psychology, and cognitive science more broadly--and how this developing knowledge base allows up to ask better informed questions about technology and mediated communication systems. Learning the main topics of research and debate in cognitive science fields on mind-brain / machine-computer parallels and why the models for computation and cognition have been so interconnected.

Research in many fields continues to discover more and more about the consequences of being the Symbolic Species in Terrence Deacon's term. This week, you will learn the main concepts from cognitive science research for describing the "cognitive continuum" from language and symbolic representation to multiple levels of abstraction in any symbolic representation (spanning writing, mathematics, symbolic media like images and combinations in film and multimedia, and computation and code). The human ability for making meaning in any kind of symbolic expression and embodying meaningful, collectively understood expression in media technologies depends on the use of symbols and the advantages of symbolic cognition.

We will follow a major direction in recent research that studies media, communication, and computational technologies in a continuum of accumulating cognitive advances that have enabled a cascading series of human capabilities afforded by symbolic cognition: the ability of abstract thought and the reflexive use of symbols, off-loading cognitive tasks and memory by extension in artefacts, externalized media storage, and automated computational processes.

Overview of Topics and Themes:
Symbolic Cognition, Sign Systems, Mediation > Cognitive Technologies

Within a broad cluster of fields--ranging from neuroscience to cognitive linguistics, cognitive anthropology, and computational models of cognition and artificial intelligence research--there has been a major convergence on questions and interdisciplinary methods for studying cognition, human meaning-making and the "symbolic faculty" generally, including all our cumulative mediations and externalizations in "cognitive technologies." Cognitive science has been closely related to computer science in seeking computational models for brain and cognitive processes, and proposing hypotheses that explain cognition as a form of computation (the "computational theory of mind/brain"), and attempts to model complex parallel software processes on what we know about the neural bases of cognition ("cognitive computing").

For the study of human symbolic cognition and meaning making in sign systems, many disciplines now converge around the major question of how our technically mediated meaning systems function with analogous and parallel "architectures" that must include (1) rules for combinatoriality of components (an underlying syntax for forming complex and recursive expressions of meaning units), (2) intersubjective preconditions "built-in" to the meaning system for collective and shared cognition, and (3) material symbolic-cognitive externalizations (e.g., writing, images, artefacts) transmitted by means of "cognitive technologies" (everything from writing to digital media and computer code) which enable human cultures and cultural memory. This recent interdisciplinary research is a "game changer" for the way we think about human communication and media technologies.

Synthesizing views, we can say that the human symbolic faculty has generated a continuum of functions from language and abstract symbolic thought to machines, media technologies, and computation:


The mainstream disciplines in communication and media studies are very conservative (remaining within a demarcated field in the humanities and social sciences) and have not yet incorporated recent advances in cognitive science fields that are directly relevant to core assumptions and research questions on language, symbolic culture, and media technologies. We therefore have an open opportunity to learn from cognitive science fields and reconfigure the inter- and transdisciplinary field in promising ways for both theoretical and applied work.

Related Principles in all Symbolic Systems and Technologies: Combinatoriality, Compositionality, Componentiality, Recursion, Externalized Memory in symbols and artefacts, Intersubjective and Collective foundation of meaning.

Video Introduction

Text of Week 5 Video Introduction


Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • Building on the past two weeks' topics, discuss a way to describe the forms of our symbolic activity as organized in an everyday media form: a software app, a film/video/TV shot or sequence (not a whole movie/video), a series of photographs, a piece of music.
  • Can you describe the "distributed cognition" / "collective symbolic cognition" functions in our everyday interactions with interfaces and artefacts? Navigating by GPS? Offloading concepts, writing, images to digital storage technologies?
  • Does the view of computer and media technologies as "cognitive technologies" or "symbolic-cognitive artefacts" provide a better understanding of these technologies?

Assessment 2


Learning Objectives

Gaining a foundation in the influential theories, key terms, concepts, and interdisciplinary approaches for the study of media and symbolic, technical mediation that have shaped research and popular conceptualization from the 1960s to the present.

The cluster of terms for media and interface are used in so many ways that we need to unpack the history of the concepts and find useful ways of using terms in descriptions and analysis. What is "new" about new media, software controlled media, and network mediation, and what aspects can we understand as a continuum in re-combinatorial functions and processes? How can we best understand, describe, and analyze the concepts and implementations for interfaces and multimodal forms? How do interfaces go "meta" in combinatoriality and presentation frameworks (computer devices and digital display screens as a metamedium, a medium for other media)? How do media technologies media social agency and how do they become major nodes in distributed agency and cognition?

Key terms and concepts:

  • Medium/media as social-technical implementations of communication and meaning functions maintained by roles in a larger cultural, economic, and political system
  • Mediation as a function of a medium (e.g, text/print, image technologies, media industries)
  • Interface as the physical-material contact point for technical mediation with users (social-cognitive agents) of media
  • Technical Mediation (in Latour's terms) as the means of distributing agency in a social-technical network
  • Media System as the interdependent social configuration of technologies and institutions
  • Communication vs. Transmission (in Debray's terms): differentiating media technologies and their functions for almost synchronous communication within a cultural group (e.g., telephone, email, TV, radio) vs. technologies for transmission of meaning and cultural identity over long time spans (e.g., written artefacts, books, recorded media, computer memory storage, museums, archives).



Key Traditions and Concepts in Media Theory

  • Martin Irvine, "Media Theory: An Introduction"
  • Denis McQuail, .McQuail's Mass Communication Theory. Sixth Edition. London, UK: Sage Publications Ltd, 2010. Excerpt: "The Rise of Mass Media."
    [This is a chapter from a standard textbook in commuication and media studies. Review this for a brief background on the history of mass media from print to television before the digital media transition.]
  • The legacy of Marshall McLuhan, "The Medium is the Message" (Excerpts from Understanding Media, The Extensions of Man. 2nd Edition).
    [McLuhan's main concepts and arguments for media are useful to rethink in the context of contemporary media. Note how he was trying to find a way to uncover the forces and influences of a medium that are mediated in the form and institutions of a medium.]
  • Lev Manovich, The Language of New Media. Cambridge: MIT Press, 2001. Excerpt: "What is New Media?"
    [Note the categories Manovich sets up for defining "New Media". This was Manovich's major work before Software Takes Command.]
  • Jay David Bolter and Richard Grusin. Remediation: Understanding New Media. Cambridge, MA: The MIT Press, 2000. Excerpts from the Introduction and Chapter 1.
  • Lisa Gitelman, Always Already New: Media, History, and the Data of Culture. Cambridge, MA: The MIT Press, 2008. Excerpt from Introduction.
    [Includes excellent bibliography of references.]

From Media to Media Systems and Mediation

  • Martin Irvine, Working with Mediology and Network Theory (introductory essay)
  • Regis Debray, "What is Mediology?" Le Monde Diplomatique, Aug., 1999. Trans. Martin Irvine.
  • Regis Debray, Transmitting Culture, trans. Eric Rauth. New York: Columbia University Press, 2000.
    Excerpts in pdf: From Chaps. 1-2; from Chap. 7, "Ways of Doing."
    Note important distinction between "communication" and "transmission" (over longer time spans).
  • Bruno Latour, “On Technical Mediation.” Common Knowledge 3, no. 2 (1994): 29-64.
    [This is a very accessible essay that outlines Latour's approach to the "distributed agency" in technologies as used and valued socially.]

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • Consider a digital interface (e.g., a Web browser and its "windows"; the "tiled" iPhone/iPad display; or a specific software interface) in as many dimensions of mediation as you can describe.
  • What are the consequences of software-driven screens implementing the "metamedium" function?
  • Thinking with Debray's mediology and Latour's Actor-Network Theory approach, describe as many of the concealed ("invisibility cloaked") forces and agencies being mediated or "interfaced" in an everyday digital device (not the "content").

Learning Objective and Key Ideas:
Learning the key concepts in "computational thinking" and the design principles in computation and software code. We will focus on foundational concepts that can be extended for understanding today's environment of "computation everywhere" and "an app for everything."

This unit focuses on the key concepts in computation and core models for software, computer and information design, and all digital media. We will approach the questions from a non-specialist perspective, but it's important for everyone to get a conceptual grasp of the core ideas in computation because they are now pervasive throughout many sciences (including the cognitive sciences), and are behind everything we do daily with computational devices, information processing, and digital media (for example, the Google algorithms for searches, all the apps in mobile devices, the software functions for displaying and playing digital media).



  • Jeannette Wing, Computational Thinking (Video)
    [Introduction to a way to make computing accessible to non-CS students.]
  • Jeannette Wing, "Computational Thinking." Communications of the ACM 49, no. 3 (March 2006): 33–35.
    [Short essay on the topic; Wing has launched a wide discussion in CS circles and education.]
  • Daniel Hillis, The Pattern On The Stone: The Simple Ideas That Make Computers Work. New York: Basic Books, 1999. (excerpts from chaps. 1-2).
    [You can read these basic concepts quickly, and move to the next reading, which provides the core concepts to prepare you for the tutorial using the Python programming language.]
  • David Evans, Introduction to Computing: Explorations in Language, Logic, and Machines. Oct. 2011 edition. CreateSpace Independent Publishing Platform; Creative Commons Open Access:

    Read chapters 1-3 (Computing, Language, Programming); others as reference and as your time and interest allow at this point. You can always return to other chapters for reference and self-study.
    [This is a terrific book based on Evans' years of teaching Intro Computer Science courses at the University of Virginia. The book is open access, and the website has updates and downloads.]
  • Martin Irvine, An Introduction to Computational Concepts

Reference and Going Further (Supplementary)

  • Ron White, How Computers Work. 9th ed. Indianapolis, IN: Que Publishing, 2007. Excerpts.
    • The Basics (hardware and software architectures)
    • Software Applications
      [Use this book as a reference for the operational nuts and bolts of hardware components and the computational principles behind them. Note that the systems architecture described is applicable to the miniaturization and modular design of mobile devices as functions become integrated in fewer chips and smaller space.]
  • Denning, Peter J. "The Great Principles of Computing." American Scientist, October, 2010.
  • -----. "What Is Computation?" Ubiquity (ACM), August 26, 2010, and republished as "Opening Statement: What Is Computation?" The Computer Journal 55, no. 7 (July 1, 2012): 805-10.
  • Rosenbloom, Paul S. "Computing and Computation." ACM and The Computer Journal 55, no. 7 (July 1, 2012): 820-24.

Learning Project: Lessons On Code Academy

Discussion Questions

  • Describe what you learned from working through the CodeAcademy tutorial and making connections to the computing principles introduced this week. Were any key concepts clearer?

Assessment 3

Learning Objectives and Topics:

Learning how we got from the model of computation and computers as general logical-symbolic processors to the idea of "personal" computers for any human-software interaction and computational processes for creating, combining, processing, displaying, storing, and transmitting of human symbolic representation in any digitizable medium.

Learning the symbolic and cognitive functions of media supports and interfaces in the history of implementations, and the affordances of computationally (software-based) represented media. What can we learn about abstractable re-implementable functions in the continuum of symbolically mediating material supports from passive interfaces (writing and image surfaces, substrates, recordable media) to metamedia (computer screens, graphical and touch interfaces for controlling software) for representing other media? Can you describe and trace the substrate function (the symbolic use of supports, substrates, surface material media for inscription and memory, images, photographic) to graphical "windows"-based computer displays (functioning as a meta-substrate or metamedium, a surface medium for representing and interpreting other media). How can other, new combinations of media and software emerge from these key design principles?

Key Concepts and terms:

  • Interface (for human interaction with, and control of, software representations and outputs)
  • Graphical User Interface (GUI) and screen interfaces
  • Metamedium / Metamedia
  • Substrate Function (for symbolic representations)
  • Continuum of symbolic functions and the Meta function in computation and digital information/media

Introductory Videos:


Framing Concepts: From basic computation to the principles behind contemporary digital multimedia

  • Lev Manovich, Software Takes Command, pp. 55-239; and Conclusion.
    Follow Manovich's central arguments about "metamedium", "hybrid media", and "interfaces" and the importance of Allan Kay's "Dynabook" Metamedium concept.
    • For Manovich, the main differentiating feature of "new media" is software: media produced, controlled, and displayed with software. Digital media = software mediatized media.

How we get from computing machines to digital media and metamedia interfaces:

  • Vannevar Bush, "As We May Think," Atlantic, July, 1945. (Also etext version in pdf.)
    • A seminal essay on managing information and human thought by a leading computer engineer and technologist during and after World War II. Bush's pre-modern computing extrapolations lead to the concepts of GUIs (graphical user interface for computers), hypertext, and linked documents. His conceptual model, though not yet implementable with the computers of the 1940-50s, inspired Doug Engelbart and other computer designs that followed in the 1970s-80s and on to our own hypermediated era.
    • Wikipedia background on this essay.
    • Background on Bush's concept of the "Memex", precursor to hypertext information systems.
    • Video of a working model of the "Memex".
  • Douglas Engelbart and "Augmenting Human Intellect"
    • Engelbart is best known for inventing the graphical interface, the "desktop computer metaphor," the mouse, and the hyperlink. His research and development teams at Stanford in the 1960s-70s were influenced by Vannevar Bush's vision and motivated by a new conception of computers not simply as business, government, or military machines for instrumental ends but as aids for core human cognitive tasks that could be open to everyone. His approach to computer interfaces using interaction with a CRT tube display (early TV screen) launched an extensive HCI engineering/design and computing community around user interaction and "augmenting" (not replacing or simulating) human intelligence and cognitive needs. This is one major thread in what is still an ongoing debate in computer science and AI, and this approach influences current models of extended or externalized cognition in artefacts and computation.
    • Engelbart, "Augmenting Human Intellect: A Conceptual Framework." First published, 1962. As reprinted in The New Media Reader, edited by Noah Wardrip-Fruin and Nick Montfort, 93–108. Cambridge, MA: The MIT Press, 2003.
    • Also available online at the Doug Engelbart Institute:
      HTML annotated edition of "Augmenting Human Intellect: A Conceptual Framework" (the Doug Engelbart Institute site).
    • See Engelbart's Patent Application (with diagrams) for an "An X-Y Position Indicator for a Display System" (aka, mouse).
    • Background on Doug Engelbart at the Computer History Museum and the influence of Vannevar Bush's ideas.
  • Alan Kay and the Dynabook Concept as a Metamedium:
    • Alan Kay's original paper on the Dynabook concept: "A Personal Computer for Children of all Ages." Palo Alto, Xerox PARC, 1972).
      [Wonderful historical document. This is 1972--years before any PC, Mac, or tablet device.]
    • Alan Kay and Adele Goldberg, “Personal Dynamic Media” (1977), excerpt from The New Media Reader, ed. Noah Wardrip-Fruin and Nick Montfort. Originally published in Computer 10(3):31–41, March 1977. (Cambridge, MA: The MIT Press, 2003), 393–404.
      [Revised description of the concept for publication.]
    • Interview with Kay in Time Magazine (April, 2013). Interesting background on the conceptual history of the GUI, computer interfaces for "interaction," and today's computing devices.
  • Janet Murray, Inventing the Medium: Principles of Interaction Design as a Cultural Practice. Cambridge, MA: MIT Press, 2012. Selections in 2 files: Introduction and Chap. 2: Affordances of the Digital Medium
    • An excellent recent statement of the contemporary design principles that expand the design tradition we are studying this week.

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • Describe the conceptual, technical, and design steps that enabled computers and computation to be used for information access and processing with any kind of medium by ordinary, nontechnical users.
  • How are most of our contemporary "computing devices" (PCs, smart phones, tablets) forms of metamedia that deploy complex modular and combinatorial technologies as a medium for processing, representing, interacting with, and transmitting other media?
  • What are we doing when we use an interface in a computing device? What is it an interface to/for?

Learning Objectives and Topics:

This unit will focus more on the concepts and designs behind the hardware of our computational devices. Understanding the cumulative steps in the development of the computer from military, university engineering research, and business communities to a wide consumer base and multiple devices that take advantage of component miniaturization and networks. How did the computer "disappear" into consumer media appliances? The unanticipated convergence of telecommunications, computing, and methods for digitizing media (text, audio, graphics and images, photography, film/video). The transition to embedded computation and smaller processing chips in everyday use (examples: sensors, consumer devices, transportation systems).

Continuing with our ongoing case study of the iPhone:

Understanding the combinatorial path of technologies leading to the iPhone and other multipurpose mobile devices. Understanding the design demands of complex interdependent modular systems and the underlying non-technical and social-technical dependencies and agencies in a device like an iPhone. Analyzing technical components as interfaces to the larger dependency network of design principles and the social, institutional, and political-economic forces that have made the components and modular combination possible in this implementation.

Introductory Video Lecture

Text from Video Introduction

Readings: Intro to a design history of the computer up to PCs and mobile devices

History of Computing and Design Concepts from Mainframes to PCs:

  • Michael S. Mahoney, "The Histories of Computing(s)." Interdisciplinary Science Reviews 30, no. 2 (June 2005). [The different research and development communities behind concepts and applications for computing.]
  • Martin Campbell-Kelly, "Origin of Computing." Scientific American 301, no. 3 (September 2009): 62–69.
  • Martin Campbell-Kelly and William Aspray. Computer: A History Of The Information Machine. 3rd ed. Boulder, CO: Westview Press, 2014. Excerpts from Part 4 on the Personal Computer, Internet, and World Wide Web.

Technical Architecture:

  • Ron White, From How Computers Work (9th Edition): Part One: Boot Up Process (History and Architectures)
    Use for reference and conceptual understanding of basic PC computing architecture. This is extensible to mobile devices with further modularlization and miniaturization of components.
  • More Advanced:
  • David A. Patterson, and John L. Hennessy. Computer Organization and Design: The Hardware/Software Interface. 5th ed. Oxford, UK; Waltham, MA: Morgan Kaufmann, 2013. Excerpts from Chapter 1.
    [Excellent overview of important concepts for system architecture from PCs to tablets. For beginning computer engineering students, but accessible.]

iPhone Architecture:

Sources for computer history

Mobile Telephony and Mobile Internet Devices: Statistics and Implications

Case Study:
De-blackboxing the iPhone and other modular computational devices

  • Deblackboxing Technology: Presentation, Prof. Irvine (view it in Presentation mode; click through animation)
  • Why it matters: Understanding a complex technical device like an iPhone means going beyond the consumer's view to the invisible story of bundled functions and the design principles and technical-social conditions that allowed this kind of convergence at this moment. Considering the many forms of technical mediation and how they intersect in a specific product. How consumers are socialized into functions and features that remain black-boxed and productized.

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • Big picture: "De-blackboxing" the development of computational, telecom, and digital media devices by connecting some of the main relationships, dependencies, and conditions that enabled the devices to have their power in our current "information age": in the related industries and institutions; policy and government funding; private investment and entrepreneurial culture; various states of the material technologies (hardware, and/or software); standards-making across institutions and manufacturers allowing development of markets and the creation of industry ecosystems; social conditions leading to the adoption of technologies, market conditions, and consumer uses.
  • Develop a specific case study: follow a "program of action" in a PC or mobile device and uncover as far as you can the technical components and software that can be interpreted not as objects or black-boxed products but as interfaces to the whole system that enables the functions we see and use: cumulative combinatorial dependencies (Brian Arthur's method), modular architectures of software layers and hardware components (systems method), and ecosystems of industry relationships (business and economic dependencies). "Who" or "what" is performing the actions that we observe? "Who" is "taking" or "making" the photo on your smart phone? Can you see some outlines of a pattern of distributed and delegated agency (Latour)? What stored or bundled agencies and prior functions do we trigger when we use an app or media feature?

Assessment 4

Learning Objectives:

Learning how to describe and understand digital media and the affordances and constraints of expressions and artefacts in digital form. We will focus on digital music and photography for understanding the technical specifications and formats, and the use of these media artefacts socially and culturally. By connecting the readings this week with earlier units in the course, you should be able to answer these questions:

  • What happens in digitization of media (text, images, music, film/video), and what are the consequences and affordances of digital media platforms, interfaces, metamedia devices (mobile, tablets)?
  • What are the differences between digitized media created in analogue platforms and "new" media "born digital"?
  • How are digital media formats and production/playback/display software interfaces designed to re-mediate prior forms and re-implement their social-cultural-political functions (texts, images, photography, film/video/TV, music, news media).
  • How do expose more clearly the analog-digital-analog continuum of contemporary media experience, and the significance of designing and programming "digital media" to simulate or emulate "analog media"?


The analog-digital-analog continuum
We've seen much debate across many disciplines over the past 25 years about the status, properties, and functions of digital media -- as objects of culture (differences in artefacts with various uses and distribution), law (status of creative expression as copyrightable intellectual property), and economics (business and commercial value, monetization of everything digital). Fixation on things digital obscures a fundamental reality: we now live in an analog-digital-analog continuum -- with ongoing looping back and forth, among and between, the material and symbolic states.

What is digitization? The need for a basic conceptual foundation in how digital sampling works
The first readings provide some basic backgrounds in the common technical means for encoding/ representing light properties (like photographic images captured on sensors in a digital camera) and audio and music (encoding continuous acoustic waves in an audio file). The trick we do in digitizing is representing mathematically what occurs in perceptible forms like visual representations and sounds by taking many time-stamped snapshots or slices (samples) of these continuous time and energy states (known as "analog"), using software to calculate all the various energy values at each defined sliced time sequence (a sample, say, in milliseconds), and then encode these sections or slices of continuous states as discrete time representations with quantifiable values. (There -- you have the basic definition a "digital medium"!) The details get technical and mathematical very quickly, but learn as much as you can to demystify the basics of digitization. Of course, when we display or play back the digitally encoded information (using software for reversing the process and hardware for rendering media in perceptible forms) we are back in our analog perceptible-symbolic-material world. Wikipedia has useful explanations for:

Digital media expose the real power of our symbolic systems that was "black boxed" in analog!
Our creation and perception of digital media artefacts (songs, photographs, film/video, computer files, Web pages, mobile app displays) begins, ends, and is motivated by the humanly perceived, socially embedded, material-symbolic forms that we can now see not as forms tied to a material substrate like paper, film or a recording medium, but as symbols capable of multiple implementations or instantiations. Writing, text, images, symbolic sounds, even 3D artefacts were never really functions of their traditional or historical media or their past substrates of representation, storage, and transmission. Writing, text, and images have now become more powerful and more widely distributed symbolic forms than ever before. Our digital technologies now expose what was always hidden in plain sight about our symbolic representations: they are empowered by social and cultural functions, not by the state of their material forms. There are definitely many "messages" in the material form of a medium: one of them is the how media technologies can reshape and re-mediate symbolic forms and social uses of communication that preexist any specific, historical configuration of technical mediation systems themselves.

So, how de we define "being digital" or "digital being"? Is there a "digital culture"?
We almost always use digital media to simulate, emulate, or reproduce the experience of analog media, that is, representing symbolic forms that can be created and received by human senses and perceptual organs. You frame and shoot a photo on your smart phone with the same eyes and assumptions used for film photos or for viewing photo prints or reproductions in print media. We read digital text with the same eyes and assumptions for reading printed texts. Adobe Photoshop began with, and still incorporates, photo darkroom concepts, processes, and metaphors established in analog lens and film-based photography. Similar analog music and audio conceptual extensions apply to digital audio workstation software (DAWs like Reason, Logic, and Protools) used in music production and in DJ live performance software platforms (usually with an interface that incorporates a virtual turntable look). Your smart phone or PC couldn't play or display digital music or video without a DAC (digital-analog converter hardware and software) and CODEC software for the specific file format to be interpreted for the hardware. On our devices, it all culminates in a transducer, a physical-material interface that converts electronic signals to perceptible images (in the humanly perceptible light spectrum and cinematic frame rate for video) and audio sounds (acoustic wave forms output in the humanly audible range).

“Digital culture” can only mean our social and cultural experience with computational and software technologies embedded in the whole cumulative combination of technologies that we live with. While digital media and software agents often remediate prior functions and agencies and can automate many processes behind the scenes, these technologies all coexist with prior or “traditional” technical mediations and symbolic systems. Our "mediation continuum" includes, subsumes, and folds in, digitizable mediations that we now use to record, store, transform, and distribute all forms of past and present media from text and historical images to film/video and all forms of hybrid media. The real "transform" (in the algorithmic sense) is now found in two additive features: the ability for digital artefacts (any encoded unit of expression) to be open to unlimited additional software processing or interpretation, and the ability to off-load and store all cumulative media representations in digitizable forms for ongoing recall, processing, and interpretation.

Digital music, photography, and video thus begin and end in analog material, perceptible states (except when produced or composed in software as a media object for later output). If we're always in an analog world, what are the important differences in the media states or formats? As Manovich and many others argue convincingly, the difference in the digital artefact is its continual openness to software processing and transformations beyond any initial physical or recorded state. Further, digital media can be produced and output with many layers of digital/software production, and can be totally produced in software. We have post-photography photos and film/video that are either totally abstract (not formed by lens-based captured light and images) or HD hyperreal simulations of photo-optical representations (e.g., the movies Avatar, Life of Pi, and The Avengers). We have all kinds of music with software generated sounds mixed (in software!) with live acoustic (analog) instrument and vocal sounds, and all kinds of hybrid mixed combinations of sound sources. What, then, is the status of the sometimes fixed but mutable digital artefact? How do we best understand the layers and interfaces in all forms of mediation and remediation when our media are always already hybrids?

Key terms and concepts to be learned:

  • Analog / Digital
  • Digitization
  • Sample (sample rate)
  • Codec (analog-digital enCODing and DECoding hardware and software)
  • Digital artefact / digital object
  • Digital media format/standards (e.g., .jpg ; .mp3; .pdf; .doc; .mov)

Key questions:

  • What are the differences in the properties of analog and digital media?
  • What does it mean to create, capture, or record a medium in a digital form, that is, what is involved in producing a digital representation of an analog medium?
  • What are the physical and computational steps in making a digital media form and then displaying or playing it back through our devices and equipment?
  • What are the key factors in digital media that determine the quality of the medium as we perceive it?

Video Introduction

Text from Video Introduction


Digital Photography and Audio: Technical Details

  • Ron White and Timothy Downs. How Digital Photography Works. 2nd ed. Indianapolis, IN: Que Publishing, 2007.
    [ Excerpts that cover the basics of the digital camera and digital image creation, memory, and processing.]
  • John Watkinson, Art of Digital Audio. 3rd ed. Oxford; Boston: Focal Press, 2000.
    [You won't have to read this straight through, but read around to see if you can get the basic concepts of digital audio encoding and how/why it works. This is more detail that you will probably want or need, but very useful for demystifying the well-designed engineering processes behind all the digital audio we use every day! Written at the level of systems abstraction, the author is not concerned with the standards, formats, and commoditization of digital media artefacts that come at the next levels of implementation.]

The Technologies and Social Conditions

  • Jonathan Sterne, .MP3: The Meaning of a Format. Durham, NC: Duke University Press Books, 2012.
    Introduction: Format Theory (excerpt)
  • Raphael Nowak and Andrew Whelan. "Editorial: On the 15-Year Anniversary of Napster - Digital Music as Boundary Object." First Monday 19, no. 10 (October 5, 2014).
    [Useful discussion of the unpredictable convergence of forces that enabled the MP3 music format to become standardized through a base of users and gaps in software patent regulations. The entire issue of this journal is devoted to the digital media and music environment after music file sharing in standardized formats.]
  • Lev Manovich, "New Media: Eight Propositions." Excerpt from "New Media from Borges to HTML," from The New Media Reader, edited by Noah Wardrip-Fruin and Nick Montfort, The MIT Press, 2002.
    This is a different summary of Manovich's approach in a different context.
    Review also the main points in Manovich, Software Takes Command (from Week 8),
    and The Language of New Media (Cambridge: MIT Press, 2001), from chap. 1: "What is New Media".

Social and Institutional Contexts of Digital Media

Case Studies: MP3, Streaming Media, Digital Photography, and Digital Convergence

Examples: Photographs [Presentation] and Music [Grooveshark]

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • What is a digital media object/artefact? Do you understand a way to describe or define what is specific to digital media objects and what is part of more general social and cultural functions of any type of medium? How do we conceive a file, an object, a digital stream, an artefact? What is the digital artefact's relation to our systems of symbolic representation in any form?
  • How would you describe the analog-digital continuum that all digital media must be designed to implement? Does the background on the technologies this week help with understanding further the "deblackboxing" of iPhone app processes that we studied last week?
  • What are the differences between media captured digitally (e.g., with a camera or with music recording equipment) and media created totally within a software environment (media "born digitally") and designed for playback and display through decoding equipment that produces perceptible analogue artefacts (music/sound, images/photographs/video; graphics)? Use examples.

Learning Objectives and Topics:

  • Understanding the background history and technical design of the Internet and the "Inter-networking" architecture concept embodied in the open protocols and standards of the Internet.
  • Understanding how the Internet is designed as a decentralized network based on a design for "end-to-end" connections.
  • Understanding the Internet as a paradigm case of "cumulative orchestrated combinatoriality" (Arthur) with intersecting histories of technical-social-political-economic development and interdependencies.
  • Understanding the consequences of the extensible design principles of the Internet and why these principles remain vitally important in the expanding global and international development of all the media, services, and apps that depend on the Internet architecture.

Orientation to the Internet as Socio-Technical System

As users, consumers, citizens, and workers with business functions depending on the Internet, we only see a small interface view of the Internet in the software on our computing devices. Socialization into media and computers, consumerist ideologies, and focus on technology productization keeps the Internet blackboxed and the deeper histories and dependencies closed off from awareness and understanding. The cumulative technologies and bundles of functions that are now so well-integrated in Internet design and digital media have deep histories in the affordances of symbolic mediation in technical artefacts. We see an essential continuum in the design history of symbolic substrates and systems for symbolic representation (the development of ancient tablets up to modern writing technologies) and in the abstraction of symbolic message units into code in the telegraph leading to the latest implementations in mobile devices and computational tablets. All the concepts and media implementations now rolled up into the Internet -- with its affordances for metamedia transmission and representation in multiple interfaces -- are parts of longer histories that we can point to, but only just begin to uncover, while studying the specifics of the technical architectures.

The Internet as a system is our "mother of all case studies" for deblackboxing social-technical systems and questions of distributed agency and distributed cognition. We can pose many useful questions by using the Internet and Web as a paradigm of mediation, agency, and transmission: the significance of the convergence of all digital (digitized) media forms across a global network using common protocols and standards; the industry, policy and technology ecosystems that enable the Internet to function and grow with new innovations. Given that Internet protocols are "open" and standards based on industry collaboration and consensus with confirmation by NGO policy groups (except in cases of monopoly dominance) and that networks are distributed systems with no one center, where are the real sources of power and authority? A major challenge today is working out how we "re-orchestrate" regulatory policy in the political-economy regimes for media, data, entertainment (TV, radio, cable), and communications after convergence on the common platform of the Internet.

Technical Design and Social Systems

To understand the interdependence of the technical and social, we also have to get the underlying technical concepts straight and the properties designed into the Internet as a global "network of networks." You will need to learn some "Internet speak" for how things work technically in the overall architecture.

From the socio-technical complex systems view, the Internet is a global, international "network of networks" connected by computing systems using Internet protocols (the core TCP/IP suite) and client-server architectures designed to support Internet protocols on any kind of computing device (all Web software, smartphone apps, cloud, etc.).

"The Network is the Computer": A large contributor to the success of the Internet network is the design principle for decentralized distribution of computing power through the client/server architecture: our local, small computing devices (called "clients" on a network) don't store a lot of data or have big software programs taking a lot of memory to do complex computations like sorting a Google search which would require even more memory and processing power. The large data storage and complex processing is done on many Internet "servers" that connect to our "clients" by software requests that we activate over the network with interfaces that have the software for Internet and Web protocols built-in (e.g., clicking a link and getting a Web page or screen of information to display, getting media from a server to play in our local device, triggering a network service on an app).

Actually implementing this system design in all the specific hardware and software we use would be impossible without the invisible institutional and political-economic agreements and policies (e.g., international telecom and data traffic agreements, cross-border long-haul data lines), hardware and software standards (defined in standards groups and implemented by major multinational companies like Cisco and Google), regional ISPs and wireless networks, and the multiple dependencies in "agency networks" for everything that happens between--and among--end users and the globally distributed system.

So, the "Internet" isn't a thing or external object that can have "effects" on us: the Internet is a social-technical-political-economic system enacted through distributed agency in one of our most complex "orchestrated combinations of prior combinations." Forming a useful answer to a simple question like "what is the Internet?" takes us into a system of relations that resists reification in an "it" -- no thing or object as a singular referent of the term. This decentralized, distributed "system" (technical - social - political - economic), though weighted at the nodal concentration centers of global power, has many consequences and challenges for simplistic (instrumental, operational) models of agency and power. A big macro, international question today is: how is the Internet--and everything done and enacted through it--to be controlled, governed, regulated, and accessed? Who "owns" the Internet? How can all the actors/agents and their interests be managed in a global and international decentralized system?

Key Concepts and Terms

  • Network Architecture
  • Protocol and the TCP/IP protocol suite
  • Data Packet
  • Packet Switching, Packet-Switched Network
  • Client / Server
  • Internet address (IP address)
  • Domain Name System (DNS)

Introductory Video Lecture

Text from Video Introduction

Readings: History, Basic Architecture, and Design Principles of the Internet

The Implications of Network Design Principles, Protocols, and Standards

Videos with Reliable Sources


Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • Design principles and the Internet as metamedium:
  • The Internet and all its subsystems provide one of the most complex systems for thinking about mediations, interdependencies, and agencies. Most people only see functions presented at the interface level--and an interface works by making itself invisible (by design and by our cultural expectations). The complexity of the computer and network architecture are invisible.
  • From the reading this week, could you give a clear answer to the question, "what does it mean to be 'on the Internet'"? How can we resist talking about "the Internet" as a totalized, reified, or uniform "technology" and actively take into account the variety of subsystems, subcomponents, and social institutions that must be orchestrated to work together?
  • For discussion this week, take a case--like an app or digital media type--and investigate the network of socio-technical dependencies, histories of technological development, economic ecosystems, institutions of mediation (standards, policy and regulation, industry groups, patents?), markets and demographics.

Assessment 5


Learning Objectives and Topics:

Most of us experience "the Internet" or "the World Wide Web" from the graphical interface of a Web browser program (both on PCs and mobile devices) of through the graphical interface of a dedicated app that accesses specific content and services for presentation in the form factor of a mobile device (the screen dimensions and specific device properties). This unit will take you further in learning the key design principles and means of technical implementation for the World Wide Web and all interface "apps" that use Web and Internet technologies.

The designs for the Internet and the Web as an integrative platform are extensible and scalable as new developments and hybrid technologies emerge for the Internet/Web system. This unlimited extensibility results from underlying design decisions universally adopted as principles by all participants in the technologies: the Internet and the Web are designed in layers so that the overall architecture is always independent of applications that can run on it (e.g., email, social networking, searching, shopping, requesting and viewing information, playing media...).

With the convergence of telecommunications, computing, information science, hardware/software, and digital media, the Web is a common distributed platform that subsumes our prior media into a new mediasphere and metamedium, a system of ongoing reconfigurations of material technologies, software and algorithms, content, and the institutions and industries that enable and sustain the "Internet" as such.

Main learning objectives:

  • Learning how the extensible and scalable design principles of the Internet/Web architecture are continuing to be developed and expanded for any Internet device from large office computers to mobile devices to "smart" TVs.
  • Learning the important background history of information and text/media concepts that led to the design of the Web as a hypermedia system for linking files, media content, and accessing complex networked computer services over the Internet.
  • Understanding the "orchestrated combinatoriality" in the modular design of the Web.
  • Using the methods we are developing, uncovering the components of technical-social system of the Web that are usually blackboxed.

Key terms and concepts

  • Client / server (in Web architecture)
  • Hypertext / hypermedia
  • HTTP: "Hypertext Transport Protocol"
  • URL: "Uniform Resource Locator": the human-readable Web file "address" on a server
  • HTML: "Hypertext Markup Language"
  • Web "browser" or client program
  • App: short for "(software) application"; on a mobile device, PC. or smart TV = an interface with controls to use Internet and Web resources (content) and services (server-side software, transactions). Another example of "client" software for interacting with servers on the network.
  • HTML5: the current HTML cluster of evolving standards, interoperating Web "languages," and data server architecture for using the Web as an all-purpose platform for connecting any IP-enabled device, any OS, any screen size.

Introductory Video Lecture/Presentation

Text from Video Introduction

The World Wide Web: Architecture and Interfaces

The Cumulative History of Information Concepts Leading to the World Wide Web

  • The history of classifying and linking books and texts as physical documents (meta-information)
  • Imagining a universal linked library: how we got to hypertext (linked text) and linked media in all digital forms
  • Background readings on Paul Otlet, Vannevar Bush, Doug Engelbart, J. C. R. Licklider, Ivan Sutherland, Ted Nelson, Tim Berners-Lee
  • Tim Berners-Lee, Robert Cailliau, Ari Luotonen, Henrik Frystyk Nielsen, and Arthur Secret. "The World-Wide Web." Communications of the ACM 37, no. 8 (August 1, 1994): 76–82.
    [This is an early description of the Web design; only the examples of applications are outdated.]
  • Tim Berners-Lee, Weaving the Web: The Original Design and Ultimative Destiny of the World Wide Web. New York, NY: Harper Business, 2000. Excerpts.
    [These excerpts reveal the concepts behind the design of the Web system and take us up to the point where the Web took off with the Mosaic and Netscape browsers. Berners-Lee joined MIT to begin the WWW Consortium in 1994.]
  • Tim Berners-Lee's original proposal for the Web (1989) ( archive)

Consequences of Design Principles and Social-Economic-Policy Constraints

  • Apps and mobile device interfaces: channeling and fragmenting Internet/Web architecture for commerce and consumerism
  • The "effects" of Internet and Web technologies are outcomes of the system which they mediate and are mediated in.
  • Jonathan Zittrain and others: scenarios and futures: policies and industry models that can determine kinds of futures.

The Wider Social Implications of the Internet and Web

  • From Jonathan Zittrain, The Future of the Internet--And How to Stop It. New Haven, CT: Yale University Press, 2009. Excerpt from introduction and chap. 1. Entire book is available in a Creative Commons version on the author's site.
    [Zittrain's approach shows how the "effects" of present and future of Internet technologies are not hidden, black box operations, but emerge from the larger system of design principles, policy, and business/industry decisions.]
  • Siva Vaidhyanathan, The Googlization of Everything (and Why We Should Worry). Berkeley: University of California Press, 2011. (excerpts)

Reference Sources

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • With a better understanding of Internet, the Web, and and digital media design principles, how can we provide better informed policy arguments on key issues; for example, debates about regulation of TV/media/entertainment industry "content" delivered over the Internet as something for the FFC on the model of cable TV or broadcast?
  • On the metamedium level, what is the difference between a "web browser" and an iPad app--both technically and philosophically as an interface for users?
  • What are the consequences of Google becoming the dominant gateway or access to Web content?

Learning Objectives and Topics:

Understanding the main concepts and design principles behind important Internet/Web technologies that will continue in many future developments and implementations: Cloud Computing, the Internet of Things (IoT), Big Data, and Ambient Computing. Deblackboxing these complex technologies with the accessible concepts and methods of this course so that we can (1) go beyond media hype, mystification, and misunderstanding, (2) go beyond productizing by companies large and small (everyone is marketing a "solution" in one of more of these technology clusters), and (3) becoming informed about the implications and need for wider participation in future developments.

Students will work toward:

  • Understanding the design principles for Cloud Computing, the Internet of Things (IoT), and Ambient Computing, their system of dependencies, and the possible futures of these technologies as they become implemented and adopted.
  • Being able to describe and analyze these recent implementations of network computing on the Web platform according to the principles for extensible and scalable design and as implementations of the ongoing cumulative, combinatorial modular design principles of computational and digital media technologies.
  • Being able to describe and analyze how mobile devices with client software and Internet connectivity like the iPhone are now designed to interact with this larger combined system and represent a node in the combinatorial modular design for Cloud and Ambient Computing.
  • Being able to describe how Internet and computational principles can be implemented to connect many kinds of "things" (= artefacts and/or physical locations) to IP networks (wired and wireless) for many kinds of automated processes (sensors for monitoring physical processes, geo-location, real-time interactive data).

The Cloud = "The Network is the Computer"

Cloud Computing can sound nebulous and mysterious, but it's an extension of principles in the distributed Internet/Web networked computing architecture for bundling many kinds of behind-the-scenes processes and functions so that they appear -- and are used -- as services and utilities. The "network as a bundle of services" works on many levels, both for end users and for companies providing the services (for example, streaming media and Cloud based apps that run in a Web browser or mobile device). The Cloud computing concept and ways of deploying it in actual systems architecture are other great examples of cumulative, combinatorial, modular design principles. As in other design principles for complex systems that we have studied, modular design in Cloud architecture takes the form of many software layers (abstraction layers) hidden from view that pass on the results of their processes, functions, or tasks to other layers in the system. Network software bundles up the outputs of all the process layers (many of which were done on computers widely distributed anywhere on the system network) to provide the unified, combined results that we experience when using Internet devices and digital media.

There are many technical and social dependencies that enable the Cloud design concepts to be implemented now and in the future, but would not have been possible to implement during earlier phases of the Internet. The Cloud architecture of networked computers takes advantage of (1) faster computation in clusters of servers, (2) computer systems themselves designed specifically as servers to be clustered in data centers, (3) cheaper and almost unlimited memory capacity for software processing on Internet/Web servers, and (4) faster ubiquitous networks with more capacity and routing/switching "intelligence" designed for fiber optic networks. Of course, there are many more layers of dependencies in policies, standards, and industry ecosystems that are continually reconfiguring around new developments as all the conditions required for implementing new technology concepts become aligned.

As with everything in technology, so much depends on standards accepted across industries, institutions, and engineering research communities. Cloud architecture principles require that anything developed must interoperate with all existing Internet/Web systems and standards. Technical specifications for Cloud systems are based on an evolving set of standards for network services that extend the architecture of the Internet and Web as a platform for any network-deliverable service or data content. The Cloud concept re-models the Web and IP networks as a virtual utility or "always-on" service accessible anywhere with an IP-enabled device. The Cloud architecture distributes functions and resources across many kinds of computational processes that don't "live" or "run" on any single server or device. Here's a definition from the National Institute of Standards and Technology (NIST) that provides a reference point:

Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services ) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics, three service models, and four deployment models.
(NIST, US Department of Commerce [Read further: NIST document])

The extensible, scalable, modular design principles of the Internet and Web implemented on a platform of open architecture and standards has enabled our current version of the "always on" Web of active behind-the-scenes network services, delivering content and software that runs in our apps and Web windows, and streaming media to any IP enabled device. Instead of relying on the earlier Internet client/server model for everything, the Cloud principle distributes the network "intelligence" across many other networked computational processes on many computers simultaneously.

The End-User's Experience and Interfaces to Cloud Computing

If you use iTunes and Apple iOS apps, do Google searches, use Google apps, shop on Amazon or eBay, view Youtube videos and multimedia web pages, or use mobile device apps to get real-time data, you are interacting with many kinds of behind-the-scenes Internet/Web Cloud Computing architectures. Almost everything we experience when interacting with Web and Internet services today involves interoperating components of Cloud Computing architecture, which essentially bundles many networking, software, and server infrastructure functions into an "always on" utility without the limitations of individual computers in one physical location.

The Cloud = Black Box

Our user view is a seamless integration of results, but the Cloud is intentionally a very closed off and complex black box given a mystical, transcendental name, The Cloud. The Cloud metaphor has been around a long time in engineering for networks, but now the term designates a specialized implementation of Internet/Web architectures and software layers. (We will explore the implications of "The Cloud" = "Black Box".)

Cloud | Internet-of-Things | Big Data | Ambient Computing

The background of Cloud architecture is also presupposed in other Net/Web implementations like the Internet of Things (IoT) and Ambient Computing, a network environment where things with geo-location and sensing chips with wireless IP connectivity can send and receive information and interact with the background of computing processes with or without direct human control. The iPhone and other mobile devices are one kind of end node in the larger Cloud, Ambient, and IoT design concepts.

Introductory Video Lecture/Presentation

Text from Video Introduction


Background on Architecture and System Design:
Cumulative Combinatoriality, Extensible and Scalable Modular Design Principles

Business Cases: Companies you may have never heard of actually provide the network services that get data and media to your end devices:

The Possible Futures of the Internet/Web:
Cloud, Internet of Things, "Always On" Ambient Computing

Discussion Questions
Use these questions to help reflect on your reading and thinking this week, and choose at least one to write about:

  • Reviewing our earlier conceptual frameworks, how would you describe the current implementations of Cloud Computing and the Internet of Things with the key concepts and methods of the course:
    • cumulative combinatoriality and modular design,
    • extending and distributing individual and collective symbolic cognition and human agency
    • re-mediating our communication media and the social functions of communication
    • using computing and digital media interfaces as an extensible metamedium (a medium for representing and processing other media)?
  • What are the wider implications beyond the scope of this course for the future of the Internet/Web, digital media, and the connecting of many kinds of things to the Internet?
  • Returning to our course motto: "Technology is too important to be left to technologists." Do you think we've achieved our main learning objective: providing access to the key concepts in technology so people with non-technical backgrounds can participate in the important debates about technology in their own fields or professions.


Final Assessment (Assessing Your Ability to Work with the Key Concepts)

Final Capstone Project

  • Instructions for researching and writing a 10-12 page "white paper" on a major technology issue in an industry, sector, or field that can be better analyzed or understood through the approaches in the course. Will be submitted digitally in Blackboard. Can be customized as a study useful for your career and current position.