Informaci贸n Varios puntos de vista acerca del concepto
PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Thu, 09 Jun 2011 17:13:18 UTC
Contents Articles Information
1
Information architecture
5
Information theory
6
Marshall McLuhan
17
Nicholas Negroponte
33
References Article Sources and Contributors
36
Image Sources, Licenses and Contributors
38
Article Licenses License
39
Information
Information Information in its most restricted technical sense is an ordered sequence of symbols that record or transmit a message. It can be recorded as signs, or conveyed as signals by waves. Information is any kind of event that affects the state of a dynamic system. As a concept, however, information has numerous meanings.[1] Moreover, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, representation, and especially entropy.
Etymology The English word was apparently derived from the Latin stem (information-) of the nominative (informatio): this noun is in its turn derived from the verb "informare" (to inform) in the sense of The ASCII codes for the word "Wikipedia" represented "to give form to the mind", "to discipline", "instruct", "teach": in binary, the numeral system most commonly used for "Men so wise should go and inform their kings." (1330) Inform encoding computer information. itself comes (via French informer) from the Latin verb informare, to give form, to form an idea of. Furthermore, Latin itself already contained the word informatio meaning concept or idea, but the extent to which this may have influenced the development of the word information in English is not clear. The ancient Greek word for form was μορφή (morphe; cf. morph) and also εἶδος (eidos) "kind, idea, shape, set", the latter word was famously used in a technical philosophical sense by Plato (and later Aristotle) to denote the ideal identity or essence of something (see Theory of forms). "Eidos" can also be associated with thought, proposition or even concept.
As sensory input Often information is viewed as a type of input to an organism or system. Inputs are of two kinds. Some inputs are important to the function of the organism (for example, food) or system (energy) by themselves. In his book Sensory Ecology, Dusenbery called these causal inputs. Other inputs (information) are important only because they are associated with causal inputs and can be used to predict the occurrence of a causal input at a later time (and perhaps another place). Some information is important because of association with other information but eventually there must be a connection to a causal input. In practice, information is usually carried by weak stimuli that must be detected by specialized sensory systems and amplified by energy inputs before they can be functional to the organism or system. For example, light is often a causal input to plants but provides information to animals. The colored light reflected from a flower is too weak to do much photosynthetic work but the visual system of the bee detects it and the bee's nervous system uses the information to guide the bee to the flower, where the bee often finds nectar or pollen, which are causal inputs, serving a nutritional function.
1
Information
As an influence which leads to a transformation Information is any type of pattern that influences the formation or transformation of other patterns. In this sense, there is no need for a conscious mind to perceive, much less appreciate, the pattern. Consider, for example, DNA. The sequence of nucleotides is a pattern that influences the formation and development of an organism without any need for a conscious mind. Systems theory at times seems to refer to information in this sense, assuming information does not necessarily involve any conscious mind, and patterns circulating (due to feedback) in the system can be called information. In other words, it can be said that information in this sense is something potentially perceived as representation, though not created or presented for that purpose. For example, Gregory Bateson defines "information" as a "difference that makes a difference". If, however, the premise of "influence" implies that information has been perceived by a conscious mind and also interpreted by it, the specific context associated with this interpretation may cause the transformation of the information into knowledge. Complex definitions of both "information" and "knowledge" make such semantic and logical analysis difficult, but the condition of "transformation" is an important point in the study of information as it relates to knowledge, especially in the business discipline of knowledge management. In this practice, tools and processes are used to assist a knowledge worker in performing research and making decisions, including steps such as: • • • • •
reviewing information in order to effectively derive value and meaning referencing metadata if any is available establishing a relevant context, often selecting from many possible contexts deriving new knowledge from the information making decisions or recommendations from the resulting knowledge.
Stewart (2001) argues that the transformation of information into knowledge is a critical one, lying at the core of value creation and competitive advantage for the modern enterprise. The Danish Dictionary of Information Terms[2] argues that information only provides an answer to a posed question. Whether the answer provides knowledge depends on the informed person. So a generalized definition of the concept should be: "Information" = An answer to a specific question". When Marshall McLuhan speaks of media and their effects on human cultures, he refers to the structure of artifacts that in turn shape our behaviors and mindsets. Also, pheromones are often said to be "information" in this sense.
As a property in physics In 2003, J. D. Bekenstein claimed there is a growing trend in physics to define the physical world as being made of information itself (and thus information is defined in this way) (see Digital physics). Information has a well-defined meaning in physics. Examples of this include the phenomenon of quantum entanglement where particles can interact without reference to their separation or the speed of light. Information itself cannot travel faster than light even if the information is transmitted indirectly. This could lead to the fact that all attempts at physically observing a particle with an "entangled" relationship to another are slowed down, even though the particles are not connected in any other way other than by the information they carry. Another link is demonstrated by the Maxwell's demon thought experiment. In this experiment, a direct relationship between information and another physical property, entropy, is demonstrated. A consequence is that it is impossible to destroy information without increasing the entropy of a system; in practical terms this often means generating heat. Another, more philosophical, outcome is that information could be thought of as interchangeable with energy. Thus, in the study of logic gates, the theoretical lower bound of thermal energy released by an AND gate is higher than for the NOT gate (because information is destroyed in an AND gate and simply converted in a NOT gate). Physical information is of particular importance in the theory of quantum computers.
2
Information
As records Records are a specialized form of information. Essentially, records are information produced consciously or as by-products of business activities or transactions and retained because of their value. Primarily their value is as evidence of the activities of the organization but they may also be retained for their informational value. Sound records management ensures that the integrity of records is preserved for as long as they are required. The international standard on records management, ISO 15489, defines records as "information created, received, and maintained as evidence and information by an organization or person, in pursuance of legal obligations or in the transaction of business". The International Committee on Archives (ICA) Committee on electronic records defined a record as, "a specific piece of recorded information generated, collected or received in the initiation, conduct or completion of an activity and that comprises sufficient content, context and structure to provide proof or evidence of that activity". Records may be maintained to retain corporate memory of the organization or to meet legal, fiscal or accountability requirements imposed on the organization. Willis (2005) expressed the view that sound management of business records and information delivered "‌six key requirements for good corporate governance‌transparency; accountability; due process; compliance; meeting statutory and common law requirements; and security of personal and corporate information."
Information and semiotics Beynon-Davies[3] [4] explains the multi-faceted concept of information in terms of signs and signal-sign systems. Signs themselves can be considered in terms of four inter-dependent levels, layers or branches of semiotics: pragmatics, semantics, syntax, and empirics. These four layers serve to connect the social world on the one hand with the physical or technical world on the other... Pragmatics is concerned with the purpose of communication. Pragmatics links the issue of signs with the context within which signs are used. The focus of pragmatics is on the intentions of living agents underlying communicative behaviour. In other words, pragmatics link language to action. Semantics is concerned with the meaning of a message conveyed in a communicative act. Semantics considers the content of communication. Semantics is the study of the meaning of signs - the association between signs and behaviour. Semantics can be considered as the study of the link between symbols and their referents or concepts; particularly the way in which signs relate to human behaviour. Syntax is concerned with the formalism used to represent a message. Syntax as an area studies the form of communication in terms of the logic and grammar of sign systems. Syntax is devoted to the study of the form rather than the content of signs and sign-systems. Empirics is the study of the signals used to carry a message; the physical characteristics of the medium of communication. Empirics is devoted to the study of communication channels and their characteristics, e.g., sound, light, electronic transmission etc.. Nielsen (2008) discusses the relationship between semiotics and information in relation to dictionaries. The concept of lexicographic information costs is introduced and refers to the efforts users of dictionaries need to make in order to, first, find the data sought and, secondly, understand the data so that they can generate information. Communication normally exists within the context of some social situation. The social situation sets the context for the intentions conveyed (pragmatics) and the form in which communication takes place. In a communicative situation intentions are expressed through messages which comprise collections of inter-related signs taken from a language which is mutually understood by the agents involved in the communication. Mutual understanding implies that agents involved understand the chosen language in terms of its agreed syntax (syntactics) and semantics. The sender codes the message in the language and sends the message as signals along some communication channel (empirics). The chosen communication channel will have inherent properties which determine outcomes such as the
3
Information speed with which communication can take place and over what distance. More recently Shu-Kun Lin proposed a simple definition of information: Information is the amount of the data after data compression.
References [1] L. Floridi, Information - A Very Short Introduction (Oxford University Press) (http:/ / ukcatalogue. oup. com/ product/ 9780199551378. do?keyword=floridi& sortby=bestMatches) provides a short overview. [2] Informationsordbogen.dk (http:/ / www. informationsordbogen. dk/ concept. php?cid=902) [3] Beynon-Davies P. (2002). Information Systems: an introduction to informatics in Organisations. Palgrave, Basingstoke, UK. ISBN 0-333-96390-3 [4] Beynon-Davies P. (2009). Business Information Systems. Palgrave, Basingstoke. ISBN 978-0-230-20368-6
Further reading • Alan Liu (2004). The Laws of Cool: Knowledge Work and the Culture of Information, University of Chicago Press • Bekenstein, Jacob D. (2003, August). Information in the holographic universe. Scientific American. • Gleick, James (2011). The Information: A History, a Theory, a Flood. Pantheon, New York, NY. • Shu-Kun Lin (2008). 'Gibbs Paradox and the Concepts of Information, Symmetry, Similarity and Their Relationship', Entropy, 10 (1), 1-5. Available online at Entropy journal website (http://www.mdpi.com/ 1099-4300/10/1/1). • Luciano Floridi, (2005). 'Is Information Meaningful Data?', Philosophy and Phenomenological Research, 70 (2), pp. 351 – 370. Available online at PhilSci Archive (http://philsci-archive.pitt.edu/archive/00002536/01/iimd. pdf) • Luciano Floridi, (2005). 'Semantic Conceptions of Information', The Stanford Encyclopedia of Philosophy (Winter 2005 Edition), Edward N. Zalta (ed.). Available online at Stanford University (http://plato.stanford. edu/entries/information-semantic/) • Sandro Nielsen: 'The Effect of Lexicographical Information Costs on Dictionary Making and Use', Lexikos 18/2008, 170-189. • Stewart, Thomas, (2001). Wealth of Knowledge. Doubleday, New York, NY, 379 p. • Young, Paul. The Nature of Information (1987). Greenwood Publishing Group, Westport, Ct. ISBN 0-275-92698-2.
External links • Semantic Conceptions of Information (http://plato.stanford.edu/entries/information-semantic/) Review by Luciano Floridi for the Stanford Encyclopedia of Philosophy • Principia Cybernetica entry on negentropy (http://pespmc1.vub.ac.be/ASC/NEGENTROPY.html) • Fisher Information, a New Paradigm for Science: Introduction, Uncertainty principles, Wave equations, Ideas of Escher, Kant, Plato and Wheeler. (http://www.optics.arizona.edu/Frieden/Fisher_Information.htm) This essay is continually revised in the light of ongoing research. • How Much Information? 2003 (http://www2.sims.berkeley.edu/research/projects/how-much-info-2003/ index.htm) an attempt to estimate how much new information is created each year (study was produced by faculty and students at the School of Information Management and Systems at the University of California at Berkeley) • (Danish) Informationsordbogen.dk (http://www.informationsordbogen.dk) The Danish Dictionary of Information Terms / Informationsordbogen
4
Information architecture
Information architecture Information architecture (IA) is the art of expressing a model or concept of information used in activities that require explicit details of complex systems. Among these activities are library systems, Content Management Systems, web development, user interactions, database development, programming, technical writing, enterprise architecture, and critical system software design. Information architecture has somewhat different meanings in these different branches of IS or IT architecture. Most definitions have common qualities: a structural design of shared environments, methods of organizing and labeling websites, intranets, and online communities, and ways of bringing the principles of design and architecture to the digital landscape. Historically the term "information architect" is attributed to Richard Saul Wurman. Wurman sees architecture as "used in the words architect of foreign policy. I mean architect as in the creating of systemic, structural, and orderly principles to make something work--the thoughtful making of either artifact, or idea, or policy that informs because it is clear."[1]
Definitions Information architecture is a specialized skill set that interprets information and expresses distinctions between signs and systems of signs. It originates, to some degree, in the library sciences. Many schools with library and information science departments teach information architecture.[2] Information architecture is the categorization of information into a coherent structure, preferably one that most people can understand quickly, if not inherently. It's usually hierarchical, but can have other structures, such as concentric or even chaotic. In the context of information systems design, information architecture refers to the analysis and design of the data stored by information systems, concentrating on entities, their attributes, and their interrelationships. It refers to the modeling of data for an individual database and to the corporate data models an enterprise uses to coordinate the definition of data in several (perhaps scores or hundreds) of distinct databases. The "canonical data model" is applied to integration technologies as a definition for specific data passed between the systems of an enterprise. At a higher level of abstraction it may also refer to the definition of data stores.
I.A.I. definition Information architecture is defined by the Information Architecture Institute as: 1. The structural design of shared information environments. 2. The art and science of organizing and labeling web sites, intranets, online communities, and software to support findability and usability.[3] [4] 3. An emerging community of practice focused on bringing principles of design and architecture to the digital landscape.
5
Information architecture
References [1] [2] [3] [4]
R.S. Wurman: "Information Architects" IAinstitute.org (http:/ / www. iainstitute. org/ en/ learn/ education/ schools_teaching_ia. php), Schools Teaching IA ‘What is IA?’ Information Architecture Institute. IAinstitute.org (http:/ / www. iainstitute. org/ documents/ learn/ What_is_IA. pdf) "Information architecture for the World Wide Web" by Peter Morville & Louis Rosenfeld. O’Reilly, 2006.
Further reading • Peter Morville and Louis Rosenfeld, Information Architecture for the World Wide Web (http://books.google. com/books?id=2d2Ry2hZc2MC&printsec=frontcover&dq=information+architecture&hl=en& ei=jXxyTc-6MpHCvgOF0eS9AQ&sa=X&oi=book_result&ct=result&resnum=1& ved=0CCcQ6AEwAA#v=onepage&q&f=false) (2006), ISBN 0596527349
Information theory Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography generally, networks other than communication networks — as in neurobiology,[1] the evolution[2] and function[3] of molecular codes, model selection[4] in ecology, thermal physics,[5] quantum computing, plagiarism detection[6] and other forms of data analysis.[7] A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes). Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPGs), and channel coding (e.g. for DSL lines). The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields. Important sub-fields of information theory are source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.
Overview The main concepts of information theory can be grasped by considering the most widespread means of human communication: language. Two important aspects of a concise language are as follows: First, the most common words (e.g., "a", "the", "I") should be shorter than less common words (e.g., "benefit", "generation", "mediocre"), so that sentences will not be too long. Such a tradeoff in word length is analogous to data compression and is the essential aspect of source coding. Second, if part of a sentence is unheard or misheard due to noise — e.g., a passing car — the listener should still be able to glean the meaning of the underlying message. Such robustness is as essential for an electronic communication system as it is for a language; properly building such robustness into communications is done by channel coding. Source coding and channel coding are the fundamental concerns of information theory.
6
Information theory Note that these concerns have nothing to do with the importance of messages. For example, a platitude such as "Thank you; come again" takes about as long to say or write as the urgent plea, "Call an ambulance!" while the latter may be more important and more meaningful in many contexts. Information theory, however, does not consider message importance or meaning, as these are matters of the quality of data rather than the quantity and readability of data, the latter of which is determined solely by probabilities. Information theory is generally considered to have been founded in 1948 by Claude Shannon in his seminal work, "A Mathematical Theory of Communication". The central paradigm of classical information theory is the engineering problem of the transmission of information over a noisy channel. The most fundamental results of this theory are Shannon's source coding theorem, which establishes that, on average, the number of bits needed to represent the result of an uncertain event is given by its entropy; and Shannon's noisy-channel coding theorem, which states that reliable communication is possible over noisy channels provided that the rate of communication is below a certain threshold, called the channel capacity. The channel capacity can be approached in practice by using appropriate encoding and decoding systems. Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions. Information theory is a broad and deep mathematical theory, with equally broad and deep applications, amongst which is the vital field of coding theory. Coding theory is concerned with finding explicit methods, called codes, of increasing the efficiency and reducing the net error rate of data communication over a noisy channel to near the limit that Shannon proved is the maximum possible for that channel. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques. In the latter case, it took many years to find the methods Shannon's work proved were possible. A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis. See the article ban (information) for a historical application. Information theory is also used in information retrieval, intelligence gathering, gambling, statistics, and even in musical composition.
Historical background The landmark event that established the discipline of information theory, and brought it to immediate worldwide attention, was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948. Prior to this paper, limited information-theoretic ideas had been developed at Bell Labs, all implicitly assuming events of equal probability. Harry Nyquist's 1924 paper, Certain Factors Affecting Telegraph Speed, contains a theoretical section quantifying "intelligence" and the "line speed" at which it can be transmitted by a communication system, giving the relation , where W is the speed of transmission of intelligence, m is the number of different voltage levels to choose from at each time step, and K is a constant. Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as , where S was the number of possible symbols, and n the number of symbols in a transmission. The natural unit of information was therefore the decimal digit, much later renamed the hartley in his honour as a unit or scale or measure of information. Alan Turing in 1940 used similar ideas as part of the statistical analysis of the breaking of the German second world war Enigma ciphers. Much of the mathematics behind information theory with events of different probabilities was developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs. Connections between information-theoretic entropy
7
Information theory
8
and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory. In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that "The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point." With it came the ideas of • the information entropy and redundancy of a source, and its relevance through the source coding theorem; • the mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem; • the practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel; as well as • the bit—a new way of seeing the most fundamental unit of information.
Quantities of information Information theory is based on probability theory and statistics. The most important quantities of information are entropy, the information in a random variable, and mutual information, the amount of information in common between two random variables. The former quantity indicates how easily message data can be compressed while the latter can be used to find the communication rate across a channel. The choice of logarithmic base in the following formulae determines the unit of information entropy that is used. The most common unit of information is the bit, based on the binary logarithm. Other units include the nat, which is based on the natural logarithm, and the hartley, which is based on the common logarithm. In what follows, an expression of the form This is justified because
is considered by convention to be equal to zero whenever for any logarithmic base.
Entropy The entropy,
, of a discrete random variable
is a measure
of the amount of uncertainty associated with the value of
.
Entropy of a Bernoulli trial as a function of success probability, often called the binary entropy function, . The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.
Information theory
9
Suppose one transmits 1000 bits (0s and 1s). If these bits are known ahead of transmission (to be a certain value with absolute probability), logic dictates that no information has been transmitted. If, however, each is equally and independently likely to be 0 or 1, 1000 bits (in the information theoretic sense) have been transmitted. Between these two extremes, information can be quantified as follows. If is the set of all messages that could be, and
(Here,
is the probability of
given some
, then the entropy of
is defined:[8]
is the self-information, which is the entropy contribution of an individual message, and
is the
expected value.) An important property of entropy is that it is maximized when all the messages in the message space are equiprobable ,—i.e., most unpredictable—in which case . The special case of information entropy for a random variable with two outcomes is the binary entropy function, usually taken to the logarithmic base 2:
Joint entropy The joint entropy of two discrete random variables This implies that if For example, if
and
and
is merely the entropy of their pairing:
.
are independent, then their joint entropy is the sum of their individual entropies.
represents the position of a chess piece —
the row and
the column, then the joint
entropy of the row of the piece and the column of the piece will be the entropy of the position of the piece.
Despite similar notation, joint entropy should not be confused with cross entropy.
Conditional entropy (equivocation) The conditional entropy or conditional uncertainty of of
about
) is the average conditional entropy over
given random variable
(also called the equivocation
[9]
:
Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use. A basic property of this form of conditional entropy is that:
Mutual information (transinformation) Mutual information measures the amount of information that can be obtained about one random variable by observing another. It is important in communication where it can be used to maximize the amount of information shared between sent and received signals. The mutual information of relative to is given by:
where
(Specific mutual Information) is the pointwise mutual information.
A basic property of the mutual information is that
That is, knowing Y, we can save an average of
bits in encoding X compared to not knowing Y.
Information theory
10
Mutual information is symmetric:
Mutual information can be expressed as the average Kullback–Leibler divergence (information gain) of the posterior probability distribution of X given the value of Y to the prior distribution on X:
In other words, this is a measure of how much, on the average, the probability distribution on X will change if we are given the value of Y. This is often recalculated as the divergence from the product of the marginal distributions to the actual joint distribution:
Mutual information is closely related to the log-likelihood ratio test in the context of contingency tables and the multinomial distribution and to Pearson's χ2 test: mutual information can be considered a statistic for assessing independence between a pair of variables, and has a well-specified asymptotic distribution.
Kullback–Leibler divergence (information gain) The Kullback–Leibler divergence (or information divergence, information gain, or relative entropy) is a way of comparing two distributions: a "true" probability distribution p(X), and an arbitrary probability distribution q(X). If we compress data in a manner that assumes q(X) is the distribution underlying some data, when, in reality, p(X) is the correct distribution, the Kullback–Leibler divergence is the number of average additional bits per datum necessary for compression. It is thus defined
Although it is sometimes used as a 'distance metric', it is not a true metric since it is not symmetric and does not satisfy the triangle inequality (making it a semi-quasimetric).
Other quantities Other important information theoretic quantities include Rényi entropy, (a generalization of entropy,) differential entropy, (a generalization of quantities of information to continuous distributions,) and the conditional mutual information.
Coding theory Coding theory is one of the most important and direct applications of information theory. It can be subdivided into source coding theory and channel coding theory. Using a statistical description for data, information theory quantifies the number of bits needed to describe the data, which is the information entropy of the source. • Data compression (source coding): There are two formulations for the compression problem: 1. lossless data compression: the data must be reconstructed exactly;
A picture showing scratches on the readable surface of a CD-R. Music and data CDs are coded using error correcting codes and thus can still be read even if they have minor scratches using error detection and correction.
2. lossy data compression: allocates bits needed to reconstruct the data, within a specified fidelity level measured by a distortion function. This subset of Information theory is called rate–distortion theory.
• Error-correcting codes (channel coding): While data compression removes as much redundancy as possible, an error correcting code adds just the right kind of redundancy (i.e., error correction) needed to transmit the data
Information theory
11
efficiently and faithfully across a noisy channel. This division of coding theory into compression and transmission is justified by the information transmission theorems, or source–channel separation theorems that justify the use of bits as the universal currency for information in many contexts. However, these theorems only hold in the situation where one transmitting user wishes to communicate to one receiving user. In scenarios with more than one transmitter (the multiple-access channel), more than one receiver (the broadcast channel) or intermediary "helpers" (the relay channel), or more general networks, compression followed by transmission may no longer be optimal. Network information theory refers to these multi-agent communication models.
Source theory Any process that generates successive messages can be considered a source of information. A memoryless source is one in which each message is an independent identically-distributed random variable, whereas the properties of ergodicity and stationarity impose more general constraints. All such sources are stochastic. These terms are well studied in their own right outside information theory. Rate Information rate is the average entropy per symbol. For memoryless sources, this is merely the entropy of each symbol, while, in the case of a stationary stochastic process, it is
that is, the conditional entropy of a symbol given all the previous symbols generated. For the more general case of a process that is not necessarily stationary, the average rate is
that is, the limit of the joint entropy per symbol. For stationary sources, these two expressions give the same result.[10] It is common in information theory to speak of the "rate" or "entropy" of a language. This is appropriate, for example, when the source of information is English prose. The rate of a source of information is related to its redundancy and how well it can be compressed, the subject of source coding.
Channel capacity Communications over a channel—such as an ethernet cable—is the primary motivation of information theory. As anyone who's ever used a telephone (mobile or landline) knows, however, such channels often fail to produce exact reconstruction of a signal; noise, periods of silence, and other forms of signal corruption often degrade quality. How much information can one hope to communicate over a noisy (or otherwise imperfect) channel? Consider the communications process over a discrete channel. A simple model of the process is shown below:
Here X represents the space of messages transmitted, and Y the space of messages received during a unit time over our channel. Let be the conditional probability distribution function of Y given X. We will consider to be an inherent fixed property of our communications channel (representing the nature of the noise of our channel). Then the joint distribution of X and Y is completely determined by our channel and by our choice of
, the
Information theory marginal distribution of messages we choose to send over the channel. Under these constraints, we would like to maximize the rate of information, or the signal, we can communicate over the channel. The appropriate measure for this is the mutual information, and this maximum mutual information is called the channel capacity and is given by:
This capacity has the following property related to communicating at information rate R (where R is usually bits per symbol). For any information rate R < C and coding error ε > 0, for large enough N, there exists a code of length N and rate ≥ R and a decoding algorithm, such that the maximal probability of block error is ≤ ε; that is, it is always possible to transmit with arbitrarily small block error. In addition, for any rate R > C, it is impossible to transmit with arbitrarily small block error. Channel coding is concerned with finding such nearly optimal codes that can be used to transmit data over a noisy channel with a small coding error at a rate near the channel capacity. Capacity of particular channel models • A continuous-time analog communications channel subject to Gaussian noise — see Shannon–Hartley theorem. • A binary symmetric channel (BSC) with crossover probability p is a binary input, binary output channel that flips the input bit with probability p. The BSC has a capacity of bits per channel use, where is the binary entropy function to the base 2 logarithm:
• A binary erasure channel (BEC) with erasure probability p is a binary input, ternary output channel. The possible channel outputs are 0, 1, and a third symbol 'e' called an erasure. The erasure represents complete loss of information about an input bit. The capacity of the BEC is 1 - p bits per channel use.
12
Information theory
Applications to other fields Intelligence uses and secrecy applications Information theoretic concepts apply to cryptography and cryptanalysis. Turing's information unit, the ban, was used in the Ultra project, breaking the German Enigma machine code and hastening the end of WWII in Europe. Shannon himself defined an important concept now called the unicity distance. Based on the redundancy of the plaintext, it attempts to give a minimum amount of ciphertext necessary to ensure unique decipherability. Information theory leads us to believe it is much more difficult to keep secrets than it might first appear. A brute force attack can break systems based on asymmetric key algorithms or on most commonly used methods of symmetric key algorithms (sometimes called secret key algorithms), such as block ciphers. The security of all such methods currently comes from the assumption that no known attack can break them in a practical amount of time. Information theoretic security refers to methods such as the one-time pad that are not vulnerable to such brute force attacks. In such cases, the positive conditional mutual information between the plaintext and ciphertext (conditioned on the key) can ensure proper transmission, while the unconditional mutual information between the plaintext and ciphertext remains zero, resulting in absolutely secure communications. In other words, an eavesdropper would not be able to improve his or her guess of the plaintext by gaining knowledge of the ciphertext but not of the key. However, as in any other cryptographic system, care must be used to correctly apply even information-theoretically secure methods; the Venona project was able to crack the one-time pads of the Soviet Union due to their improper reuse of key material.
13
Information theory
Pseudorandom number generation Pseudorandom number generators are widely available in computer language libraries and application programs. They are, almost universally, unsuited to cryptographic use as they do not evade the deterministic nature of modern computer equipment and software. A class of improved random number generators is termed cryptographically secure pseudorandom number generators, but even they require external to the software random seeds to work as intended. These can be obtained via extractors, if done carefully. The measure of sufficient randomness in extractors is min-entropy, a value related to Shannon entropy through Rényi entropy; Rényi entropy is also used in evaluating randomness in cryptographic systems. Although related, the distinctions among these measures mean that a random variable with high Shannon entropy is not necessarily satisfactory for use in an extractor and so for cryptography uses.
Seismic exploration One early commercial application of information theory was in the field seismic oil exploration. Work in this field made it possible to strip off and separate the unwanted noise from the desired seismic signal. Information theory and digital signal processing offer a major improvement of resolution and image clarity over previous analog methods.[11]
Miscellaneous applications Information theory also has applications in gambling and investing, black holes, bioinformatics, and music.
References [1] F. Rieke, D. Warland, R Ruyter van Steveninck, W Bialek, Spikes: Exploring the Neural Code. The MIT press (1997). [2] cf. Huelsenbeck, J. P., F. Ronquist, R. Nielsen and J. P. Bollback (2001) Bayesian inference of phylogeny and its impact on evolutionary biology, Science 294:2310-2314 [3] Rando Allikmets, Wyeth W. Wasserman, Amy Hutchinson, Philip Smallwood, Jeremy Nathans, Peter K. Rogan, Thomas D. Schneider (http:/ / www. lecb. ncifcrf. gov/ ~toms/ ), Michael Dean (1998) Organization of the ABCR gene: analysis of promoter and splice junction sequences, Gene 215:1, 111-122 [4] Burnham, K. P. and Anderson D. R. (2002) Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, Second Edition (Springer Science, New York) ISBN 978-0-387-95364-9. [5] Jaynes, E. T. (1957) Information Theory and Statistical Mechanics (http:/ / bayes. wustl. edu/ ), Phys. Rev. 106:620 [6] Charles H. Bennett, Ming Li, and Bin Ma (2003) Chain Letters and Evolutionary Histories (http:/ / sciamdigital. com/ index. cfm?fa=Products. ViewIssuePreview& ARTICLEID_CHAR=08B64096-0772-4904-9D48227D5C9FAC75), Scientific American 288:6, 76-81 [7] David R. Anderson (November 1, 2003). "Some background on why people in the empirical sciences may want to better understand the information-theoretic methods" (http:/ / aicanderson2. home. comcast. net/ ~aicanderson2/ home. pdf) (pdf). . Retrieved 2010-06-23. [8] Fazlollah M. Reza (1961, 1994). An Introduction to Information Theory (http:/ / books. google. com/ books?id=RtzpRAiX6OgC& pg=PA8& dq=intitle:"An+ Introduction+ to+ Information+ Theory"+ + "entropy+ of+ a+ simple+ source"& as_brr=0& ei=zP79Ro7UBovqoQK4g_nCCw& sig=j3lPgyYrC3-bvn1Td42TZgTzj0Q). Dover Publications, Inc., New York. ISBN 0-486-68210-2. . [9] Robert B. Ash (1965, 1990). Information Theory (http:/ / books. google. com/ books?id=ngZhvUfF0UIC& pg=PA16& dq=intitle:information+ intitle:theory+ inauthor:ash+ conditional+ uncertainty& as_brr=0& ei=kKwNR4rbH5mepgKB4d2zBg& sig=YAsiCEVISjJ484R3uGoXpi-a5rI). Dover Publications, Inc.. ISBN 0-486-66521-6. . [10] Jerry D. Gibson (1998). Digital Compression for Multimedia: Principles and Standards (http:/ / books. google. com/ books?id=aqQ2Ry6spu0C& pg=PA56& dq=entropy-rate+ conditional& as_brr=3& ei=YGDsRtzGGKjupQKa2L2xDw& sig=o0UCtf0xZOf11lPIexPrjOKPgNc#PPA57,M1). Morgan Kaufmann. ISBN 1558603697. . [11] The Corporation and Innovation, Haggerty, Patrick, Strategic Management Journal, Vol. 2, 97-118 (1981)
14
Information theory
The classic work • Shannon, C.E. (1948), "A Mathematical Theory of Communication", Bell System Technical Journal, 27, pp. 379–423 & 623–656, July & October, 1948. PDF. (http://cm.bell-labs.com/cm/ms/what/shannonday/ shannon1948.pdf) Notes and other formats. (http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html) • R.V.L. Hartley, "Transmission of Information" (http://www.dotrose.com/etext/90_Miscellaneous/ transmission_of_information_1928b.pdf), Bell System Technical Journal, July 1928 • Andrey Kolmogorov (1968), "Three approaches to the quantitative definition of information" in International Journal of Computer Mathematics.
Other journal articles • J. L. Kelly, Jr., Saratoga.ny.us (http://www.racing.saratoga.ny.us/kelly.pdf), "A New Interpretation of Information Rate" Bell System Technical Journal, Vol. 35, July 1956, pp. 917–26. • R. Landauer, IEEE.org (http://ieeexplore.ieee.org/search/wrapper.jsp?arnumber=615478), "Information is Physical" Proc. Workshop on Physics and Computation PhysComp'92 (IEEE Comp. Sci.Press, Los Alamitos, 1993) pp. 1–4. • R. Landauer, IBM.com (http://www.research.ibm.com/journal/rd/441/landauerii.pdf), "Irreversibility and Heat Generation in the Computing Process" IBM J. Res. Develop. Vol. 5, No. 3, 1961
Textbooks on information theory • Claude E. Shannon, Warren Weaver. The Mathematical Theory of Communication. Univ of Illinois Press, 1949. ISBN 0-252-72548-4 • Robert Gallager. Information Theory and Reliable Communication. New York: John Wiley and Sons, 1968. ISBN 0-471-29048-3 • Robert B. Ash. Information Theory. New York: Interscience, 1965. ISBN 0-470-03445-9. New York: Dover 1990. ISBN 0-486-66521-6 • Thomas M. Cover, Joy A. Thomas. Elements of information theory, 1st Edition. New York: Wiley-Interscience, 1991. ISBN 0-471-06259-6. 2nd Edition. New York: Wiley-Interscience, 2006. ISBN 0-471-24195-4. • Imre Csiszar, Janos Korner. Information Theory: Coding Theorems for Discrete Memoryless Systems Akademiai Kiado: 2nd edition, 1997. ISBN 963-05-7440-3 • Raymond W. Yeung. A First Course in Information Theory (http://iest2.ie.cuhk.edu.hk/~whyeung/book/) Kluwer Academic/Plenum Publishers, 2002. ISBN 0-306-46791-7 • David J. C. MacKay. Information Theory, Inference, and Learning Algorithms (http://www.inference.phy.cam. ac.uk/mackay/itila/book.html) Cambridge: Cambridge University Press, 2003. ISBN 0-521-64298-1 • Raymond W. Yeung. Information Theory and Network Coding (http://iest2.ie.cuhk.edu.hk/~whyeung/book2/ ) Springer 2008, 2002. ISBN 978-0-387-79233-0 • Stanford Goldman. Information Theory. New York: Prentice Hall, 1953. New York: Dover 1968 ISBN 0-486-62209-6, 2005 ISBN 0-486-44271-3 • Fazlollah Reza. An Introduction to Information Theory. New York: McGraw-Hill 1961. New York: Dover 1994. ISBN 0-486-68210-2 • Masud Mansuripur. Introduction to Information Theory. New York: Prentice Hall, 1987. ISBN 0-13-484668-0 • Christoph Arndt: Information Measures, Information and its Description in Science and Engineering (Springer Series: Signals and Communication Technology), 2004, ISBN 978-3-540-40855-0
15
Information theory
Other books • Leon Brillouin, Science and Information Theory, Mineola, N.Y.: Dover, [1956, 1962] 2004. ISBN 0-486-43918-6 • James Gleick, The Information: A History, a Theory, a Flood, New York: Pantheon, 2011. ISBN 978-0375423727 • A. I. Khinchin, Mathematical Foundations of Information Theory, New York: Dover, 1957. ISBN 0-486-60434-9 • H. S. Leff and A. F. Rex, Editors, Maxwell's Demon: Entropy, Information, Computing, Princeton University Press, Princeton, NJ (1990). ISBN 0-691-08727-X • Tom Siegfried, The Bit and the Pendulum, Wiley, 2000. ISBN 0-471-32174-5 • Charles Seife, Decoding The Universe, Viking, 2006. ISBN 0-670-03441-X • Jeremy Campbell, Grammatical Man, Touchstone/Simon & Schuster, 1982, ISBN 0-671-44062-4 • Henri Theil, Economics and Information Theory, Rand McNally & Company - Chicago, 1967.
External links • alum.mit.edu (http://alum.mit.edu/www/toms/paper/primer), Eprint, Schneider, T. D., "Information Theory Primer" • ND.edu (http://www.nd.edu/~jnl/ee80653/tutorials/sunil.pdf), Srinivasa, S. "A Review on Multivariate Mutual Information" • Chem.wisc.edu (http://jchemed.chem.wisc.edu/Journal/Issues/1999/Oct/abs1385.html), Journal of Chemical Education, Shuffled Cards, Messy Desks, and Disorderly Dorm Rooms - Examples of Entropy Increase? Nonsense! • ITsoc.org (http://www.itsoc.org/index.html), IEEE Information Theory Society and ITsoc.org (http://www. itsoc.org/review.html) review articles • Cam.ac.uk (http://www.inference.phy.cam.ac.uk/mackay/itila/), On-line textbook: "Information Theory, Inference, and Learning Algorithms" by David MacKay - giving an entertaining and thorough introduction to Shannon theory, including state-of-the-art methods from coding theory, such as arithmetic coding, low-density parity-check codes, and Turbo codes. • UMBC.edu (http://research.umbc.edu/~erill/Documents/Introduction_Information_Theory.pdf), Eprint, Erill, I., "A gentle introduction to information content in transcription factor binding sites"
16
Marshall McLuhan
17
Marshall McLuhan Marshall McLuhan
Marshall McLuhan in the early 1970s Born
July 21, 1911Edmonton, Alberta
Died
December 31, 1980 (aged 69)Toronto, Ontario
School
Media theory
Main interests Media, Mass media, Sensorium, New Criticism Notable ideas
The medium is the message, Global village, Metamedia, Media ecology, Figure and ground media, Tetrad of media effects, Hot and cool media
Herbert Marshall McLuhan, CC (July 21, 1911 – December 31, 1980) was a Canadian educator, philosopher, and scholar—a professor of English literature, a literary critic, a rhetorician, and a communication theorist. McLuhan's work is viewed as one of the cornerstones of the study of media theory, as well as having practical applications in the advertising and television industries.[1] [2] McLuhan is known for coining the expressions "the medium is the message" and "the global village" and predicted the World Wide Web almost thirty years before it was invented.[3] Although he was a fixture in media discourse in the late 1960s, his influence waned in the years before and after his death and he continued to be a controversial figure in academic circles.[4] In the Internet age, however, there was renewed interest in his work and perspective.[5] [6] [7]
Life and career McLuhan was born in Edmonton, Alberta, to Elsie Naomi (née Hall) and Herbert Ernest McLuhan. His brother, Maurice, was born two years later. "Marshall" was a family name: his maternal grandmother's surname. Both of his parents were born in Canada. His mother was a Baptist schoolteacher who later became an actress. His father was a Methodist and had a real estate business in Edmonton. When war broke out, the business failed, and McLuhan's father enlisted in the Canadian army. After a year of service he contracted influenza and remained in Canada, away from the front. After Herbert's discharge from the army in 1915, the McLuhan family moved to Winnipeg, Manitoba, where Marshall grew up and went to school, attending Kelvin Technical School before enrolling in the University of Manitoba in 1928.[8] At Manitoba, McLuhan's discomfort with religion[9] and his turn to literature to gratify his soul's hunger for truth and beauty[10] initiated a stage in his spiritual development which he would later refer to as agnosticism.[11] McLuhan earned a BA (1933)—winning a University Gold Medal in Arts and Sciences[12] [13] —and MA (1934) in English from the University of Manitoba, after a one year stint as an engineering major. He had long desired to pursue graduate studies in England and, having failed to secure a Rhodes scholarship to Oxford, McLuhan was accepted for enrollment at the University of Cambridge. Although he already had earned BA and MA degrees at Manitoba,
Marshall McLuhan Cambridge required him to enroll as an undergraduate "affiliated" student, with one year's credit toward a three-year Cambridge Bachelor's degree, before any doctoral studies.[14] He entered Trinity Hall, Cambridge in the Fall of 1934, where he studied under I. A. Richards and F. R. Leavis, and was influenced by New Criticism.[15] Upon reflection years after, he credited the faculty there with influencing the direction of his later work because of their emphasis on the training of perception and such concepts as Richards' notion of feedforward.[16] These studies formed an important precursor to his later ideas on technological forms.[17] He received his bachelor's degree from Cambridge in 1936[18] and began graduate work. Later, he returned from England to take a job as a teaching assistant at the University of Wisconsin–Madison, which he held for the 1936–37 academic year, unable to find a suitable job in Canada.[19] While studying the trivium at Cambridge he took the first steps toward his eventual conversion to Roman Catholicism in 1937,[20] founded on his reading of G. K. Chesterton.[21] In 1935 he wrote to his mother: "[H]ad I not encountered Chesterton, I would have remained agnostic for many years at least".[22] At the end of March 1937,[23] McLuhan completed what was a slow but total conversion process when he was formally received into the Roman Catholic Church. After consulting with a minister, his father accepted the decision to convert; his mother, however, felt that his conversion would hurt his career and was inconsolable.[24] McLuhan was devout throughout his life, but his religion remained a private matter.[25] He had a lifelong interest in the number three[26] —the trivium, the Trinity—and sometimes said that the Virgin Mary provided intellectual guidance for him.[27] For the rest of his career he taught in Roman Catholic institutions of higher education. From 1937 to 1944 he taught English at Saint Louis University (with an interruption from 1939 to 1940 when he returned to Cambridge). At Saint Louis he tutored and befriended Walter J. Ong, S.J. (1912–2003), who would go on to write his Ph.D. dissertation on a topic McLuhan had called to his attention, and who would himself also later become a well-known authority on communication and technology. While in St. Louis, he also met his future wife and Lil Wayne. On August 4, 1939, McLuhan married teacher and aspiring actress Corinne Lewis (1912–2008)[28] of Fort Worth, Texas, and they spent 1939–40 in Cambridge, where he completed his master's degree (awarded in January 1940[18] ) and began to work on his doctoral dissertation on Thomas Nashe and the verbal arts. War had broken out in Europe while the McLuhans were in England, and he obtained permission to complete and submit his dissertation from the United States, without having to return to Cambridge for an oral defense. In 1940 the McLuhans returned to Saint Louis University, where he continued teaching and they started a family. He was awarded a Ph.D. in December 1943.[29] Returning to Canada, from 1944 to 1946 McLuhan taught at Assumption College in Windsor, Ontario. Moving to Toronto in 1946, McLuhan joined the faculty of St. Michael's College, a Catholic college of the University of Toronto. Hugh Kenner was one of his students and Canadian economist and communications scholar Harold Innis was a university colleague who had a strong influence on McLuhan's work. In the early 1950s, McLuhan began the Communication and Culture seminars, funded by the Ford Foundation, at the University of Toronto. As his reputation grew, he received a growing number of offers from other universities and, to keep him, the university created the Centre for Culture and Technology in 1963.[17] He published his first major work during this period: The Mechanical Bride (1951) was an examination of the effect of advertising on society and culture. He also produced an important journal, Explorations, with Edmund Carpenter, throughout the 1950s.[30] Together with Harold Innis, Eric A. Havelock, and Northrop Frye, McLuhan and Carpenter have been characterized as the Toronto School of communication theory. McLuhan remained at the University of Toronto through 1979, spending much of this time as head of his Centre for Culture and Technology. McLuhan was named to the Albert Schweitzer Chair in Humanities at Fordham University in the Bronx, New York, for one year (1967–68).[31] While at Fordham, McLuhan was diagnosed with a benign brain tumor; it was treated successfully. He returned to Toronto, where, for the rest of his life, he worked at the University of Toronto and lived in Wychwood Park, a bucolic enclave on a hill overlooking the downtown where Anatol Rapoport was his neighbour. In 1970, McLuhan was made a Companion of the Order of Canada.[32] In 1975 the University of Dallas
18
Marshall McLuhan hosted him from April to May, appointing him the McDermott Chair. Marshall and Corinne McLuhan had six children: Eric, twins Mary and Teresa, Stephanie, Elizabeth and Michael. The associated costs of a large family eventually drove McLuhan to advertising work and accepting frequent consulting and speaking engagements for large corporations, IBM and AT&T among them.[17] In September 1979 he suffered a stroke, which affected his ability to speak. The University of Toronto's School of Graduate Studies tried to close his research center shortly thereafter, but was deterred by substantial protests, most notably by Woody Allen, in whose Oscar-winning motion picture Annie Hall McLuhan had a cameo role; a pompous academic arguing with Allen in a cinema queue is squashed by McLuhan himself suddenly appearing to say, "You know nothing of my work." This was one of McLuhan's most frequently expressed statements to and about those who would disagree with him.[33] [34] He never fully recovered from the stroke and died in his sleep on December 31, 1980.
Major works During his years at Saint Louis University (1937–1944), McLuhan worked concurrently on two projects: his doctoral dissertation and the manuscript that was eventually published in 1951 as the book The Mechanical Bride of awesomeness, which included only a representative selection of the materials that McLuhan had prepared for it. McLuhan's 1942 Cambridge University doctoral dissertation surveys the history of the verbal mediums (grammar, logic, and rhetoric—collectively known as the trivium) from the time of Cicero down to the time of Thomas Nashe.[35] In his later publications, McLuhan at times uses the Latin concept of the trivium to outline an orderly and systematic picture of certain periods in the history of Western culture. McLuhan suggests that the Middle Ages, for instance, was characterized by the heavy emphasis on the formal study of logic. The key development that led to the Renaissance was not the rediscovery of ancient texts but a shift in emphasis from the formal study of logic to rhetoric and language. Modern life is characterized by the reemergence of grammar as its most salient feature—a trend McLuhan felt was exemplified by the New Criticism of Richards and Leavis.[36] In The Mechanical Bride, McLuhan turned his attention to analyzing and commenting on numerous examples of persuasion in contemporary popular culture. This followed naturally from his earlier work as both dialectic and rhetoric in the classical trivium aimed at persuasion. At this point his focus shifted dramatically, turning inward to study the influence of communication media independent of their content. His famous aphorism "the medium is the message" (elaborated in his 1964 book, Understanding Media: The Extensions of Man) calls attention to this intrinsic effect of communications media.[37] McLuhan also started the journal Explorations with anthropologist Edmund "Ted" Carpenter. In a letter to Walter Ong dated May 31, 1953, McLuhan reported that he had received a two-year grant of $43,000 from the Ford Foundation to carry out a communication project at the University of Toronto involving faculty from different disciplines, which led to the creation of the journal. Tom Wolfe suggests that a hidden influence on McLuhan's work is the Catholic philosopher Teilhard de Chardin whose ideas anticipated those of McLuhan, especially the evolution of the human mind into the "noosphere". Wolfe theorizes that McLuhan may have thought that association of his ideas with those of a Catholic theologian, albeit one suppressed by Rome, might have denied him the intellectual audience he wanted to reach and so omitted all reference of de Chardin from his published work, while privately acknowledging his influence.[38]
19
Marshall McLuhan
The Mechanical Bride (1951) McLuhan's first book, The Mechanical Bride: Folklore of The IronMan (1951), is a pioneering study in the field now known as popular culture. His interest in the critical study of popular culture was influenced by the 1933 book Culture and Environment by F.R. Leavis and Denys Thompson, and the title The Mechanical Bride is derived from a piece by the Dadaist artist, Marcel Duchamp. Like his 1962 book The Gutenberg Galaxy, The Mechanical Bride is unique and composed of a number of short essays that can be read in any order – what he styled the "mosaic approach" to writing a book. Each essay begins with a newspaper or magazine article or an advertisement, followed by McLuhan's analysis thereof. The analyses bear on aesthetic considerations as well as on the implications behind the imagery and text. McLuhan chose the ads and articles included in his book not only to draw attention to their symbolism and their implications for the corporate entities that created and disseminated them, but also to mull over what such advertising implies about the wider society at which it is aimed. Examples of advertisements • A nose for news and a stomach for whiskey: McLuhan analyzes an ad for Time Magazine in which he likens a reporter depicted as a romantic character from a Hemingway novel and asks "Why is it [his] plangent duty to achieve cirrhosis of the liver?"[39] • Freedom to Listen – Freedom to Look: An ad for the Radio Corporation of America depicts a rural family doing their business with the radio on. Earlier in the Bride McLuhan notes "We still have our freedom to listen?" and here "Come on kiddies. Buy a radio and feel free—to listen."[40] • For Men of Distinction – Lord Calvert: An ad for Lord Calvert whiskey depicts nine gentlemen holding a glass of their whiskey, while McLuhan notes the lack of non-artists amongst them; "Why pick on the arts? Hasn't anyone in science or industry ever distinguished himself by drinking whiskey?"[41] • The Famous DuBarry Success Course: An ad for beauty creams complete with female model in a swimsuit hawks itself as a "success course" complete with "tuition", to which McLuhan asks, "Why laugh and grow fat when you can experience anguish and success in a strait jacket?"[42]
The Gutenberg Galaxy (1962) McLuhan's The Gutenberg Galaxy: The Making of Typographic Man (written in 1961, first published in Canada by University of Toronto Press in 1962) is a pioneering study in the fields of oral culture, print culture, cultural studies, and media ecology. Throughout the book, McLuhan takes pains to reveal how communication technology (alphabetic writing, the printing press, and the electronic media) affects cognitive organization, which in turn has profound ramifications for social organization: ...[I]f a new technology extends one or more of our senses outside us into the social world, then new ratios among all of our senses will occur in that particular culture. It is comparable to what happens when a new note is added to a melody. And when the sense ratios alter in any culture then what had appeared lucid before may suddenly become opaque, and what had been vague or opaque will become translucent.[43] Movable type His episodic and often rambling history takes the reader from pre-alphabetic tribal humankind to the electronic age. According to McLuhan, the invention of movable type greatly accelerated, intensified, and ultimately enabled cultural and cognitive changes that had already been taking place since the invention and implementation of the alphabet, by which McLuhan means phonemic orthography. (McLuhan is careful to distinguish the phonetic alphabet from logographic/logogramic writing systems, like hieroglyphics or ideograms.)
20
Marshall McLuhan Print culture, ushered in by the Gutenberg press in the middle of the fifteenth century, brought about the cultural predominance of the visual over the aural/oral. Quoting with approval an observation on the nature of the printed word from Prints and Visual Communication by William Ivins, McLuhan remarks: In this passage [Ivins] not only notes the ingraining of lineal, sequential habits, but, even more important, points out the visual homogenizing of experience of print culture, and the relegation of auditory and other sensuous complexity to the background. [...] The technology and social effects of typography incline us to abstain from noting interplay and, as it were, "formal" causality, both in our inner and external lives. Print exists by virtue of the static separation of functions and fosters a mentality that gradually resists any but a separative and compartmentalizing or specialist outlook.[44] The main concept of McLuhan's argument (later elaborated upon in The Medium is the Massage) is that new technologies (like alphabets, printing presses, and even speech itself) exert a gravitational effect on cognition, which in turn affects social organization: print technology changes our perceptual habits ("visual homogenizing of experience"), which in turn affects social interactions ("fosters a mentality that gradually resists all but a... specialist outlook"). According to McLuhan, the advent of print technology contributed to and made possible most of the salient trends in the Modern period in the Western world: individualism, democracy, Protestantism, capitalism and nationalism. For McLuhan, these trends all reverberate with print technology's principle of "segmentation of actions and functions and principle of visual quantification."[45] The global village In the early 1960s, McLuhan wrote that the visual, individualistic print culture would soon be brought to an end by what he called "electronic interdependence": when electronic media replace visual culture with aural/oral culture. In this new age, humankind will move from individualism and fragmentation to a collective identity, with a "tribal base." McLuhan's coinage for this new social organization is the global village.[46] The term is sometimes described as having negative connotations in The Gutenberg Galaxy, but McLuhan himself was interested in exploring effects, not making value judgments: Instead of tending towards a vast Alexandrian library the world has become a computer, an electronic brain, exactly as an infantile piece of science fiction. And as our senses have gone outside us, Big Brother goes inside. So, unless aware of this dynamic, we shall at once move into a phase of panic terrors, exactly befitting a small world of tribal drums, total interdependence, and superimposed co-existence. [...] Terror is the normal state of any oral society, for in it everything affects everything all the time. [...] In our long striving to recover for the Western world a unity of sensibility and of thought and feeling we have no more been prepared to accept the tribal consequences of such unity than we were ready for the fragmentation of the human psyche by print culture.[47] Key to McLuhan's argument is the idea that technology has no per se moral bentâ&#x20AC;&#x201D;it is a tool that profoundly shapes an individual's and, by extension, a society's self-conception and realization: Is it not obvious that there are always enough moral problems without also taking a moral stand on technological grounds? [...] Print is the extreme phase of alphabet culture that detribalizes or decollectivizes man in the first instance. Print raises the visual features of alphabet to highest intensity of definition. Thus print carries the individuating power of the phonetic alphabet much further than manuscript culture could ever do. Print is the technology of individualism. If men decided to modify this visual technology by an electric technology, individualism would also be modified. To raise a moral complaint about this is like cussing a buzz-saw for lopping off fingers. "But", someone says, "we didn't know it would happen." Yet even witlessness is not a moral issue. It is a problem, but not a moral problem; and it would be nice to clear away some of the moral fogs that surround our technologies. It would be good for morality.[48]
21
Marshall McLuhan The moral valence of technology's effects on cognition is, for McLuhan, a matter of perspective. For instance, McLuhan contrasts the considerable alarm and revulsion that the growing quantity of books aroused in the latter seventeenth century with the modern concern for the "end of the book". If there can be no universal moral sentence passed on technology, McLuhan believes that "there can only be disaster arising from unawareness of the causalities and effects inherent in our technologies".[49] Though the World Wide Web was invented almost thirty years after The Gutenberg Galaxy, and ten years after his death, McLuhan prophesied the web technology seen today as early as 1962: The next medium, whatever it is – it may be the extension of consciousness – will include television as its content, not as its environment, and will transform television into an art form. A computer as a research and communication instrument could enhance retrieval, obsolesce mass library organization, retrieve the individual's encyclopedic function and flip into a private line to speedily tailored data of a saleable kind (Marshall McLuhan 1962).[50] Furthermore, McLuhan coined and certainly popularized the usage of the term "surfing" to refer to rapid, irregular and multidirectional movement through a heterogeneous body of documents or knowledge, e.g., statements like "Heidegger surf-boards along on the electronic wave as triumphantly as Descartes rode the mechanical wave." Paul Levinson's 1999 book Digital McLuhan explores the ways that McLuhan's work can be better understood through the lens of the digital revolution.[3] McLuhan frequently quoted Walter Ong's Ramus, Method, and the Decay of Dialogue (1958), which evidently had prompted McLuhan to write The Gutenberg Galaxy. Ong wrote a highly favorable review of this new book in America.[51] However, Ong later tempered his praise, by describing McLuhan's The Gutenberg Galaxy as "a racy survey, indifferent to some scholarly detail, but uniquely valuable in suggesting the sweep and depth of the cultural and psychological changes entailed in the passage from illiteracy to print and beyond."[52] McLuhan himself said of the book, "I'm not concerned to get any kudos out of [The Gutenberg Galaxy]. It seems to me a book that somebody should have written a century ago. I wish somebody else had written it. It will be a useful prelude to the rewrite of Understanding Media [the 1960 NAEB report] that I'm doing now." McLuhan's The Gutenberg Galaxy won Canada's highest literary award, the Governor-General's Award for Non-Fiction, in 1962. The chairman of the selection committee was McLuhan's colleague at the University of Toronto and oftentime intellectual sparring partner, Northrop Frye.[53]
Understanding Media (1964) McLuhan's most widely known work, Understanding Media: The Extensions of Man (1964), is a pioneering study in media theory. Building on an idea he owed to his colleague Harold Innis ("We change our tools and then our tools change us"),[54] McLuhan proposed that media themselves, not the content they carry, should be the focus of study—popularly quoted as "the medium is the message". McLuhan's insight was that a medium affects the society in which it plays a role not by the content delivered over the medium, but by the characteristics of the medium itself. McLuhan pointed to the light bulb as a clear demonstration of this concept. A light bulb does not have content in the way that a newspaper has articles or a television has programs, yet it is a medium that has a social effect; that is, a light bulb enables people to create spaces during nighttime that would otherwise be enveloped by darkness. He describes the light bulb as a medium without any content. McLuhan states that "a light bulb creates an environment by its mere presence."[55] More controversially, he postulated that content had little effect on society—in other words, it did not matter if television broadcasts children's shows or violent programming, to illustrate one example—the effect of television on society would be identical. He noted that all media have characteristics that engage the viewer in different ways; for instance, a passage in a book could be reread at will, but a movie had to be screened again in its entirety to study any individual part of it.
22
Marshall McLuhan "Hot" and "cool" media In the first part of Understanding Media, McLuhan also stated that different media invite different degrees of participation on the part of a person who chooses to consume a medium. Some media, like the movies, were "hot"—that is, they enhance one single sense, in this case vision, in such a manner that a person does not need to exert much effort in filling in the details of a movie image. McLuhan contrasted this with "cool" TV, which he claimed requires more effort on the part of the viewer to determine meaning, and comics, which due to their minimal presentation of visual detail require a high degree of effort to fill in details that the cartoonist may have intended to portray. A movie is thus said by McLuhan to be "hot", intensifying one single sense "high definition", demanding a viewer's attention, and a comic book to be "cool" and "low definition", requiring much more conscious participation by the reader to extract value.[56] "Any hot medium allows of less participation than a cool one, as a lecture makes for less participation than a seminar, and a book for less than a dialogue." [57] Hot media usually, but not always, provide complete involvement without considerable stimulus. For example, print occupies visual space, uses visual senses, but can immerse its reader. Hot media favour analytical precision, quantitative analysis and sequential ordering, as they are usually sequential, linear and logical. They emphasize one sense (for example, of sight or sound) over the others. For this reason, hot media also include radio, as well as film, the lecture and photography. Cool media, on the other hand, are usually, but not always, those that provide little involvement with substantial stimulus. They require more active participation on the part of the user, including the perception of abstract patterning and simultaneous comprehension of all parts. Therefore, according to McLuhan cool media include television, as well as the seminar and cartoons. McLuhan describes the term "cool media" as emerging from jazz and popular music and, in this context, is used to mean "detached." [58] This concept appears to force media into binary categories. However, McLuhan's hot and cool exist on a continuum: they are more correctly measured on a scale than as dichotomous terms.[17]
The Medium is the Massage: An Inventory of Effects (1967) This book, published in 1967, was McLuhan's best seller,[6] "eventually selling nearly a million copies worldwide."[59] Initiated by Quentin Fiore,[60] McLuhan adopted the term "massage" to denote the effect each medium has on the human sensorium, taking inventory of the "effects" of numerous media in terms of how they "massage" the sensorium.[61] Fiore, at the time a prominent graphic designer and communications consultant, set about composing the visual illustration of these effects which were compiled by Jerome Agel. Near the beginning of the book, Fiore adopted a pattern in which an image demonstrating a media effect was presented with a textual synopsis on the facing page. The reader experiences a repeated shifting of analytic registers—from "reading" typographic print to "scanning" photographic facsimiles—reinforcing McLuhan's overarching argument in this book: namely, that each medium produces a different "massage" or "effect" on the human sensorium. In The Medium is the Massage, McLuhan also rehashed the argument—which first appeared in the Prologue to 1962's The Gutenberg Galaxy—that media are "extensions" of our human senses, bodies and minds. Finally, McLuhan described key points of change in how man has viewed the world and how these views were changed by the adoption of new media. "The technique of invention was the discovery of the nineteenth [century]", brought on by the adoption of fixed points of view and perspective by typography, while "[t]he technique of the suspended judgment is the discovery of the twentieth century", brought on by the bard abilities of radio, movies and television.[62] An audio recording version of McLuhan's famous work was made by Columbia Records. The recording consists of a pastiche of statements made by McLuhan interrupted by other speakers, including people speaking in various
23
Marshall McLuhan phonations and falsettos, discordant sounds and 1960s incidental music in what could be considered a deliberate attempt to translate the disconnected images seen on TV into an audio format, resulting in the prevention of a connected stream of conscious thought. Various audio recording techniques and statements are used to illustrate the relationship between spoken, literary speech and the characteristics of electronic audio media. McLuhan biographer Philip Marchand called the recording "the 1967 equivalent of a McLuhan video."[63] "I wouldn't be seen dead with a living work of art." – 'SuperBro' speaking "Drop this jiggery-pokery and talk straight turkey." – 'Middle aged man' speaking
War and Peace in the Global Village (1968) McLuhan used James Joyce's Finnegans Wake as a inspiration for this study of war throughout history as an indicator as to how war may be conducted in the future. Joyce's Wake is claimed to be a gigantic cryptogram which reveals a cyclic pattern for the whole history of man through its Ten Thunders. Each "thunder" below is a 100-character portmanteau of other words to create a statement he likens to an effect that each technology has on the society into which it is introduced. In order to glean the most understanding out of each, the reader must break the portmanteau into separate words (and many of these are themselves portmanteaus of words taken from multiple languages other than English) and speak them aloud for the spoken effect of each word. There is much dispute over what each portmanteau truly denotes. McLuhan claims that the ten thunders in Wake represent different stages in the history of man:[64] • • • • • • • • • •
Thunder 1: Paleolithic to Neolithic. Speech. Split of East/West. From herding to harnessing animals. Thunder 2: Clothing as weaponry. Enclosure of private parts. First social aggression. Thunder 3: Specialism. Centralism via wheel, transport, cities: civil life. Thunder 4: Markets and truck gardens. Patterns of nature submitted to greed and power. Thunder 5: Printing. Distortion and translation of human patterns and postures and pastors. Thunder 6: Industrial Revolution. Extreme development of print process and individualism. Thunder 7: Tribal man again. All choractors end up separate, private man. Return of choric. Thunder 8: Movies. Pop art, pop Kulch via tribal radio. Wedding of sight and sound. Thunder 9: Car and Plane. Both centralizing and decentralizing at once create cities in crisis. Speed and death. Thunder 10: Television. Back to tribal involvement in tribal mood-mud. The last thunder is a turbulent, muddy wake, and murk of non-visual, tactile man.
From Cliché to Archetype (1970) In his 1970 book, From Cliché to Archetype, McLuhan, collaborating with Canadian poet Wilfred Watson,[65] approached the various implications of the verbal cliché and of the archetype. One major facet in McLuhan's overall framework introduced in this book that is seldom noticed is the provision of a new term that actually succeeds the global village; the global theater. In McLuhan's terms, a cliché is a "normal" action, phrase, etc. which becomes so often used that we are "anesthetized" to its effects. An example of this given by McLuhan is Eugene Ionesco's play The Bald Soprano, whose dialogue consists entirely of phrases Ionesco pulled from an Assimil language book. "Ionesco originally put all these idiomatic English clichés into literary French which presented the English in the most absurd aspect possible."[66] McLuhan's archetype "is a quoted extension, medium, technology or environment." "Environment" would also include the kinds of "awareness" and cognitive shifts brought upon people by it, not totally unlike the psychological context Carl Jung described. McLuhan also posits that there is a factor of interplay between the cliché and the archetype, or a "doubleness":
24
Marshall McLuhan Another theme of the Wake [Finnegans Wake] that helps in the understanding of the paradoxical shift from cliché to archetype is 'past time are pastimes.' The dominant technologies of one age become the games and pastimes of a later age. In the 20th century, the number of 'past times' that are simultaneously available is so vast as to create cultural anarchy. When all the cultures of the world are simultaneously present, the work of the artist in the elucidation of form takes on new scope and new urgency. Most men are pushed into the artist's role. The artist cannot dispense with the principle of 'doubleness' or 'interplay' because this type of hendiadys dialogue is essential to the very structure of consciousness, awareness, and autonomy.[67] McLuhan relates the cliché-to-archetype process to the Theater of the Absurd: Pascal, in the seventeenth century, tells us that the heart has many reasons of which the head knows nothing. The Theater of the Absurd is essentially a communicating to the head of some of the silent languages of the heart which in two or three hundred years it has tried to forget all about. In the seventeenth century world the languages of the heart were pushed down into the unconscious by the dominant print cliché.[68] The "languages of the heart", or what McLuhan would otherwise define as oral culture, were thus made archetype by means of the printing press, and turned into cliché. The satellite medium, McLuhan states, encloses the Earth in a man-made environment, which "ends 'Nature' and turns the globe into a repertory theater to be programmed."[69] All previous environments (book, newspaper, radio, etc.) and their artifacts are retrieved under these conditions ("past times are pastimes"). McLuhan thereby meshes this into the term global theater. It serves as an update to his older concept of the global village, which, in its own definitions, can be said to be subsumed into the overall condition described by that of the global theater.
Key concepts Tetrad In Laws of Media (1988), published posthumously by his son Eric, McLuhan summarized his ideas about media in a concise tetrad of media effects. The tetrad is a means of examining the effects on society of any technology (i.e., any medium) by dividing its effects into four categories and displaying them simultaneously. McLuhan designed the tetrad as a pedagogical tool, phrasing his laws as questions with which to consider any medium: • • • •
What does the medium enhance? What does the medium make obsolete? What does the medium retrieve that had been obsolesced earlier? What does the medium flip into when pushed to extremes?
The laws of the tetrad exist simultaneously, not successively or chronologically, and allow the questioner to explore the "grammar and syntax" of the "language" of media. McLuhan departs from his mentor Harold Innis in suggesting that a medium "overheats", or reverses into an opposing form, when taken to its extreme.[17] Visually, a tetrad can be depicted as four diamonds forming an X, with the name of a medium in the center. The two diamonds on the left of a tetrad are the Enhancement and Retrieval qualities of the medium, both Figure qualities. The two diamonds on the right of a tetrad are the Obsolescence and Reversal qualities, both Ground qualities.[70]
25
Marshall McLuhan
26
Using the example of radio: • Enhancement (figure): What the medium amplifies or intensifies. Radio amplifies news and music via sound. • Obsolescence (ground): What the medium drives out of prominence. Radio reduces the importance of print and the visual. • Retrieval (figure): What the medium recovers which was previously lost. Radio returns the spoken word to the forefront. • Reversal (ground): What the medium does when pushed to its limits. Acoustic radio flips into audio-visual TV.
Figure and ground A blank tetrad diagram
McLuhan adapted the Gestalt psychology idea of a figure and a ground, which underpins the meaning of "The medium is the message." He used this concept to explain how a form of communications technology, the medium or figure, necessarily operates through its context, or ground. McLuhan believed that to fully grasp the effect of a new technology, one must examine figure (medium) and ground (context) together, since neither is completely intelligible without the other. McLuhan argued that we must study media in their historical context, particularly in relation to the technologies that preceded them. The present environment, itself made up of the effects of previous technologies, gives rise to new technologies, which, in their turn, further affect society and individuals.[17] All technologies have embedded within them their own assumptions about time and space. The message which the medium conveys can only be understood if the medium and the environment in which the medium is used—and which, simultaneously, it effectively creates—are analyzed together. He believed that an examination of the figure-ground relationship can offer a critical commentary on culture and society.[17]
Technological determinism Media determinism, a subset of technological determinism, is a philosophical and sociological position which posits the power of the media to impact society.[71] McLuhan explains technological determinism as it relates to media. "the printing press, the computer, and television are not therefore simply machines which convey information. They are metaphors through which we conceptualize reality in one way or another. They will classify the world for us, sequence it, frame it, enlarge it, reduce it, argue a case for what it is like. Through these media metaphors, we do not see the world as it is. We see it as our coding systems are. Such is the power of the form of information." [72]
Marshall McLuhan
Legacy After the publication of Understanding Media, McLuhan received an astonishing amount of publicity, making him perhaps the most publicized English teacher in the twentieth century and arguably the most controversial. This publicity had much to do with the work of two California advertising executives, Gerald Feigen and Howard Gossage, who used personal profits to fund their practice of "genius scouting." Much enamoured with McLuhan's work, Feigen and Gossage arranged A portion of Toronto's St. Joseph Street is for McLuhan to meet with editors of several major New York co-named Marshall McLuhan Way. magazines in May 1965 at the Lombardy Hotel in New York. Philip Marchand reports that, as a direct consequence of these meetings, McLuhan was offered the use of an office in the headquarters of both Time and Newsweek, any time he needed it. In August 1965, Feigen and Gossage held what they called a "McLuhan festival" in the offices of Gossage's advertising agency in San Francisco. During this "festival", McLuhan met with advertising executives, members of the mayor's office, and editors from the San Francisco Chronicle and Ramparts magazine. Perhaps more significant, however, was Tom Wolfe's presence at the festival, which he would later write about in his article, "What If He Is Right?", published in New York Magazine and Wolfe's own The Pump House Gang. According to Feigen and Gossage, however, their work had only a moderate effect on McLuhan's eventual celebrity: they later claimed that their work only "probably speeded up the recognition of [McLuhan's] genius by about six months."[73] In any case, McLuhan soon became a fixture of media discourse. Newsweek magazine did a cover story on him; articles appeared in Life Magazine, Harper's, Fortune, Esquire, and others. Cartoons about him appeared in The New Yorker.[6] In 1969 Playboy magazine published a lengthy interview with him.[74] McLuhan was credited with coining the phrase Turn on, tune in, drop out by its popularizer, Timothy Leary in the 1960s. In a 1988 interview with Neil Strauss, Leary stated that slogan was "given to him" by McLuhan during a lunch in New York City. Leary said McLuhan "was very much interested in ideas and marketing, and he started singing something like, 'Psychedelics hit the spot / Five hundred micrograms, thatâ&#x20AC;&#x2122;s a lot,' to the tune of a Pepsi commercial. Then he started going, 'Tune in, turn on, and drop out.'"[75] During his lifetime and afterward, McLuhan heavily influenced cultural critics, thinkers, and media theorists such as Neil Postman, Jean Baudrillard, Camille Paglia, Timothy Leary, Terence McKenna, William Irwin Thompson, Paul Levinson, Douglas Rushkoff, Jaron Lanier and John David Ebert, as well as political leaders such as Pierre Elliott Trudeau[76] and Jerry Brown. Andy Warhol was paraphrasing McLuhan with his now famous 15 minutes of fame quote. When asked in the 70s for a way to sedate violences in Angola, he suggested a massive spread of TV devices.[77] In 1991 McLuhan was named as the "patron saint" of Wired Magazine and a quote of his appeared on the masthead for the first ten years of its publication.[78] He is mentioned by name in a Peter Gabriel-penned lyric in the song "Broadway Melody of 1974". This song appears on the concept album The Lamb Lies Down on Broadway, from progressive rock band Genesis. The lyric is: "Marshall McLuhan, casual viewin' head buried in the sand." McLuhan is also jokingly referred to during an episode of The Sopranos entitled House Arrest. Despite his death in 1980, someone claiming to be McLuhan was posting on a Wired mailing list in 1996. The information this individual provided convinced one writer for Wired that "if the poster was not McLuhan himself, it was a bot programmed with an eerie command of McLuhan's life and inimitable perspective."[78] A new centre known as the McLuhan Program in Culture and Technology, formed soon after his death in 1980, is the successor to McLuhan's Centre for Culture and Technology at the University of Toronto and since 1994 it has been part of the University of Toronto Faculty of Information. The first director was literacy scholar and OISE professor David R. Olsen. From 1983 until 2008, the McLuhan Program was under the direction of Dr. Derrick de Kerckhove who was McLuhan's student and translator. Since 2008 Professor Dominique Scheffel-Dunand has been Director of the Program.
27
Marshall McLuhan
Notes [1] "Programming: Getting the Message" (http:/ / www. time. com/ time/ magazine/ article/ 0,9171,837382,00. html). Time. October 13, 1967. . Retrieved 3 March 2011. [2] "Television: Dann v. Klein: The Best Game in Town" (http:/ / www. time. com/ time/ magazine/ article/ 0,9171,909291,00. html#ixzz0zAieWZWT). Time. May 25, 1970. . Retrieved 3 March 2011. [3] Levinson, Paul (1999). Digital McLuhan: A Guide to the Information Millennium (http:/ / www. cyberchimp. co. uk/ U75102/ levinson. htm#ch3). Routledge. ISBN 0-415-19251-X. . [4] Stille, Alexander (14 October 2000). "Marshall McLuhan Is Back From the Dustbin of History; With the Internet, His Ideas Again Seem Ahead of Their Time" (http:/ / www. nytimes. com/ 2000/ 10/ 14/ arts/ marshall-mcluhan-back-dustbin-history-with-internet-his-ideas-again-seem-ahead. html). The New York Times: p. 9. . Retrieved 10 March 2011. [5] Beale, Nigel (28 February 2008). "Living in Marshall McLuhan's galaxy" (http:/ / www. guardian. co. uk/ books/ booksblog/ 2008/ feb/ 28/ livinginmarshallmcluhansga). The Guardian (UK). . Retrieved 21 March 2011. [6] Wolf, Gary (January 1996). "The Wisdom of Saint Marshall, the Holy Fool" (http:/ / www. wired. com/ wired/ archive/ 4. 01/ saint. marshal. html?pg=1& topic=& topic_set=). Wired 4.01. . Retrieved 2009-05-10. [7] Boxer, Sarah (3 April 2003). "CRITIC'S NOTEBOOK; McLuhan's Messages, Echoing On Iraq" (http:/ / www. nytimes. com/ 2003/ 04/ 03/ arts/ critic-s-notebook-mcluhan-s-messages-echoing-on-iraq. html?scp=12& sq=mcluhan& st=nyt). The New York Times: p. 1. . Retrieved 10 March 2011. [8] Gordon, pp. 99–100. [9] Edam, Tina (2003). "St Marshall, Mass and the Media: Catholicism, Media Theory and Marshall McLuhan" (http:/ / spectrum. library. concordia. ca/ 1977/ 1/ MQ77929. pdf), p. 10. Retrieved 2010-06-27. [10] Marchand (1998), p. 20. [11] Edam (2003), p. 11. [12] Gordon (1997), p. 34 [13] Marchand (1998), p.32 [14] Gordon, p. 40; McLuhan later commented "One advantage we Westerners have is that we're under no illusion we've had an education. That's why I started at the bottom again." Marchand (1990), p 30. [15] Marchand, p. 33–34 [16] Marchand, pp. 37–47. [17] Old Messengers, New Media: The Legacy of Innis and McLuhan (http:/ / www. collectionscanada. ca/ innis-mcluhan/ ), a virtual museum exhibition at Library and Archives Canada [18] Gordon, p. 94. [19] Gordon, pp. 69–70. [20] Gordon, p. 54–56. [21] Lewis H. Lapham, Introduction to Understanding Media (First MIT Press Edition), p. xvii [22] McLuhan, Marshall. "Letter to Elsie McLuhan", September 5, 1935. Molinaro et alia (1987), p. 73. [23] Gordon, p.74, gives the date as March 25; Marchand (1990), p.44, gives it as March 30. [24] Marchand (1990), pp. 44–45. [25] Marchand (1990), p. 45. [26] Gordon, p. 75 [27] Associates speculated about his intellectual connection to the Virgin Mary, one saying, "He [McLuhan] had a direct connection with the Blessed Virgin Mary... He alluded to it very briefly once, almost fearfully, in a please-don't-laugh-at-me tone. He didn't say, "I know this because the Blessed Virgin Mary told me," but it was clear from what he said that one of the reasons he was so sure about certain things was that the Virgin had certified his understanding of them." (cited in Marchand, p. 51). [28] Fitterman, Lisa (2008-04-19). "She was Marshall McLuhan's great love ardent defender, supporter and critic" (http:/ / www. theglobeandmail. com/ servlet/ story/ LAC. 20080419. OBMCLUHAN19/ / TPStory/ Obituaries). Globe and Mail. . Retrieved 2008-06-29. [29] Gordon, p. 115. [30] Prins and Bishop 2002 [31] During the time at Fordham University, his son Eric McLuhan conducted what came to be known as the Fordham Experiment, about the different effects of "light-on" versus "light-through" media. [32] Order of Canada citation (http:/ / archive. gg. ca/ honours/ search-recherche/ honours-desc. asp?lang=e& TypeID=orc& id=2180) [33] University of Toronto Bulletin, 1979; Martin Friedland, The University of Toronto: A History, University of Toronto Press, 2002 [34] http:/ / www. youtube. com/ watch?v=OpIYz8tfGjY#t=01m45s [35] McLuhan's doctoral dissertation from 1942 was published by Gingko Press in March 2006. Gingko Press also plans to publish the complete manuscript of items and essays that McLuhan prepared, only a selection of which were published in his book. With the publication of these two books a more complete picture of McLuhan's arguments and aims is likely to emerge. [36] For a nuanced account of McLuhan's thought regarding Richards and Leavis, see McLuhan's "Poetic and Rhetorical Exegesis: The Case for Leavis against Richards and Empson" in the Sewanee Review, volume 52, number 2 (1944): 266–76.
28
Marshall McLuhan [37] The phrase "the medium is the message" may be better understood in light of Bernard Lonergan's further articulation of related ideas: at the empirical level of consciousness, the medium is the message, whereas at the intelligent and rational levels of consciousness, the content is the message. This sentence uses Lonergan's terminology from Insight: A Study of Human Understanding to clarify the meaning of McLuhan's statement that "the medium is the message"; McLuhan read this when it was first published in 1957 and found "much sense" in it – in his letter of September 21, 1957, to his former student and friend, Walter J. Ong, S.J., McLuhan says, "Find much sense in Bern. Lonergan's Insight" (Letters of Marshall McLuhan, 1987: 251). Lonergan's Insight is an extended guide to "making the inward turn": attending ever more carefully to one's own consciousness, reflecting on it ever more carefully, and monitoring one's articulations ever more carefully. When McLuhan declares that he is more interested in percepts than concepts, he is declaring in effect that he is more interested in what Lonergan refers to as the empirical level of consciousness than in what Lonergan refers to as the intelligent level of consciousness in which concepts are formed, which Lonergan distinguishes from the rational level of consciousness in which the adequacy of concepts and of predications is adjudicated. This inward turn to attending to percepts and to the cultural conditioning of the empirical level of consciousness through the effect of communication media sets him apart from more outward-oriented studies of sociological influences and the outward presentation of self carried out by George Herbert Mead, Erving Goffman, Berger and Luckmann, Kenneth Burke, Hugh Duncan, and others. [38] Wolfe, Tom (February 2011). "MARSHALL McLUHAN SPEAKS CENTENNIAL 2011" (http:/ / www2. marshallmcluhanspeaks. com/ ?video=intro). www2.marshallmcluhanspeaks.com. . Retrieved 2011-02-22. "Introduction" [39] The Mechanical Bride, pg 9 [40] The Mechanical Bride, pg 21 [41] The Mechanical Bride, pg 56 [42] The Mechanical Bride, pg 152 [43] Gutenberg Galaxy 1962, p. 41. [44] Gutenberg Galaxy pp. 124–26. [45] Gutenberg Galaxy p. 154. [46] Wyndham Lewis's America and Cosmic Man (1948) and James Joyce's Finnegan's Wake are sometimes credited as the source of the phrase, but neither used the words "global village" specifically as such. According to McLuhan's son Eric McLuhan, his father, a Wake scholar and a close friend of Lewis, likely discussed the concept with Lewis during their association, but there is no evidence that he got the idea or the phrasing from either; McLuhan is generally credited as having coined the term. Eric McLuhan (1996). "The source of the term 'global village'" (http:/ / www. chass. utoronto. ca/ mcluhan-studies/ v1_iss2/ 1_2art2. htm). McLuhan Studies (issue 2). . Retrieved 2008-12-30. [47] Gutenberg Galaxy p. 32. [48] Gutenberg Galaxy p. 158. [49] Gutenberg Galaxy p. 254. [50] http:/ / www. utoronto. ca/ mcluhan/ marshal. htm [51] America 107 (Sept. 15, 1962): 743, 747. [52] New Catholic Encyclopedia 8 (1967): 838. [53] Gordon, p. 109. [54] Prins, Harald E.L., and John Bishop. "Edmund Carpenter: A Tricker's Explorations of Culture & Media." Pp. 207–45. In B. Engelbrecht, ed. Memories of the Origins of Ethnographic Film. Frankfurt am Main, 2007, p. 221. [55] Understanding Media, p. 8. [56] Understanding Media, p. 22. [57] Understanding Media, p. 25. [58] See CBC Radio Archives (http:/ / www. google. ca/ url?sa=t& ct=res& cd=1& url=http:/ / archives. cbc. ca/ IDC-1-74-342-1818/ people/ mcluhan/ clip4& ei=xCz1R9_zGIeIgAKDqNHDDg& usg=AFQjCNG9ty0kHZVJPTfHg8smU8NOiv4TIg& sig2=z7jj7loyK6wCawbvMxOPWA) [59] Marchand, p. 203 [60] McLuhan & Fiore, 1967 [61] According to McLuhan biographer W. Terrence Gordon, "by the time it appeared in 1967, McLuhan no doubt recognized that his original saying had become a cliché and welcomed the opportunity to throw it back on the compost heap of language to recycle and revitalize it. But the new title is more than McLuhan indulging his insatiable taste for puns, more than a clever fusion of self-mockery and self-rescue — the subtitle is 'An Inventory of Effects,' underscoring the lesson compressed into the original saying." (Gordon, p. 175.)
However, the FAQ section (http:/ / marshallmcluhan. com/ faqs. html) on the website maintained by McLuhan's estate says that this interpretation is incomplete and makes its own leap of logic as to why McLuhan left it as is. "Why is the title of the book The Medium is the Massage and not The Medium is the Message? Actually, the title was a mistake. When the book came back from the typesetter's, it had on the cover 'Massage' as it still does. The title was supposed to have read The Medium is the Message but the typesetter had made an error. When McLuhan saw the typo he exclaimed, 'Leave it alone! It's great, and right on target!' Now there are possible four readings for the last word of the title, all of them accurate: Message and Mess Age, Massage and Mass Age." [62] Understanding Media, p. 68.
29
Marshall McLuhan [63] Marchand (1998), p.187. [64] War and Peace in the Global Village, p. 46. [65] "Watson, Wilfred" (http:/ / www. thecanadianencyclopedia. com/ index. cfm?PgNm=TCE& Params=A1ARTA0008487). The Canadian Encyclopedia. . Retrieved 14 March 2010. [66] From Cliché to Archetype, p. 4. [67] From Cliché to Archetype, p. 99. [68] From Cliché to Archetype, p. 5. [69] From Cliché to Archetype, p. 9. [70] McLuhan, Eric (1998). Electric language: understanding the present. Stoddart. ISBN 0773759727., p. 28 [71] Media Determinism in Cyberspace, Regent University [72] Postman, Teaching as a Conserving Activity (1979), p. 39 [73] Marchand, pp. 182–184. [74] "Playboy Interview: Marshall McLuhan". Playboy: pp. 26–27, 45, 55–56, 61, 63. March 1969. [75] Strauss, Neil. Everybody Loves You When You're Dead: Journeys into Fame and Madness. New York: HarperCollins, 2011, p. 337–38 [76] "It's cool not to shave – Marshall McLuhan, the Man and his Message – CBC Archives" (http:/ / archives. cbc. ca/ IDC-1-69-342-1826/ life_society/ mcluhan/ clip8). CBC News. . Retrieved 2007-07-02. [77] Daniele Luttazzi, interview at RAI Radio1 show Stereonotte (http:/ / www. radio. rai. it/ radio1/ radiounomusica/ stereonotte. cfm), July 01 2007 2:00 am. Quote: "McLuhan era uno che al premier canadese che si interrogava su un modo per sedare dei disordini in Angola, McLuhan disse, negli anni 70, 'riempite la nazione di apparecchi televisivi'; ed è quello che venne fatto; e la rivoluzione in Angola cessò." (Italian) [78] Wolf, Gary (January 1996). "Channeling McLuhan" (http:/ / www. wired. com/ wired/ archive/ 4. 01/ channeling. html). Wired 4.01. . Retrieved 2009-05-10.
Works cited This is a partial list of works cited in this article. See Bibliography of Marshall McLuhan for a more comprehensive list of works by and about McLuhan.
By Marshall McLuhan • 1951 The Mechanical Bride: Folklore of Industrial Man; 1st Ed.: The Vanguard Press, NY; reissued by Gingko Press, 2002 ISBN 1-58423-050-9 • 1962 The Gutenberg Galaxy: The Making of Typographic Man; 1st Ed.: University of Toronto Press; reissued by Routledge & Kegan Paul ISBN 0-7100-1818-5 • 1964 Understanding Media: The Extensions of Man; 1st Ed. McGraw Hill, NY; reissued by MIT Press, 1994, with introduction by Lewis H. Lapham; reissued by Gingko Press, 2003 ISBN 1-58423-073-8 • 1967 The Medium is the Massage: An Inventory of Effects with Quentin Fiore, produced by Jerome Agel; 1st Ed.: Random House; reissued by Gingko Press, 2001 ISBN 1-58423-070-3 • 1968 War and Peace in the Global Village design/layout by Quentin Fiore, produced by Jerome Agel; 1st Ed.: Bantam, NY; reissued by Gingko Press, 2001 ISBN 1-58423-074-6. • 1970 From Cliché to Archetype with Wilfred Watson; Viking, NY ISBN 0-67033-093-0
About Marshall McLuhan • Gordon, W. Terrence. Marshall McLuhan: Escape into Understanding: A Biography. Basic Books, 1997. ISBN 0465005497. • Marchand, Philip. Marshall McLuhan: The Medium and the Messenger. Random House, 1989; Vintage, 1990; The MIT Press; Revised edition, 1998. ISBN 0262631865 (http://www.philipmarchand.com/mcluhan.html) • Molinaro, Matie; Corinne McLuhan; and William Toye, eds. Letters of Marshall McLuhan. Toronto: Oxford University Press, 1987, ISBN 0195405943
30
Marshall McLuhan
Further reading • Benedetti, Paul and Nancy DeHart. Forward Through the Rearview Mirror: Reflections on and by Marshall McLuhan. Boston:The MIT Press, 1997. • Carpenter, Edmund. "That Not-So-Silent Sea" [Appendix B]. In The Virtual Marshall McLuhan edited by Donald F. Theall. McGill-Queen's University Press, 2001: 236–261. (For the complete essay before it was edited for publication, see the external link below.) • Coupland, Douglas. Extraordinary Canadians: Marshall McLuhan. Penguin Canada, 2009; US edition: Marshall McLuhan: You Know Nothing of my Work!. Atlas & Company, 2011. • Daniel, Jeff. "McLuhan's Two Messengers: Maurice McNamee and Walter Ong: world-class interpreters of his ideas." St. Louis Post-Dispatch (Sunday, August 10, 1997: 4C). • Federman, Mark. McLuhan for Managers: New Tools for New Thinking. Viking Canada, 2003. • Flahiff, F. T. Always Someone to Kill the Doves: A Life of Sheila Watson. Edmonton: NeWest Press, 2005. • Levinson, Paul. Digital McLuhan: A Guide to the Information Millennium. Routledge, 1999. ISBN 0-415-19251-X; book has been translated into Japanese, Chinese, Croatian, Romanian, Korean and Macedonian • Ong, Walter J.: "McLuhan as Teacher: The Future Is a Thing of the Past." Journal of Communication 31 (1981): 129–135. Reprinted in Ong's Faith and Contexts: Volume One (Scholars Press, 1992: 11–18). • Ong, Walter J.: [Untitled review of McLuhan's The Interior Landscape: The Literary Criticism of Marshall McLuhan 1943–1962]. Criticism 12 (1970): 244–251. Reprinted in An Ong Reader: Challenges for Further Inquiry (Hampton Press, 2002: 69–77). • Prins, Harald E.L., and Bishop, John M. "Edmund Carpenter: Explorations in Media & Anthropology." Visual Anthropology Review Vol.17(2): 110-40 (2002). (http://media-generation.com/Articles/VAR.pdf) • Prins, Harald E.L., and John Bishop. "Edmund Carpenter: A Trickster's Explorations of Culture & Media." pp. 207–45. In Memories of the Origins of Ethnographic Film. B. Engelbrecht, ed. Frankfurt am Main: Peter Lang, 2007. • Theall, Donald F. The Virtual Marshall McLuhan. McGill-Queen's University Press, 2001.
External links • McLuhan Program in Culture and Technology at the University of Toronto (http://www.mcluhan.utoronto.ca/) • Useful introduction to some of McLuhan's ideas by Jim Andrews (http://vispo.com/writings/essays/mcluhana. htm) • UbuWeb Marshall McLuhan (http://www.ubu.com/sound/mcluhan.html) featuring the LP The Medium is the Massage • Official Site (http://www.marshallmcluhan.com/) • CBC Digital Archives - Marshall McLuhan, the Man and his Message (http://archives.cbc.ca/300c. asp?id=1-69-342) • McLuhan global research network Liss Jeffrey's McLuhan bibliography free online (http://www.mcluhan.ca/ bibliography.phtml) • McLuhan Revisited (http://www.straightdope.com/columns/030725.html) by Cecil Adams • Marshall McLuhan/Finnegans Wake Reading Club (http://venicewake.org/) Venice, Calif. Very active West Coast USA club & link to Yahoo McLuhan group • Mcluhan Tetrad Concept explained (http://www.anthonyhempell.com/papers/tetrad/concept.html) • McLuhan facts, sources, and class (http://www.digitallantern.net/mcluhan/) • McLuhan's Laws of Media (http://www.horton.ednet.ns.ca/staff/scottbennett/media/) • Marshall McLuhan: "The Medium is the Message" by Todd Kappelman (http://www.leaderu.com/orgs/probe/ docs/mcluhan.html) • A Biographical Introduction to Marshall McLuhan (http://www.judithfitzgerald.ca/criticalmass.html)
31
Marshall McLuhan • Marshall McLuhan's Enduring Visions & Values (http://www.judithfitzgerald.ca/ thegospelaccordingtomcluhan.html) • Philosophy and Application (http://www.terminalhex.com/essay.htm) • MediaTropes eJournal (http://www.mediatropes.com/index.php/Mediatropes/issue/view/174) Vol. 1, Marshall McLuhan's "Medium is the Message": Information Literacy in a Multimedia Age • Extensions of McLuhan: An Audio Album Visualization 1968/2009 - by Cultural Farming (http://www. culturalfarming.com/Ethnography/Extensions_of_McLuhan.html)
32
Nicholas Negroponte
33
Nicholas Negroponte Nicholas Negroponte
Nicholas Negroponte delivering the Forrestal Lecture to the US Naval Academy in Annapolis, MD, on April 15, 2009 Born
December 1, 1943New York City,
Occupation
Academic and Computer Scientist
Spouse
Elaine
Children
Dimitri Negroponte
Nicholas Negroponte (born December 1, 1943) is a Greek-American architect best known as the founder and Chairman Emeritus of Massachusetts Institute of Technology's Media Lab, and also known as the founder of the One Laptop per Child Association (OLPC).
Early life Negroponte was born to Dimitri John Negroponte, a Greek shipping magnate, and grew up in New York City's Upper East Side. He is the younger brother of John Negroponte, former United States Deputy Secretary of State. He attended Buckley School in New York City, Le Rosey in Switzerland, and The Choate School (now Choate Rosemary Hall) in Wallingford, Connecticut, from which he graduated in 1961. Subsequently, he studied at MIT as both an undergraduate and graduate student in Architecture where his research focused on issues of computer-aided design. He earned a Master's degree in architecture from MIT in 1966.
Career MIT Negroponte joined the faculty of MIT in 1966. For several years thereafter he divided his teaching time between MIT and several visiting professorships at Yale, Michigan and the University of California, Berkeley. In 1967, Negroponte founded MIT's Architecture Machine Group, a combination lab and think tank which studied new approaches to human-computer interaction.[1] In 1985, Negroponte created the MIT Media Lab with Jerome B. Wiesner.[2] As director, he developed the lab into the pre-eminent computer science laboratory for new media and a high-tech playground for investigating the human-computer interface. Negroponte also became a proponent of intelligent agents and personalized electronic newspapers,[3] for which he popularized the term the Daily Me.
Nicholas Negroponte
Wired In 1992, Negroponte became involved in the creation of Wired Magazine as the first investor. From 1993 to 1998, he contributed a monthly column to the magazine in which he reiterated a basic theme: "Move bits, not atoms." Negroponte expanded many of the ideas from his Wired columns into a bestselling book Being Digital (1995),[4] which made famous his forecasts on how the interactive world, the entertainment world and the information world would eventually merge. Being Digital was a bestseller and was translated into some twenty languages. Negroponte is a digital optimist who believed that computers would make life better for everyone.[5] However, critics such as Cass Sunstein[6] have faulted his techno-utopian ideas for failing to consider the historical, political and cultural realities with which new technologies should be viewed. Negroponte's belief that wired technologies such as telephones will ultimately become unwired by using airwaves instead of wires or fiber optics, and that unwired technologies such as televisions will become wired, is commonly referred to as the Negroponte switch.
Later career In 2000, Negroponte stepped down as director of the Media Lab as Walter Bender took over as Executive Director. However, Negroponte retained the role of laboratory Chairman. When Frank Moss was appointed director of the lab in 2006, Negroponte stepped down as lab chairman to focus more fully on his work with One Laptop Per Child (OLPC) although he retains his appointment as professor at MIT. In November 2005, at the World Summit on the Information Society held in Tunis, Negroponte unveiled the concept of a $100 laptop computer, The Children's Machine, designed for students in the developing world.[7] The price has increased to US$180, however, due to the falling US dollar. The project is part of a broader program by One Laptop Per Child, a non-profit organisation started by Negroponte and other Media Lab faculty, to extend Internet access in developing countries. Negroponte is an active angel investor and has invested in over 30 Mary Lou Jepsen, Alan Kay and Nicholas startup companies over the last 30 years, including Zagats, Wired, Negroponte unveil the $100 laptop. Ambient Devices, Skype and Velti. He sits on several boards, including Motorola (listed on the New York Stock Exchange) and Velti (listed on the London Stock Exchange). He is also on the advisory board of TTI/Vanguard. In August 2007, he was appointed to a five-member special committee with the objective of assuring the continued journalistic and editorial integrity and independence of the Wall Street Journal and other Dow Jones & Company publications and services. The committee was formed as part of the merger of Dow Jones with News Corporation.[8] Negroponte's fellow founding committee members are Louis Boccardi, Thomas Bray, Jack Fuller, and the late former Congresswoman Jennifer Dunn.
34
Nicholas Negroponte
References [1] Negroponte, Nicholas (1970). The Architecture Machine: Towards a More Human Environment. Cambridge, Mass.: MIT Press. ISBN 0-262-64010-4. [2] Schrage, Michael (1985-10-07). "An MIT Lab Tinkers With the Future of Personal Computers". The Washington Post: pp. 13. [3] Negroponte, Nicholas (1991). "Products and Services for Computer Networks". Scientific American 265 (3): 76–83. ISSN 0036-8733. [4] Negroponte, Nicholas (1999). Being Digital. New York: Knopf. ISBN 0-679-76290-6. [5] Hirst, Martin and Harrison , John, (2007)Communication and New Media, Oxford University Press, p. 20 [6] Sunstein, C.R. (2001) Republic.com Princeton University Press [7] Kirkpatrick, David (2005-11-28). "I'd Like to Teach the World to Type" (http:/ / money. cnn. com/ magazines/ fortune/ fortune_archive/ 2005/ 11/ 28/ 8361971/ index. htm). Fortune. . Retrieved 2010-12-12. [8] Wall Street Journal, August 1, 2007. "Text of Dow Jones Editorial Agreement". Online edition (http:/ / online. wsj. com/ article/ SB118598565803884843. html) retrieved on October 21, 2007.
Video links • TED: Nicholas Negroponte takes OLPC to Colombia (http://www.ted.com/index.php/talks/ nicholas_negroponte_takes_olpc_to_colombia.html) • TED: Nicholas Negroponte: From 1984, 4 predictions about the future (3 of them correct) (http://www.ted. com/index.php/talks/view/id/230) • TED: Nicholas Negroponte: The vision behind One Laptop Per Child (http://www.ted.com/index.php/talks/ view/id/41) • Nicholas Negroponte Keynote at NetEvents, Hong Kong inc. first production olpc laptop (http://www. netevents.tv/docuplayer.asp?docid=75) December 2006 • Nichloas Negroponte Q&A at NetEvents, Hong Kong (http://www.netevents.tv/docuplayer.asp?docid=100) December 2006 • TEDxBrussels: Nicholas Negroponte on OLPC (http://www.youtube.com/watch?v=Q81TmwXe3ZM) (November 2009) • Nicholas Negroponte about books and OLPC on NECN (http://www.necn.com/Boston/Arts-Entertainment/ 2009/10/21/Author-Nicholas-Negroponte-on/1256156542.html)
35
Article Sources and Contributors
Article Sources and Contributors Information Source: http://en.wikipedia.org/w/index.php?oldid=433242144 Contributors: "alyosha", 5 albert square, A. B., AJackl, Aesopos, Al Lemos, Alan Liefting, Alansohn, Alasdair, Ale jrb, Alex.muller, Algorithms, Alphabravotango, Alphax, Amerique, Ancheta Wis, Andres, Andrewbadr, Andrewrp, Andrewrutherford, AndriuZ, Anne11jun, Antandrus, Apurvasukant, Ariaconditzione, Ariedartin, Arthur Rubin, Atroche, Auntof6, Avs5221, BD2412, Bart v M, Bbarkley, Bebenko, Beetstra, Ben-Zin, Bensaccount, Bertilvidet, BigDunc, Billreid, Blakkandekka, Bluerasberry, Bobblewik, Bongwarrior, Boxplot, Bradshow, Brian Dare, Brianjd, Browneze, BryanD, Bubba73, Bueller 007, CALR, CES1596, CIreland, COMPATT, COMPFUNK2, CRGreathouse, Calieber, Can't sleep, clown will eat me, Capricorn42, CardinalDan, Carlroller, Cassie, Catinator, Celcom, Cenarium, Chamal N, CharlesC, Chithrapriya, Chitvamasi, Choster, Christian List, Christopher Parham, Ciaran H, Ckatz, Cmsreview, Cobi, Cocomo-jp, Complete, Conversion script, Coywish, Crystallina, Csharpboy, Cuckoofridge, Cyan, Cyfal, D6, DalaoDy, Daniel.Cardenas, Danny lost, Daven200520, David Eppstein, David0811, Dawnseeker2000, Dehneshin, Delldot, Deltabeignet, Denisutku, DerHexer, Dessimoz, Diagonalfish, Dicklyon, Djmiller9975, Dorkysnorky123654, Download, Downtownee, Draccon136, Dreadstar, Dreap, Dreftymac, Dungsff, Eaefremov, EagleFan, Edgerunner76, Ellywa, Eloquence, Em3ryguy, EndlessWorld, Enigma55, EugeneZelenko, Eurobas, Everyking, Excirial, Exformation, Faustnh, Feinoha, Flammifer, FlavrSavr, Flcelloguy, Floridi, Fredrik, Frosted14, Fæ, Gamma2delta, GaryColemanFan, Gbbinning, Ged UK, GeeJo, George100, Gfoley4, Giftlite, Goatasaur, God of Slaughter, Gogo Dodo, Golgofrinchian, Gregbard, Grm wnr, Gscshoyru, Gtg204y, Haham hanuka, HalfShadow, Hallenrm, Hard Sin, Harmeetkaur09, Helix84, Hephaestos, Hesar, Hot torew, Hsarkka, Iancarter, Igiffin, Incnis Mrsi, Indon, InformationZone, Informatwr, Iridescent, Iseecubes, Isria, Ivan Štambuk, Ixfd64, J.delanoy, J04n, JFreeman, JLaTondre, Jakehall2, Jasonlums, Jayarathina, Jdunn0101, Jebba, Jerrch, Jheald, Jimothytrotter, JinJian, Joeblakesley, Jon Awbrey, JonesC-NC, Jorfer, Jose77, Joyous!, Jpgordon, Juliancolton, Jusjih, Jwdietrich2, Jwithers, K50 Dude, KYPark, Kby, Kentback, Kevinmon, Kgeza7, Khalid Mahmood, Khobler, Kimberleyhastly, Kimse, King of Hearts, Kirkjames, Kjells, Kku, Klausness, Ksanyi, Kuru, Kvng, Kyle Barbour, Kyle J Moore, La goutte de pluie, Lalalalalala, Ldonna, Leandrod, Leonariso, Levineps, Lexor, Linshukun, Lir, Little guru, LogicalDash, LonelyMarble, Loom91, Loren.wilton, Lotje, Lottamiata, LtNOWIS, Luapnampahc, Luvstar 17, MC MasterChef, MER-C, MK8, Maashatra11, Machine Elf 1735, Magdalena Szarafin, Magister Mathematicae, MagnaMopus, Mah Nah, Manuel Trujillo Berges, Maria Vargas, Mark Renier, MarsRover, Martpol, Masgatotkaca, Materialscientist, Maurice Carbonaro, Mayooranathan, Mdebets, Meelar, Meeples, Mentifisto, Michael Fourman, Michael Hardy, Michaelschmatz, Michal Jurosz, Mike J B, Mike Rosoft, MikeGasser, Mikeblas, Millahnna, MisterSheik, Mjb, Mkoval, Modster, Montag451, Mpfrank, Mr Stephen, Mr.Z-man, Mtz07, My mom is cool, NAHID, Naudefj, NawlinWiki, Nelbathy, Neo-Jay, Neparis, Nilmerg, Nixdorf, Njaelkies Lea, Nosferatütr, Obeattie, Oberst, Oicumayberight, Olathe, OleMaster, Olivier, Omnipaedista, Ott, Overix, PGWG, PaulHanson, Peak, Pelister, Pepve, PeterSymonds, Peterdjones, Philip Trueman, Piano non troppo, Pinkcyberheart, Pir, PlatonicIdeas, Poojamal, Poor Yorick, Pooryorick, Pratibha sharma, Prickus, Pvalozic, Pvm 02, R'n'B, R.Dhiyanesh, Ramobear, Ranveig, Rasmus Faber, Rattle, RexNL, Riana, Richard001, RichardF, Rizeg70, Rlitwin, Robert W. Wright, Roberto Gejman, Robinh, Robocracy, Romanm, Roozbeh, Rronline, Rspanton, Ruud Koot, ST47, SWAdair, Samanwith, Samboy, Sbamyani, Scope creep, Sct72, Sean.hoyland, Securiger, Semmelweiss, Sesel, ShadowRangerRIT, Shahidislam31, Sheizaf, Shindo9Hikaru, Shizhao, Silence, Simoes, Sir Nicholas de Mimsy-Porpington, SkepticalMetal, SkyMachine, Smartse, Smooth O, Smyth, Snowded, Solitude, Spinningspark, Spot, Sputnikcccp, Sssbbbrrr, Stebbins, SteinbDJ, Steipe, Stephenb, Stereotek, Stevenson-Perez, Stevertigo, Svetovid, Systemetsys, Systemizer, THF, Tad Lincoln, Taxisfolder, Tbhotch, Tcsetattr, Terence, Terrifictriffid, TestPilot, The Anome, TheAMmollusc, TheKMan, Thedosmann, Thingg, Think outside the box, Thomasda, Thriceplus, Tide rolls, Timir Saxa, Tnxman307, Tobias Bergemann, Togo, TomasBat, Tomos, Tpbradbury, Trevor MacInnis, Tritium6, Truman Burbank, Unmerklich, Vaishalimathur14, Vald, Versageek, Versus22, Vespristiano, Vishal.belawade, Voyagerfan5761, Wavelength, Wigren, Wikiborg, Wikieditor06, Wikiwlod, Wile E. Heresiarch, Wingsandsword, Wlodzimierz, Woohookitty, WpZurp, Xaphnir, Xdamr, Xyzzyplugh, YankeeDoodle14, Yekrats, Yettie0711, Yidisheryid, Yuma en, Z.E.R.O., Zarcadia, Zondor, Zzuuzz, Александър, 665 anonymous edits Information architecture Source: http://en.wikipedia.org/w/index.php?oldid=429939888 Contributors: A3RO, Aapo Laitinen, Aaronbrick, Abialek, Ajcheng, Angela, Asist, Atomiq, Bill.albing, BirgerH, Brick Thrower, Burkhard, Capricorn42, Captmondo, Chrisfurniss, Clayoquot, Craigb81, Creacon, DEddy, DanimovIA, Danimovski, DerHexer, Dobrien, Eddiexx77, Edward, Ehheh, Ekillaby, Emorrogh, Enric Naval, Erliptac, Fenice, Flyguy649, Frequencydip, GlassFET, Graham Berrisford, Greenmtncville, Harvardnet, Havanafreestone, Hbent, Hobartimus, Horatio Huxham, Icseaturtles, IvanLanin, J04n, Jakew, JimmycurN, Jmelody, Jomackiewicz, Jpbowen, Justin2007, KPH2293, Kaganer, Kauczuk, Kellylautt, KeyStroke, Koavf, Kostmo, LLarson, LeeHunter, LilHelpa, LockeShocke, Lostnumber747, Lotje, Luís Felipe Braga, Lycurgus, Mahanchian, Mandolinface, Marcok, Martarius, Masao, Mbria, Michael Hardy, MrOllie, Mymarpie, NicHB7, Nickmalik, Nighthawk2050, Novacough, Oconar, Of.information.architecture, Ohnoitsjamie, Oicumayberight, Onepd, Outriggr, Pavel Vozenilek, Pboersma, Pema, PhilipR, Phoebe, Prof 7, Pucciar, Quarty, Resmini, RichMorin, Ronz, S.K., Salvadorious, Scetoaux, SeanGustafson, Seth.lilly, Simon.raistrick, Skeejay, Smoken Flames, Socrates32, Spalding, Speedoflight, Studip101, Tankgrrl2000, The Thing That Should Not Be, Timber92, Timo Kouwenhoven, UXjam, Urhixidur, Veinor, Velvetsmog, Willking1979, Yworo, シ, 267 anonymous edits Information theory Source: http://en.wikipedia.org/w/index.php?oldid=430818866 Contributors: (, 213.122.18.xxx, APH, Alejo2083, AlexanderMalmberg, Algorithms, AllGloryToTheHypnotoad, Allchopin, Almkglor, Ammarsakaji, Ancheta Wis, Andres, Andrewbadr, Andycjp, Angela, Ann O'nyme, Annacoder, AnonMoos, Ap, Arthur Rubin, Arunkumar, AxelBoldt, Bemba, Bemoeial, BenjaminGittins, Bethnim, Bidabadi, Bobby D. Bryant, Bobo192, Brion VIBBER, BryanD, Buettcher, C9900, CALR, COMPATT, CRGreathouse, Calbaer, CapitalR, Cassandra Cathcart, Cazort, Cburnett, Chancemill, Charles Matthews, Chris Pressey, Chungc, Cihan, Commander Nemet, Constant314, Conversion script, Creidieki, Crunchy Frog, D, D6, DV8 2XL, Daggerstab, DanBri, Dani.gomezdp, David Eppstein, Deepmath, Dicklyon, Djcmackay, Dysprosia, Dziewa, E-Kartoffel, EPM, Eatsaq, Edchi, El C, Elektron, Ericamick, Eweinber, Expooz, Eyreland, Fleem, FrancisTyers, FrankTobia, Fredrik, FreezBee, Giftlite, Gnomehacker, Graham Chapman, Graham87, Grgarza, Grubber, Guaka, HSRT, Haham hanuka, Hannes Hirzel, HappyCamper, Harryboyles, Heidijane, Henri de Solages, Het, Hpalaiya, HumphreyW, Hyacinth, Iluvcapra, Imz, Informationtheory, Informationtricks, Isheden, Isomorphic, Isvish, Ivan Štambuk, Jahiegel, Jheald, Jimmaths, Jingluolaodao, Joeoettinger, Johnuniq, Jon Awbrey, Josh Parris, Jthomp4338, Jwdietrich2, Kanenas, Karada, Kevin Baas, Kjells, KuniShiro, Kusma, L.exsteens, L353a1, Lachico, Lee J Haywood, Light current, Linas, LittleDan, Loom91, Lordspaz, Lotje, LouScheffer, Lupo, Lyrl, MH, ML, Magmi, ManuelGR, Maria Vargas, Masgatotkaca, Masrudin, Materialscientist, Matthew Verey, Maurreen, Mayumashu, Mceliece, Mct mht, Mdd, Metacomet, Michael Hardy, Michael Ross, Michael Slone, Mindmatrix, Minesweeper, MisterSheik, Mitch Ames, Moe Epsilon, Momergil, Mpeisenbr, Msh210, Munford, MuthuKutty, Nabarry, Nageh, Nearfar, Netalarm, Neuromancien, NeuronExMachina, Nick Green, Noisy, Nothingmuch, Novum, Oldrubbie, OverlordQ, PHansen, Pax:Vobiscum, Pcarbonn, Pcontrop, Pearle, Peerc, Picapica, Poor Yorick, PoulyM, Pouya, Pulkitgrover, Pzrq, Radagast3, RainbowCrane, Rakesh kumar, Rbj, Rdsmith4, Reedy, Rich Farmbrough, Roman Cheplyaka, Ruud Koot, Rvollmert, SC, ScottHolden, Securiger, Seglea, SeventyThree, Siddhant, Sigmundg, Simon South, Sina2, Sir Nicholas de Mimsy-Porpington, SkyMachine, Smalljim, SoWhy, Spellchecker, Spoon!, Srleffler, Staffwaterboy, Starrymessenger, StephanWehner, SudoMonas, Tamer ih, Taxisfolder, Tedunning, Terra Novus, Thermochap, Thomas Keyes, TimeOfDei, Timo Honkasalo, Tiramisoo, Toby Bartels, Twohoos, Tyrson, Uncle Bill, Unnikrishnan.am, Useight, Varunrebel, Vegetator, Velho, Vinodmp, WikiIT, Wile E. Heresiarch, Wizard191, Woohookitty, Ww, XJamRastafire, Xezbeth, Xlasne, Yahya Abdal-Aziz, Ynh, 514 anonymous edits Marshall McLuhan Source: http://en.wikipedia.org/w/index.php?oldid=433410300 Contributors: 3finger, AKGhetto, Acracia, AgarwalSumeet, AlexWaelde, Alexius08, All Hallow's Wraith, Ameliorate!, AndrewFW, Andrewbadr, Andrewvdill, Andycjp, Apeiron07, Apparition11, AuxBuissonets, BD2412, BMF81, Balcer, Barrylb, Bearcat, Beckerb, Beek man, Being blunt, BelindaEdgeworth, Bellagio99, Bender235, BerryMAS214, BigFatBuddha, Birdsnare, Bireszter, Blotto adrift, BobJones77, Bobo192, Bookandcoffee, Bornintheguz, BrentS, Brett epic, Bryan Derksen, Buffyg, Bwark, Careless hx, CartoonRabbit, Chrisvls, ChrysJazz, Clarknova, Closedmouth, Cobaltcigs, Coffeepusher, Cola cola colo, Colonies Chris, Conversion script, Crysb, Csortanb, Cullinane, Curious melroy, Cyberchimp, Cybercobra, D6, DCDuring, DFRussia, Dan6hell66, Datandrews, Davinwave, Dbergan, Decumanus, DennisDaniels, Detriment626, Dhodges, Dibyajyotighosh, DimisNasis, Dl2000, Dominus, DrBakali, DrJ, Drinkybird, Dureo, Durova, Dysprosia, Earl Andrew, Eastwind, Edburns, Edonovan, Egyszer, Ehheh, Entheos, Eranb, Erich Schmidt, Everyking, Fabricationary, Facius, FactoryBoy, Felipetesc, Ffirehorse, Firsfron, Fizbin, FlyHigh, Folantin, Forcewhispers, Fredrik, Frédérick Lacasse, FullSmash26, Func, Föld-lét, Ganskop, Garnethertz, Garywill, Gase12, Gawaxay, Gene Nygaard, George Dance, George415, Ghostintheshell, Gilbert Lapointe, Gkerkvliet, Goethean, Gogobera, GoldenXuniversity, GoneAFK, Gracehoper, Grafen, Gregmcpherson, GregorB, Grstain, Hadal, Haham hanuka, Hall Monitor, Healersun, Hede2000, Heleenvanderklink, Hinto, Homagetocatalonia, Hu, Hyacinth, Iamasalamander, Iawas, Ibagli, Inuxx, Inwind, J.delanoy, Jahsonic, Jamesmorrison, Jason One, Javaweb, Jay Litman, JayHenry, JayJasper, Jboyd, Jd4v15, Jeff3000, Jenni gmas229, Jeysaba, Jgurreri, JillandJack, Jim Raynor, Jivecat, Jm34harvey, JoanneB, John, JohnKlax, Johnjosephbachir, Jonathunder, Josh Parris, Jpbowen, Judithfitzgerald, Jun Nijo, KF, Kchishol1970, Kelisi, Kellywatchthestars, Kewp, Khukri, Kicking222, Klundarr, Kneedles, Koenraad Cl, Kroh89, Kvng, Kwsherwood, Kww, Kyle Barbour, LAgurl, LOL, LOUCHAN, Lavintzin, Leandrooliveira, Lestrade, Lijil, Little guru, Lucidish, Lucy8, Lumos3, Lyotards pants, MMBBTT, Madmagic, Magioladitis, MagnesianPhoenix, Mamalujo, Mannafredo, Mark1000, MarkBuckles, Markbeaulieu, MarkkuP, Master Jay, Master son, Matthew Woodcraft, Mav, Mayumashu, Maziotis, Mcluhanprophecy, Meco, Medleyswimmer, MegX, Merosonox, Merphant, Michael khoo, Michaeladenner, Midnightdreary, Mikeymke, Mild Bill Hiccup, Mindmatrix, Mitrius, Modemac, Modernist, Mojaveman, Monsquaz, Morninj, Moshido, Moxy, MrOllie, Nabeth, NailPuppy, Natinja, Nectarflowed, Nekura, Newsroom hierarchies, Nguyenmas214, Nietzschekeen, Noebse, OlEnglish, Oli Filth, Onceler, OrgasGirl, Oxymoron83, P.T. Aufrette, Papercutbiology, Paradiso, PatrickFisher, Patstuart, PaulLev, Paxse, Pearle, Pedant17, Personalcomputer, Pgan002, Phil Sandifer, Philip ea, Phydend, Pinkville, Pmcray, Pneuhaus, Poor Yorick, Profangelo, QTCaptain, Quadell, Quercus, Questforneutrality, RHB, RJBurkhart3, Rais229, Rajah, Rdsmith4, RedJ 17, RedWolf, Redvers, Rich Farmbrough, Richard W.M. Jones, RickK, Rje, Rjwilmsi, Rlitwin, Roltz, Rothorpe, Rrawpower, SD6-Agent, SDC, Sam Hocevar, SanderSpek, SanguineX, Santa Sangre, Shadowjams, Shantavira, Shimeru, Shoreranger, Simanicmas229, SimonP, Sintagma, Sjmurray, Skadinadace, Smdo, Smitty, Smobri, Sokratesla, Somvanlig, Staatenloser, Stan Shebs, Stefanomione, SteinbDJ, Steve-o, SteveHFish, Stirling Newberry, Straw Cat, Sunray, Supergee, T. Matthew Phillips, TOR, Tagishsimon, Taylor.v.berry, Tcp1234a, Tdw1203, Tea Tzu, Ted Longstaffe, Tercross, Terrance Mockler, Tfarrell, The Alzabo, The Anome, The Cunctator, The wub, TheMadBaron, ThePatriot, Themusicgod1, Theresa knott, Thingg, Thortful, Tomsega, Triwbe, Tualha, Tvoz, Twas Now, Ulric1313, Vague Rant, Vassanjimenno, Vathek, Vicharam, Virbonusdicendiperitus, Viriditas, Vsb, Vzbs34, WFinch, Wassenberg, Wetwarexpert, Wikiklrsc, Wildpenny, Willking1979, Wknight94, Wolfman, Wordyness, WormwoodJagger, Xanderer, Y, YUL89YYZ, Yellow-bellied sapsucker, Yoninah, ZHUMAS214, Zapplepie, Zebraspot, ZimZalaBim, Zviki1, 1011 anonymous edits Nicholas Negroponte Source: http://en.wikipedia.org/w/index.php?oldid=424683547 Contributors: ARK, AVRS, Aashish950, Acmcie, Alansohn, Alienlifeformz, Arawn, Artist In Flight, AxelBoldt, BSveen, Badmoon36, Benjamin Mako Hill, Bevo, Billninio, CharlieNisser, Cherlin, Cho229, Classicfilms, Clement Cherlin, D6, Davepape, Davewho2, Deannerz, DerHexer, Djadvance, Dmrcambridge, Doristein, Dorit, Edward, Eloquence, Emyr42, Erickaakcire, Erud, Everyking, Fgrose, Flambergius, FrenchIsAwesome, Gaius Cornelius, GidsR, Gilliam, H0riz0n, Hroðulf, Hu12, IDD55, JForget, JMK, Jacoplane, James Kidd, Jawed, Jiang, Johnleemk, Jtmichcock, Jugander, Juri, KYPark, Kbdank71, Kevyn, Kozuch, Ksnow, Lightmouse, Madcoverboy, Mallaccaos, Markpeak, Martarius, Martey, Mateo LeFou, Mcsee, Meeples, Metlin, Micheldene, Moukas, Mschlindwein, MusiCitizen, Mydogategodshat, Nabeelo, Nationalparks, Negroponte, Ninio, No3mie, Nova77, Nynaeve22, Olivier, Oxymoron83, PTSE, Panthos304, Pegua, Pipedreamergrey, Pollox, Pvmoutside, Qohen, RJBurkhart3, Randhirreddy, Rockfang, Rossumcapek,
36
Article Sources and Contributors Rshangle, Ryulong, Sakhalinrf, Sannse, Sean864, Senu, Shermozle, Sina, Skopelos-slim, Slaniel, Slicky, Square87, Stefanomione, Stevenmc, Stirling Newberry, Sumahoy, Tassedethe, Tbolende, Tedster212, Teeeim, Themfromspace, Threeafterthree, Toyota prius 2, Ttyre, VandalCruncher, Varlaam, Viajero, Viksit, VivaEmilyDavies, Wernher, Wiccan Quagga, Xenovatis, Yorker, ZWSteinberg, 110 anonymous edits
37
Image Sources, Licenses and Contributors
Image Sources, Licenses and Contributors File:WikipediaBinary.svg Source: http://en.wikipedia.org/w/index.php?title=File:WikipediaBinary.svg License: GNU Free Documentation License Contributors: User:Spinningspark. Original uploader was Dreftymac at en.wikipedia. Later version(s) were uploaded by Spinningspark at en.wikipedia. Image:Binary entropy plot.svg Source: http://en.wikipedia.org/w/index.php?title=File:Binary_entropy_plot.svg License: GNU Free Documentation License Contributors: Brona and Alessio Damato Image:CDSCRATCHES.jpg Source: http://en.wikipedia.org/w/index.php?title=File:CDSCRATCHES.jpg License: Public Domain Contributors: en:user:Jam01 Image:Comm Channel.svg Source: http://en.wikipedia.org/w/index.php?title=File:Comm_Channel.svg License: Public Domain Contributors: en:Dicklyon Image:Binary symmetric channel.svg Source: http://en.wikipedia.org/w/index.php?title=File:Binary_symmetric_channel.svg License: Public Domain Contributors: David Eppstein Image:Binary erasure channel.svg Source: http://en.wikipedia.org/w/index.php?title=File:Binary_erasure_channel.svg License: Public Domain Contributors: David Eppstein File:MarshallMcLuhan.png Source: http://en.wikipedia.org/w/index.php?title=File:MarshallMcLuhan.png License: unknown Contributors: User:Bender235, User:Cydebot, User:FairuseBot, User:TreveX, User:YellowDot File:MediaTetrad.svg Source: http://en.wikipedia.org/w/index.php?title=File:MediaTetrad.svg License: GNU Free Documentation License Contributors: Boozinf, Merosonox, Viriditas File:Marshall McLuhan Way Toronto.jpg Source: http://en.wikipedia.org/w/index.php?title=File:Marshall_McLuhan_Way_Toronto.jpg License: GNU Free Documentation License Contributors: user:Hinto File:NNegoponte USNA 20090415 .jpg Source: http://en.wikipedia.org/w/index.php?title=File:NNegoponte_USNA_20090415_.jpg License: Creative Commons Attribution-Sharealike 3.0 Contributors: Gin Kai, U.S. Naval Academy, Photographic Studio Image:Kaye negroponte.jpg Source: http://en.wikipedia.org/w/index.php?title=File:Kaye_negroponte.jpg License: Creative Commons Attribution 2.0 Contributors: David Weekly
38
License
License Creative Commons Attribution-Share Alike 3.0 Unported http:/ / creativecommons. org/ licenses/ by-sa/ 3. 0/
39