cara agar cepat hamil weigh loss factor : 2012

Rabu, 26 Desember 2012

Curation and Research in Art and Science

Chicago's Field Museum is making drastic cuts to basic research in order to meet a constrained budget. Lukas has argued that this should be seen as a blow to scientists, historians of science, and members of the public, even while we acknowledge museums' complex roots in the cultural capital of the Gilded Age.

Source: http://upload.wikimedia.org/wikipedia/commons/8/89/Field_Museum_of_Natural_History.jpg
Both Lukas's analysis and poignancy feel spot on, and I take seriously the idea that we can't cleave them apart. Museums don't just conveniently blur analytical binaries (like public and private, internal and external, expert and lay) for historians of science; they're also sites with which people fall in love, and thus a hook for wider audiences.

People who study museums—like Lukas, Jenna Tonn, and others—know this well. But I think one thing the Field Museum episode reveals is that, even within the academy (indeed, even within history of science), there are some widespread misperceptions about today's museum curation—some will be surprised that curators are tenured, for example, and that "curation" is as much original research as preservation and display.

Sure, we "know" this. But I think if a scientific division at a major university—down the road, say, at the University of Chicago—were going through this (collapsing departments, breaking tenure), we'd hear more about it. Does that sound right? If so, why? And why do I get the impression that part of making the case for the Field is convincing people that its staff really does crucial research (rather than, say, simply enabling others to do it)?

Maybe a comparison with art museums will help bring what interests me into focus.

Last week, I was at the de Young Museum, which stares across a manicured lawn at the California Academy of Sciences in Golden Gate Park. Recently rebuilt in striking style, their proximity reminded me of what "research" in either setting might mean, both in terms of the objects they contain and the expectations of the public who visit them.

California Academy of Sciences (http://upload.wikimedia.org/wikipedia/commons/0/02/California_Academy_of_Sciences_pano.jpg)
Like the Field in Chicago, the CAS in San Francisco combines original research, education, and public exhibits on topics across natural history. Both attract millions of visitors, many of them families and school groups, and both are increasingly hands-on, interactive spaces (the sort of curation mentioned by Dan in a comment on Lukas's post).

What do visitors expect? Of course, there are the specimens for which natural history is famous—like Sue, the Field Museum's Tyrannosaurus rex (the most complete specimen of its kind)—of which only a tiny fraction are ever seen in public. But science, too, is on display: not just its products, that is, but its process.

Sue at the Field Museum (http://upload.wikimedia.org/wikipedia/commons/f/fc/Sues_skeleton.jpg)
Sue floats in a sea of signage, detailing not just her anatomy and life history, but the provenance of her fossilized bones and the process that brought them to light. Science went into her discovery, recovery, and display, and those efforts are explained alongside her striking skeleton. Visitors are invited to "discover the science of Sue"; prominent placards proclaim both her place in natural history and the practices that made it so.

It's a different story at the de Young—or, for that matter, at the MET or the MoMA or any other flagship museum of the fine arts. It's not that art curators don't put in similar efforts—of course they do. And you do hear about feats associated with the recovery, restoration, and display of priceless paintings much as you do about famous fossils. In both, curators study, as well as care for, the objects in their charge.

M. H. de Young Memorial Museum (http://upload.wikimedia.org/wikipedia/commons/c/c2/De_Young_Museum_pano.jpg)
And yet.

There seems to be a different relationship between what's on display and what's beneath the surface at the two institutions, something that goes beyond everyday distinctions between "art" and "science," that goes part of the way toward clarifying what's at stake in the Field Museum's recent troubles—and why we need reminding of the processes it brings to light.

It seems that art curators don't claim to "do" art in the same way as science curators claim to "do" science. There's a distinction between the process of producing and displaying the works in the de Young that's collapsed when you cross the courtyard to the Academy of Sciences, one that jives with what historians think goes into both fields.

Historians of science have convincingly shown that whatever "science" is, it's operating all the way from digging to displaying when it comes to dinosaur bones—a fact current curators have been happy to confirm. Art historians and curators, on the other hand, have been less likely to claim their crafts as "art" from top-to-bottom. If anything, they're more likely to see their work as scientific.

Curation? (http://upload.wikimedia.org/wikipedia/en/e/e1/Ecce_Homo_%28El%C3%ADas_Garc%C3%ADa_Mart%C3%ADnez%29.jpg)
The distinction I've just introduced has been (I think) invoked as the Field had advocated for its preservation. Cut science museums and you cut science, which directly depends on such collections and institutions. Cut art museums and you cut art in a different sense: its public display more than its production—which major museums were less directly involved with (to my knowledge) than were their scientific counterparts.

What might such a case elide? Well, it seems like one way to bridge the study of art and science curation—as practical aesthetics, instantiations of judgment—gets lost. To the extent that historians of science have been paying more attention to matters of taste, the boundary I've traced above fails to see science as fundamentally art-like. Interestingly, seeing science curation that way might well hamper—rather than help—effort on the Field's behalf.

That brings me to my final point. We might watch more warily than Lukas would like as historical work comes this close to advocacy. Crises remind us what matters—but we might well strive to transcend our current commitments as we describe those who came before us. Seeing science curators as taste-makers, for example, might well undermine efforts to paint their work (sorry) as a public good. So what? If Lukas's Bourdieuian approach to their early history does more harm than good precisely when museums like the Field are under pressure to change, what do we do?

Should this change what Lukas says? No (nor has it). Does Lukas think it should? No (as he himself suggests). But I'm curious to hear how he might bring his analysis (of the capitalist roots of museum culture) and his passion (for the preservation of those institutions) together.

In some cases, the latter (passion) might drive the former (analysis)—but the present seems a complicated one for such a synthesis. Might the former—a story about where museums come from, undergirded by a vision of how they reflect the cultural capital that created them—instead impact the latter? We need both passion and critique, but what happens when they don't fit together easily?

Lukas suggests "other dimensions of the history" will make clear how his history of museums and his fondness for their present incarnation go together, and that it will take a post to flesh them out. I hope he'll write it.

Minggu, 23 Desember 2012

. . . By Exemplars: Kuhn in Chicago

A few weeks ago, I attended a birthday party at the University of Chicago called "Celebrating the 50th Anniversary of Thomas Kuhn's The Structure of Scientific Revolutions." It was a stimulating event, and I left with many thoughts, problems, and puzzles. Below, I try to capture the gestalt of the presentations and discussions there. My post follows a nice summary that Michael Barany gave us of a sister Kuhn event at Princeton.

Tom Kuhn Wants His Theories Back, You Hippie Sociologist!!
If there was one theme that came through during the conference, it was a renewed interest in reasoning by exemplars, and the papers there suggested that a great deal of compelling work is being done on this topic and that a great deal more remains to be explored. At times, discussions of reasoning by exemplar took on the feeling of agenda-setting: some programmatic vision for the history of science being cast on the shores of Lake Michigan. We'll see what it nets. You'll see flashes of this theme throughout the summaries below.

This post is going to be damn long, and I apologize for that in advance. (Feel free to "tl:dr.") My goal is to summarize the talks with as much fidelity as possible. No doubt the University of Chicago will put videos of the event up in no time, and all of this will have been fruitless, but perhaps someone will find it of use. I apologize to any of the speakers who I might paraphrase badly. (If I have gotten something egregiously wrong, please drop me a note, and I will change it.) I have broken the conference down by speaker. Lorraine Daston was first up.

Day 1

Lorraine Daston

Daston began her talk "History of Science without Structure" by bemoaning the vagaries of email. In the program, the word "structure" was printed in italics, which would refer to Kuhn's book, but Daston wanted to discuss the more general social scientific notion of structure. "Structure" was a "word to conjure with in 1962," Daston claimed, as Levi-Strauss and others were bringing the idea to the fore. Yet, perhaps no other word strikes current historians of science as dusty as structure. Many of the themes that Kuhn identified in his work have come back around, but "structure" hasn't. The term "revolutions" remains a commonplace today, so why has "structure" fared so poorly?

Daston claimed that most current historians of science believe that no idea of structure could do justice to the nuance of their subject matter. Since the 1990s, the goal of history of science (HoS) has been to complexify, rather than simplify. Good papers are now "rich," not incisive. Kuhn's desire for history of science to go mainstream has been fulfilled, but his focus on historicism and the influence of that commitment undid ideas of structure. In this way, historians of science became historians by detaching themselves from philosophers. (During one Q&A session at the conference, an audience member joked something like, "As a philosopher of science, I'm annoyed by details.")

Historicism also undid distinctions between science and other human endeavors. "Normal Science" evaporated under examination. Kuhn saw "unparalleled insulation" around science (the old demarcation/boundary work problem), but historians undid this view in the 80s and 90s. The contextualist approach of the 80s and 90s was not the old school externalism of Marxism (Hessen), Weber (Merton), etc. Rather, contextualism's chops came from other social science fields: anthropology, literary theory, etc. The contextual pursuit of internal history led to "external sites"—outside the lab—where science was also conducted. In this way, HoS bid farewell to philosophy and sociology because these fields went to the general that historical investigation resisted.

For this reason, HoS diverged strongly from science studies. (Daston has, um, famously written about this topic before; scope Hank's postmortem of that paper and an opposing one here.) In the 70s, you could ask any grad students in HoS departments what the Strong Programme was or about the idea of "experimenters regress," but this is no longer true. If you look at Isis and other major HoS journals, very few recent articles are related to science studies. Daston claimed that, at about the same time that HoSers* (Trademark LJV) moved away from science studies, controversies around Kuhn's Structure declined.

Kuhn would have been horrified that HoS has lost its theoretical underpinnings, even if this result came from his own historicism. Kuhn played with the idea that Gestalt psychology and other theories of visual perception were not simply analogies for paradigms in his work but were rather the way things really worked. Kuhn returned over and over to the visual. Throughout Daston's talk, she ran video of experiments on visualization in which, for some days, a man wore glasses that inverted his field of vision. (I missed where and when this experiment was conducted.) Daston's point in showing these videos was that, if you examine actual experiments on visualization, they don't fit Kuhn's view. In particular, they don't fit Kuhn's duck-rabbit analogy, in which an image can switch back and forth between two incompatible "pictures."  If you invert someone's visual field, they only adapt slowly and in a piecemeal way, over days and weeks.

Rules partly explain how we adapt to new ideas/perceptions (it was unclear to me whether she thought Kuhn embraced rules, or whether this was her own recommendation). In this way, "practice" is what we have left of Structure/structure. This idea—that only rules can give us structure—is itself a product of history. The very idea that we learn from exemplars is all over 20th century thought; therefore, "paradigms" are not unique to science. Like learning to deal with an inverted visual field, learning to reason via exemplar is a gradual experience. It is context sensitive. It can be taught. Once again, this brings down the barriers that spell out science's supposed uniqueness. Reasoning from exemplars can in fact be brought down to rule-following, though Kuhn thought it could not. This is where we should be looking and searching; this search may bring us back into contact with philosophy and sociology (she also later mentioned cognitive science).

During the Q&A period, Daston responded to a question by claiming that when historians of science focused was on ideas, the internal/external divide made sense, but the focus on practices undid all of this. Peter Galison said that we should remember that the Vienna Circle and Wittgenstein were also looking at Gestalt psychology during this time; perception was the model for knowledge. Another audience member said, "Look, vision as knowledge is a general idea in Western culture going back to forever." A third audience member asked Daston, "Have historians of science really turned to history en masse? It seems that many just focus on more local theories." Daston responded that the history of biology seems to hold onto philosophy and questions of "what is a theory" in ways other subfield don't. The same audience member responded in turn, claiming that this might have to do with historians of evolutionary biology having to contend with creationists. These historians fall back on trying to distinguish what makes science "science."

George Reisch

The title of George Reisch's talk, “Aristotle in the Cold War: On the Origins of Thomas Kuhn’s Structure of Scientific Revolutions,” is about as descriptive as they get. Reisch sought to historicize Kuhn in his Cold War context by meditating on the meaning of Kuhn's so-called "Aristotle Experience." (You can read more than I am going to say about the experience here.) Kuhn recounted that he had been sitting in his room during graduate school in 1947. He had been reading Aristotle's Physics to determine how much mechanics in the later Newtonian sense the Greek philosopher knew. He quickly discovered that Aristotle knew no mechanics at all in the modern sense. In Aristotle, Kuhn found (what we often call) another "worldview." Kuhn believed that Aristotle's view of physics made sense, but only if you set aside the assumptions of Newtonian physics.

Kuhn sat in his room and flipped back and forth between the Aristotelian worldview of physics and the Newtonian one—duck, rabbit, duck, rabbit, duck. He described the Aristotle Experience like a kind of religious conversion. Most important, this notion of a history of science full of different, incompatible worldviews was much different than the progressive, accumulative picture held by Kuhn's advisor, James B. Conant. The essence of science, for Kuhn's emerging account, was "non-cumulativeness." This also led him to history as something that could radically change our understanding of science, the sentiment that opens up Structure. As Kuhn recalled, "Nothing prepared me for the way science looks when viewed through the writings of dead scientists."

Reisch sought to unpack the Cold War origins of the image of the scientific mind that is inherent in the Aristotle Experience. Reisch put it something like this, "It's a picture of mind convinced not by proof but by persuasion." Similar worried pictures of the human mind abounded during the Cold War, as people feared of communism's persuading influence.

Conant was not much impressed with the first draft he saw of Structure. He felt that Kuhn overused the notion of paradigm and that it led to "needless problems about progress." Conant believed that Kuhn should get rid of the paradigm altogether. Someone has likened this idea of Kuhn getting rid of paradigm to someone viewing A Street Car Named Desire and saying, "Look this play has some good parts, but that character Blanche has got to go." Kuhn reacted to Conant by making paradigm even more central to his account. He also began to play up the conversion of the Aristotle Experience. Kuhn's account of his own conversion changed over time. At first, he focused on reading sources (the idea of not being prepared that I have already quoted); later, he said that the Aristotle Experience came first.

Yet, Reisch asked, what did the Aristotle Experience really mean? Was it something that could be generalized across the history of science? Reisch thinks that Kuhn's 1947 experience very quickly came to shape all of his later work. Kuhn's earlier student papers look downright positivist. Kuhn was not yet Kuhnian. In one of his student papers, Kuhn insisted that the "essential thing" is data not concepts. Of course, he later inverted this notion, as he came to embrace postpositivist notions of "theory-ladenness" and "underdetermination" and whatnot.

An important part of Kuhn's model is that the scientific mind is unaware of the different paradigms, including its own. The choice is not made by scientists, it is made for them by what in his early post-AE writings Kuhn calls "predisposition," which largely has an unconscious basis. By 1951, the pieces were in place for Kuhn, including the idea that experience underdetermines theory. But Kuhn did not have paradigms in 1951. He only gained that late in 1960. Paradigms play the roles that Kuhn earlier attributed to language and predisposition. In moving towards this view, Kuhn's picture was departing from Conant's, but also from mainstream logical empiricism, including Carnap's. Carnap agreed with Kuhn that mind must simplify the world, but Carnap, unlike Kuhn, believed that we are aware of this simplification.

Reisch then pointed to other works that had this people-as-dupes theory of mind during the Cold War, such as Arthur Koestler's The God that Failed and The Sleep Walkers. Another prime example was Czeslaw Milosz's anticommunist book, The Captive Mind, and both Milosz and Kuhn were at Berkeley in the 1960s. But Reisch wasn't saying that Kuhn took the view from Milosz; rather he thinks this was the view that had already taken roost at Harvard in the late-40s and 50s. For instance, Conant's 1947 essay on phlogiston chemistry makes misguided scientists sound like communist dupes.

Conant was facing constant worries about communists in the academy. In general, he maintained a philosophy of academic freedom, but Conant believed that communists were "out of bounds" as professors because they had sacrificed reason to politics. Conant did not create this picture of communists. It was largely the creation of NYU professor Sidney Hook. Communists had unfree minds. Hook began reviewing Conant's books in 1948, and the men later became friends. Hook saw Conant as a good anti-communists who understood this idea of the unfree mind.

During the 1950s, as Kuhn was working himself from predispositions to paradigms, he played with the era's idea of ideology: "the function of [scientific] theory as professional ideology." In Kuhn's view "professional ideology" limited creativity (just as paradigm's later did). Reisch asked, was Kuhn aware that his ideas were highly politicized? At the time, probably not. Shortly after writing Structure, Kuhn wrote an essay on dogma in science. It did not go over well. Others convinced him to stop using the term dogma. Yet, Kuhn's focus on dogma may have led Imre Lakatos to pair Kuhn against the radically anti-dogmatic Popper years later.

During the question and answer period, Reisch said that he was led to this paper by debates around intelligent design. IDers like to cast biologists as close minded and brainwashed. Additionally, an audience member pointed out that important factor in Kuhn's belief that paradigms were unconscious was his experience undergoing psychoanalysis. (James Poskett pointed out to me—via Twitter no less—that John Forrester wrote about Kuhn and psychoanalysis. Here for those interested.) Reisch agreed that psychoanalysis probably played a role in Kuhn's world view but insisted that we have to be careful here. Psychoanalysis talks about "unconscious ideas," but paradigms are not ideas, so if the mind if captive to anything it is captive to (unconscious) practices.

Daniel Garber

Daniel Garber's talk was called “Why the Scientific Revolution wasn’t a Scientific Revolution, and Why it Really Shouldn’t Matter to Kuhn.” In his slides, however, Garber has changed part of his title to "why it should matter to Kuhn." He copped to feeling ambivalence. Over the last 30 years, Garber said, a chorus has tried to deny that the Scientific Revolution was a revolution. As Steven Shapin begins his book on the topic, "There was no such thing as the scientific revolution and this is a book about it." But Garber asked, what lies behind this denial? Good history? Or a pomo attempt to undermine science and the Enlightenment? One colleague had warned Garber not to say that there was no such thing as the scientific revolution because it gives comfort to our enemies. He wondered aloud, "no doubt great things happened in 17th century, but was it a revolution?"

The more fundamental question, for Garber, is this: is it illuminating to liken changes in science to political revolution? One problem with this analogy: political revolutions are fairly clear. There is often a transition period where things are hazy, but this doesn't go on forever. As Garber said, "Human nature abhors a power vacuum." In Kuhn's model, such change and transformation is also resolved clearly in a scientific revolution. (An aside: I once heard Paul Forman and Alexei Kojevnikov discuss someone who had written about the nature of revolutions either just before or during the time that Kuhn was in the Society of Fellows at Harvard. They postulated that Kuhn may have read this work. Kojevnikov also talked about how Soviet historians of science made heavy use of the revolution metaphor well before Kuhn.) For Kuhn, a choice between paradigms is like a choice between incompatible political regimes. But does this pattern fit the scientific revolution?

As Garber argued, we see lots of changes during the period of the "scientific revolution" (changes in practices, communication, education, instrumentation, social formations, etc., etc., etc.) But Garber chose to focus on one change: the eclipse of Aristotelian philosophy. Even contemporaries (historical actors) saw it as an essential change—the "New Philosophy."

So does Kuhn's model fit this case? Well, there were many attacks on Aristotelianism, including from Bacon and Galileo. Descartes explicitly presented himself as the new Aristotle, the new authority. This is what a scientific revolution looks like. But things are not so simple. Many other people challenged, complicated, and defended Aristotle's views. Some saw challengers to Aristotle as a kind of club, and a small minded club at that. If Aristotle was the great eagle of thought, they saw the challengers as bunch of little chicks. Many terms were created for these "innovators," the challengers who arose against Aristotle. The English term was "Novelists." Major and minor figures were regularly listed as part of the club (Garber read a few of these lists aloud), but these thinkers held extremely varied views. Cartesianism as a movement may have fit Kuhn's view, but there were so many other (kinds of) actors.

Some of these other actors and movements simply do not fit the Kuhnian image. For instance, a number of conferences were held during this period, each dedicated to an individual topic. The conferences often featured as many as 7 or 8 opposing views, and they had an openness of spirit. Indeed, Renaudot, who staged some of these conferences, was worried about "schools" presenting at the conferences because members would be wary of speaking their own thoughts. In other words, members would toe the party line. As Garber pointed out, here Renaudot sounds a lot like Cold Warriors worried about persuasion and brainwashing. Renaudot embraced the open mind. Renaudot held that "we should be no more willing to embrace Aristotle's views than he was willing to embrace his predecessors'."

As Garber argued, while some pretended to the status of being the new Aristotle, thought never really gelled around a single philosophy during this time. If we are looking for a winner, Newton would be a good candidate. But Newtonianism never had the scope of Aristotle. The varieties of anti-Aristotelianism never simplified into an alternative to Aristotle. Garber's main punchline reminded American Science-er Lukas of something Janet Brown once noted about the persistence of variety, that scientific change was "not an eclipse so much as a Milky Way."

In the end, Garber asserted that Kuhn was wrong; there is no reason that scientists cannot live without resolution. Science is not like politics in this way. No doubt Kuhnian revolutions do happen in science. When that happens we have what Kuhn calls "normal science."

During the Q&A, Daston wondered, "Could this level of eclecticism exist in theology?" Garber said that we have to be sensitive to the case. Theology would be very different. Part of the problem is that natural philsoophy was so tightly tied to theology. To challenge a philosophy was to challenge theology. But we see increasing distance between these two spheres over time. He also said that few natural philosophers were willing to put forward a view an encompassing as Aristotle. We should also pay attention to efforts by some to reawaken other ancient philosophies as alternates to Aristotle during this time. Garber wanted to emphasize that things like scientific revolutions in the Kuhnian sense do occur: "I want to say that a transition from a geocentric to a heliocentric worldview really happened."

Norton Wise

Norton Wise's talk was a meditation on his mysterious title, "A Smoker's Paradigm." Wise painted a picture of Kuhn's '71-'75 graduate seminars at Princeton, which covered such topics as electromagnetic theory and quantum Theory. The seminars used only primary sources, in original languages (English, French, German), about 100 pages of reading a week. He tried to have us imagine Kuhn sitting in the seminar room, saying slowly "Since no assimilation is possible," before taking an impossibly long and deep drag of his cigarette, only to finish his sentence, "later analysts have dismissed his explanation to hand waving." Wise's metaphor is this: Kuhn was to the history of science as he was to his beloved smokes, deep and intense. He burrowed down into primary sources the way he ate cigarettes in a drag or two. (A friend, who also studied with Kuhn in the early 70s, calls bullshit on the idea that Kuhn burned up a whole cancer stick in only a few drags, though he did see Kuhn with more than one cigarette lit at a time. But even my friend would be entertained by Wise's metaphor, which is after all meant to be fun. And, of course, we have the story of Kuhn throwing an ash tray at Erroll Morris's head, and Lukas's analysis of that tale here.)

Kuhn believed that previous history of science, especially history that scientists trafficked in themselves, tended to cover over problems with Whiggish explanations. The point of drilling down into primary sources the way Kuhn did was to uncover these problems and also to find the paradigms and paradigm shifts for which the man became so famous. But here we come to Wise's central point:
For Kuhn, paradigms were very narrow and specific things, and they were only accessible to professionals. In Kuhn's later book on black-body radiation, he talked of "paradigms" (he doesn't actually use that term there) being possessed by 100 members in one place, 25 in another. Paradigms are, to use Kuhn's words, precise, esoteric, and professional. (Earlier, I think, Wise discussed how he [Wise], Kuhn, and others actively involved in the seminars had serious scientific training [mostly all in physics, I imagine] and how they went extremely deep and in step-by-step detail into the scientific reasoning of the primary sources. Wise's recounting of this fact about the seminar as well as his pointing out of Kuhn emphasizing the "professional" nature of paradigms recalls to mind the now moldy discussion about whether historians of science should be scientists.) Wise gave a brief example of how small and esoteric Kuhn thought paradigms were by pointing to, what Kuhn saw as a significant change, Max Planck's very minute shift in counting procedures.

So, this is Wise's puzzle about Structure: what is with these references to community in the book? The references seem to lead to socio-cultural explanations, but Kuhn himself uses the internal-external distinction to create a buffer for science. Wise pointed to a rhetorical move that Kuhn made throughout his career: he would reference some factor that might be important to the story but then set it aside by arguing "but this isn't important in this essay." These rhetorical gestures towards the idea suggest that you care. Wise claims that this is what Kuhn was doing by reference to the sociological aspect of paradigms. In fact, it precludes what it invites. For Kuhn, "community" extends no further than those individuals represented in the narrative ("100 members in one place, 25 in another"). Paradigms are about a set of deeply involved scientists making a change around some small thing. This isn't the "community" that sociologists discuss, nor the notion that historians of science have examined since the 1970s. For Kuhn, there are no networks, no exchange relationships, no movements of materials.

These are all of the things that historians have come to pay more attention in the years since Structure. As an example, Wise pointed to Robert E. Kohler's Lords of the Fly, which examined how the exchange of materials was so essential to how the community works. For example, Wise argued that if Kuhn had examined the national lab that measured black-bodies in his black-body radiation book, it would have brought in the things he was trying to exclude.

In his 1970 postscript to Structure, Kuhn describes a set of scientists' "group commitments" which he calls the "disciplinary matrix." The matrix has three parts: symbolic generalizations, metaphysics paradigms, and values. But again, the matrix makes up the intellectual commitments of a small set of individuals and it goes no further! In this way, Kuhn was extremely perturbed and distraught when sociologists and others in the 1970s began to use his theories to paint a much more "social" picture. Wise recalled Kuhn calling him into his office, "Norton come in here. Look at this. Look what they're doing." Kuhn called the Edinburgh Strong Programme "absurd" and "deconstruction gone mad."

Kuhn remained skeptical about the turn to practice (which Daston outlined earlier) until his death. As Wise recalled, Kuhn said "he could not get practice." He ended his career with relatively the same view: the essential things (for the history of science) were the small things held by a small body of experts. Wise was therefore saying that everything that has been made of the "sociology" in Kuhn's work has been (I can't remember his exact phrasing but it was something like) "put in there" or "stuck in there." Wise noted that, at a similar celebration of Kuhn's Structure in Europe, David Bloor said that he was going to send a proof to Wise that the sociology is in Structure. He hasn't sent it yet.

The Q&A session for Wise's paper was a fascinating experience. Daston asked Wise about the experience of the Kuhnian seminar. Wise pointed out that Kuhn never analyzed community like the seminar—who met who, how. Daston also asked if Wise had ever experienced an intellectual experience as intense as Kuhn's seminars. Wise said that he might have in the Probabilistic Revolution Group, which had also included people like Daston and Ted Porter.

Then things got very interesting as people tried to push back on Wise's assertion that Structure was mostly devoid of what we would consider sociology. To be frank, the room was filled with a tension and emotionality that left the younger (say, < 35 yrs old) historians of science puzzled. One young historian of science leaned over and whispered into my ear, "This is weird, right?" And she was right. It was weird. I think one explanation for our perplexity is that we simply do not have the emotional (ahem, Oedipal?) relationship with Structure that older scholars do. For example, Wise, who went to study with Kuhn after reading the book, recalled his own personal encounter with it. Wise was a low energy nuclear physicist when the NRC/AEC defunded 50 low energy phsyics labs in a year. Wise read Structure as CRISIS; he saw his own autobiography in it.

So, people took turns questioning Wise. One person pointed out that a certain set of ideas, like heliocentrism, maybe were held by a large group of people, which we could call a community. Yes, said Wise, "But what you mentioned are ideas. They can be anywhere in the world. That is an extremely weak notion of community." Ken Alder pointed out that the one place where events external to science played an important role in Structure was calendrical reform. Wise agreed but thought that it didn't go to the heart of Kuhn's interest. Ian Hacking said that the real crux was "normal science": how does normal science itself function? That has to do with with communities, education, etc., lots of external sociological stuff. One audience member suggested that some scientists find Kuhn's account of scientific dogma appealing because of their own experiences with funding agencies like the NIH.

Peter Galison asked Wise, "What is your argument?" But I think Wise's argument was pretty clear: he thinks that Kuhn's definition of "paradigm" is deep and narrow in the ways listed above and that people have "added in" all of our post-Structure sociological reflections.  Yet, another younger historian didn't buy Wise's argument, writing to me, "This is insane. Imagine Kuhn was right (to a point).  He inhabited a paradigm too, which is to say that he may not have been aware of all the dimensions of his own theory—he was a philosophically minded historian who inhabited a sociological paradigm..." In other words, "Kuhn didn't know what Kuhn was doing."  Kuhn was simply the worst interpreter of Kuhn, and the people who headed in a more sociological direction after the man-himself were right. I am not so very death-of-the-author as to go along with this fellow. I think it's worthwhile getting straight what Kuhn was trying to do, and I think that Wise did a good job of it. That's not an argument against sociology, of course, but Wise wasn't making such an argument anyway.

Day 2

Between showing up late, having technical problems, and being toast, my notes from day 2 are not as detailed. Probably just as well.

Ian Hacking

(As luck and transportation arrangements would have it, I missed most of Hacking's talk. This summary is largely taken from notes on the talk from historian of science, Stephanie Dick, as well as what I heard in the Q&A period. Thanks, Steph!!)

In his paper "Paradigms," Hacking sought to embed wand to embed the notion of “paradigm” as model-following and exemplars within the history of reasoning, from Aristotle’s Rhetoric until the present. He pointed to such recent works as John Arthos's "Where There are No Rules or Systems to Guide Us: Argument from Example in a Hermeneutic Rhetoric." Hacking asserted that no one has given us an explicit analysis of how reasoning by paradigm works. It has troubled us since Aristotle. Yet, he argued, if we didn't hold deductive logic as the gold standard of reasoning, we wouldn't be troubled by reasoning by exemplar.

Hacking gestured towards evolutionary theories of reasoning, such as that of Sperber and Mercier, which holds that reasoning is a means to arguing with others and seeking to obtain agreement and shared opinion. It is not about getting to the truth. This partly explains why there are so many tensions and paradoxes in logic—it is not about deductive demonstration itself, but rather about obtaining agreement in a given situation.


Hacking also examined Reviel Netz's analysis of Greek culture and mathematics. He compared the repetitions found in Homer with those in Greek geometry: both are very, very repetitive because they are still working in an oral culture. You can do math without symbols, Netz shows, but it is because you remember the linguistic turns of phrase that make the proof move along. 


Reasoning by exemplar was a real part of Greek and Roman philosophy and is apparent up through Bacon, but it disappeared at the end of the Renaissance, at which point inductive logic came into being (in Bernoullis, Bayes, Hume, etc.) Hacking wanted to assert the persistence of this form of reasoning: apprenticeship, common law, and other such fields are instances of learning by example (which is tied to tacit knowledge). Yet, it is in the Greek world that we see the most explicit reflections on exemplars. 

Hacking claimed that "Arguments [from exemplars] work in a community only they they are 'ritualized,' when there is a procedure, a routine." He turned once more to Netz who showed the rote character of geometric prose in ancient Greece. Proof is invented in this moment, which rested at the cusp between oral and written argument. Netz showed how linguistic formulae are a rote procedure, a ritualized rule for going on. The culmination of rote is a rule, and the Euclidean postulates might be seen as rules for going forward from rote behaviors. Both rote behaviors and routines are communal. As argument becomes ritualized in a community, you can teach it. In this way, sociologists are not interested in attending to little social interactions but in "observing how the rituals that underlie paradigms change. It is those rituals of reasoning that tie communities together."
 

In the Q&A period, Hacking discussed Mary Douglas's question, why do groups stay together and why do they fall apart? For Hacking, exemplars are part of the answer. He claimed, "The class of examples partly determines what the community is." For instance, the common law tradition is, in part, based around a canon of examples. You have to pick the right exemplars to succeed in a common law argument. A set of exemplars does not guarantee agreement, of course.  Examples—and their invocation and application—may be contested but they are still in the canon. Finally, Hacking noted that we should attend to how related ideas such as "rule of thumb" and "heuristics" relate to exemplars.

Andrew Abbott

Andrew Abbott's paper “Structure and Sociology: Notes for a History?” examined citations to Kuhn's Structure in different fields. As is often the case with his work, his talk was strewn with funny one-liners. "Kuhn's references to Gestalt psychology feel as dated today as Coke in a bottle," he said, and Freakonomics is "mostly evolutionary psychology tarted up as economics." Earlier, Abbott wrote a paper called "Varieties of Ignorance," which showed that references to his own professions book were usually either unnecessary or showed that the citing author had not really read Abbott's argument.

Abbott analyzed citations of Kuhn and showed how citations increased over time in fields other than history. The rise of Kuhn citation in the humanities raised sharply from '75 to '80, while citation of Kuhn still high in the field of education and other applied fields. Yet, Abbott argued, a citation to Kuhn usually is just a generic reference to the idea that people hold different worldviews. Which parts of Kuhn actually get cited? Abbott has done a detailed analysis of pages cited from Kuhn. Kuhn is cited close to once a day, but only 6% actually cite pages in the work. Of the articles that cite Kuhn, almost always cite the first pages of a chapter. The pages that are NOT cited are dedicated to detailed examples. This lack of page citations shows something important. Earlier in social science, 2/3's of journal articles cited exact pages in works. But now citations to Kuhn are general and vague rather than pointed and precise. In this way, Abbott made a fine grained argument that scholarship today is sloppy and shoddy. As he claimed, the majority of people who have cited this Kuhn's book have probably not read most of it, many probably have never read any of it.

In the Q&A period, an audience member pointed out that Kuhn's book sells about 80 copies a day, "so it's still being read." Abbott responded, "Well, it's being purchased." The audience member came back, "It is being read because it's on tests." Abbott asked, "Have you heard of Wikipedia?"

Dave Kaiser

Dave Kaiser gave a neat talk titled, “Kuhn among the Psychologists: Structures’s Early Audience.” He opened up by stating (or perhaps quoting), "There are two Kuhns. The first is the author of The Structure of Scientific Revolutions. The second wrote a book of the same title." He went on to describe how Kuhn responded to criticisms of Structure by examining some of the contents of box in the Kuhn archive that is just dedicated to letters around the book, containing around 170 sets of correspondence. Kuhn dictated his letters, quite literally talking through the book's problems. Unfortunately, I was experiencing technical difficulties during this session, so I cannot give detailed summary.

Angela Creager

Angela Creager's talk was titled “Paradigms and Exemplars meet Biomedicine.” You could predict from the get-go that Creager was going to question Kuhn's famous, nearly exclusive focus, on physics and neglect of biology.  Creager hoped to build on Raine Daston's statement that reasoning by exemplars pops up everywhere. For Kuhn, a paradigm was a shared example, and Creager wanted to reexamine this issue in the context of the life sciences. Specifically, Creager focused on the use of model organisms in biological research. The key aspect of such model systems is their typicality. In this way, Creager argued that these organisms are "exemplars" in Kuhn's sense. Kuhn thought paradigms allowed scientists to see problems as things they had encountered before. Creager thinks this fits how life scientists use model organisms. Yet, whereas Kuhn focused on theory, model organisms lead us to focus on experimentation. And here's where Creager differs from Kuhn, a difference earlier identified in Wise's talk: in her account, she focuses on the movement and centrality of materials, especially the model organisms in question. This leads one to several kinds of insights that are not ascertainable through Kuhn's picture or methods.

Peter Galison

Finally, Peter Galison gave his talk “Islands of Knowledge: Boas, Wittgenstein, Kuhn.” Apparently, somewhere between submitting the title and writing the talk, Wittgenstein fell out, as he was not mentioned. Galison began by examining Kuhn's early life as a physicist. Kuhn's work in physics was atypical for the time. When most physicists enrolled in WWII, Kuhn was doing applied quantum theory, and his work was more like research that was done from 1923-1938, rather than the work of the war.  Galison went over some of Kuhn's quantum work and concluded, "This is normal science if you've ever seen it." Pointing to a graph from one of Kuhn's physics essays, Galison said, "This is the paradigmatic paradigm." This would become important later in Galison's talk because Kuhn's physics research suggests why his picture of physics was so out-of-touch with post-war physics as historians of science have uncovered it

Galison then went on to describe a series of debates, via correspondence, between Kuhn and Paul Federabend. Feyerabend questioned Kuhn's notion of paradigms as monistic and mystical and covering coexisting alternative theories. Kuhn saw the importance of holding somethings still for science to progress, a view very different from Feyerabend's. Galison likened Kuhn's view to an airplane crash investigation, in which investigators look for the little point that made the whole thing fall apart. It is the little thing that finally causes the edifice to fall, for Kuhn. But Feyerabend had a totally different picture, one of competing, conflicting, simultaneously existing theories vying for dominance.

Galison then explained a potential genealogical source for Kuhn's "monism" (as Feyerabend called it). In 1883, Franz Boas traveled to Baffin Island, where he began his ethnographic work with the Inuit, an experience he eventually published in The Central Eskimo. Boas cast cultures as basically coherent and separate—what scholars later called cultural relativism, the "island cultures" from the talk's title. Boas thought that culture had to be understood in its "fitting togetherness." Cultures live in distinct worlds. Boas's view eventually led to Benjamin Lee Whorf's idea that we live in different languages, and, as Galison described, Kuhn was reading Whorf when he was a Harvard Fellow. For Kuhn, the essential thing is how ontology changes in science. It's about how our ontologies map to the world. In Kuhn's view, scientific revolutions occur when our ontological boxes move, and as they do, we move between worlds.

Galison then returned to Kuhn's early life as a physicist. As noted before, Kuhn's scientific work was a part of the 30s, so he did not participate in the major shift in physics during and after WWII. Science, at that time, became a world full of federal granting agencies, highly politicized research agendas, the movement of materials, and enormous, highly-sophisticated experimental apparatuses and technologies. If Kuhn had participated in this change, he would have had a very different vision, Galison argued, and it may have been one much more like Feyerabend. Kuhn's island cultures—with a single governing ontology—make no sense in the context of post-WWII science

 
Unfortunately, travel plans required me to leave before Ronald Giere's talk, "Kuhn as a Perspectival Realist," so I'm unable to report on it.

Sabtu, 22 Desember 2012

An Experiment in Teaching Hiroshima to Tomorrow's Engineers

As many of our readers attempt to recover from the semester's end, I'm pleased to present a guest post by David Spanagel, reflecting on a just-completed pedagogical experience.


This past term, I had the rare pleasure of teaching the history of modern American science and Technology survey course at WPI, an institution populated predominantly by engineering and natural sciences majors.  Despite the high opportunity costs involved, I selected just two books to “cover” the twentieth century portion of this course, and both of these featured the role of physicists in developing the atomic bomb during World War II.

 
A Tale of Two Cities, produced by the War Department in 1946, online thanks to the Prelinger Archives -- and one of Dan's favorite teaching films


David Cassidy’s recently published A Short History of Physics in the America Century lived up to its title, providing my fact-obsessed but reading-averse engineering students with just 170 pages that introduce and contextualize all the important people, institutions, events, activities, and ideologies that transformed physics in America from its small-scale parochial late-19th century practical and experimental preoccupations, through its remarkable mid-century wartime theory-based revolutionary technological achievements, to its twin legacies as a international collaborative pursuit of ephemeral fundamental natural objects and a lavishly-funded bastion of nationally competitive military scientific employment.

To complement Cassidy’s dense but dispassionate chronological account, I chose to assign a highly personalized analysis of the central episode in this larger trajectory – Mary Palevsky’s Atomic Fragments. Using both published documents and an extensive series of interviews, Palevsky crafts an account of her own interactions with several leading Manhattan Project scientists, seeking to learn about the roles that they played in developing atomic weaponry, and to discern their retrospective feelings of responsibility for and reconciliation to the manner of exactly how the bombs were (or should have been) used in August 1945.

Educational policy debates trumpet the need to cultivate undergraduate engineering students’ ethical development (with an insistence and frequency that suggests that this need has not been fully addressed within the realm of engineering courses), so I figured: “Why not use history as a creative tool to support this important endeavor on behalf of my students?”  By pairing these two quite different styles of historical analysis, I hoped to inspire my students to see beyond the black and white orderliness of standard chronology-based cause and effect arguments, so that they might begin to develop some more refined sensibilities about the richness of history’s gray areas. The moral questions that Palevsky probed with her venerable interlocutors seemed especially promising for student discussion.

The upshot: this experiment of mine came out surprisingly well. Despite a distance from the Cold War that has now grown so great for my students, none of whom were alive when the Berlin Wall fell, the history of the atomic bombings of Hiroshima and Nagasaki still provoke a compelling set of questions about the meaning of technical achievement in various contexts.  On her final exam, one student wrote: “This book [Atomic Fragments] humanized the scientists, so when we watched the Los Alamos documentary [I had shown the class Jon Else’s The Day After Trinity], they seemed more real.”

Another student was sparing in his praise for the readings, but he directly apprehended my whole purpose in assigning that pair of course texts: “On their own, these two books are nothing extraordinary. Palevsky’s book is full of subjectivity and the importance of her personal inquiry overcomes the relevance of her interviewees on the development of American science.  Cassidy’s book is not much more than a condensed summary of factual information about the ups and downs of American physics. Both books combined, however, provide the right balance of the scientific facts and the moral, ethical, and personal reasons behind the actions [which] introduced nuclear power to the world.”

To supplement these secondary source reading assignments, I devote some class time to what I call “primary source” workshops – a packet of excerpts of historical document are given to pairs of students to read on the spot, then discuss with each other, and then report out the class as a whole. For the atomic bomb topic, I gave the class the minutes of the Target Committee and Interim Committee meetings of May 1945, so that my students could figuratively eavesdrop on two of the rare instances where a handful of atomic scientists actually participated in specific decisions about how the bombs would be utilized in the ongoing war against Japan. The timing of these meetings is significant. Virtually all of the European refugee scientists had originally joined the Manhattan Project with some idea of preventing Nazi Germany from being the world’s first atomic power, but after the first week of May 1945, Hitler was already dead and Germany had surrendered.

The students were quite surprised to discover, from their analyses of these documents, the degree to which both scientists and policy-makers shared a deep concern that Japanese cities would soon be too damaged to adequately “see” the dramatic effects of an atomic explosion.  A scientific attitude of experimentation had the effect of rendering all those inhabitants of a target city into an abstract material substance.  That the committee members could simultaneously insist that the bombs be dedicated only to “military” targets, and still advise that city centers would be optimal locations for maximizing devastation (in order to achieve the intended “psychological” effect of totally disheartening a notoriously stoic enemy people), is an extreme paradox that young 21st century readers really found difficult but nevertheless imperative to sort through.

David Spanagel is an Assistant Professor of Humanities & Arts at Worcester Polytechnic Institute and the recently elected Chair of the Forum for the History of Science in America, which sponsors this blog.



Kamis, 20 Desember 2012

The Field Museum Cuts Basic Research

Karl Akeley's famous "Fighting African Elephants" being put on display at the Field Museum in Chicago, ~1905.


The Field Museum of Natural History in Chicago is one of the country's oldest, largest, and most respected institutions of its kind. It has played a leading role in the global effort to collect, study, and exhibit remnants of our world's biodiversity for over a century, but it looks as though this legacy may be nearing its end. According to articles in Nature and the Chicago Tribune, the museum's administration recently announced that it would cut spending on basic research by $3 million to help meet its goal of reducing the overall budget by some $5 million next year. Among other things, this decision will almost certainly require breaking tenure to lay off curators.

Last Tuesday, the museum's new president and CEO, Richard Lariviere, announced that all of its academic departments--Geology, Zoology, Anthropology, and Botany--would be eliminated as part of an effort to streamline its organizational structure. As of January the 1st, they will be replaced by a single department for Science and Education.

There are a few reasons I have decided to write about this development, which is extremely unsettling to say the least. One of them is that I have many fond memories of visiting the museum over the years (my father works there as a curator), and I share the sadness and frustration that many have voiced in the wake of this news. Whereas Lariviere reportedly told the Tribune that "if we wrestle these issues to the ground successfully, our future is rosy," others have been less sanguine in their views. James Hanken, the Director of Harvard's Museum of Comparative Zooology, for example, told Nature that "There’s no way the Field Museum will be able to maintain its position of prominence under those circumstances."

So I urge readers who agree that cutting basic research is a shortsighted move to sign this online petition urging Lariviere and other museum administrators to rethink their fiscal strategy.



But there's a second, somewhat less personal reason I wanted to write about these developments. As a historian of science, I have spent quite a lot of time wondering why American civic natural history museums have a research mission in the first place. The best answer that I have come up with centers on the fact that these institutions required the financial support of wealthy philanthropists to get started during the late 19th century. (The Field Museum, which is named after Marshall Field, a department store magnate who contributed a million dollars to its founding during the 1890s, is a case in point.) Entrepreneurs, financiers, and industrialists had many reasons to support the creation of a museum (or a park, concert hall, library, etc.), but one of the most interesting is that doing so helped to legitimize their wealth, power, and social standing. Adopting some terminology from the French anthropologist Pierre Bourdieu, we might say that investing in the collection of natural history specimens was one way to convert economic wealth into cultural capital.

Inscription above the Field Museum's main entrance.

Since most of these institutions were founded at a time of considerable labor unrest, their underwriters were keen to attract a large and diverse audience that included working class families. To do that, the exhibits had to be exciting and visually stimulating. But they were also supposed to instill bourgeois values and help legitimize a particularly ruthless and competitive mode of production, one which gave rise to increased social inequality. The decision to begin hiring curators and investing in basic research was, thus, among other things, an attempt to lend credibility and authority to popularizations of a particularly self-serving vision of the natural world.

Here, I have been deliberate in my attempts to put the argument in its starkest, most Marxian terms. In actual fact, the modern, civic museum--institutions that combine scientific research with popular entertainment and public instruction--are far more than a public relations campaign for the capitalist mode of production. For example, their creation during the Gilded Age was not just the activity of one social group. Rather, it required forging an alliance between private capital, municipal government, practicing naturalists, and a willing (though not yet paying) public. So I'd just like to signal that there is a lot here that's been left unsaid.

Still, I do basically subscribe to the kind of narrative I've tried to outline above. At the same time, now that the Field Museum's research mission is under serious threat, I feel an immense sadness, grief, and sense of loss. Why should this be so? If creating a research mission was just a way to make museums a better shill for capitalism, why would removing it be such a bad thing?

The answer obviously has a lot to do with all those other dimensions of the history that I've left out. But rather than actually start sketching out such an answer (a task that would require its own post), let me just close by reflecting on the question itself.

Events such as those with which I've started this post are useful reminders to us as practicing historians in that they provide some perspective on our work. We often talk about the need to gain a sense of critical distance from our subject matter. I think that's certainly true. But we should not forget that sometimes we also need exactly the opposite. Too much critical distance turns history into a mere parlor game. But when we are threatened with the tangible loss of something we love (or the creation of something that we abhor), its history takes on a fuller, richer, and more important meaning.

I do not want to give the impression that I advocate giving up on critique. As an historian, I am fundamentally committed to the critical enterprise. But I do want to suggest that critique for its own sake is not just shallow but potentially dangerous.  If it is not directed at a tangible goal, it can lull us into the false sense that we can do what is right without getting our hands dirty.

Selasa, 11 Desember 2012

"Change or Die!": The History of an Innovator's Aphorism

I asked Matt Wisnioski to share something with our readers about the history of technological change and innovation in celebration of the release of his book, Engineers for Change. I'm extraordinarily happy to offer this guest post on the unexpectedly fascinating history of a modern slogan. Change or die!


Innovation advertisement from 1970.
Source: "Change or Die!" Electronic Design 18, no. 13 (1970), 64.

A sure sign that an idiom has become a meme is when journalists attract page clicks by speculating on what it would mean to take it literally. That was the opening conceit of Alan Deutschman’s 2005 article “Change or Die” for the magazine Fast Company. Summarizing IBM’s Global Innovation Outlook conference, where “the most farsighted thinkers from around the world” addressed seemingly intractable global problems, he argued that science has shown that in only one time out of nine, when faced with preventable conditions like heart attacks, are people able to change. The lesson translates across all realms of human activity. Confronted with radical changes from outside their walls, businesses find themselves unable to adapt. If they hope to thrive, corporate leaders need a “strategy for continuous mental rejuvenation and new learning,” he quotes neuroscientist and entrepreneur Michael Merzenich. In his article and subsequent monograph of the same title, Deutschman had his finger on a pulse that he simultaneously helped create. A Google search of change or die combined with its parent term innovation generates over two million hits, including “change management” blogs, studies of the cable television industry, and policy analyses of biomedical research.

Alan Deutschman's book. Source: http://www.alandeutschman.com/books_change_061206.htm

Few aphorisms so pithily capture the ethos of contemporary technoscience. “Change or die” evokes the making of new technologies in an environment of rapid disruption. Entrepreneurial, goal-oriented research upends the administrative and financial structures of entire industries. Simultaneous advances in a diverse range of fields intersect to produce research opportunities and new markets. Hybrid teams of experts coalesce and dissolve across disciplinary, institutional, and national boundaries. The result is a chaotic engine of accelerating progress that brings great reward to the survivors. It is an expression of Darwinian logic that applies to economics, to knowledge making, and, most of all, to the knowledge workers charged with living in a state of creative flow.


Like “innovation” itself, “change or die” is a strikingly novel expression. One finds the smallest trickle of precursors. The phrase makes an appearance in a 1710 sermon that admonishes ministers to have fortitude when encountering the “proudest Worms on earth.” It again surfaces in a handful of 19th century poems and songs. The Massachusetts anti-slavery politician Charles Sumner and the adventure novelist Zane Grey also hit upon the idiom. In his 1939 book Patterns of Survival, the geologist John Hodgdon Bradley was one of the first to give “change or die” an evolutionary interpretation.

Charles Sumner's "Our Domestic Relations: Or, How to Treat the Rebel States," (1861).  Source: Google Books

In these earlier uses "change or die" was neither admonition nor binary choice. Curiously, in 1961, it evolved into an axiom of sociotechnical Darwinianism in an editorial titled "Change or Die" in Voice, the magazine of the Cement, Lime, Gypsum, and Allied Workers which argued that "reactionary organizations" such as the Chamber of Commerce had been unable to adapt to the industrial economy and operated with an obsolete mindset suited to the "bygone era" of King George III. But, its widespread manifestation as a Heraclitian axiom is due to an organized cadre of entrepreneurs in the 1960s and early 1970s.

In 1961, William Maass, a charismatic vice-president at Conover-Mast (publisher of Boating Industry and Volume Feeding Management) had a vision for a “new kind of publication” designed to help scientists, engineers, and research managers keep pace with the technological age. According to Maass, the “consumption of fundamental science by technology” had accelerated such that time from “idea” to “utilization” was reduced from decades to weeks, obliterating distinctions between scientists and engineers. Supported entirely by advertising, his magazine International Science and Technology was given away for free to 120,000 of the world’s top technoscientific practitioners. Produced by leading science journalists, including senior editor Robert Colborn—Dartmouth engineering grad, published novelist, and former editor of Business Week—and an august advisory board, it carried interviews with Nobel Prize winners, technical executives at Bell Labs, and science administrators from France, Pakistan, and the Soviet Union. It also offered intimations of an emerging ideology of innovation.

International Science and Technology 1962 February
Source: http://www.flickr.com/photos/bustbright/3516360061/in/photostream/

In November 1964, IST launched an advertising campaign in business magazines and newspapers, as well a film oriented to corporate marketing divisions, with the slogan “Change or Die!” literally written on a wall. Maass assured his scientific readership that they were the drivers of technical advance, that it was the marketing and business side that did not yet understand the epochal revolution in which they were living, and that they needed to prepare themselves to adjust.

By 1968, a confluence of technological and political revolutions upended any semblance of stability in the life of knowledge workers, who no longer could consider themselves modernity’s agents of change. Maass, Colborn, and their growing team of likeminded technoscientists anchored by Jack Morton of Bell Labs positioned themselves as spokesmen and tour guides of a world “in a state of … rapid and intricate change” like “liquid in turbulent flow.”  They left the established publishing realm in an entrepreneurial venture they called the “Innovation Group.” For a cost of $75 dollars annually, an overlapping community of authors and audience that included venture capitalists, research executives, philosophers, and entrepreneurs would receive the magazine Innovation and could engage in networked conference calls with the likes of Milton Friedman or attend electronically-mediated executive retreats with strong resemblance to TED talks. [See a sample cover of the magazine, designed by Chermayeff & Geismar.]

In 1970, the Innovation Group recycled Change or Die! as a philosophy for everyone. In the advertisement excerpted above, which ran in major business and technical journals, they presented their magazine’s core functions. Innovation first provided technoscientific practitioners an encompassing explanatory frame, “a unique picture of what is happening to your work environment.” But it then promised to equip readers with an introduction to “the most powerful of the new management techniques,” that included systems thinking and technological forecasting, so that they might impose order on an “ever-changing environment.” These were to be learned in the microelectronics industry through interviews with Robert Noyce and Gordon Moore, in the avant-garde of new media with primers on time-sharing, in accounts of struggles to adapt by traditional automakers, and in innovative solutions to environmental pollution that engaged business as much as government.

"Our article will prescribe no morals..." from the 1970 Innovation ad

The unified ideal of Innovation and the select few for whom Change or Die! sparked enthusiasm rather than lament was that of the “change manager.” The concept of the change manager was the individualization of meaningful direction in a chaotic world. The process of innovation could never be controlled fully. With the right mentality and the right tools of augmentation, however, one could “see ahead” and “make things happen.” It was (returning to Deutschman’s terms) the ability to “frame change,” “support change,” and understand “your brain on change.”

In its second dip into the advertising well, however, the Innovation Group signaled its own difficulties of adaptation. The ad represented a major change in strategy from speaking to rarified elites to a broader audience of middle managers. Aesthetically, the ad—with its the coupon, dramatic price drop, abundance of explanatory text, and overwrought justification for the change in business plan, was a pale reflection of the avant-garde quality of Innovation itself. Behind the scenes, Innovation would experience multiple tragedies. In 1970, Colborn died of cancer. Then Morton was found murdered in the charred remains of his car, the victim of a violent mugging. In 1972, Innovation unceremoniously was absorbed by the economics journal Business and Society Review, another young publication that bore little in common with it stylistically or ideologically. Staff dispersed to do public relations for companies such as Xerox, corporate consulting, and freelance science writing.

"Coupon" from the 1970 Innovation ad

One need “prescribe no morals” but may “easily read the application” to see that when the Innovation Group faced its own advice, for reasons beyond its control, it did both.


Matthew Wisnioski is assistant professor of Science and Technology in Society at Virginia Tech. He is the author of Engineers for Change: Competing Visions of Technology in 1960s America, the first book in the MIT Press’ new Engineering Studies series. He is at work on a new book titled Inventing Innovators that explores the rise of innovation discourses and design methods in post-WWII technoscientific identity.

Kamis, 06 Desember 2012

Should Online Communities Have Rights?

On November 30th, 2012, NCsoft, a South Korea-based video game maker, shutdown one of its digital properties, a massive multiplayer online roleplaying game (MMORPG), City of Heroes. Its community of gamers, many of whom had played the game for several years, had earlier reacted with startled indignation when the closure was announced. Indeed, some held in-game protests in front of City Hall in the game's fictional capital; they called it Occupy Paragon City. (You can see video of one of the protests, along with live commentary, here.)

An In-Game Protest in the MMORPG City of Heroes

I never played City of Heroes, nor have I had much interaction with the MMORPG scene, but in this post, I want to consider some of the interesting developments around technology and community in the City of Heroes story. And I want to ask the question: should online communities have rights?


City of Heroes was released in 2004. While most MMORPGs, like Everquest and World of Warcraft, focused on a Tolkienesque fantasy setting, City of Heroes enabled users to become superheroes (and later supervillains). The game had its ups and downs, and present and former players believed that game designers had made missteps in developing the world. Still, many were shocked when they heard the news at the end of August that the game was doing to die.

The players reacted in a variety of ways, beyond the protests already mentioned. Many shared stories. One former player, now writer, claimed that the game "saved [her] life," and there are plenty of anecdotal tales about couples who met on the game and later got married, about babies who would not exist if it not for City of Heroes, etc., etc. We've continued to see new forms of intellectual exploration around online communities as well, including this video on the pending killing of City of Heroes by the blogger, MMOAnthropology. (He has some cool ideas, though I wish they were more deeply informed by academic anthropology; of course, others are already up to that intellectual "game"). Organizers also created an online petition to keep the game active.



For sometime, I've been following these kinds of both in- and out-of-game activities from a distance (mostly with the help of an old friend and active gamer, Richard Piskur). For instance, Rich showed me how players within the Facebook-linked strategy game, Battle Pirates, hold in-game vigils for other players who are undergoing some hardship. There are even reported cases of hoaxes, wherein people feign sickness or death in order to have a vigil held in their honor.

Players Hold a Vigil in Battle Pirates by Surrounding a Sick/Dead Player's Base with Their Ships (Thanks, RP!)

With these new forms of communality arising, we can—and maybe should—wonder whether these online communities deserve some kind of protection, or at least the development of norms that would keep these kinds of closures from happening. In an interview with The Korea Times, the fantasy author and City of Heroes player, Mercedes Lackey called the closure "unethical." As she argued, “I think canceling a game that is making a profit, along with destroying jobs and an online community, is entirely unethical. And I believe that companies that do that are going to get exactly what they deserve, as customers revolt over greed killing cool.” It's easy to mock online gamers. They are on the ass-end of a lot of jokes. But when it comes to thinking about rights, what marginalized community has not been ridiculed?   

One pissed-off player and blogger wrote about the "costs" of closing the game, including a) "loss of revenue" (many people claim that the game was profitable), b) a "loss of faith" in NCsoft (lots of calls for boycotts), c) loss of development talent (NCsoft folded the game's development team), and, most important for my purposes, "loss of culture." He wrote, "CoH's closure is helping to eliminate the idea that an MMO is a home, someplace you stay through thick and thin."

Some of the venom is unsurprisingly being spat at real cultural divides, namely the oceanic one between the Korean firm NCsoft and its US users. Players have drawn attention to the poor morale among NCsoft's employees to suggest that the company is a bad actor. For instance, at this page, (supposed) former employees have left harsh (and racist/nationalist) criticisms of the firm's policies, like this: "Advice to Senior Management – After your meltdown please do not attempt to re-open to the United States. Stay in Korea. Thanks."  

It seems to me this cultural divide is beside the point, however; the issue of online communities and game closures is really about the role of capitalism in shaping the online world. Corporations are building online spaces that people want to inhabit. They fill those spaces with desirable things: nifty graphics, involved plots, fun. And in those spaces, people form communities; they make friends; they create inside jokes and a local culture. (One character in City of Heroes was a rather large rabbit named Watership Doom.)  But companies can kill the games whenever they want.

So what rights do players have to keep their communities and "their" games? We see some people beginning to ask questions about this. For instance, at pc.gamespy.com, some called OOPManZA commented:

"I think that all MMOs, including CoH, should operate under a policy whereby when the parent company decides to shut down the game the following happens:
1. Game server and client code is open sourced and placed on GitHub
2. Game assets are made available under a suitable license that allows for them to be used freely but not resold by any third-party.
I think this would be awesome as it would prevent the game from disappearing completely and also allow the community to become the maintainers of the game."

To which another commentator responded, "That would indeed be awesome, but it'll never happen. There's too much proprietary code and artwork in modern games for them to ever be released as open source." But OOPManZA came back to that, claiming, "Proprietary code that is abandoned is worse than useless, it's practically criminal..." As the users talk about community, they debate IP, the net's sine qua non.

Will we look back in forty years and shake our heads that online communities did not have rights? I'm not so sure. In a class I'm teaching this semester, Computers & Society, my students claimed that they have very little interest in taking part in deep online communities. Meeting strangers online does not appeal to them. They are even increasingly skeptical about Facebook. Of course, they might be the wrong sample for such a question. Perhaps these kinds of communities only become desirable once people have moved onto working-life and adulthood and loneliness threatens. At the same time, Second Life is a ghost town. Perhaps the whole notion of an online community will be a passing fad.

For all of the players' efforts to save City of Heroes, the closing day finally came. A small online movement will continue to push NCSoft to let players to set up and run their own non-proprietary version of the game, and some players have asked Disney to pick up and continue the property. At the end, players recorded their last minute of play to video. When the plug was pulled, players saw an error message: "Lost Connection to Mapserver."


Error Messages Met Attempts to Log-in to the Community

On the final night of City of Heroes, an in-game DJ span tracks as groups of superheroes danced below. (The song list for the night can be viewed here.) One of the tunes was Goo Goo Doll's Iris. Some felt what the song's lyrics pleaded, "You're the closest to heaven that I'll ever be. And I don't want to go home right now."