Maps and the taxonomic style

2Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The race toward completing the human genome has become the new classic case of competition in science. We have several older classic cases, such as the one in the 1960s between Guillemin and Schally to elucidate the chemical composition of Thyroxin Releasing Hormone (TRH) (Latour & Woolgar, 1986). Contests like these have a number of shared characteristics. First, competition may devolve into personal antagonism, as in the Human Genome Project (HGP), which was characterized by competition between Craig Venter and the public consortium. Second, the criteria to be met are changed along the way. In the Guillemin-Schally case, for instance, this concerned gas chromatography, which was enforced by one of the parties as the new quality standard for separating chemical fractions. In the HGP it was the shift from hunting individual genes to sequencing the entire human genome. And third, there are often demonstrations of brute force approaches, as in the processing of the huge quantities of sheep brain tissue needed for extracting a tiny amount of TRH, which required searching entire slaughterhouses. Venter used a brute force approach from beginning to end. But the differences are also very important. The HGP attracted a great deal of attention because never before had there been a contest between a public consortium of science and a private group; this occurred after Venter resigned from the National Institutes of Health (NIH) in 1992 to work in the private sector. And never before had patent applications cast such long shadows over scientific research; once again Venter was involved here, even while he still worked at the NIH. But one important difference concerning the HGP is that the decision to go for sequencing the entire human genome involved an important shift, not only in the goal of the project and the criteria for its success, but also a shift in the style of research, as Brian Balmer 1996 has argued. It was a shift from theory-led questions concerning the functions of genes, involving focused experimenting, to mapping for mappings sake, described by protagonists in the field as extended natural history. Essentially it was the transition from a Popperian experimental testing of hypotheses to taxonomy. There was not a specific decision which caused the shift, but it was nevertheless dramatic. In 1994, extrapolating from sequencing technologies that were not yet available, technicians in the HGP estimated that it would take about a century to map the full 3 billion bases of the human genome (Kitcher, 1995). However, by 2000 the job was done, or nearly so. Why was the job done so quickly? Why bother to sequence the mysterious stretches of who-knows-what (Venters own words) that comprise 97% of the human genome? One answer is because it could be done, or rather, because it was such a formidable technological challenge. It was the same logic that led to climbing Mount Everest and putting a man on the moon (Shapin, 2008). But whatever it was, it was not the logic of peer-reviewed little science, as both Balmer and Shapin suggest. The criteria for large projects like the HGP are formed within statefunded bureaucracies and/or entrepreneurial cultures, and in the case of the HGP and Venters rival project, it was both. Shifts to taxonomic approaches are in no small part necessitated by the apparent priority of data-gathering, which occurs in many large interdisciplinary projects. Cases in point are the Arabidopsis Information Resource, a plant genomics coordination project (Leonelli, 2007), the Human Brain Project (Beaulieu, 2001) and climate research (Kwa, 2005). Drowning in data is a danger faced by a wide array of sciences, a danger which can be averted only thanks to the development of information technology and computing facilities (Conway, 2006). In turn, the priority of data gathering is brought about by the availability of technologies producing data. If a technology is present, then there will be a drive to use it well, to not leave the machines idle. In the early part of his career, Venter had been an old fashioned experimentalist. He devoted ten years to determining the sequence of just one gene, the adrenaline neurotransmitter receptor gene. He published his findings in 1987, and in the same year read about a newly developed machine which combined automated gene sequencing with automated computer recording. It cost $110,000. At the time, Venter was still at the NIH and he bought one at his own expense with funds from a personal grant. This machines successor cost $300,000, and in 1999 at Celera Genomics, Venter had several hundreds of these machines working day and night.

Cite

CITATION STYLE

APA

Kwa, C. (2009). Maps and the taxonomic style. In New Visions of Nature: Complexity and Authenticity (pp. 173–177). Springer Netherlands. https://doi.org/10.1007/978-90-481-2611-8_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free