
This essay examines the political complexities of competing modelling schemes by examining the work of the biologist Lancelot Hogben. Hogben engaged with biophilosophical and biopolitical predicaments: personal bias, political ideology, and the dangers of misapplying statistical concepts outside of their theoretical remit. Hogben’s anti-eugenicist stance, along with his zoological and statistical knowledge, despite being partially forgotten by history, has become even more relevant today.
Keywords: eugenics; biological models; statistics; biopolitics; Lancelot Hogben; history of biology; genetics
The process of averaging the characters of a given group, of knocking the individuals together, giving them a good stirring, and then serving the resulting omelet as a race was essentially the anthropological process of race-making. It may have been good cooking, but it was not science, since it served to confuse rather than to clarify. When an omelet is done it has a fairly uniform character, though the ingredients which have entered into its making have been varied. So it was with the anthropological conception of race. It was an omelet that corresponded to nothing in nature: an indigestible dish conjured into being by an anthropological chef from a number of ingredients which were extremely varied in character. This omelet conception of race had no existence outside the statistical frying pan in which it had been reduced by the heat of the anthropological imagination; it was a meaningless concept because it is inapplicable to anything real. When anthropologists began to realize that the proper description of a group does not consist in the process of making an omelet of it, but in the analysis and description of the character of the variability of the elements entering into it — its ingredients — they discovered that the fault lay not with the materials but with the conceptual tool with which they had approached their study. In passing, it is a good idea not to accept any concept until the presuppositions upon which it is based have been thoroughly examined.
Beginning of page
[p. 156]A wide range of modelling practices are found in the biological sciences, broadly conceived. Predominant among these practices is the use of model organisms, which tend to be treated in two general ways. First, an organism may be taken as a means of exploring a more general biological function (such as genetic inheritance or ageing, for instance). Second, an organism may be investigated because it exemplifies a novel or rare capacity or trait. Here are examples of each:
General form: fruit flies were used in the early twentieth century to build maps of chromosomes, because they provided easy access to answering related questions: their large salivary glands made them easy to genetically sample; their lifespan meant they would exhibit mutations quickly, as generations were short; and they lived everywhere that humans did.
Novel capacity: an organism might be studied because it has a presumably rare or interesting capacity, such as the anti-freeze blood of the arctic fish Myoxocephalus scorpius, which lacks haemoglobin, or the extreme phenotypic plasticity of Pristimantis mutabilis, a tiny tree frog which can reshape its skin from smooth to spiky in thirty seconds.
While knowledge gained from these studies can unveil genetic, molecular, or broadly applicable physiological functions and evolutionary histories that are relevant to human beings, human beings themselves are not considered ethically usable as model organisms.1 While human bodies have historically been subjected to biological experiments, this has, unsurprisingly, been bound up with eugenical programmes, endemic racism, and large-scale ethical violations. Humans are not ‘merely’ animals or organisms in this sense, yet some humans have historically been treated as more animal-like, or as human but ‘less human’, in order to justify torture in the name of pseudoscience. Scientific racism is not scientific but is, rather, a loose set of political beliefs that selectively borrows and twists scientific concepts and practices in order to appear as a legitimate science.
An illustrative case of this contradictory oscillation in racist discourse is clearly evident in the ways in which racist whites respondedBeginning of page[p. 157] to the participation of Black athletes vis-à-vis medical discourse on and around Black bodies in the early twentieth century. Athletes like the legendary Jesse Owens were seen as more ‘animal-like’ in order to explain their athletic prowess while still retaining some degree of superiority for white nationals. At the same time, medical practitioners and experimenters might claim that Black bodies were physiologically equal to white bodies as justification for dissecting the former to benefit the latter.2
Here the two modes of approaching model organisms are presented in an occluded fashion: Black bodies were seen under the general rubric when they could be treated as ‘human enough’ for medical trials. On the other hand, Black bodies were treated as more animal-like, and hence under the rubric of organismal novelty, to set them apart from white bodies.3
In this regard the process of making races to some degree parallels the conceptual thinking involved in the practice of using model organisms. Taking a cue from Sylvia Wynter, then, one should always be wary of the assumed inclusiveness of ‘human’, as it has historically been constructed to strategically leave out some humans for the sake of others.4 Furthermore, analogizing the exploitation of animals to the exploitation of human beings without this genealogical awareness can reinforce the animalization/dehumanization of some peoples in the name of animal ethics.5 The two uses and formalizations of modelBeginning of page[p. 158] organisms outlined above can serve to remind us of these limitations: biological commonalities do not always suffice for health disparities or other environmental differences, nor should purportedly ‘essential’ differences be deployed without an awareness of how those differences have been constructed by way of selective historical means.
However, in biological terms as applied to human beings, the more general and more specific approaches to model organisms, and how they conform to biological practice, do not address a specific mid-level of analysis, namely, the study of a population — something which is ‘below’ the species level and ‘above’ the individual level. A biological researcher may take a step back from the organism’s capacities, physiology, and developmental history, and instead view it as a bearer of inherited traits (genetic, epigenetic, or more vaguely hereditary). This approach to humans as organisms aligns with the discourse of anthropological race-making as described by Montagu in the epigraph above. For Montagu, the process of making races disappears, and races come to be accepted as straightforwardly empirical ‘facts’ about the world.6
Furthermore, in stepping away from the body as a physiological entity in its living and environmental engagements, it may seem that statistical approaches or population approaches then neutralize the racist assumptions on display in the experimentation with bodies; however, they in fact occlude them via statistical tools. Approaching evolution by way of populations ties directly into the approach of frequentism that this article will critique at length, namely, that by measuring traits in repeated random samples one can derive the frequency of those traits and their relative causes — in this case, genes.
This mid-level approach (viewing humans as populations and correlating them to races) was integral to the formation of the so-called ‘modern synthesis’ of the 1930s, in which statistical methods were used to synthesize Mendelian models of genetic inheritance with Darwin’s theories of natural selection. The emphasis on populations, frequencies, and traits could then be applied to humans and, furthermore, could be allied with eugenical experiments or, in the post-war period, with sociobiological research projects. While there is an assumptionBeginning of page[p. 159] that current concepts and practices in biology (especially genetics) have moved beyond overtly race-based ideology, and check individual bias, there has been less of a concentrated effort to draw out the politically questionable threads of this third biometric level — of statistical analysis as justifying non-biological assumptions about uncritically utilized categories such as race, IQ, or combinations of assumptions to do with gender and sex. For instance, historical categories of race are uncritically correlated with gene samples from populations, such that a sample of a population living in a given region today is seen as representative of the gene pool of a race anthropologically constructed hundreds of years ago, which in turn truncates or ignores the history of human migration across the planet.
The point is that this utilization is not easily traceable to explicit political ideology, nor to personal bias, but is, rather, the result of a ‘thought collective’ (to borrow a term from Ludwik Fleck) utilizing abstract concepts in a way that allows for the confirmation and valorization of already existing socio-political categories.7 These statistical concepts, such as frequency, then behave in an amphibious manner by mutually reinforcing pre-existing confidence in categories and in the efficacy of newly constructed statistical or biological methods — that is, new science is ‘accurate’ because it justifies things one already knows to be ‘true’. This does not mean that all frequentist analyses are politically flawed; however, in situations in which one is neither tempted nor encouraged to rattle the status quo, they are expedient and simple.
To this end, this essay will outline how such warnings were issued almost one hundred years ago by examining the life and work of Lancelot Hogben (1895–1975). Hogben engaged with the biophilosophical and biopolitical predicaments outlined at each level: personal bias, political ideology, and, most importantly for our purposes here, the dangers of misapplying statistical concepts outside of their theoretical remit. Furthermore, Hogben’s anti-eugenicist stance intersected with his zoological and statistical knowledge, something which, despite havingBeginning of page[p. 160] been partially forgotten by history, is, I wish to argue, even more relevant for recent attempts to decolonize biological and statistical knowledge within Western universities and within broader scientific practice. Hogben is also of particular importance because he understood, against the grain of the assumptions of the modern synthesis, the relationship and differences between working with model organisms and working with statistical models, as outlined above — that the concepts and the nature of the claims possible for each differed substantially. The modern synthesis was the alliance of Darwin’s theories of natural selection and Mendel’s theory of genetic inheritance, brought together largely by statistical models in the early 1900s, and taking traits, genes, and mutations as things to be measured within a population, rather than the capacities, aspects, and genealogies of particular organisms or their environments.
I will highlight two episodes from Hogben’s life that exemplify his multilevel engagement with the intersection of politics and biology, before engaging with the amphibious nature of the third level of statistical knowledge, and examining Hogben’s debate with Ronald Aylmer Fisher and the biometric school of genetics, an encounter that set the course of genetics and biology for at least the following forty years.
I wish to argue that these biographical features of Hogben’s life, rather than being incidental to his work, reveal how his position with the general political and economic situation of practising biologists, especially in the UK in the twentieth century, gives his critiques added weight, as he was able to view the genetic thought collective from the outside.
After being educated in the UK and working in the fields of zoology and comparative endocrinology in Edinburgh and then Montreal, Lancelot Hogben took up a professorship in Cape Town in 1927. Taking up the chair of his department, Hogben revitalized the laboratory spaces and updated the curriculum, and taught biology and mathematics to local teachers as well.8 Most notably, along with many other researchers,Beginning of page[p. 161] including his wife — the socialist demographer, mathematician, and statistician Enid Charles — he developed a reliable pregnancy test using the local frog Xenopus laevis.
Building on his knowledge of endocrinology, Hogben identified that the same hormones were present in amphibians and mammals. Before this time, the standard pregnancy test (which required special permission and could only be ordered by a doctor, and often only by the patient’s presumed husband) involved injecting a small mammal (typically a mouse or rabbit) with the urine of the woman undergoing the test, then waiting for physiological changes before killing and dissecting the animal to see if ovulation had occurred. Hogben co-authored numerous papers with Charles, who was pursuing a PhD in physiology while in Cape Town.9
The Xenopus test was not only faster (frogs would typically lay eggs within twelve hours following the injection) but it also caused the frog only mild discomfort, as opposed to death for the small mammals.10 Part of the reason why Hogben could see the potential of the test in both scientific and ethical terms was due to his general mechanistic approach to biology. For Hogben, hormones were biochemical actors and there was no reason to think that they would not function across species lines. So, while Hogben was a mechanist, he was also not a reductionist, insofar as he thought that researchers should find the appropriate level of functionality not only to acknowledge the complexity of relations between species at levels deeper than anatomy, but also in a way that would be less harmful to the organism and more beneficial to the public. Put otherwise, Hogben’s mechanistic approach to organisms allowed him to see biological compatibility or sameness, which then subsequently allowed him to highlight social or cultural disparities between species, and between members of the same species. This is borne out by Hogben’s disagreement with General Jan Smuts’sBeginning of page[p. 162] holism, which, on the face of it, might seem inclusive and organic, but that in fact was used to justify right-wing racist acts and policies.
Holism, the notion that one must treat organisms and the environment as an integrated system, and that this way of thinking should be expanded beyond the merely biological, is often seen as a politically liberal concept, as being proto-ecological. But how could this be squared with Smuts’s political beliefs and actions, given that Smuts helped sow the seeds of racial segregation in South Africa that would later bloom into apartheid, and that he also aimed to conquer Namibia, Burundi, Rwanda, and Tanzania, and went after whites whom he saw as ‘race traitors’ for defending the native peoples?11 In his work, Smuts increasingly blended his ecological principles with his defence of European involvement in Africa. Citing the ‘third event’ (the emergence of Homo sapiens in Africa), Smuts utilized the logic of recapitulation to claim that Africans remained the original but also the most ‘simple child-like form’ of human beings.12
Thus, holism could seem liberal in that it advocated for harmonious relations between all species and peoples, while it also emphasized that everything had its ‘proper place’ in nature, and hence stressed the cultural ‘superiority’ of Christian Europe, and its role in deciding where everyone else belonged.
While continuing his semi-clandestine political activities (including smuggling threatened people out of harm’s way), Hogben engaged in a debate with Smuts over the question of holism or mechanism. The debate occurred in 1929 as part of the meeting of the British Association for the Advancement of Science. The results of the debate were published some years later as part of a longer discussion on the relationship between biology and humanism in Hogben’s book The Nature of Living Matter, in which he sets his own views against those of Smuts, J. B. S. Haldane, Arthur Eddington, and Alfred North Whitehead.13Beginning of page[p. 163]
In the book, Hogben does not discuss biological theories or experimental results so much as he questions the spirit of holism as advocated by Smuts and Haldane. Hogben seems to think there is a basic misunderstanding by those who argue for the fundamental irreducibility or mysteriousness of living systems, given that, for Hogben, science is ultimately the collective labour of individuals to produce collective knowledge about the external world. He therefore does not necessarily reject holism at the level of explanation, but rather argues that it is expressed in a way that is unassailable by non-scientists and therefore risks being a form of science that is easier to abuse politically.
Hogben refers to this view as one of science publicism — the notion that something must be publicly expressible and collectively thought-through to be considered science. This is not meant to invalidate ethics or morality, or even the legitimacy of knowledge internal to a single person, nor is it to see science merely as a support beam for common sense. Authors such as John Bellamy Foster claim that Hogben was insufficently Marxist, a ‘mere’ socialist or a humanist, or that he was all three of these things at different periods of his life.14 Yet Hogben consistently had no tolerance for class discrimination, nor for magical or religious thinking by which one could ignore serious social problems.
Thus, while Hogben’s mechanistic or functionalist perspective might be seen as reductive and hence politically limited, in fact it was this functionalist view that allowed him to make the argument that hormones would work across species, or to show that the assertion that there are hard biological differences between races simply does not hold up.
After Cape Town, Hogben went to the London School of Economics to lead a research group on socialist-motivated anti-eugenics. This led to further conflicts in the understanding of the relationship between politics and biology, at the different level of statistics.Beginning of page[p. 164]
While a researcher at the London School of Economics, Hogben read the work of the highly influential statistician R. A. Fisher and eventually had a correspondence with him. As James Tabery has documented, these letters soon turned into a drastic scientific and political rift.15 Fisher was of the biometric school, with an approach to statistics known as ‘frequentism’ (which I will discuss at length in what follows). But more importantly here, and for Hogben’s disagreement with him, was the fact that Fisher thought that environmental influences could be bracketed out of statistical analyses on the heritability of traits and the relation of this to an organism’s development. For Hogben, development could not be understood without taking into account not only the influence of the environment but also how changes in the environment at different times can affect the organism in a non-linear fashion that is more difficult to predict statistically.
It is important to define the term heritability, as it is somewhat complex. In genetics, heritability is the degree to which the variance in a trait is traceable to genetic causes. For instance, height would have a high degree of heritability since parental height appears to have a substantial effect on the height of the child. Number of fingers or toes, on the other hand, would have a low or almost zero degree of heritability, as genetic contributions from one’s parents would not affect the range.
The ability to measure heritability is often predicated on measuring traits of parents and children while excluding or minimizing environmental factors. This concept is not only relevant to eugenical claims about ‘good genes’ or desirable traits, but also marks the beginning of the (still ongoing) discussion of nature versus nurture (and their entanglement) in the life and social sciences. In the late nineteenth and early twentieth centuries, eugenics applied the logic of animal breeding to human beings, not only in determining physical characteristics but also behavioural ones (intelligence, criminality, and so on). While these characteristics were initially correlated withBeginning of page[p. 165] ‘bloodline’ or ‘stock’, these terms slowly became bolstered by biological discourse, following the work of Francis Galton among others.
The stances of Fisher and Hogben were more specifically to do with gene/environment interaction and the relation of biometric frequency (how the number of genes is distributed within a population [Fisher]) to developmental synchronicity (how the timeline of development affects the activation or deactivation of genes within a singular organism [Hogben]). In broader terms, frequentism is the statistical view that repeated objective measurements of traits or events are sufficient for making true claims about the world — in other words, that one can say how probable something is after performing an agreed number of trials (such as coin flips).
Chapter two of James Tabery’s Beyond Versus sets out the stakes and outlines in great detail how the Fisher–Hogben debate unfolded. Fisher had already established himself by 1918, by beginning to show how genetic variation could be measured within a population — rather than focusing on averages, Fisher had introduced the concept of a normal distribution, or bell curve, into biology.16 As Tabery shows, the conceptual upshot of Fisher’s work was that he could argue how inheritable a trait was by showing variance in traits between relatives, and could thus construct variation among genes.
Fisher treated populations like a series of coin flips (that is, as objective data) in which tall children give birth to tall children, thus suggesting that height is straightforwardly caused by genes. One issue that arises from this, however, is how to explain differences in height between two tall children who have tall parents. This is the question of standard deviation: if tall people have tall children, why would one be slightly taller than average and another far taller than average?
For Fisher, as Tabery highlights, the answer is not environmental conditions, but rather the idea that some genes are dominant while others are recessive.17 Thus, variance in height is completely described by the source of genes at one level (the parents) and the interaction of genes at another (thereby explaining the difference in height between siblings).Beginning of page[p. 166]
One can see the immediate relevance of such concepts for eugenicists. If the causes of variation of human traits are genetic, then it becomes possible (and even desirable) to select breeding partners to raise those averages. Yet, as Hogben pointed out to Fisher in the early 1930s, this assumes a rather straightforward understanding of genes as causes and, importantly, as independent causes.
On the contrary, Hogben thought that one could not ignore environmental effects, especially when taking into consideration the temporal register of development — that it is not merely a question of what genes there are, but of which genes are active at specific developmental stages of the organism (meaning it is a question of gene activity, and not only frequency). Hogben suggested that Fisher was mistaken to assume that variation was to do with the ratios of genes, and that environment could affect the expression of genes — that is, a non-genetic or non-heritable factor could affect the expression of genes during the development of the organism. After all, Hogben’s research from a decade earlier, on fruit flies and heat shock experiments (in which pupae were subjected to environmental shocks at different points in their development), had demonstrated a non-linear entanglement between gene expression and environmental factors.
Tabery makes clear that he thinks the differences between Hogben and Fisher cannot be chalked up to ‘mere’ political, religious, or class differences (Hogben having been raised poor and as a Quaker, and Fisher having been middle class and within the Church of England), but rather were to do with a very different sense of explanatory frameworks (indexing the concept of ‘thought communities’ from Fleck, above). These differences concerned how each figure viewed causes and how they understood environment–gene interaction, with Fisher believing statistics could account for causes while, for Hogben, one needed to perform an actual experiment.18 These differing conceptual stances no doubt in part also reflected their disciplinary backgrounds — Fisher was a mathematician through and through, whereas Hogben knew numerous subdisciplines of the life sciences and was comfortable in the experimental laboratory setting.Beginning of page[p. 167]
The historical consensus is that Hogben lost and that, as a result, Fisher’s name was immortalized in the modern synthesis (which used frequentist methods to combine Mendelian genetics and Darwinian evolution). The sympathy towards Fisher, and towards the eugenical thinking that surrounded him, was also consolidated when several UCL buildings were named after Karl Pearson and Francis Galton, upon their founding by the university in the early 1900s. But given Hogben’s credentials and breadth of study, one might expect that his approach would have been more relevant to biological practice.
Part of the reason for the longevity of Fisher’s contribution is that it marked a shift and a collapse of warring paradigms in post-Darwinian biology. For decades prior to Fisher, there had been a strong sentiment that Darwin’s evolutionary theories could not be reconciled with the rediscovery of Mendel’s laws of inheritance. Darwin did not have a well-articulated theory of inheritance and, while he rejected any progressive force in the Origin of the Species, he was far more sympathetic to proto-social Darwinian notions (especially those of Francis Galton) in The Descent of Man. In this regard it is not a stretch to ask how much the modern synthesis was driven by eugenics (which was a global tendency at the end of the nineteenth and beginning of the twentieth century). In part this was due to the fact that there were insufficient tools available for the study of genes (or heritable factors) in the early twentieth century, which is why in statistical analysis genes were conceptual placeholders or heuristics rather than material causes.
But what is also important is that Fisher’s approach got around the difficulty of whether the biological sciences could be considered ‘properly’ experimental sciences, since mathematical models made the question moot (especially as it applied to practices such as animal husbandry or the selective breeding of human beings). It was conceptually and technologically difficult in the late 1800s and early 1900s to understand genes as causes, since they did not seem to be straightforward physical causes and seemed very susceptible to noise, interruption, and mutation. This is why, as Arlin Stoltzfus has brilliantly shown, the debate between the mutationists and the biometricians was not about anti-Darwinian sudden changes within a species versus the more gradualist picture of thinkers like Fisher, but rather to do with the question of how genes cause anything to occur, and the question of how manyBeginning of page[p. 168] types of cause were present in organisms.19 Fisher’s approach, and that of the biometricians, simplified genes as direct causes without saying whether genes were in fact theoretical entities or physical ones — they were just a way of measuring traits, which were themselves supposedly straightforwardly empirical. But the frequentist mode made it easier to claim that other traits (such as criminal tendencies, laziness, alcoholism, pauperism, and so forth) were also just as genetic, and that by rejecting environmental factors one could direct policy away from social projects and towards forced sterilization and other forms of negative eugenical control.
Before addressing how statistical concepts lend themselves to certain politics (however indirectly), there is one more episode of Hogben’s life to address, that of his direct activities against the Nazis, as well as the question of how we should understand the very idea of leftist science.20
In April 1940 Lancelot Hogben went to Oslo, Norway, to deliver a lecture criticizing the then-ascendant racial theories of the Nazi party. Having picked up his twenty-one-year-old daughter Sylvia from neighbouring Sweden (where she had been working as an au pair for a family friend), the two set off for the airport in Oslo, only to find it suddenly secured by Nazi storm troopers. The two hitchhiked to Sweden (back to where Sylvia had been staying), where they translated numerous texts to raise some quick cash, before making their journey home the long way around the planet: to Moscow, via the trans-Siberian railway to Vladivostok, by boat to Japan, then steamer ship to San Francisco.
This was not an atypical biographical episode for Hogben, for whom there was no separating the social and political ramifications of science from scientific practice itself. While this stance was not unusual among the British scientific left of the time, Hogben’s integrationBeginning of page[p. 169] of the scientific and the political, especially with regard to his anti-eugenic posture, was far more uncommon. In his collective biography The Visible College, Gary Werskey groups Hogben with J. D. Bernal, J. B. S. Haldane, Hyman Levy, and Joseph Needham.21 These figures form part of a larger and longer tradition of scientists (in particular in the life sciences) that saw themselves not only to the left of the status quo but also often as forging a third organicist path between neo-Darwinian mechanism (which by the 20s and 30s was over-represented by the biometricians) and neo-vitalist opposition (Wilhelm Roux, Hans Driesch).22
While J. S. Haldane ( J. B. S. Haldane’s father), J. H. Woodger, the Needhams, and others can be classified as organicists (who sought to discover biology’s conceptual autonomy, often with a Whiteheadian flavour), situated between neo-mechanism and neo-vitalism, Hogben was rather a fairly committed mechanist (as evidenced in his debates and activities, especially while living in South Africa, as discussed above). Yet Hogben’s articulation of biological mechanism was resolutely non-teleological and non-metaphysical. An important aspect of the organicists’ legacy, as Donna Haraway argues, is not that their work was ‘more’ metaphysical than these other forms of biology, but rather that they made the metaphysical and/or metaphorical aspects of their work explicit.23
While Hogben had Marxist sympathies (or, at least, was sympathetic to the Marxists around him, like Haldane), he self-identified as a socialist and later in life as a scientific humanist — the latter term is perhaps best known through the work of the already mentioned J. D. Bernal. Compared to Haldane and Bernal, Hogben was far more aware of and outspoken about the racial and classist shortcomings of his contemporaries, as well as his political enemies.Beginning of page[p. 170]
While contemporary social and cultural critics fought (and still fight) eugenics on political, ethical, or religious grounds, Hogben was one of the few biologists (even among left-wing scientific humanists) to decry eugenics as conceptually flawed as well. This in part relates to the fact that biologists at the time who might have had different political allegiances may still have shared a general Promethean attitude, based on a general scientific humanistic optimism (which of course had both capitalist-industrial and Marxist-Promethean versions).
Importantly, part of this critique was not only directed at the statistical details and model constructions of biometricians such as Fisher, Pearson, and Weldon, but in particular at the very idea of ‘neutral science’.
Neutral science should be strived for, but far too often it is assumed to be the starting baseline for scientific investigators, something that is only violated in obvious cases of personal bias. This later point is particularly relevant today as leftist politics rejects those that would falsely claim their science to be neutral, yet such rejection from the left too often rejects the very coherence of scientific studies. For instance, it is not at all uncommon to find a generally sceptical attitude towards scientific authority and tacit agreement between groups that would otherwise be politically opposed. Those who discuss biopolitics, for instance, might cite a critique of Darwinism from a scientific creationist despite the conservative politics of the latter. Neutrality requires the collision of political perspectives, not the assumption that science is in and of itself neutral.
At the same time, defenders of an apparently already existing scientific neutrality (such as Dawkins, Dennett, and Tyson) see any opposition to science as ignorant relativism, an unjustified scepticism in the face of glorious truth. Hogben’s attitude is to reject both stances in a way that requires being open about one’s political commitments while agreeing that scientific research can be a political good: ‘Science is the last defence of intellectual freedom in its perennial conflict with arbitrary authority.’24Beginning of page[p. 171]
Hogben is caught between being too political in one regard (not being properly neutral, as science nowadays is supposed to be) and not being political enough (because he was not a card-carrying Marxist in the manner of the Needhams, Haldane, or Bernal). In The Return of Nature, Foster seems almost disappointed in Hogben and reduces his politics to a kind of naive scientific humanism. But what Hogben represents, most importantly, is an unwillingness either to give over scientific authority to bad political actors, or to acknowledge that science can be carried out in a political vacuum. The difficulty is to make the politics of scientific practice explicit, rather than to automatically assume it to be neutral — scientists should not be socially or political reductive for the sake of short-sighted scientific coherence, that is, science’s assumed neutrality.
But even with this degree of awareness of the situatedness of scientific knowledge, and even beyond individual bias or explicit ideological agenda (as in the groupthink of some scientific communities), there still remain the conceptual attachments (or baggage) of different approaches. What Hogben saw clearly was that treating humans as model organisms, but only in a statistical way (humans as statistical models and therefore mathematical members of a population), allowed many to hide and couch their biases as ‘empirically obvious’ observations about race, class, and sex. These were decided by the arbitrary authority that Hogben lamented, as matters of policy of control rather than as sets of modelling practices based on understanding. It is not that statistical models are inherently politically bankrupt, but that their perceived neutrality is a tempting tool in the justification of political authority.
Importantly, to understand the mid-level of modelling, one has to analyse the conceptual commitments of a science, especially when that science, or set of tools, is imported from one domain to another, as in our case here, with the transposition of statistics into biology.Beginning of page[p. 172]
Standard histories of biology often see the 1920s and 1930s as a time of celebration following a long dark night of false alternatives to Darwinism and pre-genetic uncertainty. With the so-called ‘modern synthesis’ the claim is usually made that Mendelian genetics and Darwinian neo-mechanism established biology as a ‘proper’ modern science. But the synthesis that was consolidated with the discovery of the structure of DNA did not eliminate underlying tensions that were equal parts political and statistical. Hogben is an interesting case in that he is sympathetic to the neo-mechanist and, hence, biometric methods, but also deeply suspicious of unacknowledged bias and the tendencies towards teleology in the work of figures like Francis Galton, R. A. Fisher, and Karl Pearson. In other words, frequentist statistics can, but need not necessarily, follow from a mechanist understanding of biology, namely, one in which we see empirical changes in organisms and populations and therefore assume that there exist straightforward and predictable causes of those changes.
Hogben writes:
We do not need biologists to tell us that any subject can be made dull enough to defy the efforts of any but a few exceptionally bright or odd individuals. By exploring individual differences human genetics might help us to find out how to adapt our educational technique to individual needs. It will do so, and gain prestige in consequence, when it ceases to be an apology for snobbery, selfishness, and class arrogance.25
Hogben is of course not merely opposing genetics or biology to the ethical and moral concerns of the humanities, but is arguing that those who claim to be conducting neutral science have in fact failed to ‘sterilize their own instruments’ before violently applying them to the social body. Hogben’s book of statistics, quoted above, places statistical methods within a conceptual history, and in some ways surpasses Ian Hacking’s Foucauldian genealogical approach in The Taming of Chance.26Beginning of page[p. 173]
Hogben outlines four modes of statistical thinking which are not always clearly discriminated:
We could otherwise represent these positions in terms of their illustrative figures:
It is in this contested deployment of statistical attitudes that Hogben’s work is particularly interesting, insofar as it points out what is veiled via mathematical-philosophical dressing. Hogben accuses Fisher and the other biometricians of uncritically searching for frequencies by using an idealized infinite population with specifiable distribution but without any a priori agreement about a necessary sample size.27 It should start to become clear how this is not only a dispute between choices of method but also between levels of investment in the readability and changeability of the structure of the living world.
This double layer of formalizing effectively hides the fact that Fisher’s frequentism is coloured and classed. Seeing such bias is made even harder by the fact that the frequentist interpretation of statistics claims to be an extension of the aggregate form. The statistician and ur-biometrician Quetelet claimed that he could create a ‘social physics’ in which the tools of celestial mechanics could be transplanted onto theBeginning of page[p. 174] social body. This became mathematically feasible in Quetelet’s time as a result of the availability of demographic and statistical data on large populations of people (and was even further accelerated later on through the availability of military enlistment data), combined with a confidence in the existence of metaphysically immutable natural laws. This idea that total population could be treated as a generalized material quantity contributed to how deeply buried the biases of the frequentist position were, in a way that paralleled the unavowed teleology of the neo-mechanist position that was accelerated in the work of Galton, Fisher, and Pearson. Mechanism is not necessarily teleological, but when it emphasizes functions — defining things by what they do, not what they are — it is tempting to enlarge this definition to larger and larger scales, with function becoming the seed for purpose.
It is important to avoid the too-easy assertion that mechanism is therefore inherently wrong or that statistical measurement is either malicious or useless. The issue is about the role of revisability vis-à-vis the initial motivating concepts for constructing a model in the attempt to solve a problem. This, in turn, affects how one situates the concepts around the problem.
As James Tabery has argued, for instance, both Fisher and Hogben were interested in how variance in a population was related to the interaction between genes and environment, though their emphases were diametrically opposed.28 While Fisher famously synthesized Mendelian and biometric approaches, this largely excluded the environment as the cause of variance and incorporated the causal power of genes combining in a way that explained digressions from averages. For Hogben, development was the site in which one could measure and articulate the complex interaction between genetics and environment, emphasizing this as a temporal rather than a static model.
But this is not to say that Hogben was against the very idea of biology improving human life, in the sense of attempting to direct future human reproduction, or to improve collective life via biology. For a time, Hogben directed a programme of social biology and workedBeginning of page[p. 175] to create what he saw as a left-leaning form of eugenics to combat the prevailing conservatism of the time. It is not so surprising that it is difficult to separate a more leftist politics from a model that is more revisable, whereas Fisher’s approach tended to minimize outliers that would require reassessment of the effects of environmental factors (and that would thereby challenge the unacknowledged externality of terms like prominence or well-bred as clunky, classist biology).
But this does not guarantee a happy correlation between scientific concept and political aspiration. Most of Hogben’s Marxist colleagues or otherwise left-leaning scientists in the first half of the twentieth century were happy with classist and racist forms of eugenics. While the Prometheanism of Haldane and Bernal makes for more pyrotechnic reading than much of Hogben’s work, it also remains uncritical of its racist and classist assumptions, covering them in a science-fictional aesthetic of rocketing into a post-scarcity future. When Hogben worked with the experimental biology group in London, he was the only member to criticize the then-standard support for eugenics.
Why interlace Hogben’s biography with his battles against eugenics and the broader abuse of statistics and biological concepts? Hogben was, as I hope to have shown, a canary in the coal mine for the threat of racist science. In his early days, Hogben combatted eugenics in philosophical-theoretical terms, then at the level of statistical skewing, and then in direct actions against the rise of fascism and its theories. Unlike many of his international colleagues, Hogben experienced war, apartheid, and the guns of Nazi storm troopers, as well as their academic apologists both during and after their reign.
While Hogben was not a repressed minority, he was right on the edge of the genetic thought collective of the twentieth century in a way that few other scientists were. What is also rare, and an important lesson for the present, is that Hogben understood the mathematical and scientific details of the work he critiqued, which is something too-often lacking in debates around the politics of biology, especially amid the return of eugenics in our current moment. In addition, rather than putting forward a Marxist science, like his companions HaldaneBeginning of page[p. 176] and Julian S. Huxley, Hogben saw that science done well was already Marxist: it was a public-facing collective enterprise if it was to be considered science at all. This view is also sorely lacking in a world where the purported neutrality of science is taken as a given rather than as an ideal to be constantly pursued.29Beginning of page[p. 177]
Where do numbers stand in our scientific and political imaginary? What part do numbers play in how we explain biological and social processes? And why does this matter?
Numbers have often been associated with the modern obsession of counting objects, be they taxpayers, commodities, or pathogenic agents. Starting in the nineteenth century, numbers have been inserted into increasingly complex statistical models to predict behaviour and to help bureaucrats, scientists, and business consultants engage in a seemingly endless effort for ‘the taming of chance’.30 Scholars have long addressed statistics’ power to support modernity’s march towards ‘universal’ classifications, and its propensity to erase some differences while creating others. The gaze of the modern state and its instrumental interventions in reality required legibility structures and a politics of measurement centred around the simplification, quantification, and standardization of knowledge. This led in turn to an increased blindness to metis — any form of situated, practical, experiential, and improvisational kind of knowledge — both in politics and in science.31 Essential for the toolbox of those promoting ‘the sweet despotism of reason’,32 statistical analysis has been critically associated with the violence of averaging, eliminating outliers, or clustering, which often produce analyses with highly classed, gendered, and racialized undertones.
However, some scientists have assumed explicitly philosophical and political stances, from which they have questioned the tendency to consider statistical patterns as explanatory in themselves, where the modelling of populations is concerned. As Ben Woodard shows in his chapter, this was the case for Lancelot Hogben, in his strong stance against R. A. Fisher, the most important proponent of mathematical theories of population, a foundational figure of the biometric schoolBeginning of page[p. 178] of genetics, and a champion of theoretical claims concerning the supposed variability between ‘races’. Lancelot Hogben rooted his anti-eugenicist stance in specific epistemological critiques of the apparent moral and political neutrality of statistical models. He focused instead on working with model organisms as foundational for understanding the complex mechanisms behind this variability, mechanisms that in most cases depended on specific environmental and developmental factors. It is not that Hogben rejected the use of statistics; rather, he just used them for different purposes, and assigned them a different role and status in the research process. While Fisher postulated causal relationships simply by observing statistical patterns, Hogben used statistics as a starting point in a complex search for an explanation as to how and why things look a certain way. Both were trying to explain variation within and between populations, but they exhibited radically different models of what an explanation is, how causal relationships can be uncovered, and, simply put, what numbers can and cannot tell us.
Hogben’s efforts strangely resonate with the critical view of statistics on the Eastern side of the Iron Curtain, where politicians and technocrats considered commensuration and quantification as fully fledged political activities, and saw any number attached to planning as the crystallization of long-term social processes and relations.33 Centrally planned economies functioned according to what these political actors understood as ‘class statistics’, born in explicit opposition to ‘bourgeois statistics’, whose power to denature reality and hide the contradictions and historically contingent nature of capitalism was denounced at every step.34 Hiding wide variation behind averages, forming politically problematic categories for categorical variables, interpreting statistical regularities as historical trends — all were exposed as oppressive or, at least, ignorant scientific practices, widely used in capitalist societies but widely relegated to pre-socialist history in Eastern and Central Europe. Concrete examples were used in statisticalBeginning of page[p. 179] and economic journals to illuminate this mistaken instrumentalization of numbers: statistics for the average monthly consumption of sugar in interwar Romania obscured the fact that wealthy factory owners consumed nine kilos of sugar per month, while workers consumed one; calculating infant mortality separately for children born out of wedlock concealed the impact of class inequality on early-life survival chances; while to claim a decrease in the number of workers participating in strikes in a given year as an indicator of decreased union activity was to fail to account for the long-term strike of railway workers.
As we learn from Ben’s chapter, Hogben’s critique went a step further than the realization that statistical interpretation is political, the basic level at which the Soviet and East-Central European statisticians’ critique stopped. For Hogben, scientific practices could allow or foreclose specific lines of thought not only through the aggregation, analysis, interpretation, and communication of data, but from the moment that a research object is constructed. For a scientist, being political thus starts with the type of questions one asks and continues with how one answers them. While critiques of standardization and quantification have been abundant, Hogben’s case is unique because of his holistic vision of the scientific endeavour, and because his holism required a constant reflexivity about the wider societal implications of choosing one scientific practice over another.
© by the author(s)
Except for images or otherwise noted, this publication is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.© 2025 ICI Berlin Press