89 minute read

Academic

Next Article
Bibliographies

Bibliographies

ArchSoc

Needless to say, it was a very different semester for ArchSoc with no in-person events possible. We moved regular events online as well as developing alternative events.

Advertisement

Our lecture series successfully moved onto Zoom with high attendances from students, staff, and the general public. One particular lecture had an attendance of just under 60 people with attendees from the USA, Canada, Ireland, and Denmark. Our thanks to all of our speakers from first semester: Prof. Jim Crow, Dr. Jonny Geber, Dr. Guillaume Robin, Dr. Manuel FernándezGötz, and Rich Hiden and Rachel Backshall, for providing interesting and insightful discussions. ArchSoc’s fortnightly lecture series will continue in second semester with talks from speakers such as Prof. Ian Ralston and Dr. Joanne Rowland.

We were still able to hold our traditional Halloween and Christmas social events, albeit via Zoom. Thank you to everyone who attended and made them so enjoyable. Group pumpkin carving via video-conferencing software was a personal highlight. We also developed our online coffee afternoons to provide a safe social meet-up in this time of isolation. These were in collaboration with ArchPALS who provided academic support during discussion on coursework.

This semester a new event for us was a Scavenger Hunt around Edinburgh and, virtually, further afield. Released weekly, sites were visited and recorded by participants with a chance to win, following government and university guidelines. All sites were visitable virtually so that students could get involved if they were not comfortable with going out in the current situation or were not residing in Edinburgh. We also included sites from further afield that were for virtual visiting only, such as Stonehenge. The scavenger hunt was well-enjoyed and provided motivation to go out during restrictions whilst remaining safe. Our particular thanks to committee members Darcey Spenner, Becky Underwood, and Patrícia Hromadová for organising and producing the scavenger hunt across the semester. Once again, congratulations to the overall winner Megan Powell and to our runners-up Ross Morrison and Grayson Thomas.

In October, we held an EGM to elect members to the new committee positions of First Year Representative, Postgraduate Representative, and BAME Officer. We were happy to welcome Ross Dempster (First Year Rep), Katie Duke (Postgrad Rep), and Becky Underwood (BAME Officer) into these roles. Our thanks to all the candidates that ran for the positions.

We look forward, in second semester, to collaborating with the ArchSocs of the University of Highlands and Islands and Glasgow University on the SSAS Conference 2021. We also look forward to our Fieldwork Fair, likely to be online, and to the continuation of our other events. We hope to see you there! You can keep up to date with ArchSoc via Facebook (Edinburgh University Archaeology Society), Instagram (@edinarch), or emailing us at edin.archsoc@gmail.com with any queries or to be added to the mailing list.

I would like to thank all of the committee for overcoming difficult times this semester and organising fantastic events. Thank you to our members and everyone who attended lectures or events this semester, and finally my thanks to Retrospect for the opportunity to share ArchSoc’s semester with you.

Sam Land ArchSoc President (2020-2021)

Morality Vice: Combatting Venereal Disease in Progressive Era America

By Jack Bennett

Sex work and the spread of venereal disease were two entangled epicentres of the Progressive Era (1890-1920). Both were blamed on personal vice, an idea shaped by perceptions of gender, race, and citizenship. The social hygiene movement, which sought to eradicate diseases associated with “moral vice” expanded. Of the many issues progressives confronted, venereal disease came to underscore the prejudicial social and political transformations of the United States during this period.

Historically, anti-vice campaigns had been one element of religious revivalism. During the Progressive Era these campaigns received growing institutional support from medical and social reformers. Rates of infection were highlighted by the inquiries of charitable investigators, settlement workers, and emerging social work professions. Coinciding with this social hygiene reformism was the emergence of First Wave Feminism. Many in the movement blamed society’s moral ills,including venereal disease, on patriarchy. For both movements, moral reform required political reform. Vice became the causal factor, responsible for corrupt officials and the spread of disease. Exposing sex work, thus became a regular weapon in the hands of groups advocating political reform.

Across major cities, sex work was ruthlessly suppressed during the Progressive Era. The Chicago Vice Commission was established in 1911 and dismantled the city’s red light district within a year. Legitimised as a campaign to curb the rampant spread of venereal diseases, such as syphilis, it fundamentally aimed to protect a prejudiced and narrow cultural ideal of womanhood by targeting sex workers and immigrants. The crusade was supported by a coalition of business leaders and professional altruists who argued that vice reform was an area where the municipal reform impulse and the movement for social justice coincided. These anti-prostitution campaigns were characterised by federal government intervention, local law enforcement, and non-governmental social reformer groups.

The social hygiene movement, beginning with the founding of the American Society of Sanitary and Moral Prophylaxis in New York City in 1905, also contributed to the growth of reform sentiment. Under the leadership of Dr. Prince A. Morrow, author of Social Diseases and Marriage (1904), the society sought to warn the public of the dangers of venereal disease and to educate young people “about the laws and hygiene of sex.” Similar societies followed and by 1910 coalesced under the American Federation for Sex Hygiene. Nonetheless, these campaigns failed to eliminate sex work. Unwilling to engage with the multifacted problems confronting the individual women involved in the trade, reformers only “eliminat[ed] the most public examples of commercialised sexuality from view in urban spaces”. Anti-vice movements demonstrate the interaction, therefore, between public health, gendered prejudice, and US government policy.

The growing strength of government conviction in successfully curtailing sex work is reflected in the legislation of the period. From 1911 to 1915, twenty one states passed laws enabling the closure of prostitution houses based on citizen action, alongside nineteen states introducing anti-pandering measures. Venereal disease was made reportable to health authorities in eight states, while North Dakota, Oregon, Pennsylvania, Vermont, and Wisconsin made venereal disease a bar to marriage. Sex work was not simply a matter of caveat emptor, rather the threat of venereal disease had far-reaching impacts on anti-vice campaigns. Social hygiene activism provided a major forum for conservative discus-

sion of sexuality from the turn of the century, rising to prominence with the establishment of the American Social Hygiene Association (ASHA) in 1913 accompanied by the Social Hygiene journal. Organised to combat venereal disease and sex work in new ‘scientific’ ways, it represented a professionalisation of the earlier social purity movement. The integration of morality with venereal disease was a foundational pillar in the framework for understanding the cause of venereal disease and the rehabilitation of sex workers during the age of Progressivism.

Across the US ‘Frontier regions’, venereal diseases were deemed societal ills to be cured, rather than the acceptable nuisances of a transient culture. In Kansas, Chapter 205 held that those accused of improper behaviour could be quarantined in the Kansas State Industrial Farm for Women (WIF). Institutionalisation of women accused of such behaviour represented the idea society could be purged of ‘impure’ diseases. Chapter 205 was the culmination of two decades of Frontier transformation, representing a socio-political shift away from established Frontier transience that fermented venereal diseases towards altruistic humanism. When this failed, reformers turned to regulation and quarantine of those with venereal disease. Quarantine proved less effective in solving the ‘impurity’ problem than was hoped, driving social reformers to more desperate attempts at ‘curing’ venereal disease through social eugenics. Reproductive sterilisations were introduced on a systematic level. Kansas thus served as the fertile breeding ground for progressive era experiments.

The case in San Antonio reveals the conflation of anti-vice campaigns with moral and political authority. The women’s movement sought to eradicate male sexual exploitation of women, yet women’s access to power through Progressive moral reform was severely limited. As Peggy Pascoe notes, female moral reformers had more success exerting moral authority over other women than over men. Their modest successes such as the appointment of female police officers in San Antonio were met with scorn and swiftly rolled back by male-dominated political machines. Nationally, anti-vice campaigns incarcerated over 15,000, predominantly working-class minority women, without trial from 1900 to 1920. The Progressive efforts to protect women, supply medical treatment, education, and vocational opportunities were overshadowed by the more punitive elements of social control, often hinging on a woman’s racial identity. White women arrested for sexual delinquency might expect rehabilitation and schooling, while non-White women frequently experienced incarceration. Similarly, White privileged women often sacrificed marginalised women to secure their own socio-political advancement. In sum, the anti-vice campaigns in the ‘Frontier regions’ can be characterised by the incubation of reform, increased state-level regulatory influence, and independent pursuit of objectives.

World War One made venereal disease a national issue, as the wartime atmosphere of sexual freedom clashed with traditional morality and concerns around soldiers’ health. In establishing the Commission on Training Camp Activities (CTCA), the federal government revitalised the 1910s anti-prostitution movement. US soldiers’ health became an essential responsibility of the state. In the European battlefields, American reformers found a tangible area to prove themselves. Women represented the largest perceived threat to the moral side of the war effort, simultaneously perceived as passionless angels, sexual victims, or threatening sexual aggressors. Through the developments of institutions, attempts were made to combat the spread of venereal disease. While education and recreation for men were promoted to combat venereal diseases, detention and law enforcement were used against women. World War One exacerbated divisions between women based on class and racial identity. The modest gains made by White middle-class women, through collaboration with anti-vice crusades, were often achieved to the detriment of younger, poorer, more marginalised women.

In the spring of 1917, the federal government designated both San Antonio and El Paso supply depots for the army. The Social Hygiene Bulletin warned of the “serious obstacle” facing both cities is the “presence of Mexican and Indian laborers who are unintelligent in these matters [of vice] and impatient of any regulative measures.”

Despite government pressure, El Paso retained a “zone of tolerance” for sex work. El Paso had made a name for itself at the end of the nineteenth century as a ‘sin city’ of the American Southwest. Historian Ann R. Gabbert highlights the town’s quasi-regulated sex work industry helped fund police salaries and keep taxes low. City leaders received political contributions from brothel keepers, fearing the alienation of voters. Fundamentally, the city’s officials believed efforts to close down vice in El Paso would merely push clients over the border into Juárez, Mexico. Efforts to curtail commercialised vice in El Paso had met with constant failure throughout the Progressive Era. Thus, in order to protect soldiers, the problem had to be stopped at the root: civilian life. health. From the victims of repression to active reformers, marginalised groups remodelled themselves for a ‘New Age’ at the outset of the twentieth-century, demonstrating the impact of moral, health convictions on the political developments of the United States.

Anti-vice campaigns across the United States shaped the Progressive Era. With the global spread of disease directly informing domestic manifestations of prejudice, reform, and social advancement; sex, race and, class combined in this climate of transformation. To an extent, established systems of oppression were entrenched, while concurrently breaking others down. The structure of Progressive Era society founded on idealised, intransient womanhood was increasingly challenged by women working to establish greater socio-political legitimacy. In this reform movement, citizenship was questioned on the grounds of gender, race, and class, to be incrementally recalibrated through the lens of public

The Metic in the Wake of the Athenian Plague

By Justin Biggi

Pandemics, and their disruptive socio-political consequences, are nothing new. In the aftermath of the devastating Athenian plague of 430 – 426 BC, the concept of Athenian citizenship found itself in deep crisis. Approximately one third of the Athenian population had died of the plague, leaving a weakened city (Athens was, at that point, under siege by Sparta) even more vulnerable. Rather than make citizenship more easily attainable, however, Athens closed its ranks further in the aftermath, enforcing stronger limitations on the path to naturalisation. Furthermore, refugees who had come to Athens from the surrounding territory, and whose increased numbers contributed to the rapid contagion rate, were seen with growing suspicion.

This paper will explore the shifting attitudes towards metics (non-citizens who were often active parts of the Athenian community), during and after the plague, especially concerning those metics who may have been illegally passing themselves off as citizens. By exploring the ways in which the recent epidemic changed Athenians’ ways of seeing themselves and the ‘Other’, my aim is to explore the intersection of pandemics and prejudice in Athenian history, demonstrating how the former very often led to a rise of the latter, both during and after the spread of contagion. Analysing the plague, and its effects on

Athens allows us to explore an already complex ideology, that of Athenian citizen identity, in the context of an incredibly difficult social, political, and public health crisis.

The plague of 430 – 426 BC could not have occurred at a worse time. Athens was already facing increased pressure due to the Spartan invasion of Attika, but, at the time, victory still seemed well within its reach. The plague quickly dashed those hopes, giving way to a domino effect that resulted in Athens’ eventual defeat and surrender to Sparta in 404 BC. Medically defining the plague has been difficult, despite Thucydides’ indepth description of the symptoms. Recent studies seem to have identified the pathogen as most likely a form of typhus. The transmission seems to have occurred via a previously infected ship arriving in the port of Piraeus, which was also the city’s main source of food and goods imports. Typhus is a highly infectious disease with a high mortality rate. Its primary transmission is through physical contact with either rat fleas or body lice. Powell concludes that the Athenian plague was generated by the latter, Rickettsia prowazekii. Body lice thrive in crowded, unsanitary spaces, and are transmitted primarily through cross-contamination of unwashed clothes, hands, and hair.

As a result of the Spartans’ “scorched earth” military policy, Athens saw an influx of refugees into the city. Pericles had agreed to allow those fleeing the advancing Spartans to settle within the so-called Long Walls of the city, including that section of them that connected the city to the port of Piraeus, the supposed origin of the outbreak. Thucydides describes in great detail the unsanitary conditions in which the refugees lived, especially after the plague began: “[t]he dead lay as they had died, one upon another, while others hardly alive wallowed in the streets”. Morens & Littman estimate that the Athenian population rose from around 150,000 people to 350,000 at the height of the war. Confined in unsanitary, overcrowded allotments, the refugee population was seen as one of the primary sources of the outbreak, despite not being the cause of it: “[t]he crowding of the people out of the country into the city aggravated the misery”. demics are, amongst medical phenomenon, the ones which cause the most widespread unrest. Infection in these cases is transmissible, imminent, and invisible, pushing people towards hyperawareness. It often leads to increased reactionary sentiments, and marginalisation can occur due to “germ panic” – modern-day examples include the homophobic and racist reaction to the beginning of the HIV/AIDS epidemic or the increased rates of anti-Asian racism in the wake of the Covid-19 pandemic. Nor do any of the exacerbated sentiments appear out of the blue. Fear simply amplifies societal prejudices that are already there.

Twenty years before the Athenian plague, in 451 BC, Pericles’ citizenship decree declared that Athenian citizens were only those people born of two Athenian parents. This law is exemplary of a deeply-ingrained attitude of Athenian exceptionalism. Athenians “defined themselves, collectively, in opposition to noncitizens and slaves”. This definition was propped up by claims of a common autochthonous origin, that is, the belief that all Athenian citizens came from the same “good stock”.

Immediately following the Peloponnesian War, in 403 BC, the Periclean decree was reformulated and reenacted. The war had been an unprecedented disaster for Athens, demographically, morally, and economically. Instead of pushing for the inclusion of new potential citizens through a relaxation of citizenship laws, Athens closed itself off, reinforcing the ideology of its exceptionalism by emphasising the autochthony upheld by the stringent citizenship decrees. This often-found expression in an increased number of court cases against non-citizens, most often metics.

Against Neaira is one of such cases. Brought forward in 340 BC by Apollodoros and Theomnestos, it was written by the former. It is an attack against a woman, Neaira, who had been accused of feigning her citizen status. The case is, technically, against her husband Stephanos, a political rival of both Theomnestos and Apollodoros, but it results in an ad hominem invective against the woman and her daughter, Phano. Though it does not contain any explicit references to the plague, it is a text deeply rooted in the

reactionary autochthonous tendencies that the plague and the crisis of the Peloponnesian War exacerbated. And, while the plague is not explicitly mentioned, the war is.

As part of his argument, Apollodoros recalls the efforts made by the Plateans in aiding Athens, from the time of the Persian War onwards. This includes a stint defending Athens from a Spartan and Theban attack during the Peloponnesian War, and Sparta’s siege and defeat of Platea in retaliation. For their efforts and bravery (a contingent of Plateans had escaped to Athens to warn the city of the attack) they were granted honourary citizenship. Apollodoros’ emphasis on the selfless acts of the Plateans is meant to contrast with Neaira and Phano’s supposed corruption of the citizen body. Citizenship was an exclusive right, that must be carefully guarded and conferred only to those who actually deserved it through acts “in the service of the people of Athens”.

In Against Neaira, the long-term effects of the Athenian plague and the Peloponnesian War are evident both in Apollodoros’ rampant xenophobia, as well as in his emphasis on citizenship being something that must be earned through good deeds in the service of Athens, a direct consequence of widespread Athenian exceptionalism. The plague, a deeply traumatic event that had far-reaching consequences for Athens as a whole, saw refugees and immigrants often singled out as one of the principle causes of the illness’s spread. These sentiments were only reinforced after the conclusion of the Peloponnesian War, and pushed the Athenian polis to seek continuous confirmation of its otherwise failing exceptionalism.

How Manifest Destiny Fuelled Racial Prejudice

By Finlay Cormack

Manifest destiny became a prominent ideology among the newly independent American people, who believed it was their right to expand America’s territory further west. The term “manifest destiny” was first used by, journalist, John L. O Sullivan when arguing that Americans should strive to expand into the west. Sullivan specifically used the slogan in an editorial he wrote about ongoing disputes with Britain regarding Oregon and the annexation of Texas. United States officials, such as Andrew Jackson, ran on polices of expansion and creating more land for White European settlers. The reason for expansion towards the Pacific is a subject of controversy, with historians such as Mark Joy thinking it was driven by slavery whilst others, including Patricia Limerick, believe it was a desire to exterminate the indigenous population. This article will go some way into explaining why the US government expanded westwards, focusing on racial prejudice and economic motives.

Racial prejudice was the most significant reason for the American government’s policies towards westward expansion. In 1829, Andrew Jackson was elected on a platform built on prejudice against Native Americans and their forced removal from south western parts of America. The New Echota Treaty of 1835 demanded the removal of all Native Americans, specifically the Cherokee, the dominant tribe in the region, from an area spanning eight million acres. After 15,000 refused, they were immediately treated with hostility as US troops mobilized to expel people from the area by any means necessary: many were caught and taken to concentration camps. This episode exemplifies that, during periods of expansionism especially under Jackson, the White

settlers treated the indigenous peoples with utter disdain, and US forces were heavy handed and openly racist towards the tribes. Jackson, however, wasn’t the first president or even the first American to have the idea of forceful “Indian removal”. Policy towards Native Americans had always been to either move them further west or to “civilize” them by turning them Christian, teaching them English, and attempting to sever them from their culture entirely. The Idea of ‘civilization’ was racially motivated, illuminating the fact that westward expansion was intended to exterminate natives not only in physical sense but to eradicate identities and cultures. Richard Pratt, a US army officer had the initial idea and proof you could “civilize” natives. He slowly taught prisoners of war how to speak and act like White men, with the intention of making their actions and behaviour compatible with American beliefs. In 1830, physician Charles Caldwell wrote that, “Civilization is destined to exterminate them [Native Americans], in common with the wild animals”.

Westward expansion was inevitable to a certain extent. The American Revolution left the colonists with thirteen states and a growing population that necessitated expansion to boost the economy of the newly acquired countries. Native American territories were initially settled by farmers looking for new areas of cheap and fertile agriculture. With the advancement of the American railway system, different settlers migrated to the west, looking to improve the wealth and status that they had already established in the eastern states as money was generally worth more in rural and sparsely populated western regions. The “American dream” is widely recognized today as an ideology in which people strive to make a new life for themselves in the United States; however even before the term was coined in the 20th century, the sentiment was alive and well, driving expansion for personal gains at the cost of indigenous peoples. It was US government policy before 1829 to remain at peace with indigenous Americans as far as possible, explaining why treaties were used so frequently to legitimise acquisitions. The historian Patricia Limerick explains how the perception of these treaties built the argument that Native Americans were greedy and that the White man was trying to libertate the land for its proper farming uses. In January 1848, gold was discovered in California: the previously small, new territory with a very small settler population, was about to be a hub of global expansionism. The Gold Rush became the centre piece for economic expansion as many rushed to San Francisco in search of the rare mineral. A consequence of this was its impact on the Native American peoples of the regions, who, until this point had avoided the Jackson era of expansion and forced assimilation. The indigenous population in California decreaed by an estimated 80,000. This was the result of both the killing of buffalo, which were a primary food source, and implementation of reservations. Expansion was always violent.

Westward expansion and manifest destiny became a way to legitimise anything. Political intentions combined to form the backbone of any acquisition or expansion of western land in the 1800s. Racial prejudice was fundamental to expansion, with Native Americans violently dehumanised and compared to animals and savages. American officials often went out of their way to cause harm. Andrew Jackson, for example, expanded into the west with the explicit purpose of removing indigenous populations. The need to grow the American economy was another reason to expand further into the west as they were built on an agrarian economy. However, this led to the massacres of Native American populations. This clearly illustrates that the manifest destiny of White settlers was inherently destructive and fuelled racial prejudice against Native Americans, leading to the destruction of lives, identities and, cultures.

The Sovereignty Pandemic, the Paris Peace Conference, and the Contestation of National Space

By Inge Erdal

It was a bitter and violent contestation of territory throughout central and eastern Europe that confronted the peace conference that had assembled in Paris in 1919. Their mission was nothing short of reconstructing the massive wreck that was the international political system after the First World War. Tomáš Garrigue Masaryk, the first president of Czechoslovakia, went as far as to dub it a “laboratory built over a vast cemetery”. The laboratory metaphor is an apt one, in which a new vaccine was deemed necessary to aid the plague-ridden subject. It was Woodrow Wilson who came to formulate and largely personify this remedy, one in which states formulated around self-determination would cooperate and balance each other’s interests in a League of Nations. The question left unsaid was: who would have self-determination? Or rather, over what? After all, both the emerging and seasoned states of the region contested the same territories as integral parts of their national communities, stretching in a large belt from the Baltic to the Aegean Sea and from the Adriatic to the Black Sea. This resulted in a deluge of competing claims and arguments from local powerholders and their allies in the esteemed halls of Paris.

The principal error that could be made again, for it certainly has been, is assuming the naturality of these contestations. The enduring legacy of the romantic truth of these ancient nations awaking from their slumber only to find they are not alone in wanting a homeland. Rather, nations do not rise in a day, imagined communities as they are, they take time and effort to cultivate. In our case, the role of cartographers, statisticians and their ilk in conceptualising their ethnic homelands, creating foundations to be argued and fought for in the years after 1918, is something that must be put under the closest scrutiny. For our purposes, we will largely restrict ourselves to Poland. A good case study, since nearly the entire extent of the new-born state’s borders was contested, with internal and external forces pushing for their vision to be realised through the sovereign powers of the Paris Peace Conference.

Cartographers were certainly among those clamouring for a new Polish state, which as a result of its particular history had many competing interpretations of what it would look like. Most influential among them was Eugeniusz Mikołaj Romer (1871-1954), born in Lwów in the Kingdom of Galicia in Austria-Hungary, now Lviv in Ukraine. That detail highlights Romer’s origins, as a descendant of an old family of the Polish aristocracy which had once run their enormous estates in the eastern parts of the Polish-Lithuanian Commonwealth. A certain romanticism of what constituted Poland proved inescapable throughout Romer’s storied career as a geographer, working on glaciology and meteorology before compiling his most famous work, the Great Statistical and Geographical Atlas of Poland in Vienna in 1916. Naturally, this included ethnic maps of Poles in the atlas’ Plate XI, which he pictured as a heartland surrounded by a sea of scattered ‘areas of majority’ stretching deep into Ukraine, Belarus, and Lithuania, along with over half of Prussia, the southern stretches of Silesia, and of course, Galicia. Romer’s atlas became widely spread not only among the Polish intelligentsia and would-be leaders, but also Switzerland, albeit published in English. The atlas became a key reference point for which Wilson’s committee of geographers, historians and ethnographers formulated their response to the hotly contested territories the new Polish republic fought their neighbours over.

Romer was by no means alone. Another son of Lwów, Stepan Rudnytskyi (1877–1937) was, unlike Romer, Ukrainian, which reflected in his own maps, finding far more widespread Ukrainian,

Figure 1. The map made by Romer showing the border between Poland and Russia in red, as agreed in 1920. Image sourced from Pamięć Polski. The map is currently stored at Jagiellonian University Library in Cracow. [pamiecpolski.archiwa.gov.pl]

which reflected in his own maps, finding far more widespread Ukrainian populations in Galicia than Polish ones. Illustrations that matched well with his own aspirations for a large, if multi-ethnic, Ukrainian state within what he considered its historical frontiers. Even in Polish nationalist circles, many people thought that Romer’s map was far too widespread to form the basis for a new nation-state. These rival currents, between a Polish dominated multi-ethnic entity following in the steps of the old Commonwealth or a more homogenous ethno-state ultimately reached a compromise with the Treaty of Riga in 1921. Ending the Polish-Soviet war and setting Poland’s eastern boundary as relatively restrained, but still including minorities, primarily Ukrainians and Byelorussians, at over 30 per cent of the population. Recreating an imperial state of sorts, which would last until the violence and deportations during and after the Second World War, leaving us with the current anachronistically homogenous entities in central and eastern Europe. Of course, the divergence between the maps of Romer and Rudnytskyi was not just from any conscious or subconscious distortion of data, but rather from the lacklustre nature of the statistics themselves. Sloppy methodology and a general lack of consistency created unstable and conflicting visions of reality, even as they enjoyed a far greater trust by the intelligentsia than in subsequent epochs. The categories themselves were not set or agreed upon, be it ethnicity, language, or religion, resulting in Romer marking nearly all Jews in the contested regions as ‘Polish’, something that was not popular among certain nationalist circles due to a pervasive antisemitism. Conveniently, this decision assured his home province of Galicia acquired a very Polish appearance.

Regardless of the clear faults of the actors involved, it is essential to recognise that no map can give an accurate representation of reality. It is after all an abstraction, where certain things have been removed or given emphasis, zoomed into,

or conveniently left out of the frame. As a result, maps themselves can act as agents, like Romer’s certainly did, being one of many influencing the currents and structures gravitating around the sovereignty epicentre of Paris. Foucault even argued that maps are representations of powerknowledge, creating and shaping discourse. This view can produce some odd insights, such as that maps are not really representing national space at all, but rather creating it by way of representation. After all, a general ambivalence was felt by the local populace for their rulers, even while historic regions like Silesia, Carinthia, and the Banat were partitioned so they could join their ‘homelands’. These maps, therefore, and the faulty statistics on which they were largely based, helped transform ambiguous imperial spaces into defined national ones, even as lingering minorities disrupted the desired harmony.

That said, there is still a missing element. After all, they were but one instrument to justify the sortition of territories. Nor were the maps merely matters of direct military conflict, though in the east it remained an active force, seeing as they all ultimately looked to the world leaders gathered in Paris to arbitrate. Arbitration is the keyword here, as it reveals the relationship to the question of sovereignty. This comes from the view of Carl Schmitt, who famously stated that “the sovereign is he who decides on the exception”. Sovereignty is, therefore, that which has the power to decide when and how exceptions are made or rather arbitrates between competing interests and competing visions of reality. different state actors with autonomy and room for manoeuvre for all parties involved. It is precisely the arbitration from a perceived legitimate source, one which was exercised discursively, economically, and only militarily in a very limited sense, that the would-be nation-states within the plague zone so desired.

This arbitration was far from absolute, more of a suggestion to be negotiated rather than a command to be followed. Seemingly, even the world’s great powers in assembly demonstrated a rather limited reach. Rather, they are merely one, if the most powerful set of actors, trying to reign in the Sovereignty Pandemic plaguing not only Europe but increasingly more of the world as well; when not all too dissimilar strains would spread to Africa and Asia after an even more destructive world war. In central and eastern Europe, the plague failed to be contained properly by the conference, the League of Nations, or the international system. A failure that was not allowed to happen again, resulting in the brutal population transfers of millions of people commencing in 1945, for the most part putting an end to the contested national space of the region. In that way, the cartographers finally got what they wanted, with monochromatic maps to marvel, the historic link a casualty of the pandemic.

There were certainly many would-be sovereignties in central and eastern Europe at the time, even multiple ones in the same wouldbe national territory. Ukraine after all flipped through competing states in the West Ukrainian People’s Republic in eastern Galicia, and the Ukrainian Hetmanate and the Ukrainian SSR in the Russian partition area, before eventually ending up divided between Poland and the Soviet Union. Yet, it is clear that they acted in a system of scalar sovereignty to solve the “riddles” as Leonard Smith has called them. Stephen Legg dubbed these situations “Sovereignty Regimes”, where sovereign power is divided between

The Sand Creek Massacre and the Death of Native American culture

By Amy Hendrie

The white man has taken our country, killed all our game; was not satisfied with that, but killed our wives and children. Now no peace. We want to go and meet our families in the spirit land. We loved the whites until we found out they lied to us, robbed us of what we had. We have raised the battle axe until death.

– Leg-in-the-Water

The expansion of White settlers into the western territories of America in the nineteenth century led to the slaughter, displacement, and destruction of Native American nations and their traditional way of life. The use of the word genocide can certainly be applied here. Propelled by factors such as manifest destiny, the discovery of gold, and government legislation such as the Homestead Act of 1862, white settlers encroached upon land already inhabited by Native American communities and took it for themselves, disregarding the Native American way of life and, indeed, quality of life, entirely in the process. While the devastation felt by Native American nations cannot be adequately conveyed through an article, I will examine the 1864 Sand Creek Massacre and the subsequent displacement of the Cheyenne and Arapaho nations. This will articulate the way in which the prejudice of White Americans fundamentally ripped apart families, nations, and cultures, as is evident in the account of Leg-in-theWater.

The Cheyenne and Arapaho were inhabiting a small encampment in Colorado at the time of the Sand Creek Massacre, which occurred on the 29 November 1864. Led by Black Kettle, the Cheyenne and fellow chiefs had engaged in talks with White American authorities which resulted in the Cheyenne being instructed to remain in Sand Creek. Conditions at this time were peaceful, with efforts being made to continue this tone. However, peace did not ensue. Headed by Colonel Chivington, US troops ambushed the Cheyenne and Arapaho encampment and slaughtered around 200 Native American inhabitants. According to Captain Silas Soule, only 60 of these Indians were not women or children. An account from Robert Bent reveals that, as women were on their knees begging for mercy, “the soldiers shot them all” and that there was seemingly “an indiscriminate killing of women and children”. Innocent people were killed, and their bodies maimed by these troops – they were scalped, their genitals taken as trophies. The brutality of this massacre cannot be underestimated. Such horrific brutality was encouraged by Chivington in a public speech given in Denver just prior to the massacre, in which he advocated for the killing and scalping of all Native Americans, not excluding women and children. There is no valid justification or defence for this massacre. Even Captain Silas Soule, one of the troops present at the massacre, deplored the acts of his fellow soldiers, deeming the massacre a betrayal of peaceful Natives, and condemning the killing of women and children, who had “their brains beat out by men professing to be civilised”.

The Sand Creek massacre undoubtedly destroyed both the lives and the power of the Cheyenne and the Arapaho. Nevertheless, it did not destroy their spirit. Chivington himself admits that the aim of the ambush was to convince the tribes to relinquish Colorado and leave quietly, instead, it led to a thirst for revenge and justice. An alliance of Cheyenne, Arapaho, and Sioux raided along the South Platte in January of 1865, attacking wagons and military outposts. They scalped the White defenders of the town of Julesburg like the US soldiers had scalped the women and children of Sand Creek. For many Cheyenne and Arapaho warriors, the massacre

Image: Depiction of the Sand Creek Massacre, c.1875. Ledger drawing by Howling Wolf. Source: Wikimedia Commons.

did not silence them, it encouraged them to grow louder.

Though, for the chief Black Kettle, an advocate of peace, an attempt at compromise was preferred. Indeed, alongside allied chiefs, he engaged in talks with representatives of the US government. Unsurprisingly, the representatives did not have their best interests at heart. They sought to encourage Black Kettle to relinquish all rights to Colorado, so that white settlers could assert ownership of the land. In addition to this, gold had been found in Colorado, leading to a swarm of settlers headed their way. Little Raven, present at the meeting, eloquently describes the way in which this effort on the part of the US government would prove detrimental to the livelihood and culture of these nations: other chiefs lie there; our women and children lie there. Our lodges were destroyed there, and our horses were taken from us there, and I do not feel disposed to go right off to a new country and leave them.

Here it is painfully clear that the displacement of Native American tribes happening across the Plains, was not simply a matter of moving to a new house. Native American spiritualty was irrevocably tied to the land. Indeed, when examining the case of the Black Hills in South Dakota, deemed sacred by the Lakota tribe, the disregard held by the US government for the preservation of Native American culture is illuminated. To the Lakota, the Black Hills were a site where they could spiritually connect with their ancestors, where they went to receive their adult names and come of age. The hills were an integral part of the spiritual life of the Lakota. Then, when gold

was discovered in the hills, the agreements and the Fort Laramie Treaty which originally protected this area from white settlers dissolved at the prospect of a gold rush. The US government’s response was one of inaction, not taking any measures to prevent white settlers from breaking the treaty and encroaching upon Lakota territory in search of gold. Essentially, the wealth of White Americans was deemed more important than the lifestyle and culture of the Lakota. The same was true for the Cheyenne and Arapaho as they were instructed to leave Colorado, to leave the site that anchored them to their dead, to their fallen loved ones. Black Kettle agreed to the move, seeing no other option of peace was to be found. And so, the Cheyenne and Arapaho were uprooted, displaced, and torn from their spiritual and ancestral roots.

I would argue that, as well as lives being lost at the Sand Creek massacre, there was another death that occurred as a result of the confinement of Native Americans to reservations – that of a way of life. As briefly discussed in the previous paragraph, Native American nations were in many ways rooted to the land on which they lived, thus the displacement of tribes eradicated this relationship with their land and aspects of their spirituality. The destruction of the way of life for Plains Indians does not stop there. A key and frightening reality of how Native American culture was whittled away was through the removal of children from their families once they reached reservations and placing them in boarding schools. A process which aimed to ‘kill the Indian, save the man’, according to Captain Richard H. Pratt, an army officer who headed the Carlisle Indian School. At such schools, a child would be forbidden from speaking their mother tongue and instead only speaking English, facing severe punishment for disobedience. The rationale here was to assimilate these children into white American society, reducing the child’s traditional culture to memory alone. This stripping away of identity was a cultural massacre.

The diminishing presence of Native Americans on the Great Plains was no accident. At every turn, hostile individuals such as Colonel Chivington and Captain Pratt aimed to destroy Native Americans, in body and in identity. The list of such criminals is long. General Custer of little Big Horn fame, Colonel Sheridan and many more deserve to be discussed and condemned in detail. Guilty too was the American government on a federal and state level. While the government openly deplored events such as the Sand Creek massacre, they either broke or ignored treaties they brokered with Native American chiefs, as is clear in the example of the Black Hills and the Lakota. In this way, the government facilitated the demise of Native American culture. This area of history has been understudied and largely ignored. This is a mistake. The genocide of Native Americans is woven into the fabric of US history and deserves as much attention as the cliché study of presidential rivalries.

Edinburgh’s Infrastructure, the Spread of Disease and the Plague Outbreak of 1645

By Melissa Kane

In the early modern era, the city of Edinburgh as we know today was largely constrained to the Old Town and the Royal Mile, stretching from the castle to Holyrood Palace. Most of the population was constrained within this approximate square mile with the Flodden Wall to the south (remnants of which can still be seen in Greyfriars Kirkyard and Pleasance), as well as the Nor Loch to the North (modern-day Princes Street Gardens). However, in the sixteenth century, a growing population of 12-15,000 individuals within the limited walls resulted in two unique features that defined early modern Edinburgh: the building of its ‘skyscrapers’, and the outbreak of deadly disease within them.

Although this definition in no way fits our modern interpretation of such buildings, the limits of the city forced construction to move upwards instead of downwards, growing “taller and ever taller” throughout the period. These features can be seen most in areas such as the Royal Mile where tenements could reach six stories in height, with the highest behind Parliament Close, where buildings reaching down to the Cowgate could be as many as twelve stories. These developments are illustrated in a map produced by James Gordon of Rothiemay (fig. 1) which demonstrates the narrow tenements and complex, cramped infrastructure of the city, particularly towards the West of the Royal Mile and the Castle.

However, by the seventeenth century, the population had grown yet again, with an estimated increase of around 20,000 individuals cited in the 1630s ‘housemails tax’ records. The ‘housemails tax’ was a one-off tax on house rent that allowed it to reconstruct tenement topographies, houseby house, which extended to around 30,000 by the 1690s in the Canongate area alone. This placed a growing strain on city housing, with multiple occupancy houses becoming more and more common. As one resident stated, “I am not sure that you will find anywhere so many dwellings and such a multitude of people in so small space as in this city of ours”. These households often contained large families sharing single rooms in tenements and “narrow shamble of timber and thatch”, along with steep, dark and dirty staircases that opened straight onto the urban streets and closes.

This was only the beginning of the hygiene issues of the city in the sixteenth and seventeenth centuries, with one of the most pressing issues being the state of the public streets. Throughout the medieval and early modern period, the structure of the city did not include sewers or drainage and instead most waste was thrownn

directly into the street. In some areas such as the Fleshmarket Close, where many of the butcheries were placed, the streets were designed so that any blood, fluids or remnants from slaughter ran straight into the Nor’ Loch, or through Cockburn Street as we know it today. This often gave the impression that the “streets [were] almost obliterated in dirt”, the same streets where food was prepared, children played, and the markets were held. In June of 1634, an Englishman named Sir William Brereton visited the city, whereupon he made several critical comments about the lack of hygiene in Edinburgh’s streets (although it may indeed stem from English prejudice):

“The city is placed in a dainty, healthful pure air, and doubtless were a most healthy place to live in, were not the inhabitants most sluttish, nasty and slothful people. I could never pass through the hall, but I was constrained to hold my nose […] This street, which may indeed deserve to denominate the whole city, is always full thronged with people.”

These conditions were inherently an issue of Edinburgh’s clinical condition, but they were also a demonstration of its demographic population. Many tenants living in these conditions were some of the poorest of Edinburgh’s population, based towards the Castle end of the city, which as seen in Rothiemay’s map, is where building structure is the most cramped and overpopulated. On the contrary, in the Canongate area of the city and outside of the Flodden Wall, housing was much more spaced out, as seen in the map with large and luxurious gardens. These areas were populated by many “noble and genteel families” who could afford to live closer to the Scottish court at Holyrood and away from the built-up landscape of inner Edinburgh.

The inevitable result of these conditions was the outbreak of plague. Edinburgh, as with many other European cities had faced outbreaks of disease from the twelfth century Black Death, until the nineteenth century outbreak of cholera as a direct result of inadequate living conditions, which unfairly affected those at the lowest end of society who could not afford large, clean spaces.

The impact of the 1645 Plague, however, was one of the most devastating and the worst to ever hit the city, killing up to 50% of the population. Corpses were said to have littered the closes as the infected fell in their tens of thousands, and numerous burial pits were commissioned to tend to the never-ending parade of the dead. The effects of the plague lay heavy on the city, with one scholar stating that “only sixty citizens were fit to carry arms” in the event of an English invasion. It is undoubtable to infer that these tragedies would have disproportionately affected those living in the unhygienic conditions to the West where people could not afford to leave their homes, resulting in many areas such as Mary Kings Close shut up and “almost altogether buried” to slow the spread of infection and prevent further exposure, but trapping those left behind inside.

It was not for another 80 years after the plague of 1645 that specific directions were brought in by authorities to solve the persistent issue of living conditions in these areas, stating that:

“streets and houses be […] diligently and carefully as may be kept. The streets washed and cooled [and] brimstone burnt plentifully in any room or place.”

However, it would be over 200 years before the introduction of sanitary districts by Henry Littleton would truly transform the hygienic reputation of the city, changes which were only fully implemented after the Cholera outbreak of the 1830s. The mistakes of the Old Town infrastructure did have a great impact on the design of the New Town in the 1760s and, instead of high and narrow closes, the roads of New Town are wider and much more open to allow for movement and cleanliness. However, once again these privileges were only available to the highest and wealthiest of society, allowing those of lower station and wealth to yet again succumb to the prejudices of their lower social status and become the inevitable victims of disease.

Racialised Disease: The Bubonic Plague in Honolulu, 1899-1900

By Sofia Parkinson Klimaschewski

On December 12, 1899, the Board of Health publicly announced the first bubonic plague death in Honolulu. Yon Chong, a 22-year-old Chinese bookkeeper working out of Chinatown had fallen ill three days earlier. Suspicions arose quickly when buboes – painful swellings of the lymph nodes occurring in the latter stages of infection – began forming on his body. To elucidate the situation, the attending physician asked for a further, jointly conducted diagnosis, quickly confirming his original hypothesis. The Plague had re-emerged in 1855 in the Chinese province of Yunnan. Slow to spread, it ultimately reached the commercial cities of Guangzhou and Hong Kong in 1899. With an expanding global economy, the disease continued to advance uncontrollably along existing trade routes. When the Nippon Maru steamer, carrying a passenger thought to have succumbed to the plague en route, reached Honolulu in May 1899, the ship was immediately ordered to quarantine. This precaution, however, failed to contain the disease, as flea-infested rats readily disembarked.

At the turn of the twentieth century, Honolulu’s Chinatown was a diverse, predominantly nonwhite neighbourhood housing around 10,000 residents, most of whom were Chinese, Japanese, or Native Hawaiian. Harbouring the city’s first plague-related fatalities, a military-imposed cordon sanitaire was quickly erected around the district. Residents, however, could be made exempt if given daily permits by the Board of Health (BOH) after receiving a medical inspection. These were almost exclusively granted to immigrants working as servants for affluent white families. Besides quarantine, the board introduced routine house-to-house inspections led by community volunteers. These were aimed at tracking down unidentified cases and their contacts for isolation and locating ‘infected premises’, which were then ‘required to be disinfected with 5% sulfuric acid solution and bichloride of mercury’ whilst outhouses were destroyed, and new cesspools dug. Following the discovery that multiple sick people were being hidden by kin, there was an expansion of the corps of inspectors, and the act of neglecting ‘to give information which would result in sickness being found’, thereby obstructing ‘a health officer or an agent in the performance of his duty’, was introduced into law as a misdemeanour.

Chinatown’s quarantine was not borne simply out of epidemiological concerns. Rather, there was widespread belief that Asian residents’ contagion might endanger the wider, especially white settler, community. The systematic scapegoating of Asian immigrants, most predominantly seen in Pacific harbour towns affected by the Third Plague pandemic, was prevalent throughout this period: the ‘Yellow Peril’ was directly linked to the spread of infectious disease. Further sanitary measures introduced during the pandemic in Honolulu – annexed in 1898– directly mirrored widespread development in public health authorities’ depiction of Chinese immigrants as ‘filthy and diseased’ in mainland USA. This assertion extended itself to further judgements in relation to afflictions such as smallpox and syphilis. On December 24, the BOH entrusted a special Commission of Three with the task of investigating the conditions in Chinatown. Their report concluded with the statement:

‘Plague lives and breeds in filth and when it got to Chinatown, it found its natural habitat’ – Dr. C. B. Wood, territory of Hawaii.

This fallacy had originated in connection to concepts of white supremacy borne out of previous colonial encounters. Some had noted that ‘white people in Asia’, mostly British expatriates living in colonial India or Hong Kong, ‘were less likely

to contract plague’ when compared to locals. They attributed this to their inherent racial superiority rather than clear lifestyle advantages, with the expatriates living within closed compounds, removed from the crowded and deprived communities outside.

The commission’s recommended steps for the sanitisation of Chinatown were more widely adopted by the newly created Citizen’s Sanitary Commission. Non-white residents from ‘infected’ areas were taken to disinfection stations and subsequently moved to the wider quarantine district. Any valuables previously on their person were taken away, never to be returned. Furthermore, all - independent of age or sex - were forcibly stripped of their clothes and belongings, to be communally fumigated and brutally inspected for signs of Plague as white guards watched.

With harbours closed, Honolulu was at a complete economic standstill. As pressure mounted to re-open the city, more drastic sanitation methods were introduced by the BOH: starting in January, every home or establishment where Plague was found was to be burnt down. They had deliberated that: fire would destroy the plague gems, kill rats, cleanse the soil and open it up to the purifying influences of sun and air, and would prevent any occupancy of the premises until a safe period of time had elapsed.

In the first few weeks, these intentional fires remained controlled, with readers of the Honolulu Adviser tracking the destruction through daily updated maps of Chinatown. However, on January 20, strong winds caused flying embers to land on the wooden steeples of a local church, resulting in a massive fire which rapidly spread across the neighbourhood. As panicked citizens tried to flee, the National Guard refused to break the cordon sanitaire. When one exit was finally opened, white residents sporting makeshift weapons ensured that victims were all placed into detention camps, tightly controlled by armed guards. Nevertheless, new cases continued to appear until late March. Isolated fires were also identified on the island of Hawai’i and Kahului on Maui, where another Chinatown was burned down in an effort to control contagion.

Chinese and Japanese immigrants homeless and deprived of ‘many of their enterprises and livelihoods’. Mismanaged and inadequate compensation schemes led by the American government, paid only eighteen months later, had long term socio-economic implications and resulted in the widespread, permanent dispersal of Asian immigrant populations throughout Honolulu.

Despite the association between rats and plagues having already been recognised, (even if clear scientific evidence was still lacking), officials remained convinced that racial minorities were spreading the ‘Asiatic plague’. This fearmongering continued throughout the early-twentieth century, especially in the western United States, with catastrophic consequences during the plague outbreak felt in San Francisco, which saw the initial introduction of equally ineffective public health measures. The economic and political pressure to ‘act quickly’ upon the outbreak of the plague, when such action was informed by racialised constructions of disease, set in motion a civic disaster with social and political consequences extending far beyond the initial tragedy.

Plague in Bombay in the Late 19th Century: When Colonial Medicine gets Political and Politics Meets the People

By Lucy Parfitt

The plague in Bombay - part of the third plague pandemic (1855-1960) - is seldom discussed in comparison to the Black Death of early modern Europe which inspires popular images of beaked plague doctors, a dirty, chaotic, and overcast London town, and the death of around half of Europe’s population. No such motifs exist in the Western imagination surrounding the third plague pandemic nor epidemics that touched the non-western world. Yet the bubonic plague did indeed re-visit Europeans, not on their own soil, but in many of the lands they had colonised. The pandemic killed 12 million people worldwide and primarily impacted China, Hong Kong, Australia, South Africa and crucially, India, where 10 million people succumbed to the disease. Bombay particularly, where the Indian plague outbreak began, experienced a high mortality rate, a mass exodus from the city into the Indian interior, and, arguably, political chaos. Whilst this can be told as a tragic story of death, failure of a colonial state, violent colonial intervention and the colonies once more becoming the laboratory for western science, it also provides us with an example of how Western medicine was ridden with anxiety, uncertainty and disagreement and how – when colonial science met with colonial subjects – resistance and bargaining took place.

The outbreak of plague in Bombay, hesitantly acknowledged by the municipality’s commissioner Mr. P. C. H. Snow in October 1896, was initially met with a thorough urban sanitation campaign focusing on the destruction of property suspected to be infected and the wider disinfection of the city through lime-washing and scattering carbolic acid. Snow, a senior member of the Indian Civil Service (ICS), had also had his 1888 Municipal Act powers extended, which allowed him to segregate, by force if necessary, any suspected plague victims in hospitals. However, Snow was reluctant to carry out widespread forced segregation, knowing full well that such measures flouted the inhabitants varied cast, religious, and gender customs and that there was little mutual understanding about the intentions and benefits of Western biomedicine. He had internalised a fear, stemming from the 1857 mutiny, of direct intervention into Indian affairs and causing offence - he did not underestimate Bombay’s populations capacity for collective resistance.

Snow’s fears were confirmed when on October 29, one thousand mill-hands attacked Arthur Road Hospital in protest of segregation measures, and his own sanitation staff, who were made up of low-caste imported labourers, seemed sympathetic to this resistance. Furthermore, between October 1896 and February 1897, up to 380,000 people fled the city in panic - not necessarily from the disease itself, but from increasingly draconian colonial intervention. Snow wanted to be especially cautious and avoid alarming the city’s sanitation auxiliaries claiming, “on their presence or absence, respectively, depended the safety or ruin of this vast and important city” - he predicted their departure from Bombay could convert the rapidly expanding port city, central to the colonial project and colonial governance, “into a vast dunghill of putrescent ordure” within a fortnight.

However, Snow quickly came into the firing line for his ‘soft’ approach to the pandemic by a committee of bacteriologists and physicians, which included a delegate from the Indian Government, Professor Haffekin. This conflict reflected both the opportunism of individual struggles for professional recognition as well as fundamental disagreements in theories of health occurring at the time. Worboys and Arnold have both written

about a paradigm shift in scientific discourses in this period where theories of health transitioned from spatially confining “disease to the tropics” in miasma, humeral and climatic theories of health, to considering certain diseases themselves to be tropical with the rise of Germ Theory. This manifested itself in different approaches to health intervention, with the senior sections of the IMS and ICS, including Snow, subscribing to sanitation policies and efforts to reform personal hygiene and ‘undesirable’ cultural practices. In comparison, the younger bacteriologists who dominated the lower ranks of the IMS, lobbied for racialised policies of systematic segregation, access to Indian bodies to conduct autopsies, and forced inoculation which they saw as necessary in their pursuit of pathogens at the site of the human body itself.

Controversially, Snow was replaced in his public health duties by Brigadier General Gatacre’s commission in March 1897 and the Epidemic Disease Act was passed by Viceroy Lord Elgin; a state of near martial law was established in Bombay where the army organised medical interventions, being given the green light to do whatever they considered necessary in dealing with the pandemic. This committee focused on policing the movements of the Indian population, intensifying hospital segregation policies, quarantining passengers on ships suspected of carrying the disease, and intercepting railway travellers. The next year of intervention can be considered an example of how public health policy contributed to the intimate and violent colonisation of Indian bodies. Private property was destroyed, health surveillance increased, caste practices and customs like purdah were disregarded, and the population of Bombay was consistently subjected to the Western medical gaze – alien for many and often without consent or mutual understanding of intent. Caste, gender, and religious considerations were thought of as merely superstitious obstacles by Gatacre and his committee – the creation of private caste hospitals being considered the financially burdensome solution to Indian resistance to segregation – constructing the Indian population as if in opposition to the cool-headed rationality of Western science and biomedi-

cine.

In addition, the initial consensus – united in a common dislike for Snow – between bacteriologists and Gatacare’s committee was short lived, with Arnold arguing that bacteriology was not seamlessly converted into the language and methods of administration. Gatacre believed that something more than contagion was causing transmission, suggesting that, alongside the plague bacillus, the “generally insanitary conditions of person, clothing [and] habitation” were also to blame. Professor Haffekine remained adamant that his inoculation serum was the Government of India’s best bet and remained committed to demonstrating his, and bacteriology’s, professional worth to the colonial project. Therefore, not only had sanitarians contributed to discourses which imagined India and Indians as the site of filth and infestation – betraying their anxiety to justify and buttress colonial rule based on racial stereotypes – bacteriologists and the new committee in Bombay – with their underestimation of Indian resistance, agency and own traditional understandings of health – positioned Indians as opposed to scientific progress, and thus ‘civilisation’.

Indeed, Bombay’s population did resist – in direct and outright rioting as well as through more covert acts of non-cooperation. For example, the vernacular press published exposés about hospitals disregarding pollution-related rituals and customs for high-caste Hindus and the molestation of Muslim women. Furthermore, many Indians concealed friends and family who were showing symptoms of plague. General discontent culminated in riots in Bombay throughout 1897, further attacks on Arthur Road Hospital, and the assassination of W. C. Rand, who oversaw plague policy in the nearby city of Pune.

The establishment’s attempt to regain formal control internally – between the army, the Government of India, IMS and ICS – and externally on the streets of Bombay, detailed painstakingly in Gatacre’s extensive 1897 Report, reveals the tensions within colonial science and western biomedicine. Despite western biomedicine being premised on its universality and relationship to natural law, these conflicts – professional, personal and theoretical – illustrate how scientific institutions are very much political and social and, when put under strain, represent sites for crises of confidence. Colonial medicine itself, supposedly the symbolic embodiment of Western benevolence, ironically undermined ideological arguments for British superiority and justifications for Britain’s right and duty to colonise.

It is also important to note that in February-March 1897, the international sanitary conference had taken place in Venice, where the international community requested the full quarantining of ships planning to dock at Bombay. Gatacre’s new plague regime was therefore supposed to signal to the international community that the British were taking the outbreak seriously; ultimately, it was an anxiety-driven performance of public health intervention and indicated an awareness of the vulnerability of Britain’s rule in India in its reliance on internal order and international sanction. By 1898, the India Office in London deemed Gatacre and his committee to have overstepped, the Government of India being compelled to recognise that force was counterproductive in controlling the plague pandemic and that unrest was too risky: 1857 loomed in the imperialist imagination. Measures were liberalised, hospital segregation and inoculation becoming voluntary, with the 1900 Indian Plague Commission encouraging the consultation of indigenous leaders.

This brief episode in British colonial history reminds the historian that medicine and scientific institutions should not be exempt from historical analysis and that medicine was never founded and formalised on simply collective benevolence. Colonial medicine was limited by its own internal politics and competing theories of health, international expectations and the global economy, as well as the material and social realities it was met with in its host society. It was this vulnerable institution of Western Science and biomedicine upon which Britain largely predicated its superiority and civilisation, thus their right and duty to colonise – and it hung in a fine balance.

Early Christian Responses to the Antonine Plague as an Illustration of Distinctiveness Within the Roman World

By Alex Smith

There is a diversity amongst scholarship as to what can be defined as Christian in the first few centuries CE. When the terms ‘Christianity’ and ‘Christian(s)’ are used in this article, they will be used to refer to the movement that centred around Jesus of Nazareth and the people who would have recognised each other as fellow members of the movement that became what we now call Christianity. This movement has been defined by scholars, such as Larry Hurtado, as “proto-orthodox Christianity” which is the definition that will be applied here. The term ‘pagan(s)’ will be used in the same way as was used by Roman historians at the time, when referring to ways of worshiping that were not Jewish or Christian.

Scholars still discuss to what extent the early Christians were distinctive in their ancient Roman context. Christianity was influenced by both the Greco-Roman world and its Jewish origins, and this has been well attested too. They are not the focus of this article but for an in-depth analysis of the ways in which Christianity reflected the world it was born in, Arthur Darby Nock’s Conversion: The Old and the New in Religion from Alexander the Great to Augustine of Hippo and Gillian Clark’s Christianity and Roman Society are excellent books. These influences meant that there were similarities between Christianity and the culture around it. However, despite these similarities, the differences meant that early Christianity stood out in the Roman Empire. Larry Hurtado wrote in his book, Destroyer of the Gods:

“In the eyes of many of that time, early Christianity was odd, bizarre, in some ways even dangerous. For one thing, it did not fit what “religion” was for people then. Indicative of this, Roman-era critics designated it as a perverse “superstition.””

Within the context of the second and third centuries CE, the early Christians had very different motivations and ways of expressing their faith to the pagans. One of the ways in which this point is illustrated is in how the early Christians and the pagans responded to the Antonine plague.

The Roman world was full of gods. There were the Greek and Roman pantheons, city gods such as Artemis of Ephesus, local gods in areas such as Phrygia, Syria and Egypt, household gods, and spirits linked to places like bridges. These religions centred around practice, sacrifice, and divination, rather than doctrine and formal instruction. They had a major public aspect to them where entire communities would take part. Religion was not a separate category to anything else, as it is often thought of today, but instead was interwoven into the rest of life. Communities would take part in sacrifices, professional guilds took part in rituals to their patron deities during meetings, and the imperial system rested upon claims of divine validation. There were also many new religious movements, including the “mystery cults”, examples of which include the cults of Isis and Mithras. Among these, the Christians still stood out for many reasons. It was the first properly “bookish” religion and it provided people with a completely new identity. More importantly, it was a new type of religion that led to a new type of behaviour. Pagan religions were more focused on rituals and religious observances than how individuals lived. As Hurtado puts it, ‘Roman-era religion did not typically have much to say on what we might term “ethics”.’ Instead, teachings on behaviour were in the realm of philosophy. In this aspect, early Christianity could more easily be compared to contemporary philosophy than contemporary religions. For example, the philosopher Musonius Rufus agreed with the early Christians on matters such as sexual ethics and his views on drunkenness. However, when it came to how they responded to the Antonine plague, the early Christian and pagan responses were very dif-

ferent. A plague is defined in the Collins English Dictionary as “a very infectious disease that spreads quickly and kills large numbers of people.” Rooted in the difference in religious traditions, the difference in response is an illustration of the argument made by Hurtado and others, such as Rodney Stark, that early Christianity was distinctive in the Roman Empire. The Antonine plague struck the Roman Empire in 165 CE and the first wave lasted until c.180 CE. R.J. Littman and M.L. Littman have made a conservative estimate that between 7-10 million people died. This included Lucius Verus, Marcus Aurelius’ co-emperor, in 169 CE and Marcus Aurelius himself in 180 CE. There is evidence to suggest that it was caused by the arrival of smallpox – until this point unheard of in the Mediterranean – brought by Verus’ soldiers from the East.

Dionysius, the Bishop of Corinth, gives us one of our only eyewitness accounts of the Antonine plague in his letters preserved by the historian Eusebius. He describes the Christians as showing ‘unbounded joy and loyalty’, that they were ‘never sparing themselves and thinking only of one another’. He talks about how they cared for those who were ill and nursed them as best they could. This resulted in more people who caught the plague surviving, although some of those who cared for them sacrificed themselves so that others might live. In contrast, Dionysius said that the pagans abandoned the sick and dying and tried to get away from those who were ill. He wrote:

“At the first onset of the disease, they pushed the sufferers away and fled from their dearest, throwing them into the roads before they were dead and treated unburied corpses as dirt, hoping thereby to avert the spread and contagion of the fatal disease; but do what they might, they found it difficult to escape.”

What he wrote sounds extreme, but it matches other things we know about the Roman Empire. The famous physician Galen fled Rome during the Antonine plague, and no one saw this as unusual or as disreputable in any way. Rodney Stark goes into further detail about the reliability of Dionysius’ account, which there is not space in this article to talk about, in his book, The Rise of Christianity.

There were reasons why early Christianity had a much greater social impact, and responded differently to the plague, than

the contemporary philosophers of the time. Musonius and those like him focused on a few students rather than trying to change larger groups of people or society in general. In contrast, the early Christians promoted a radical change of behaviour amongst all believers from the moment of baptism. There was also a difference in motivations. The Stoics appealed to an abstract concept of an individual’s dignity as a human being and the Epicureans were motivated by an idea of “untroubled calm” in their lives. The early Christians appealed to divine commands and the responsibilities that the believers had to each other. Arthur Knock, classicist and theologian at Harvard, wrote that ‘there is no doubt that this love of brethren was altogether more lively and more far-reaching in Christianity’ when comparing the early Christians and their contemporaries. Their actions were rooted in their belief in what their God had done for them and their love for each other.

It is interesting to see what Galen, the famous physician who fled Rome, said about the Christians. His main criticism was that they were too rooted in assertions of divine revelation rather than in philosophy. At the same time, he admired their virtues and, in particular, their ‘keen pursuit of justice’. He was impressed that the early Christians, the majority of whom were from lower social classes, demonstrated philosophical virtues in a way that ‘matched genuine philosophers’, even though they had not had philosophical training. Galen’s comments show us two important things. Firstly, that early Christianity was not always unique in the behaviour it advocated for but was often better at applying it. Secondly, and more importantly for this article, when Galen’s words are matched with his actions, it is clear that the virtues he celebrated did not include the group responsibilities and self-sacrifice felt by the early Christians. This is not to say that the pagans did not have any concept of self-sacrifice, only that it was different to that held by the early Christians and that it was a lower priority within Galen’s worldview.

Overall, the Antonine plague and the responses to it illustrate the distinctiveness of early Christianity in the second and third centuries CE within the context of the Roman Empire. The difference between the early Christians and important figures such as Galen adds further weight to the arguments made by academics, such as Larry Hurtado and Rodney Stark, that the early Christians stood out amongst their contemporaries. It does not mean that the Christians and pagans did not share some common values but that there was a fundamental difference in some areas and how these beliefs and values affected how people lived and related to each other.

COVID-19 and the Disease of Systematic Racism

By Grace Smith

As the COVID-19 pandemic has developed, analysis has demonstrated the extent to which COVID-19 has disproportionately impacted black people and people of colour (POC). In the UK, for example, a government report found that 36 per cent of patients who were critically ill with COVID-19 were part of an ethnic minority, despite making up just 13 per cent of the population. To understand these disturbing statistics, we must look to the foundations of modern science itself, which, in combination with systemic racism, has created a reality where racism is embedded within the disciplines of science and medicine. This has caused a number of problems, leaving ethnic minorities vulnerable to the ongoing pandemic. The relationship between people of ethnic minorities and healthcare is deeply damaged by both past and present experiences, and black people and POC continue to face health disparities because of racism within healthcare. Finally, systemic racism’s impact on factors such as housing, has left black people and POC in disproportionately vulnerable positions regarding their health.

The rise of modern scientific thought can be traced back to the Enlightenment of the seventeenth and eighteenth centuries, an intellectual movement across Europe where reason began to be stressed in all aspects of thought, including science. The Enlightenment and its enduring ideas in the West were evidently Eurocentric and based on the intellectual ideas of white men, and therefore reflected the biases and prejudices of these men. Amongst this so-called rational approach to science as a professional discipline, there was a desire to explain the supposed racial differences observed across the world. Consequently, there was a growth in racial science from the early eighteenth century and into the nineteenth century, based on the proponents’ desire for a ‘categorisation’ of humans by race. They believed that humans are divided into separate races as a result of the natural order of things, with some being naturally inferior because of significant genetic differences between races. In reality, modern scientific studies have shown that all humans share 99.9 per cent of their DNA, meaning race is a social construct and consistently shaped by society’s social and political ideals. The resulting racial hierarchy that these pseudoscientific ideas created, unsurprisingly placed the Caucasian race as superior to all others. An example of this pseudoscience is the work of mid-nineteenthcentury American anthropologist Samuel Morton, who theorised that human intelligence was connected to brain size. He measured various skulls from across the world, concluding that white people were superior because they had larger skulls compared to any other race. These false arguments of biological difference are significant because they have persisted for centuries and remain harmful.

The idea that all non-white people were physically and mentally inferior was turned to at a time when the reality of slavery was being threatened by abolitionist ideas. In the eighteenth and nineteenth centuries, a form of supposedly scientific justification for slavery was sought by those who wanted to uphold the practice. Consequently, black people had to be shown to be inferior, even to the extent that they were presented as an ‘untamed’ people who had to be enslaved for their ‘own good’. Thomas Jefferson made some deeply damaging and influential contributions to racial pseudoscience, despite his claim that all men were created equal, arguing that black people’s inferiority was obvious from observing his own slaves. The deeply insidious nature of the attempt to justify slavery through a manufactured concept of inferiority was identified by Frederick Douglass, a former enslaved person and one of the US’s most prominent opponents to slavery in the nineteenth century, when he stated that “the whole argument in defence of slavery becomes utterly worthless the moment

the African is proved to be equally a man with the Anglo-Saxon”. White people sought to dehumanise black people through science, and this racist concept of inferiority has continued to infiltrate today’s world, even if the language has changed. Historian Ibram X. Kendi argues that “what black inferiority meant has changed in every generation ... but ultimately Americans have been making the same case”. The language of dehumanisation against black people and POC as a way to view them as separate, to view them as if they were ‘animals’ who needed to be controlled, persists today in different forms.

There are various historic examples of black people’s bodies being used for the progress of medicine and science without their consent. An illuminating example is that of J. Marion Sims, a celebrated American physician who developed the vaginal speculum in the mid-nineteenth century.

Yet Sims developed this great achievement for gynaecology with the motivation of curing VVF in enslaved black women, to ensure they would continue to produce healthy children as slaves, and through experimenting on fourteen black women he himself had enslaved. He performed surgery on them without anaesthesia because of the commonly held belief that black people experienced less pain than white people. The impact of this can even be seen today in the experiences of black women, who are five times more likely to die during childbirth than white women in the UK, as highlighted by the Five Times More campaign. A relatively recent example is the Tuskegee Syphilis Study, from 1932 to 1972, where African American men were studied to measure the effects of untreated syphilis, despite not being informed of their status or offered treatment, instead believing they were receiving free healthcare.

Is it any wonder that the racist foundations of modern science would have a continuing impact on the scientific discipline as it exists today,

experiences where medical professionals receive racism from the patients they are treating, or from fellow white colleagues within their discipline. As aforementioned, the abuse of black bodies throughout history in medicine has understandably led many black people and POC to view the healthcare system with distrust and avoid interaction with this sphere unless absolutely necessary. The medical system remains a product of systemic racism, and this is further highlighted by the under-representation of black Americans in U.S. clinical trials, occurring partly because of their concerns about the trustworthiness of medical professionals. through recognising racism’s impact on wealth, housing and job opportunities. Black people and POC are disproportionately represented in lower income and consequently often frontline jobs, factors contributing to how COVID has impacted these groups. World disasters have always served to illuminate the problems within our societies. We must grasp this opportunity to enact real systemic change in the hopes of never repeating such loss of life based on racist structures. COVID-19 has demonstrated more than ever that the vastness and seriousness of systemic racism cannot be overstressed. It is, and has always been, a matter of life and death.

Secondly, systemic racism as a whole has contributed to health disparities because of the impact it has had on factors including housing, employment opportunities, wealth and pre-existing health conditions. Public Health England’s report on the impact of COVID-19 on BAME communities demonstrates how racism and social inequality play a role in the disproportionate impact of the virus. Therefore, when the current Conservative government argue in the same breath that ethnicity should not be considered a main factor in COVID’s impact, and that other factors such as housing should be focused on, they are actively ignoring the fact that these other factors are interlinked with race by necessity.

In terms of solutions, the medical sphere needs to rebuild trust with black people, POC, and their communities, enact anti-racism training for all medical staff and have a no tolerance attitude towards racism to prevent the all-too-common mistreatment people have faced. The sciences and medical fields need to be diversified, because underrepresentation is clear and greater representation improves the discipline as a whole by challenging engrained biases. In another strain, education, including in the UK, needs to be widely improved to give the general public far greater awareness of the racist roots of science, as well as of the systemic racism that continues to shape our world. There is a vast range of resources white people can learn from, including Edinburgh’s own RACE.ED, a network dedicated to showcasing research and teaching on matters of race and decolonial studies. Evidently, systemic racism as a whole must be dismantled, including

The Rise and Fall of the British Union of Fascists: Anti-Semitism and International Fascism

By Lucy Thomas-Stanton

In 1932, the British Union of Fascists (BUF) was the rising star of British politics. Nazi-enthusiast and Daily Mail owner Lord Rothermere had pledged his support, and the Mail ran a slew of sympathetic articles, including 1934’s unapologetic “Hurrah for the Blackshirts!”. Yet this ‘success story’ was to be short-lived: by 1935, BUF membership had peaked at 50,000, and terminal ideological contradictions within the party were already beginning to reveal themselves. The BUF had made a conscious effort to distinguish themselves from the German Nazis in 1932, yet the party was increasingly viewed as a “pallid imitation of a foreign creed”. They were slowly adopting the violent and deeply prejudicial tendencies that had already marred fascist regimes, desperate for attention and financial support from Hitler and Mussolini. By the mid-1930s, the ideological elements that had rendered the party so specifically ‘British’ had been lost in a sea of generic, passionately racist, antisemitic and (ironically) internationalist fascism.

The confused outlook of the party is hardly surprising given the history of its leader, Oswald Mosley. Mosley was a complicated figure: intelligent, politically fickle, and a man who had “many Jewish friends” in the 1920s, but, by the mid-1930s, was composing lengthy antisemitic diatribes. He favoured the government control of economic markets, a throwback to his days in the pro-Keynesian Labour Party. By the time he founded the BUF, he hated traditional parliamentary politics, describing Stanley Baldwin’s moderate Conservative government as a “legislation of old women”. Mosley, however, was first and foremost an imperialist with deep concerns about the decline of the British Empire– or at least, the decline of the empire-building spirit.

Many prominent members of the BUF had a childlike fascination with the mythology of the British Empire. Rudyard Kipling (“imperial hero” and author of The Jungle Book) and Robert Baden-Powell (founder of the Boy Scouts, who extolled the frontier ethic) were held up as fascist exemplars. Conquerors and explorers were worshipped figures; these Britons of old were deemed intrepid, masculine, and willing to risk their necks in the name of empirebuilding. Post-war Britain was regrettably ‘soft’ by comparison, with a feminine concern for democracy and a supposed reluctance to exert power and aggression abroad. The White, manly English colonialist was also defined in relation to colonised peoples. Indians, for instance, were supposedly afflicted with “native hysteria”, which left them unable to govern themselves. However, the BUF were not particularly desirous to reclaim lost elements of the British Empire, or to further extend it, but rather to utilise all means available (including violence) to maintain its current condition. On this score, the BUF differentiated itself from the expansionist goals of the Nazis and Italian fascists.

This was not the only policy on which the BUF and continental fascism differed. In the meeting that pre-empted the creation of the BUF, the committee decided “not to attack Jews as such”, or at least “not to get itself entangled in so unnecessary a side issue as anti-Semitism in England”. Mosley had argued in 1932 and 1933 that Jews should not be persecuted on the grounds that they were “born Jews”. In the process, he rejected the biological basis for antisemitism espoused by the Nazis. Furthermore, in an early front-page article in the BUF newspaper, The Blackshirt, it was stated that “Jew-baiting in every shape and form was forbidden by order in the British Union of Fascists”.

By denying that they were an antisemitic party, the BUF could maintain a degree of credibility with the press. There were strong racist and antisemitic undertones to the party’s core ideology, the BUF leadership went to some lengths to keep the

Image: The British Union of Fascists (BUF) formed by Oswald Mosley. (London, England: Bridgeman Art Library).

antisemitism implicit. This may have been a conscious attempt to differentiate the party from the Nazis, whose violent actions were being reported by the international press in increasingly shocked tones. While decrying antisemitic violence, the BUF was able to garner significant support from middle-class England, largely via the Daily Mail. It also allowed the party to claim the moral high ground when members were insulted or attacked by Jewish people. For instance, when two Jewish men were jailed for kicking a fascist in Leicester Square, the Blackshirt asserted that “we shall know that any Jew attacking a member of the British Union of Fascists does so, not as a Jew, but as a Red”.

This disguise was difficult to maintain however, and soon the party’s submerged antisemitism rose to the surface. At a rally in Manchester in 1934, Mosley embarked on an antisemitic tirade after his opponents attempted to drown out his speech by singing ‘The Red Flag’, referring to his hecklers as “sweepings of the continental ghettos, hired by Jewish financiers”. The party also stirred a radical and violent form of antisemitism that was emerging in the East End. The 1934 Olympia Rally quickly descended into bloodshed and disorder, and thereafter the press strongly associated the BUF with thuggery, violence, and intolerance.

The BUF’s relationship with the Daily Mail came to an end in 1934, when Lord Rothermere, a staunch admirer of Hitler, told Mosley that he could “never support any movement with an antiSemitic bias”. Shortly after, BUF membership began to decline: the increasingly violent tendencies of the party had lost them significant support from ‘middle-England’. In reality, the party had always been antisemitic, albeit not always so conspicuously. Only a year after the founding of the BUF, The Blackshirt ran an article entitled “Shall Jews drag Britain to war?” in a poorly concealed attempt to blame Jewish people for Nazi crimes. This headline is indicative of how anti-war sentiment and antisemitism were intertwined in BUF rhetoric.

The anti-war position taken by the BUF may seem unexpected; after all, the party regularly capitalised on pat riotism and nationalism, and thought mediation and appeasement were weak, effeminate tools on the international stage. It also serves as a contrast to the war-hungry Nazis and Italian Fascists. Yet, opposition to war with Germany was essential to Mosley’s belief in the Fascist World Peace, whereby the four pillars of European fascism (Italy, Germany, France and Britain) would engage in international cooperation to bring an end to war between fascist nations. This ideological stance was a money-earner for the BUF, which received significant funding from the Italian and German governments. By funding the BUF, Italy and Germany sought to buy political influence in Britain; and they appear to have succeeded. As well as remaining resolutely opposed to war with Germany, it is clear that the divisive, prejudiced, and violent tactics utilised by the Nazis were increasingly embraced by the BUF. By 1936, the party had devised a policy of active antisemitism.

In the end, it was Mosley’s ‘international fascism’ that spelt the downfall for the BUF. The BUF had represented a ‘different kind of fascism’ in 1932: the party was introspective, with detailed economic policies. By the mid-1930s, they had merged into a support group for European fascism, complete with Nazi Jackboots and Italian fascist uniform. With Germany increasingly viewed as a clear enemy to British interests, this look was fast falling out of favour with the public. After the outbreak of World War II, the British government examined BUF funding with a critical eye for treachery, and, as a result, forced the party to disband.

Post-Pandemic England: ‘Golden Age’ or ‘Grey Age’ for Women in the Late Medieval Period

By Sophie Whitehead

When we attempt to create the dreaded ‘new normal’, it is important to look back to pandemics of the past to see how we can strive to create a more harmonious, egalitarian society in the wake of pandemic and tragedy. A pandemic, much like a war, calls for a country to be rebuilt - and this was no different for the Black Death of 1348. The number of deaths is disputed, however, the rates in England suggest deaths affecting as many as 45 per cent of the population. This drop in the number of citizens would, unsurprisingly, result in a long recovery time. Indeed, the records from the 1377 poll tax reveal that the population of England was just over half of the pre-pandemic levels. This population deficit was societally problematic, most notably through its role in the great labour deficit as casualties of the Black Death were evenly spread across the generations. Historians including E. Thorold Rogers argue that this labour deficit would lead to the 150 years after the Black Death serving as a Golden Age for female labour, noting that women’s’ wages doubled in the post-Black Death society.

It is reasserted by pioneering historian J. Bennet’s observation that ‘there ha[d] been much change in women’s lives,’ however it is contested whether there was any transformation of ‘women’s status in relation to men’. More recent interpretations like Bennet’s, have argued that the concept of a Golden Age for women is more of an unsubstantiated illusion than a reality - with historian S. Bardsley stating that, ‘English women experienced the post-plague period as “more grey than gold”’. So, to what extent is it fair to see the post-pandemic society as one striving towards equality, and what can we learn from the social changes that occurred? In order to assess whether this equality is illusion or reality, it is necessary to first understand that the female experience of social mobility varied upon their social status within the patriarchal framework of the 14th and 15th centuries.

It is unsurprising that the agency and power felt by women depended upon their socio-economic position. However, the group possessing the most agency within these structures is contested. Some historians have argued that the improvement in the position of women is felt most strongly within the rural working classes. In the case of the reapers and workers employed for ‘Autumn Work’ in the East Riding of Yorkshire in the years 1363 to 1364, female harvesters earnt on average 98 per cent of their male counterparts’ wages: a level of wage inequality that is negligible in the grand scheme of things. Similarly, female reapers and binders in 1380 Michinhampton in

Glostershire were earning 4d a day, which was the same rate as their male counterparts. One argument for this perceived equality between men and women is that there were other factors than sex that would come to effect wage disparities, notably age and disabilities. This is true in the 1331-2 pre-Black Death case of Ebury Manor, in which the male workers earned half of the wage of female workers. However, as Bardsley explains, where women were earning the same, if not more than, men:

“They were overlapping with male laborers at the bottom end of the wage scale who was probably boys’ old men or men with disabilities.”

This perceived equality was more, ‘confined by patriarchal structures than it was changed by demographic structures’. Women still worked within majority female dominated, lower payed spheres. Confined within these structures, they were unable to experience the same level of equality and agency that historians have previously emphasised.

An undisputed outcome of the Black Death is the rise of urbanisation and the subsequent increased value placed upon craftsmen and women’s work. Nowhere are these outcomes better exemplified than in London and York where a new social status began to emerge - the metropolitan workers. Their class position in some cases is so distinct from that of rural workers that their experience must be assessed separately. In her seminal text on medieval women, historian E. Ennen argues that urbanisation tended to benefit women. Whilst her work focuses upon women in Germany, the same can be said for women in England too. Arguably the most famous lay woman of medieval England is not a real person but instead a fictional character: Chaucer’s The Wife of Bath. The wife, like other women in the post-plague period, reaped the reward of a more monied, more consumerist society, with the wife herself employed as a hat maker. Did real women access the same level of financial and professional freedom as the wife? In short – yes.

From 1350 to 1400 the number of female apprenticeships soared and of the surviving 30 records of female apprenticeships from medieval London, about a third of them related to women. However, this increased freedom and professional mobility mainly applied to widows. As of 1465, a widow of a citizen from London would be made a citizen in her own rights, so long as she never remarried. The same disparity between married women and widows is further evident within craftworks. Notable examples of this principle are Alice Holford who took on her husband’s profession of bailiff upon his death in 1433 and Ellen Laingsworth, who, after her husband’s death in 1488, trained three female apprentices of her own. Historian C. Barron argues that ‘the Golden Age was golden only briefly and was most apparent in the economic capital, London.’ However, arguably the Golden Age was further constrained; it was golden for women who were no longer ‘confined by the patriarchal structure,’ in this case of marriage.

Much like the metropolitan craftswomen, women within the nobility were also often ‘confined within patriarchal structures,’ however, unlike craftswomen and rural women, they could set their own precedents and define their own rules much more easily than those who were at the mercy of legislation, landowners and in some cases, husbands. Noblewomen were able to not only have agency within political structures but also dismantle them. Lady Margaret Beaufort (LMB) passed her own act of Parliament to make herself the only woman able to hold property in her own right whilst still married and acting as a justice. Although women of the period such as Cecile of York did experience similar levels of power and social gravitas, with claimants writing to her for pardoning, few women would be able to access the same levels of power and prestige as LMB. Bennet argues that the periodisation of male history and female history are distinct, and indeed, it is possible that the periodisation of noble women’s history should once again fall into a different category. Noblewomen history was more affected by individuals’ ambitions, as seen through LMB and international politics, than a ‘demographic crisis’. Whilst it is true that the post-plague England was a Golden Age for noble women, the periodisation does not apply to them in the same way as lay women, their Golden period does not begin with the Black Death, or for that matter end in 1500, it was instead a

constant.

Whilst some women did experience greater levels of equality during the post-pandemic period, they were for the most part, in a minority and the post-pandemic period was one of continued prejudice rather than new possibilities. In contemporary society, we are constantly seeking signs that tomorrow will be better; that there will be more racial and gender equality; that there will be less inequality in wages and that this war on the virus will unify us as a nation and as a world. With this sense of hopefulness for the future, it would be fruitful to observe pandemics of the past as having similar trajectories. However, this is not the case. Bennet warns historians that ‘our preference for history as transformation might limit our ways of seeing past lives,’ and this warning is even more poignant in the present day where we look increasingly to the past for comfort. Unfortunately, the conclusion to this essay will not be able to provide that comfort: the patriarchal system did not transform in the late medieval period. However, the study of women’s labour after the Black Death does demonstrate that if the desire for a more egalitarian society existed amongst the ruling elites, the Black Death and the labour deficit would have facilitated mass societal transformation. What this study teaches us is that there is a way for pandemics to lead to an end to prejudice if the people with power want that prejudice to end.

The Self-Slaying Epidemic: Historical Mental Illness and the Case for Psychohistory

By Jess Womack

‘Make sure you look after your mental health’ is a phrase we hear all too regularly in 2020. Demand for services provided by charities such as Young Minds has soared due to isolation and new anxieties about the future, while a study published by The Lancet Psychiatry journal in November suggests that one in five COVID-19 survivors will develop a mental illness. Meanwhile, the resurgence of the Black Lives Matter movement has brought increased attention to both the everyday and intergenerational trauma experienced by members of BME communities. Whether in lockdown or in taking part in protests, we have been encouraged to look after ourselves, seek help when needed, and develop mechanisms to deal with the psychological stress of mass change to our lives.

When historians of the future come to study this period, they will have to develop ways to account for new psychological trends in order to gain any understanding of what really happened. The same could be said for any period or event, and this is something that has gone largely unaccounted for in historical practice. Historians must learn how to examine mental as well as physical health of populations across history, embrace a more scientific understanding of social trends, and lean into the under-developed field of psychohistory.

Unsurprisingly, military history is the most developed area in this field. This is partly because the effects of war were so central in the development of scientific understanding about mental health. Post-Traumatic Stress Disorder was officially recognised by the American Psychiatric Association in the 1970s, in a large part due to the legacy of the Vietnam War. However, the failure of history to engage with matters of mental health has resulted in a tendency since the 1970s to view the Vietnam veteran as uniquely affected and worthy of historical study. Eric Dean, for example, writes that they are perceived as ‘alone in history in allegedly being under-appreciated, troubled, rejected, and blamed for war’.

The last decade has seen increased attention to the psychological effects of the American Civil War from historians including Eric Dean, David Silkenat, and Reid Mitchell. Silkenat in particular highlights the ‘self-slaying epidemic’; a perceived spike in suicide rates in post-war South Carolina in the context of rapid social transformation. Although it is hard to determine exact numbers, there was certainly an increase in the practice of newspapers reporting death by suicide, and the victims appear to predominantly be white men. There are several possible explanations for this, the most obvious being the traumatic experiences of the war. An estimated 97 per cent of white men of military age served in some capacity during the conflict often described as the ‘first modern war in history’. Others have argued for a process known as the ‘contagious suicide phenomenon’, in which a personal connection to victims or even hearing about suicides remotely encourages others to take their lives. Oral history traditions suggest that this was certainly a perception at the time, with one North Carolinian in the 1890s stating that ‘Suicides never come singularly. One is always followed by another’. The increased reporting on suicide cases in local papers thus potentially contributed to the ‘self-slaying epidemic’ as well as providing evidence for it.

It is also interesting that increased suicides were largely coded as a white phenomenon in this time. It is possible to dismiss this as deaths being under-reported in African American newspapers or dismissed by the white population. It is certainly true that black veterans suffered similar trauma to their white counterparts. However, Silkenat’s

study of African American sources reveals significantly few references to suicide, suggesting that ‘many African Americans believed that suicide primarily affected whites and was rare in their community’. Instead, we can look to the context of the post-war South. There was genuine hope for economic and political advancement among African Americans in the Reconstruction period. Violence and significant inequalities still remained, of course, but Jim Crowe legislation was not yet established, and thousands born into slavery were experiencing freedom for the first time. Additionally, community and kinship networks among black populations were far more significant than among whites. For all the horrors of separation under slavery, or perhaps because of them, social ties were viewed as highly important and were openly celebrated. These ties were also strengthened by community organisations and networks for mutual aid, education, and religious celebration. Of course, these are only suggestions, and greater attention to psychohistory is needed in this, and all other areas, to better understand people and communities of the past.

There is an unsurprising argument that modern psychology is fundamentally unsuited to studying the past. Many facets of mental illness are specifically tied to society and context, and contemporary scientific models are formed for their own societies. For example, the experience of those living with psychosis was altered greatly by technological advancements like radio and television. As such, there is a danger of overanalysing historical mental illness through a contemporary lens, and misrepresenting history as a result. However, while this deserves acknowledgement and careful attention, I argue that looking at mental health and psychological trends within different contexts will enhance both our understanding of that period, and of the ways in which people respond to different situations.

2020 has made us all deeply aware that we are living through history. And perhaps, as we try to come to terms with the changes to our lives, we can learn from that history. It is, ultimately, our relationships with other people that will see us through this.

How Unique Was the 2020 US Presidential Election?: Prejudice, Pandemics and US Politics

By Lucy Cowie

The 2020 Presidential race was every bit as divisive and corrosive to American politics as was expected, but two issues were undoubtedly at the top of everyone’s mind: persistent prejudice and the Coronavirus pandemic. During the summer, these two issues peaked in influence, and tipped the election in Biden’s favour. The killing of an unarmed black man in the summer, George Floyd, sparked a series of protests across the United States and beyond. Meanwhile, the Coronavirus pandemic has killed more Americans than the five major conflicts of the twentieth century; the First World War, and the wars that the US waged in Korea, Vietnam, Afghanistan, and Iraq. We know that history repeats itself, and these two issues are no exception. Racism is not an issue of the past, and its role in US election cycles continues to be incredibly important and worthy of analysis. The pandemic has reminded the modern world that despite technological advancement, globalisation and the wonders of modern medicine, we remain vulnerable to new strands of disease.

The Black Lives Matter movement that took hold of all our attention in the summer reminded us of the deep-rooted racism in American society. Racist rhetoric has been used as a tool throughout US election history and has been aligned more so with the Republican party, as opposed to the Democrats, for the past 50 years. America is known as a melting pot, and an election held 50 years ago can demonstrate that it is necessary to cater beyond white Americans, as race was a defining feature of the resounding defeat of Republican candidate Barry Goldwater in the 1964 election. Goldwater suffered one of the biggest losses in US history, in part due to his failure to denounce right-wing extremism, such as the John Birch Society. David Farber has written on race politics and Goldwater’s defeat, and he suggests that the candidate knew ‘the politics of racism’ and chose to align himself with racists. Democratic candidate Lyndon B. Johnson was following JKF’s legacy as the ‘civil rights president’, and building a coalition which included African Americans, particularly in the North. Of the six states Goldwater won in 1964, five were in the Deep South, and the other was his home state of Arizona, which he won by half a percentage point. Civil rights were a defining feature of the 1964 campaign, and Goldwater’s appeal to a purely white base was a huge failure. This reflects the beginning of strong alignment between white southerners and the Republican party, but also marks the beginning of African Americans having significant voting influence. On average across different surveys, in the 2020 election 90 per cent of black voters sided with Biden. Even though African American votes for Trump increased in the 2020 election compared to the 2016 election, Trump has never had much success with African American voters. This became a major issue for Trump when in the 2020 election they showed incredibly high turnout after debates over black voting suppression had been raging prior to election day. In a nation where the white voter majority is starting to get smaller, Trump’s racist rhetoric and failure to condemn white supremacy limited his electoral success. In 1964, white southerners were not enough to create a successful coalition to win the White House. In 2020, Trump tried to appeal to more minorities prior to election day, namely Socialist-fearing Latinos in Florida; but this was still not enough to prevent the overwhelming support Black and Latino voters gave to Biden. The elections of 1964 and 2020 demonstrate that populism and appealing to a racist base do not hold electoral success for Republicans, and the protest movements over the summer

This article is from: