Manchester Historian Issue 36 - Ideology and Faith

Page 1

Ideology and Faith

Issue 36

May 2020


ISSUE 36 | 2020

Contents

Editor’s Note

State Shinto and Nationalism in Meiji Japan

3

James Baldwin’s Existential America

4

The History of Eco-Socialism

5

Mary Wolstencraft and Early Feminism

6

bell hooks and Intersectionality

7

Neitzsche, Modernity and Progress

8

Mao Zedong and the Sinification of Marxism Islamic Art

9

10

Holocaust Trauma and Israeli Identity 11 History of Utopian Societies

12

Interview with Katherine Clements

13

Legalist Ideology in China Native American Environmentalism

14

15

Zionism 16 Manifest Destiny - US expansionism 17 Acid House and Tatcherism

18

Jihadism 19 Whitechapel Victims

20

The Wilmington Massacre

21

The Racialised War on Drugs

22

Dietrich Bonhoeffer Review of the Cheese and the Worms

23

Welcome to Issue 36 of the Manchester Historian! Ideologies, creeds, and belief systems have always played a key role in human history, and have defined how we see the world around us, and our place in it. Much of the twentieth century has until quite recently been perceived as a battleground between the competing forces of capitalism, fascism, and communism. With the fall of the fascist regimes in Italy and Germany in 1943 and 1945, and of most communist regimes in the Eastern block by 1991, the historian Francis Fukuyama proclaimed the end of history with the victory of capitalism. This is part of a long tradition in historiography, in which the lenses of ideology are used in order to provide a framework to understand history. From such monoliths as capitalism and communism, feminism and race, to more modern approaches such as our relationship with our environment, ideologies have fundamentally transformed the world and the way we live. This is more true than ever in a world defined by COVID-19. Many ideas which had seemed radical only a few months ago, such as Universal Basic Income, rent strikes, and the semi-deliberate collapse of airline companies around the world, have come under serious consideration. The upending of how our societies have been organised until now has removed many of these lenses through which we have understood the world. Events like pandemics and revolutions might be catalysts for ideological change, but belief systems have always governed human behaviour and have developed dramatically over time. This is what we hope to explore in the articles in this issue. In that spirit, in this issue we have articles on Japanese Nationalism (p.3), the history of feminism (p.6 and 7), on Baldwin’s existentialism and Nietzsche’s nihilism (p.4 and 8). Two articles explore Israeli identity and Zionism (p.11 and 16), and two articles focus on environmentalism (p.5 and 15). The ideology of the Chinese state features in two articles on different periods (p.9 and 14), and other articles cover topics such as jihadism (p.19), US expansionism (p.17), and the history of the idea of utopia (p.12). Finally, do make sure to read our interview with Katherine Clements, an author of historical fiction, and writing coach at the University (p.13).

23

Many thanks to Francesca Young Kaufman University of Manchester History Department University of Manchester Graphics Support Workshop

22


ISSUE 36 | 2020

State Shinto and Nationalism in Meiji Japan Lafcadio Hearn, a travel writer who lived in Japan from 1890 until his death, like many other Western travellers, remarked on the curious spectacles of Japanese culture and tradition which had been cut off from the outside world for over 200 years. What is embedded, somewhat unknowingly, is a narrative of intense social and religious upheaval. Indeed, Hearn wrote extensively about the Shinto and Buddhist practices of the Japanese population where he lived, on the island of Kyūshū, writing “for in this most antique province of Japan all Buddhist and Shintoists likewise utter the Shinto prayer: Harai tamai kiyome tamai to kami imi tami”which when loosely translating Hearn’s romanised transcription means “The distant gods, smile [upon us] we pray; drive out [evil], we pray; cleanse us; we pray”. And while this may depict a harmonious relationship between the two religions it belies the conflict and reform of faith in Meiji period (1868-1912) Japan. These dynamics of faith were carefully designed to support the rise of militant nationalism which would come to a head on the global stage of world war in the mid-twentieth century Shōwa period (1926-89). After the Meiji Restoration in 1868 in which the Shogunate was overthrown and the Emperor, or Tennō’s role was restored, there was a reorganisation of Japanese society. A powerful tool used by the Meiji government was the reassertion of Shinto rites and beliefs, especially when concerned with the divine status of the Imperial family and Emperor. Buddhist temples were targeted in a movement historically called haibutsu kishaku where nearly eighteen thousand temples were destroyed as a symbol of a wider transition that aimed to separate Shinto and Buddhist divinities completely, shinbutsu bunri. However, one thing to consider when analysing this subject is the Japanese understanding of religion. Shinto was fundamentally a belief system that informed the Japanese worldview and allowed its followers to comprehend their realities of life (eg. natural disasters, sickness, and the divine rule of the Imperial family). The Meiji government implemented laws allowing for religious freedom, but created a loophole of faux-secularisation where Shinto was converted into an ideology instead of being comparable to religion. Hence, the adoption of Murakami Shigeyoshi’s theory, “State Shinto”. Murakami argues that this term also means that the Japanese populace were also active members of this na-

tionalistic idea, ‘supporting and rallying’ the cause instead of the top-down process of ideological assimilation that can be witnessed in other nations’ histories. The overall success of Shinto is contested by Fukuzawa Yukichi, a famous enlightened thinker of the Meiji period, renowned for his views on Japan’s modernisation and educational practices who wrote about his suspicions of Shinto in the modern world. He claims ‘Shinto was always a puppet of Buddhism’ (1883), and thus used Buddhist principles instead of having its own coherent set of beliefs. The Great Promulgation campaign (1870-84) sought to overcome this issue by establishing a new modern doctrine, and under this the “imperial edict on the promulgation of the Great Teachings” which sent missionaries across the country in order to proselytise Japan with a state-sponsored comprehension of what Shinto actually was. Another obstacle the Meiji government had to contend with in solidifying a modern definition of Shinto was the power of local shrines and their relationship with local governance. During the restorative era of government and into the first decade of the twentieth century the state recognised that centralisation in religious institutions was necessary for an adoption of State Shinto, and began to invest significantly into Shinto shrines as ‘vehicles for national initiatives’. Consequently, the government had further reaching control that could overcome the historic fragmented and localised nature of the country. Additionally, with any ideological movement, education was an institution that was necessary to intercept and signposted ideological change in the morals and ethics of the population. The Meiji period witnessed significant educational reform which placed Shinto and thus the Emperor at the heart of Japanese life. National Learning (Kokugaku) was a school of thought that exemplified the departure

from all religions in favour of strengthening Shinto principles and mythos which was led by enlightened thinkers such as Fukuzawa. Furthermore, from the 1880s onwards school events and trips to Shinto shrines, rituals, and festivals calling back to ancient Japanese traditions were organised for the purpose of revering the Emperor. The Rescript of Education was also implemented from 1890 and sent to every school in Japan, accompanied by a portrait of Emperor Meiji. The image we can conjure of a Japanese classroom post-1890 parallels those seen in authoritarian states during the twentieth century such as the Soviet Union, Nazi Germany, and Communist China. In combination with the divine provenance of the Emperor, he had become God in the eyes of the Japanese, and this is highlighted in classroom practice. These educational and religious reforms nurtured nationalism from an early age, and normalised the reverent role of the Emperor within Japanese society - children now grew up with an intense awareness of their moral duty to the Emperor and the state by proxy. This process developed a new “modern” Japanese national identity characterised by an ideological shift towards radical, militaristic nationalism. This would also later encompass Japan’s national image during the Pacific War, leaving long-term echoes of Japan’s modern history on the global stage. For example, even today the phrase “Tenno heika banzai” evokes images of Japanese soldiers running towards the enemy on a suicide mission, completely hypnotised by the rhetoric asserted during the Meiji period of the divine provenance of the Emperor. As expressed by Helen Hardacre, the history of State Shinto is an uneasy concept to approach in contemporary Japan as it attempts to re-contextualise national values in a modern world.

Emma Donington Kiey

Illustration of Grand Festival at Yasukuni Shrine by Shinohara Kiyōki, 1895

33


ISSUE 36 | 2020

James Baldwin’s Existential America

James Baldwin was born in Harlem in 1924, raised by his mother Berdis Jones and step father Reverend David Baldwin. Baldwin’s relationship with his step father would shape his understanding of life as an African-American in the twentieth century; his biological and emotional distance offering the space to question his own identity and to observe the conditions suffered in American society as epitomised by his step father.

existence and perpetuated oppression in American society. The fractured nature of Badwin’s identity transposed into his work, as he developed an approach to America’s racial conflict founded on a radical existential phenomenology, encouraging his readers to escape and transcend structures of oppression and ideas that encouraged mauvaise foi, and to embolden the autonomous lived experiences of African Americans.

David Baldwin would not hide the fact that James was illegitimate, often bullying him for his looks and calling him ugly, but this became a useful metaphor for Baldwin, who would describe himself as a ‘bastard of the West.’ The metaphor of illegitimacy would lead Baldwin to the conclusion that there was a shared struggle amongst white America and African-Americans, an endemic crisis of identity. From this position, Baldwin captured the essence of what W.E.B. DuBois referred to as ‘twoness’;

Religion was a principal focus for Baldwin, and a central obstacle to black liberation. Baldwin’s involvement in the Black Pentecostal Church as a young preacher is a central theme of his work. Based on his early experiences, Baldwin offered a critique which highlighted the misguided and restrictive elements of the church; the bloodless theatre and ritualistic illusion which severed the African American experience from reality and instilled an inauthentic existence. Baldwin linked these church practices to an inherent form of black self-hatred, inspired by the curse of Ham, son of Noah. Organized religion was merely ‘a mask for hatred and self-hatred and despair.’ Baldwin also criticised the more radical growth of religion. The Nation of Islam, which was perceived by many as a radical departure from consevative, and repressive Christianity, was criticised by Baldwin. In Baldwin’s view, the binary fallacy of Elijah Muhammad’s position that espoused the white man as devil only enforced a rigid, and racially exclusive dichotomy of black versus white existence. Baldwin believed that to enforce this dichotomy was to perpetuate an essentialist idea of race that was built in order to oppress. In his critique of the Nation, Baldwin stipulates that the ‘negro’ was created by white people, built solely to subjugate. Therefore to define themselves in relation to ‘the white man’, was to preserve the oppressive dichotomy, tantamount to remaining within that original definition of a ‘negro.’ In the same sense, the definition ‘negro’ has served to inform and embolden the white man in America, as the ‘negro’s’ diametric opponent, and that there is an almost symbiotic relationship between the two concepts reduced solely to an obsession with ‘race’ or colour. To escape from these limited definitions, it is important to focus on the phenomenological experience of Black people rath-

‘One ever feels his two-ness,—an American, a Negro; two souls, two thoughts, two unreconciled strivings; two warring ideals in one dark body, whose dogged strength alone keeps it from being torn asunder.’ For Baldwin, it may well have been fourness; as a black, illegitimate, queer American he was especially isolated from the society he found himself in. Insisting that he was ‘not merely a Negrowriter’, Baldwin was stifled by what he saw as essentialist and holistic views that limited his

James Baldwin, 1948

44

er than on abstract and essentialist definitions of colour. Leaving the church, and America, Baldwin began a cosmopolitan expatriate lifestyle that would last for the rest of his life. His formative years had exposed him to the peculiar relationship between Christianity and the African American. Then, in his European travels, Baldwin was exposed to the peculiarity of ‘race.’ In ‘The Discovery of What It Means to Be An American’ Baldwin most explicitly shows his existential approach toward defining his identity. Baldwin states how he sought to ‘find out in what way the specialness of my experience could be made to connect me with other people instead of dividing me from them.’ Baldwin describes what is essentially an existential crisis during his convalescence in Switzerland: ‘I began to try to re-create the life that I had first known as a child and from which I had spent so many years in flight.’ Turning towards phenomenology, drawing upon his experiences to understand who he is today. For Baldwin, this is the most significant political paradigm for dealing with what he would describe as ‘the White problem.’ By drawing a line under the paradigm of ‘race’ Baldwin was beginning to clarify that the White American had created this dichotomy to define his essenceas a ‘negro’, but as Sartre says - existence precedes essence. Through this theory, Baldwin’s definition of ‘the White problem’ re-evaluated perceptions of race with significant political implications. By developing the philosophical notion that race is an arbitrary concept, originally developed to benefit white supremacy, Baldwin’s saw philosophical ‘color-blindness’ as the best way to, not only combat racism within America, but to overcome the shortcomings of apparent developments in race relations. Baldwin’s fractured queer identity, and complex relationship with American societ placed him in a unique to assess the problem of race in America. Throughout his work, Baldwin offers a framework for liberation which emboldens the lived experience of African Americans, while simultaneously exposes the deep-set roots of structural oppression in America

Wilf Kenning


ISSUE 36 | 2020

The Origins and Rise of Eco-Socialism

Merging aspects of different ideologies is not uncommon in the history of the world. Political leaders have always twisted, blended, and combined many ideas to impose their own schemes on their citizens. This is often done through the revamping of Marxist thought into some new variant of socialism. Perhaps one of the most significant of these variants is eco-socialism, with its relevance and novelty posing the biggest threat to mainstream political ideologies. In terms of the metaphorical political colour spectrum, eco-socialism blends the green with the red – not to produce a murky brown colour, but rather a refreshing and cutting-edge ideology combining two prevalent schools of thought. Essentially its main premise is that the expanding capitalist system is the sole cause of environmental damage to the world and, in order to save the planet, we must rid ourselves of it once and for all – to ensure ‘system change, not climate change’. This mirrors socialism’s ingrained condemnation of the destructive capitalist system, but equally represents the ‘green’ political view that heavily prioritizes the preservation of Mother Earth, above all other social justice or economic issues. And in the modern world, where catastrophic weather events and rising heat levels are fiercely escalating, the green-and-red concoction that eco-socialism provides is becoming even more attractive. But when did this all begin? The ecological movement reached its climax in a completely different century to socialism. The establishment of International Mother Earth Day in 2009, the worldwide climate strikes that took place throughout 2019, and a general increase in environmental awareness worldwide, are all twenty-first century phenomena. This is years after the prime of socialism, which historians say peaked in the mid nineteenth up to the early twentieth century. The true origins of eco-socialism lie somewhere in the middle of all this, but some historians give credit to the father of socialism, Karl Marx, for sowing the seeds of eco-socialism long before this. Despite being focused primarily on revolution and seizing the means of production, Marx did point out the “metabolic rift” between man and nature, and discussed how society should take care of the planet for future generations, as was elucidated in Das

Kapital, Vol. 3. And much like modern eco-socialism, he blamed the worsening environmental degradation all on capitalism, the exploitative system that ruins human lives and nature. Marx demonstrates this mutual belief that capitalism must be dismantled, and replaced with a system of common ownership of the means of production. Therefore, the so-called ‘origins of the origins’ demonstrate how traditional Marxist thought has contributed to the rise of eco-socialism in the late twentieth century, namely the 1970s.

But before we move on to the most pivotal decade in eco-socialist and environmentalist history, significant credit should be given to another so-called “early prophet” of eco-socialism - William Morris. Morris was a British anti-imperialist, revolutionary, and socialist of the latter part of the nineteenth century. His special and once unparalleled socialist mentality plays a central part in the origins of eco-socialism and some historians argue that he helped to construct the ideology altogether. Morris’ view was ahead of his time as, even in the 1880s and 1890s, he consistently acknowledged how damaging the impact of industrialised capitalism was on the environment, even before the recognition of the present ecological crisis. His 1884 lecture, ‘Art and Socialism’, shone a new light on this impact and ominously warned us that we would eventually be “choked by filth”, because of the destructive capitalist system and how it ravages nature. His bleak attitudes mimic the rhetoric of today, over one hundred years later, showing just how radical his thinking was – radical, but indeed right. Thus, William Morris was an important figure in the early history of eco-socialism, as was Karl Marx himself. But the rise of eco-socialism, different from its origins, was defined by the environmental boom of the 1970s. The formation of both the Environmental Protection Agency in the United States, and Greenpeace in this decade, represent a new surge in the strength of ecological engagement, due to a growing concern about the health of the environment. Yet most socialist movements throughout the nineteenth century, such as Soviet communism and the Labour movement in the West, had largely overlooked these newfound issues surround-

55

William Morris, an “early prophet” of eco-socialism

ing environmental health. Thus, the red and green ‘blend’ only originated in the 1970s, with the term ‘eco-socialism’ itself coined in the following decade - notably used in the key 1980 pamphlet ‘Eco-socialism in a Nutshell’. This became an important and central work in the history of the ideology, one that provided an alternative to the ‘doomster’ image that this new environmentalism had generated, and subsequently popularised the school of thought. From here, scholars began to pay attention to studying eco-socialism as a serious political theory; in 2001, Joel Kovel and Michael Löwy wrote “An Ecosocialist Manifesto”, and the following year, Kovel published the renowned book “The Enemy of Nature: The End of Capitalism or the End of the World?” – a seemingly self-explanatory work, but nonetheless advocating for transformation of the entire capitalist system so that the human race could actually survive the threat of climate change. Since this, eco-socialism has continued to ascend even higher, and gain more and more popularity worldwide. That brings us to the final question: what is the future of eco-socialism? What will happen, and when – is an ecological revolution even likely? There are many questions like this to be asked, but generally speaking, its remarkably unique history, clear message and promising progress over the past fifty years suggest it is an ideology not to be dismissed.

Emily Hunt


ISSUE 36 | 2020

Mary Wolstencraft and Early Feminism Mary Wollenscraft is one of the most famous people you’ve never heard of. Her presence in the school curriculum is minimal; her only appearance comes as a contextual note for ‘Frankenstein’ – the popular book written by Mary Shelley, her daughter. Born in 1759 in Spitalfields, London, Wollenscraft’s early life was complicated by an erratic family. Her father was a violent drunk, who was said to have been abusive towards her mother. He constantly moved the family around England in pursuit of entrepreneurial success, once notoriously trying to establish himself as a farmer in Epping. This erratic behaviour had an economic impact: the sizeable fortune inherited by the Wollenscraft family was steadily splurged, reducing their status and rank.

To this end, Wollenscraft’s brother was the only member of the seven siblings to receive a formal education. Mary Wollenscraft had received only a few days of proper education during a short stint in Yorkshire; just enough time for her to learn how to read and write. However, it is important to note that Wollenscraft’s education, albeit largely informal, was beyond many other women of her age. She had an extensive knowledge of Shakespeare and Milton through her own love of reading, yet by the time she was a teenager she was set to enter a respectable profession. The modern definition of feminism – that being the ideology of equality for men and women – was something Wollenscraft seemed to comply with in part, her entire life. For example, Wollstonecraft’s earliest feminist tract was ‘Thoughts on the Education of Daughter’s’, published in 1786 by the radical Joseph Johnson. The tract promoted Wollstonecraft’s idea that women’s oppression stemmed from a poor education system. Joseph Johnson was also instrumental in aiding Wollenscraft’s move into the male-dominated political sphere. Whilst political activity was something incompatible with 18th Century ideals of women as elegant and passive, female literary authors were ultimately able to exhibit some creativity. Wollstonecraft explored her identity through her works of fiction. In her book ‘Mary: A Fiction’ (1788) she explored the obstacles faced by women who were self-made and orphaned. This drew from her own experiences with the death of her mother in 1782, and the ever-absent

nature of her father. Perhaps her ability to infiltrate the male-dominated world of politics owed something to this sense of undetermined identity. Her work writing for the ‘Analytical Review’ gave her a platform in which she could contribute to the literary genre and expand her knowledge, without appearing to be acting outside of her sex. In 1790, ‘A Vindication of the Rights of Men’ was published; Wollenscraft’s infamous rebuttal of Edmund Burke’s negative analysis of the French Revolution was the first of many responses, a privilege gained through her position at the ‘Analytical Review’. Evidently, Wollenscraft was able to use her literary skill to access the highest levels of academic society, something not previously seen. Writing over 200 articles for the review proved her strength as a writer, irrespective of her sex. Wollenscraft published her most overt feminist work, ‘A Vindication of the

Mary Wollstonecraft - portrait by John Odie, 1797

Rights of Women’, in 1791. Immediately, the tract was very popular: selling out three times over. By further emphasising the importance of education in ensuring equality for women, Wollenscraft directly opposed popular philosophers of the day such as Jean-Jacques Rousseau, who believed that women should be educated for the ‘pleasure of men’. Perhaps Wollenscraft took inspiration from the jumbled education she had received, concluding that women were not incapable of reason, yet were simply disadvantaged by a society which prioritised literate and privileged males. Wollstonecraft’s tract seemed radically progressive during the late 18th Century, and was rapidly translated into both German and French, whilst also becoming a hit across the Atlantic. Contemporary reactions to the book varied: esteemed male authors such as Horace Warpole condemned the book, as did female

66

literary figures such as Hannah More. More, now viewed by some scholars as a ‘conservative feminist’, actually responded to the book by arguing that women were in fact the animal which was most ‘indebted to subordination’. Clearly, the idea of gender equality was still far off from being accepted. Wollstonecraft’s branch of feminism may seem somewhat unimpressive to a modern audience. Amongst the pragmatic and eloquent phrases appeared to be a darker, almost misogynistic undertone. She continually condemns her sex, calling women ‘weak beings’ who more often than not are found to be ‘irrational, indolent and superstitious’. To this end, accepting Wollstonecraft as the ‘Founder of Feminism’ is problematic. Wollstonecraft, through her fortune and privilege, was able to access the maledominated political sphere, usually through her close connections, unlike the vast majority of women she criticised. It is interesting that Wollstonecraft’s legacy suffered so much in the period following her death; instead of being remembered as a pioneer of equal education, contemporary audiences were hasty to reprimand her as someone who transgressed gender norms. The publication of her husband’s Memoirs in 1798 did nothing to improve her image posthumously, instead portraying her as irreligious and erratic. We should be cautious to award Wollstonecraft the title of ‘Founder of Feminism’; indeed, feminism should be regarded as a movement towards gender equality that certainly predates Wollstonecraft. However, it is true that European feminists have continually been inspired by Wollstonecraft’s penetration of the male political sphere. We will never know whether Wollstonecraft’s writings intended to draw attention to women’s social oppression, or instead solely were a product of her philosophical education. However, Wollstonecraft’s early emphasis on an education system which was fundamentally equal is an unequivocally important part of feminist history, and more generally the history of women’s struggle against oppression.

Natasha Parsons


ISSUE 36 | 2020

Intersectionality in Western Feminism First wave feminism, which was and is viewed as pivotal in the fight for women’s rights by giving around 8.4 million women the vote, only claimed the vote for two in every five women in the UK; similarly in the US, the Nineteenth Amendment of 1920 brought the vote for only white women. First wave feminism therefore largely ignored social cross-sections by focusing almost exclusively on middle class white women.

The second wave feminism of the 1960’s and 1970’s brought a fight against the systematic social sexism in the West including that which was rooted in the anti-racist and anti-capitalist civil rights movements. However, women of colour were largely alienated from the central, mainstream platforms of the movement. It was this movement that spurred the writing of bell hooks, born in segregated Kentucky, who published her first book Ain’t I a Woman: Black Women and Feminism in 1981. She was one of the earliest voices within the second wave to critique the racism in the feminist movement and the sexism in the civil rights movement. hooks sought to incorporate the differences between women into feminist practices, claiming that ‘people can be fully aware of one form of domination and then be completely blind to other forms’, and effectively paving the way for intersectional feminist thought and the third wave of feminism. It was the discussion popularised by hooks that led to the coining of the term ‘intersectionality’ by Kimberlé Crenshaw in 1989 to address the marginalisation of black women within feminist and anti-racist dialogues. Crenshaw defines intersectionality as a framework to understand the interconnected nature of social and political identities and the way in which these create interdependent systems of discrimination, a ‘many layered blanket of oppression’ – this, like the work of hooks, stemmed from a lack of acknowledged diversity within Western feminist movements. Both hooks and Crenshaw continue to write on intersectional feminism, with Crenshaw acknowledging that scholars and activists have ‘broadened intersectionality to engage a range of issues, social identities, power dynamics, legal and political systems and discursive structures’. The scholars’

bell hooks, one of the early proponents of intersectional feminism

work has been pivotal in discussions of power, exclusion and diversity; but what does this mean for the feminism of now and for the feminism of the future?

into view’. The goal of intersectionality is not exclusively to understand relations of power but to bring these dynamics forward in order to reshape them.

It is often argued we are in a fourth wave of feminism, characterised by its digital nature. However, this wave is not homogenous, just as previous waves have not been homogenous and just as women are not homogenous. Abrahams claims that ‘as the target has moved from legal parity to real social equality, debates about what justice for women means and how to achieve it have become ever more difficult to unpick’ and consequently Steiner’s claim that ‘we cannot say there is only one feminism’ becomes apparent; feminism is splintered.

Over 100 years since the acquisition of the vote, Manchester as ‘the suffragette city’ could be seen as largely inclusive with an awareness of intersectionality. Indeed, in 2011, one third of the population of Greater Manchester was non-white, with a higher percentage of LGBT+ people than the English national average and a near equal proportion of men and women. However, lack of awareness surrounding intersectionality continues to pervade this diverse city, as can be seen in hate crime legislation. While Greater Manchester Police acknowledge hate crime categories such as ‘disability, race, religion, sexual orientation, transgender identity and alternative sub-cultures’, they make little reference to how these intersect, and entirely omit gender and misogyny from these categories. It is therefore clear that, as a society, intersectional feminism needs to continue to permeate Western ideas of disadvantage and discrimination. TIME argues that the ‘core of intersectionality then…is coming to appreciate that all women do not share the same levels of discrimination just because they are women’. It can therefore be seen that, historically, intersectionality is hugely symbolic in a movement towards the acknowledgement of different components of power within Western society and remains to be symbolic due to its neverexhausted nature. In our discussions of the past and our actions of the present it is necessary that we are aware of the role that intersectionality plays, in order to promote true social and political equity.

Zimmerman argues that fourth wave feminism is deeply entrenched in the values of intersectionality and while evident online, such as in the hashtag #solidarityisforwhitewomen, it could be seen as exclusionary to focus predominantly on the fourth wave as the current form of feminism due to it being predominantly online. Perhaps instead we should focus on the ‘multiple feminisms’ named by SizemoreBarber, acknowledging that there are interconnected movements in the 21stcentury. Indeed, hooks asserted that ‘we [cannot] see gains for feminism distinct and separate from other struggles’ and ‘we have to look at things more globally’. Therefore, true intersectionality within current Western feminism is linked more greatly to movements that bring to light the coexistence of social identities as creating layers of discrimination, such as the #WhyWeCantWait campaign, rather than a focus on women as a homogenous group. Currently, intersectionality needs to be an ‘international movement within and across disciplines’, always with a new direction for concern, in order to bring inaudible voices into earshot, and ‘invisible bodies

77

Hannah Baldwin


ISSUE 36 | 2020

“God is Dead!”

There are few bigger questions than that of the meaning of life. Why do we exist, possess aspiration, and abide by certain ethics? For centuries, the answers to these questions have been provided by something many now regard as simplistic and irrational: religion. In 1882, this orthodoxy was challenged by a new philosophical movement symbolised by Frederich Neitzsche’s exclamation, ‘God is Dead!’. This article will explore the foundation of Existential Nihilism in a historical framework. It will argue that Marxism, Capitalism and Modernisation led to the erosion of historically accepted values, principally religious determinism, which led to a crisis in morality. Nietzsche began to formulate an answer to this problem, and work was then developed by Jean-Paul Sartre.

Nietzche’s (often-misinterpreted) dramatic hypothesis was an expression of a fundamentally moral argument which characterised public debate for the following decades. A central concern for thinkers at the time was whether or not a society could peacefully operate without Christian morals. For Nietzsche, in line with Hobbes and Locke, the significance of religion did not derive from its virtues and spiritual teachings, but rather from the role of consoling hearts and minds in a period in which the government was powerless, or otherwise failed, to alleviate physical and mental suffering. The rise of Nihilism, popularised in Ivan Turgenev’s Fathers and Sons, further influenced Nietzsche’s thinking. Nietzsche asserted that Christianity was life-denying and placed a heavy burden of guilt upon individuals who sought levels of perfection they could not possibly reach. At the same time, Nietzsche seems to be paradoxically arguing for the value of religion in holding society together. However, these two positions are not as paradoxical as they may seem. It is true to say that Nietzsche saw the value of religion in holding society together, yet central to his argument is the idea that the bonds provided by religion had unjust grounds. Nietzsche’s view developed from religious ideas such as evil, which distorts human behaviour. This is derived from an overarching Kantian view of morality in which morals are not naturally possessed by individuals. However, Nietzsche saw himself as a moralist: in rejecting traditional forms of morality, he was creating an ethical system which was in the process of overcoming morality and its societal value. Consequently, much of his thought was dedicated to finding true moral values. This attempt to find a remedy or replacement for the declining force of religion and its moral binding of society allowed the na-

Portrait of Nietzsche in a Melancholic Pose, 1882, by Gustav Shultz

ture and content of morality to become a defining debate across western societies. The decline of religion and the ensuing moral debate was and remains prevalent across the western world. In the 40 years following the turn of the 19th century, the number of registered Christians in Britain dropped by 12%. However, even in 1851, 40% of the population still regularly attended church. According to the latest census, only 722,000 do today. Moreover, this is nothing compared to the crisis in religion seen in states such as Russia, a traditionally more spiritually orientated society. Russia has produced some of the greatest writers across all societies. Yet even the most ardently religious such as Leo Tolstoy and Fyodor Dostoyevsky, found themselves writing about a quest for morality within society. In both Anna Karenina and Resurrection, the main protagonists, Levin and Nekhydov attack traditional social values. There appears to be a paradox here. Orthodox writers, in an ardently religious society, are writing about a quest for morality, something which one would assume God provided to them. Perhaps then, Nietzsche’s diagnosis of a crisis in morality resonated in countries that remained heavily theist, as well as in places with rising levels of atheism. Having said that, it is striking that by 1941, only 500 Orthodox churches remained across Soviet territory. Yet, larger historical and political forces must be involved to provoke such a profound change in religious observance and moral and spiritual understanding. Vital to understanding this is to recognise the fact that the emergence of mass capitalism created a new form of mass morality. Max Weber argues that capitalism led to the Bureaucratization of society. That is to say, that new questions were asked of governing institutions, which religion could no longer answer. Karl Marx’s damning insight into religion being the ’opium of the masses’, has clear resonance here. The emergence of a radically new system, which transformed and shaped the every-

88

day life of citizens, slowly began to replace religion in defining the purpose of daily existence. Capitalism involved itself in individuals lives to such an extent, that a new form of societal morality was required. If we accept Nietzsche’s diagnosis - the view of Capitalism and Modernisation creating a new foundation of society, which in turn eroded the role of religion - we arrive at Jean-Paul Sartre to provide an existential analysis. For Sartre, Marx’s theory resonated. The ‘Ideology of existence’, was merely an alienated form of the deeper social and historical reality provided by Marx’s dialectic approach. However, Sartre did not fully accept Marx’s writing and viewed aspects of his work as historically specific. Central to Sartre’s work is the notion of life possessing no abstract meaning. Accepting Nietzsche’s diagnosis, Sartre then argued that there was no moral solution: ‘existence precedes essence’. That is to say that there can be no formal account of what it means to live, as life can only be given meaning, through existing itself. The dominance of Existentialist thought in philosophy today must be traced back to the work of Nietzsche and the development provided by Sartre. Moreover, these ideas would not have such resonance if it were not for dramatic historical changes resulting from the development of capitalism. The growing acceptance of the idea that there is no abstract meaning to life should be viewed in conjunction with the mass decline of religion and the emergence of capitalism, which held the lives of individuals in a vice-like grip. Viewing morality and religion in a historical perspective leads us to the conclusion that perhaps Nietzsche’s dramatic hypothesis is not as a dramatic as we might have first assumed.

Oscar Tapper


ISSUE 36 | 2020

Mao and the Sinification of Marxism

Mao was deeply Marxist in his convictions but he heavily sinified Marx’s theory, applying it to the Chinese situation and adapting it from a European context. Born out of the Marxist theory of scientific inquiry called dialectical materialism, Mao ‘sinified’ his own political actions according to this framework. Insofar as this can be interpreted as sinification, it represents a crucial characteristic of dialectical materialism that arises out of its emphasis on the authority of reality.

In his book Karl Marx’s Theory of Ideas John Torrance calls Marx a ‘scientific realist’, someone with the belief that ‘if observation is to yield new truths it must be guided by scientific theory’. This conviction demands an inductive method of law derivation, whereby new opinions are formed from real life observations, which are then checked against existing scientific ideas. Should new observations not integrate into the framework of existing theory, the theory has to be changed in accordance with the new observations. Marxist materialism not only ensures the accuracy of one’s observations, but also the accuracy of the laws against which they are tested, with empirical reality being the ultimate authority of truth. From this follows the fact that alterations of doctrine, be they scientific or political in nature, are not only deeply embedded in Marxist theory, but due to the importance placed on the need for correspondence between theory and reality, even demanded. When Marx applied his dialectical materialism to historical development, he identified the productive powers as the most fundamental driver of advancement, which is inevitable as ‘implied by the very nature of human productive activity’. How it occurs will not be discussed here, it suffices to note that it moves in stages driven by the development

Chinese Poster stressing the roots of Mao Zedong Thought

of the productive powers, following which a corresponding moral, political and social superstructure is formed. A move from one stage to the next is characterised by a change in the formation of this superstructure, but never caused by it. In a society with capitalist powers of production, capitalism’s inherent characteristics create the potential for a socialist revolution. However, Marx affirms that ‘history is not a closed process, in which the foreordained has only to be acted out’. Furthermore, while the causal relationship between productive powers and the arising superstructure is a general derivation holding true for universal situations, ‘in each social formation, more specific laws govern the precise nature of this general derivation’, granting some variability of connection and interdependence between the two. Within this lie both the utility and need to ‘sinify’: by adapting his policies to fit the Chinese situation Mao can align theory with practice satisfying his materialist convictions and, in doing so, become a better Marxist leader whose actions are more effective at overcoming the ‘uncertainty of history’ and usher in a Socialist utopia.

The above analysis aims to show that ‘sinification’ is compatible with Marxism. Let us now examine Mao’s take on this. In On Practice, Mao reflects on dialectical materialism. For him, it starts with perception, the process of experiencing the world and observing phenomena. After the initial observation, it is of the utmost importance to make sense of the experiences by putting them in order and collecting further evidence. To put observations in order, one needs to test them against existing theories. Should existing theories not coincide with new observations, one needs to return to the ultimate testing ground of reality ‘draws[s] his lessons, correct[s] his ideas to make them correspond to the laws of the external world’, he argues. Just like Marx, Mao gave reality ultimate authority concerning scientific inquiry, justifying this with the conviction that ‘all genuine knowledge originates in direct experience’. This conviction did not only include the derivation of theory, but also its purpose. Theory is of no use if it does not make ‘the leap from rational knowledge to revolutionary practice’ and, in doing so, achieves to change reality according to the theory’s desires. He maintains that during this revolutionary practice, the effectiveness of existing theory to achieve desired outcomes should never be taken for granted but remain under constant scrutiny, remaining

99

Mao Zedong, 1963

subject to change if one finds that it no longer addresses the characteristics of one’s particular reality. What follows from this analysis is that Mao and Marx both valued reality’s authority above all else, out of which came Mao’s need to adapt his theory to it, to ‘sinify’ it according to Chinese reality. While accepting the universal truth of Marxist ideology, his dialectical materialist nature prohibited him from blindly accepting all of it in the formulation of his own policies. This allowed him to create a form of Marxism that was true to itself, while still possessing ‘specific national characteristics and acquiring a definite national form.’ Ultimately, sinification is a specific term coined for a phenomenon very fundamental to Marxist ideology. In a Marxist framework, it describes nothing less than the process of scientific inquiry led by dialectical materialism leading to deviations from general theory due to the unique nature of different realities. It is deeply Marxist in essence and would occur in any situation where Marxist ideology is applied to a national situation.

Matthaeus Laml


ISSUE 36 | 2020

Iconoclasts and Iconophiles

Representation and Rejection of the Divine in Islamic Art On May 28th 1453, when the Byzantine emperor Constantine XI entered the “Church of the Divine Wisdom”, Constantinople was under siege. Perhaps the emperor knelt to pray before the Apse Mosaic of the Virgin and Child. Looking up at the gloriously gilded icon of one of Christianity’s most famous images – a young mother sitting on a throne holding a child upon her lap; the saviour of mankind. What would this mortal man at the feet of the almighty have felt? Perhaps, it was hope, the relief of salvation in Christ, or maybe it was fear. Fearful of what the fate of the Eastern Orthodox Church was to be, if the Muslim Ottomans were to storm the ancient city. One can only speculate what the final Roman emperor felt. Yet, he must have been touched when kneeling at the feet of this beautiful Byzantine icon. That is the power of religious imagery, its ability to evoke an array of emotions, to touch the soul and mind. For the iconophile, the lovers of sacred imagery, religious imagery serves to enlighten the beauties of God’s creation. For an iconoclast however, depicting the divine is an act of idolatry and sin, arguing that no earthly materials or artists can adequately depict the divine. Iconoclastic disputes have run deeply within both the Christian and Islamic faiths for centuries. As the youngest faith within the Abrahamic fold, Islam and its artistic movements were influenced by its exterior contexts. Yet, the Muslim attitude towards religious imagery mostly stemmed from the teachings and practice of the prophet Muhammad. In 630 CE, when the prophet entered Mecca, he expelled from the Kaaba all pagan idols.

“There were three hundred and sixty idols around the Ka’ba. He began to thrust them with the stick that was in his hand saying: “Truth has come and falsehood has vanished.”

[Quran 17:81]

This act of iconoclasm birthed for many historians and art critics the conventional line within Islam, of a firm rejection of any form of idols (known as aniconism). The traditional Western historiographical interpretation of Islam’s approach to idols in religious art focuses on Islam as a primarily iconoclastic religion, which aggressively pushes the removal of idols. Yet, this eludes

the distinction between different types of cultural practices within Islam. This article serves to briefly survey such pluralities in the approach to art within Islam.

Islamic art is a rich tapestry of representations of the divine. The conventional line tends to reject the use of idols (any depiction of sentient beings) based on the teachings of the prophet laid out in the Hadith – the traditions of the words and deeds of the prophet Muhammad.

“He who creates pictures in this world will be ordered to breathe life into them on the Day of Judgment, but he will be unable to do so.”

Hadith, Sahih Muslim (818-875)

The purpose of prohibition was initially to avoid idolatry. Yet the Quran provides no specific guidelines for the use of images, and iconoclastic practice was never uniform. The conventional prohibition has been interpreted in various ways. Consequently, Islamic art has been typically characterized by extensive use of calligraphy, and geometric and abstract floral patterns in its move away from figurative paintings.

Entrance iwan of Shah Abbasi Mosque, Isfahan, Iran

Muslim artists used geometrical shapes and calligraphy to make repeated patterns as a form of decorative art. Geometric patterns in Islamic art and architecture are regarded as a manifestation of divine and rational thought. In the Islamic worldview the sanctity of mathematics has been more apparent in art than many other things. It is in art that substance of the divine may be found, sacred spaces are created with the aid of geometry and arithmetic in which the complete presence of God is reflected. Mimar Sinan, one of the greatest architects and engineers of the Ottoman empire, adhered to these core values. His masterpiece, the Suleymanie mosque, is a grand example of geometric synchronisation. For example, the mimbar, the platform used by the Imam to deliver sermons, is adorned by delicate patterns that run on both sides, originating from eightfold stars in the middle. The removal of idolatrous images, how-

1010

ever, did not end an interest in figurative art within Islam. When looking at the splendid buildings and palaces of the Umayyad caliphs, one can see an abundance of images decorated in the style of Christian Late Antiquity. Moorish caliphs, similarly, utilised paintings, figurative stone reliefs and sculptures in the adoration of buildings. It is important to note however, that these were rarely used for the purpose of worship. Still, it is in Persianate manuscript depictions where the religious worship of icons is made more complex. The private medium of Persian and other miniature book illustrations is a small yet rich source of rare depictions of the prophet Muhammad. Depictions have been found to be ranging from Medieval Persian, Timurid, Safavid and Ottoman manuscripts. These images complicate the traditional view that Muslim societies pursued a strict iconoclastic stance. Made for both Sunni and Shia worshippers, manuscripts discovered from the 13thcentury show almost every episode of Muhammad’s life as recounted in the Quran and other texts. These images laid the foundation for a minor tradition of devotional images which exist to this day, from icons adorning homes to a five-storey government-commissioned mural in the heart of Tehran and even to revolutionary street art in Cairo – although the prophet’s face is obscured in both those public drawings. Just as Constantinople straddles East and West, so too the Hagia Sophia connects the Islamic world with that of the Christian. When Constantinople fell to Sultan Mehmed II of the Ottomans in 29th May 1453, Mehmed immediately went to the Hagia Sophia. He bent to pick up a handful of earth and proceeded to pour the soil over his turban as an act of humility before God. The basilica of Hagia Sophia became the mosque of Aya Sofya. Yet the beautiful Byzantine icons were not destroyed. Instead ornate Islamic calligraphy and geometric art were added to the wall, beside the Christian art. Today, if you go to Istanbul and visit the Hagia Sophia, you can still see a splendid coexistence of Christian and Islamic art, united in the purpose of worshipping God.

Piotr Kardynal


ISSUE 36 | 2020

Trauma and Israeli Identity

The state of Israel was founded in 1948 to provide a home for a people who had not only suffered centuries of oppression, but had just survived one of the worst genocides in history. It is no surprise then, that at the core of Israeli identity is an unresolved sense of trauma, in particular, Holocaust-induced trauma. Israeli identity has formed around two, at times contradictory, responses to this trauma. On the one hand, having witnessed the unimaginable horrors of the genocide, first generation Israelis felt an innate duty to uphold the highest moral standards. On the other hand, having faced a meticulously and dedicated plan to eradicate all Jewish people from the face of the earth, Israel maintains an understandable desire to survive at any cost. Not surprisingly, these two principles frequently clash as Israel attempts to uphold Western human rights standards while simultaneously defending itself from multiple hostile neighbours. This article is going to look in more depth at the latter aspect of this potentially unsolvable dichotomy: the sometimes irrationally manifested desire to avoid what many have tried, and luckily failed, to do for centuries.

As thousands of traumatised Jewish people migrated to Israel in the wake of the Holocaust, they were greeted by a new state intent on building a modern and prosperous society safe for its people. The Zionist vision for Israel focused on working the land and re-constructing Jewish identity around a new hyper-masculinity. The orientation to a reformed masculinity, generally focused on ‘Sabras’ (Jews born in Israel), reflected a desire to overcome centuries of medical stigmatisation about the weakness of the Jewish male body as well as the self-perceived failure to protect Jews from the Nazis. The problem with this blind commitment to a homogenous vision was that the traumatised voices of those who had directly survived Nazi oppression were often muted. The traumatised refugees were forced to either conform or become outcasts. In many ways, trauma came to represent a reminder of a perceived weakness that Israelis, particularly Sabras, were keen to forget. What is particularly interesting was the utilisation of cultural tools such as cinema (an example being Tomorrow is Another Day, 1948) to help formulate the image of the ‘New Jew’.

New York Jews take part in a ‘Celebrate Israel’ parade

“one should also recognise the questionable ways in which narratives of Israeli identity, victimhood, and purpose have operated in the geopolitical arena” An obsession with masculinity combined with the subduing of the traumatised voice has meant that subsequent generations of Israelis have, at times, interpreted their parent’s trauma in particular ways. Within Israeli politics and the zionist mythology of the modern Israeli state, there is an emphasis on rectifying past wrongs, and an acceptace of violence as necessary. While this was initially targeted at Nazi perpetrators (as was the case with the capture and trial of Adolf Eichmann in 1960), more hard-line Zionists have targeted anyone who might, in theory, pose a threat to the state. While it is important to recognise the immense threat Israelis still face, one should also recognise the questionable ways in which narratives of Israeli identity, victimhood, and purpose have operated in the geopolitical arena. Of noticeable concern has been the historic treatment of Palestinians. I am sympathetic to both sides in the Israeli-Palestinian conflict and have no intention of asserting the moral legitimacy of one side over the other. That said, it is hard to deny that at times Israeli actions have presented a serious contradiction. What is of significant importance has been the complex blurring of the boundaries between victim and victimiser status. Put more bluntly, there lies a concerning hypocrisy in Israel’s justifiable claim to victimhood and at times morally contentious treatment of opposition. One event that instantly comes to mind is the 1982 Sabra and Shatila massacre in which Israeli soldiers not only allowed, but potentially aided - by lighting flares - Phalange soldiers in massacring Palestinian refugees in Beirut.

1111

The contradictions in this particular atrocity were captured in the tactful animated documentary Waltz with Bashir. By shifting through multiple states of being - memory, hallucination, and real life - in the form of animation, the film follows the director Ari Folman’s confrontation with his personal role in the massacre. The key message of the film lies in the tensions between Folman’s contradictory role as both the descendant of a Holocaust survivor and his complicity in the violence. His awareness of the victim-victimiser paradox is made even more powerful when his friend Sivan states “you see yourself in the Nazis who perpetrated your parents”. With a keen eye for nuance, the film leaves the final judgement up to the audience as Sivan reminds Folman that he could not have known the massacre was taking place. Films like Waltz with Bashir present glimmers of hope that Israeli society is becoming increasingly self-aware of the contradictions in its trauma related legacy. Israel’s history throws up interesting questions about trauma, and its relationship with identity and politics. National narratives are shaped by traumatic events and they can filter down generations, influencing politics and decision-making for years to come. Recent national traumas, including 9/11, indicate that we have learnt little from history. As we strive to resolve conflicts in places like Syria, Afghanistan, Iraq, Pakistan, Sudan, Somalia, Nigeria, Mexico, and Colombia, we must be prepared to implement the necessary peacetime measures to help people process what has occurred and therefore mitigate the long-term implications of their trauma. This can only be achieved with a nuanced understanding of history, and an acceptance of the labyrinthine moral dilemmas in the international arena.

Frankie Vetch


ISSUE 36 | 2020

Utopian Theories of Society Ideals of a ‘utopia’ occur frequently throughout recorded history, with traces even seen in a variety of myths and legends. Famous figures have suggested many ways of reaching a utopian society, and yet there is no real consensus on this perfect world, except, perhaps, from the common focus of mass happiness and prosperity. Even this is contested - arguably, are not all opposing political figures, parties, and campaigns equally presenting ideologies of striving towards a utopian society - otherwise, after all, what do we have to believe in? Or, from a more depressing perspective, are all these utopian ideologies merely temporary knee-jerk reactions to more frequent instances of dystopian reality? With the peak of utopian theorists existing in the 18th and 19th centuries, many surprisingly promoted societies free of governance or the state. Robert Owen, the British mill owner, also criticised organised religion and rejected all forms of authority. He vehemently argued for self-sufficient work communities in which the workers benefited from the profits, and were provided for in education and social protection from childhood, all linked by unions. However, his New Lanark Mill, which ensured 8 hour day maximums, youth education, reduced illegitimacy, limited alcohol, and formed the basis of co-operative shops and nursery schools, all of which to many of us sounds, literally, utopian, failed economically. Interestingly, union power since the 20th century has decreased to almost negligibility, and the state overwhelmingly controls (and increasingly withholds) welfare. Henri de Saint-Simon advocated for the

Owens’ vision of a utopian town included workers housing

rights of the ‘working class’ as an efficiency guarantee, which he expanded to all workers for society such as bankers and scientists. He suggested that utopia required a meritocracy, whilst calling for the reduction of military and hereditary society, challenging the Church doctrine, and attacking ‘idleness’ of the ‘underclass’ as a moral fault. These proposals, however, suggest a society arguably not that different from an all-powerful state, rearranged but still retaining hierarchy, and therefore a suffering underclass. Charles Fourier, an advocate not of self-sufficiency, but of humanity’s natural cooperation as the secret to social success, based utopia upon utilising and allowing natural passion to dictate us, prominently our innate passions for work, creating a personalised, attractive labouring society. He attacked the very existence of civilisation vehemently as the cause of all inequality and unhappiness. Indeed, he even created the questionable idea of social indexes of personality types to choose casual sex partners! Finally, following this tide of influential thinkers was Henry George, an American political economist, who proposed that a single tax on land, alongside anti-monopoly reforms, would solve all problems of inequality by slowing the exorbitant accumulation of wealth by land owners via rents. He also envisioned an unconditional basic income from land surplus, as a universal right, which, based on the huge debate even over minimum wages and opposition to welfare schemes, appears years away from our current society. Can these figures, therefore, really be seen as utopian revolutionaries in hindsight? All their philosophies, despite their appeal, have either failed when applied, or did not address issues of inequality to a ‘utopian’ extent. All of them, equally, had an underlying deistic approach, and therefore ‘utopias’ have often been based on a ‘divine social order’ concept. However, since industrialisation, societies have become more and more secular, demonstrating the reduction of the spiritual certainty in ‘something better’, whilst equally, despite these thinkers’ abolitionist stances to organised religion, the Churches in England and America still hold a large sector of national wealth, and are primary investors in oil firms destroying the planet, while ISIS has declared a borderless Islamic State over all Muslims globally. Similarly, opposition to gov-

1212

ernments appears ironic when in today’s society, the most similar political ideology (democratic socialism) calls for state intervention to reach an equal society, whilst the concept of higher taxation on property has had decreasing public traction recently, as seen in election results. Furthermore, in historical perspective they were all time-specific reactionary ideals, rather than long-lasting social foundations or even debates. Owen, for example, acted in response to the increasing exploitation of workers during the Industrial Revolution, including child labour, which we have since both adjusted to and legislated for within our capitalist system. Similarly, Saint-Simon was inspired by the acceleration of science and technology, advocating its replacement of organised religion, which has similarly occurred without utopian principles - indeed, arguably technology has contributed to social decline. Finally, Fourier, popularly utilised in the 1848 Revolutions, clearly represents tensions within certain regimes, whilst Henry George’s early 19th century work responded to the mass movement into American cities (specifically New York), and subsequent mass inequality visible in smaller spaces, one of the greatest phenomena of industrialisation, which is now our typical geopolitical background. Notwithstanding the prominence of these utopian thinkers, inspiring countless political and cultural movements and philosophers, witnessing them through society today places a down-heartening lens on utopian possibilities. With similar political concepts suffering across the globe, and our lives increasingly ruled by banks and corporations rather than governments and welfare benefactors, the recent fascination with dystopian literature and film perhaps illustrates best the gradual death of utopia in a modernised world. We have no famed equivalent spokespeople in our society, perhaps emphasising the theorised ‘end of history’ and new, unknown era we are entering. Nonetheless, important concepts of happiness through sufficiency and equality can still be drawn from them which, I believe, should be applied timelessly to humanity.

Connie Lane


ISSUE 36 | 2020

Interview with Katherine Clements

Katherine was born in Lancashire and studied Ancient History and Archaeology at Manchester University. Her debut novel, The Crimson Ribbon, was published in 2014 and her third novel, The Coffin Path, was nominated for the HWA Gold Crown Award and The Guardian's Not the Booker Prize. Katherine was editor of Historia, the online magazine of the Historical Writers’ Association, and recently led the development and launch of the UK’s first A-Level qualification in Creative Writing. She is a Royal Literary Fund Fellow currently based at the University of Manchester’s School of Arts, Languages and Cultures.

Katherine‌C ‌ lements,‌‌author‌‌and‌w ‌ riting‌‌coach

The Manchester Historian: What do you think is the connection between historical fiction and history as an academic study? Katherine Clements: I think the clue is in the word: hiSTORY. Human beings create stories to make sense of the world and our place in it. In simple terms, part of a historian’s job is to use evidence and context to create historical narratives. No matter what the origin – primary sources, archaeological finds, legal documents etc. – a historian makes choices about which evidence to use and how to contextualize it to best communicate its meaning. This is exactly what a fiction writer does. Historical fiction writers and historians alike, we are all creating stories, aiming to better understand the people of the past. TMH: Can historical fiction make history more accessible to people? KC: Absolutely. My own interest in history began with historical fiction, and I’ve been told countless times that my work has sparked an interest in a certain period or subject for readers. Those who would never read a non-fiction history book, will pick up a novel or watch a historical film – just look at this year’s Oscar nominations for best film, several of which are historical. Aside from our schooldays, fiction is most people’s window into history. It’s human stories that connect with modern audiences. The emotional resonance with

characters is what draws us in, regardless of time or place. Fiction writers are free to imagine our way into places that historians cannot, especially in the realm of emotion. As Hilary Mantel has said, fiction ‘can sit alongside the work of historians – not offering an alternative truth, or even a supplementary truth – but offering insight’. TMH: Do you think that writing historical fiction limits you creatively? Or alternatively, does having a historical framework inspire you and enhance your writing? KC: It does both! When dealing with real events or people, you’re sometimes constrained by the historical record, but I’ve found it can work the opposite way too. History can provide the ‘hooks in the wall’ from which to hang a story. For example, my second novel, The Silvered Heart, is based on the legend of female highway thief, ‘The Wicked Lady’. This folklore is most often associated with a real person – Lady Katherine Ferrers, who lived during the English Civil Wars. I researched the lives of Katherine and her prominent family and pieced together a story that fitted both the facts and the legend. It’s total fabrication, but based on a historical framework, which was a lot of fun to figure out. I do believe novelists have a responsibility to be respectful with the historical record, and be clear when facts have been altered, but ultimately the story has to come first. TMH: Do you think historical fiction can play a role in giving a voice to marginalised groups throughout history? For instance, The Crimson Ribbon focuses on the narratives of women in a very male dominated period of history.

to me to visit the places I write about, if possible – I always find fascinating details that enrich the book. The best bit is what I call ‘method research’: visiting the archives at the Royal Armouries to handle 17th century pistols, because I needed to know what that felt like; spending a weekend birthing lambs at a North Yorkshire hill farm for The Coffin Path; a long, hot kayak trip through the Louisiana swamplands, for my next book, set in the early days of New Orleans. Writing research has taken me to some unexpected places. TMH: Do you have any advice for history students? Or for students who are interested in pursuing creative writing? KC: Read. Read a lot. And don’t be afraid of writing terrible first drafts. Most of writing is actually editing. TMH: As the Royal Literary Fund Writing Fellow, can you describe your role at the university? What writing services do you offer to students? KC: The Royal Literary Fund places writers in universities around the country, to help students (and staff) with academic writing skills. I’m available for one-to-one appointments, during which we’ll discuss whatever aspect of writing you’re concerned about. It might be anything from planning and structure, to grammar and punctuation, or how to create an argument or make your writing more engaging. I can help with any kind of writing, at any stage, so long as it’s relevant to your university career. It’s a chance for people to get confidential, constructive feedback from a professional writer. It’s not scary (I promise) and it’s completely free.

KC: Attempting to give a voice to marginalised people can be a major motivator. I set out to tell stories that I wasn’t reading in the history books. Kings and queens have been done to death – I think there is a hunger now for stories of ordinary people, and minority voices is a big part of that. There has been such progress in social history over the past few decades that such stories are easier to trace and easier to imagine, but they also contain a lot of blank spaces – that’s gold dust for a novelist. TMH: How do you go about researching for a historical novel? KC: I read as widely as possible. I start with secondary sources to get a sense of the big picture and establish the areas I need to focus on. After that it gets more specialised. I’ve visited plenty of archives and rare book collections. It’s important

1313

The Coffin Path, published 2018


ISSUE 36 | 2020

Li Kui and Legalism in China Legalism has had a somewhat celebrated history given its place in the lineages of pre-Han philosophy. The ideas espoused by the paradigmatic legalists of the Qin dynasty such as Shang Yang (390-338) and Han Fei (280-233) were the subject of vitriolic attack by the Confucian orientated scholars of the Han era. Lu Jia (d.170 BC), for instance, saw the short-lived Qin dynasty’s (221-206) adherence to these figure’s harsh legalism as the reason for its collapse and a model of what not to be followed. However, this criticism takes the doctrine the Qin followed out of its context, ignoring the reality of the chaos of the Warring States. Legalist reform was often an answer to the question of how to establish order in an age of constant political upheaval and warfare. This article aims to establish the pervasive nature of the legalism of the Zhanguo Era and how the often ignored Li Kui was fundamental in the foundation of the legalist doctrine which transformed the era. Despite contemporary recognition of confucianism as the philosophical basis of the imperial Chinese state, legalism is arguably more important as a philosophy to understand the era. Before discussing Li Kui’s reforms and their impact, it is first important to establish what can be conceived as the principles of legalist philosophy. Whereas Confucius saw the construction of a hierarchical moral order, and a return to the days of the Western Zhou (1045-771 BC) as a solution to the chaos the collapse of Zhou royal authority, the legalists took a more pragmatic, forward thinking view, wishing not to be ‘shackled’ by antiquity. For them, morality and humaneness wasn't the way to bring about order, instead stern law was. For legalists, people were naturally self-serving and licentious, needing punishment and incentive to control them. In this way they opposed, Mencius who thought that all people were naturally good. Their rhetoric is perhaps most aptly summed up in the third century BC legalist text the Shang Jun Shu which states ‘To benefit the people of All under Heaven, nothing is better than orderly rule’ and ‘The way of establishing the ruler is nowhere broader than in relying on laws… and eradicating the licentious ’. When Li Kui was appointed chief minister by Marquess Wen of Wei (445-396), the Zhou world was at an important crossroads of history, entering the Warring States era (475-221). The states were searching for a solution to the crisis of leadership that had emerged in the preceding Spring and Autumn era. This was the result of the enfeoffment of ministers in the 7th century, whose power eventually superseded that

of the rulers of the regional states. It was from this that the state of Wei in modern Shanxi emerged, with the aristocratic houses of Wei, Han and Zhao partitioning the regional state of Jin after annihilating the ruling Zhi clan in 453, which had been the hegemonic force of the Zhou world in the middle of the Chunqiu period. Intent to avoid the same crisis, Marquis Wen (445-396) recognised the need for reform and appointed Li Kui as chief minister. The reforms of Li Kui would transform the state of Wei and create a model of drastic socio-political reform that were then followed by the majority of the other states. His reforms covered three main areas. First, he established a bureaucracy, where officials were selected on ability not lineage, and promoted/demoted based on performance. This included major figures such as Wu Qi, who would lead the military expansion of Wei and later engineer similar reforms in the state of Chu, and Ximen Bao. The titles they held were not inheritable, and for the first time records of officials’ performance were kept. Second, he established a comprehensive penal law system known as the Fa Jing (Canon of Laws), which everyone was subject to, regardless of social status. Finally, he strengthened the military, introducing mandatory military service for all males, and rewarding soldiers based on military performance. Overall, these brought about a drastic transformation in Wei, with the demise of the old aristocracy, and the rise of a bureaucratic state that would become the first superpower of the Warring States era. His measures established a means of effectively controlling the population through restrictive laws, and maximising state control over and efficiency of personnel at all levels of society. In the immediate aftermath, Wei expanded drastically, invading Qin and creating the Xihe commandery. It remained the predominant superpower in the region until its defeat at Maling to the Qi in 342 BC. The impact of these reforms was not just in Wei, Li Kui had created a model of socio-political reform that would be followed by the entirety of the Zhou world. Li Kui’s reforms created a pattern of geopolitics that was followed by the majority of polities of the Warring States era, in which, first, Legalist reform was undertaken, and then military expansion took place. This resulted in an era of escalating warfare, with bureaucratic states which were able - because of the reforms - were able to effectively harness their populations, raise massed armies of hundreds of thousands of people and wage devastating wars against their neighbours for the

1414

Statue of Li Kui, Chiense legalist, in Beijing

purpose of territorial gain. By the end of the fourth century, all states had undertaken legalist reform based on the model created by Li Kui, and legalist philosophy had truly engulfed the Zhou world. Li Kui had systematically brought about a new era. Legalism became the modus operandi for the states, with the death of the old elite and the rise of new bureaucratic state apparatus revolving around a ruthless legalist doctrine. His Fa Jing served as the foundation for the paradigmatic reforms of Shang Yang in Qin that eventually enabled the state to form China’s first imperial dynasty. It also contributed to the death of the interstate order. No state, in its new form of self-perceived primacy following the reforms, would be willing to acknowledge the superiority of another state and form an alliance unifying the Zhou world, as had been the case in the Spring and Autumn period. This was the true impact of Li Kui, the first true legalist statesman and the ideology he espoused: an ideology revolving around a highly centralised state; one able to exert huge political power over its people; able to summon fearful military power over its neighbours; a ruthless doctrine of socio-political transformation in which, for the first time, law was the binding force of society.

James Carlin


ISSUE 36 | 2020

The First Environmentalists?

The trope that Native Americans were the ‘first environmentalists’ is put forward predominantly by non-Native people and is constructed on the idea that the indigenous tribes of North America are in some way closer to nature. Today, referring to someone as an ‘environmentalist’ is not considered an insult. Broadly speaking, an environmentalist is a person committed to preventing - or at the very least stalling rapid environmental degradation caused by global warming and climate change. Environmentalists are determined activists who stand up and expose global hegemonic powers for the damage they are doing to the planet. They are considered admirable. If environmentalists are benevolent, why would it be problematic to brand Native Americans as the first, or original, environmentalists?

are many benefits of cultural burning. The considerate application of fire and meticulous use of different smokes created from the fires can increase food and seed production for medicinal use and help sustain diverse landscapes of grasslands, savannas and shrublands. In the Sierra Nevada, cultural burns have not been permitted for 120 years, and as a result the region has experienced severe droughts that have led to the loss of large quantities of vegetation. The high density of the forests in these areas means that without cultural burning there is a higher risk of large, uncontrollable wildfires. In an area just south of Yosemite National Park, people from the North Folk Mono Tribe and the Cold Springs Rancheria of Mono Indians have been working to reintroduce fire to the land.

Firstly, the idea that Native Americans are closer to nature has its roots in colonialism. At the turn of the sixteenth century, European colonialists justified their attack on american soil and its inhabitants through the dehumanisation of its indigenous peoples. The presentation of Native Americans as fearsome savages living wildly amongst nature legitimised colonial projects of dispossession and genocide. It was the relationship that Native Americans had with their natural environment, starkly contrasted with the violent one Europeans had with theirs, that the colonialists used to categorise them as animals. It is true that Native Americans tended to have a different relationship with the earth compared to western populations. Whilst the former’s involved complex, sophisticated and sustainable practices of land management, the latter’s was centered on what could be extracted to fuel rapid industrialisation and imperialism. Even so, to label Native Americans as the ‘first environmentalists’ is to, on the one hand, misunderstand the core principles of environmentalism, and on the other, to misrepresent Native American cultural traditions with regards to the land.

It was in 1850 that the US government passed the Act for the Government and Protection of Indians, which outlawed intentional burning in California believing it to be a primitive and brutal cultural tradition that destroyed the land. Then in 1968, with the realisation that no new sequoias - an endangered species of redwood trees - had grown in the unburned forests, the National Park Service changed its fire policy to prescribe burning as a central feature of their land management strategies. In 1978, the Forest Service followed in their footsteps. Before 1968, the policies of both the National Park Service and the Forest Service had been informed by the dominant voices in the conservationist and ecological movement, who maintained that cultural burns were unnatural and warranted suppression. Fire, from their point of view, was a dangerous, destructive force that was to be eliminated rather than utilised for environmental flourishing. Indigenous knowledge was overridden by western theories and academics: a continued form of modern colonialism. This is where the problem of the ‘first environmentalists’ trope lies: it is deeply contradictory. Native Americans and environmentalists had very different ideas about what was beneficial for the earth in order to protect it.

One example of a Native American land management practice is cultural burning. It has been carried out for fifteen thousand years by the Yurok, Kurak, Hupa, Miwok tribes, as well as hundreds of others in California. It has, until recently, been misunderstood by the US government and environmentalist groups, which has led to its suppression. There

One of the key philosophies of environmentalists at the beginning of the 1960s was that of conservation and preservation of the land. For the environmentalist movement, any human interference with the land was unnatural and made it vulnerable to degradation.

1515

Natural environments needed to be left alone and they would prosper independently. This directly contrasted with Native American practices such as cultural burning. What environmentalists ignored was the significant historical processes of land management that had gone into sustaining the landscapes of North America by Native American populations. As a result, in their fight for environmental justice, and without realizing the importance of indigenous voices, they suppressed the rights and sovereignty of Native Americans. In 1933 President Hoover declared Death Valley, the homeland of the Timbisha Shoshone, a National Park, ending the Native management of the landscape. In 2000, when the Timbisha Shoshone Homeland Act granted some areas of Death Valley back into the hands of the Timbisha Shoshone tribe, they were met with opposition from groups of environmentalists. This was due, again, to their practice of controlled burns, which the environmentalists took issue with. The fire in these areas helped to clear springs of dead vegetation and increased water flows. The non-Native environmentalist idea that the most beneficial thing for land is to leave it as an untouched wilderness meant that they objected to Native Americans managing their own land, in particular to the practice of cultural burning, despite them having done so for thousands of years before 1933. It is important to recognise how harmful stereotypes can be, even if we connote positive ideas with them. Whilst the term ‘first environmentalists’ may seem harmless on the surface, it is a stereotype grounded in a lazy and colonialist mentality, which generalises and carelessly misinterprets Native American cultures. As we move forward with environmental struggles, many of which do involve Native American activists, we must take into account the history of colonialism and indegenous anti-colonial struggle, and the role that ideas about the environment have played in this.

Francesca McGregor


ISSUE 36 | 2020

Zionism: The Divisive History Of Israel For over 100 years, the concept of Zionism has sparked heated international debate, which shows no indication of diminishing. Now-a-days, most people are somewhat familiar with the concept of Zionism – the movement which seeks to unify the Jewish race into one nation and return them back to the Holy Land of Israel. However, Jewish control of the region did not exist until relatively recently for nearly 2000 years, thus the full implementation of Zionism would come with its controversies. Of all the movements of the 20th century, Zionism remains one of the most divisive. The late 19th century saw the emergence of the modern Zionist movement. Popularised in Theodor Herzl’s 1896 book, Der Judenstaat (The State of the Jews), Zionism has always argued that Jewish protection from persecution can only be attained if Jews took action themselves, without relying on outside help. Herzl argued for a nation for Jewish people but did not suggest where. Israel was quickly adopted due to its religious significance, disregarding the millions of Arab-Palestinians that already lived there. Whilst Jews had been migrating back to Israel and reviving the Hebrew language since the mid-1800s, the end of the century granted Zionism the recognition it desired. The 20th century changed the discourse for Zionism, as it moved from a grassroots movement to a nationalist one, that received increasing support internationally, particularly from European states. The Balfour Declaration issued by the British government in 1917 supported Palestine as a home for the Jewish people. This exacerbated ArabIsraeli tensions, as increasing Jewish immigration into Israel, with British assistance, appropriated Arab land. Whilst Arab resistance intensified, brutal British rule solidified the Jewish position. This eventually paved the way for the United Nations to officially recognise the state of Israel in their 1947 Partition Plan, which suggested splitting the region. This plan was accepted by the Jews but rejected by the Arab communities. The end of the Civil War, which erupted during this time, led to the establishment of Israel in 1948 with 26% more land than the UN Resolution suggested and left ongoing tensions even further from being resolved. As Israel was founded on conflict and tension, it is important to question its acceptance amongst Western nations

and the UN, who seem to have justified nouveau-colonialism and aggressive expansion. It is likely that their failure to defend the Jews from recent persecution – such as the pogroms in 19th century Russia and the devastating Holocaust – led to disproportionate support for Zionism fuelled by guilt. For Zionist scholars, the Jewish race were eternally unsettled, serving as the primary ‘Other’ against the rest of the world, and needed protection through having their own nation. These ideas, derived from ancient Judaism, were catalysed by the persecution of Jews during the Holocaust.

Palestinian refugees leave their homes, 1948

Secular nations supported Zionists as they feared the blood of another six million Jews on their hands. Support for Israel became synonymous with support for Zionism, as many Zionist institutions became part of Israel’s infrastructure during its creation. This includes the Israel Defence Forces which combined three Zionist militias into a new army. Thus, support for Israel was continually driven by guilt, and the new Jewish state had unwavering Western allies. Israel’s relationship with the West, however, would not always be perfect. Support for Israel – and therefore implied support for Zionism – had become the norm for Western politicians. It is therefore surprising that in 1975, the United Nations adopted a resolution which declared ‘Zionism a form of racism and racial discrimination’. It is unsurprising that most large Western powers, such as the USA, France and the UK, voted against this resolution and then spear-headed its repeal nearly twenty years later, no doubt seeking to assist its strong ally in the region. The resolution, declaring Zionism as racism, was sponsored by Middle Eastern states, opposing the illegal military occupation which the indigenous Palestinian population had been subjected to for nearly thirty years. Between 1947

1616

and 1949, over 750,000 Palestinians were displaced into neighbouring states, almost half the population at the time. Zionists justify these actions by manipulating Jewish trauma, particularly the Holocaust, justifying aggression towards what they perceive as second-class citizens living on land which rightfully belongs to the Jewish race. This exploits the Jewish faith to condone the same type of targeted violence which forced the Jews out of Europe in the 1940s. Palestinians refer to this time as ‘Al Nakba’, or ‘The Catastrophe’, whilst their backlash to aggression is deemed unjust and responded to with stronger invasion. Since the Middle East became the arena for the Cold War, there has been continuous disregard for indigenous populations, of which Palestinians were one of the first to be targeted. The Israeli occupation of Gaza has been labelled by Human Rights Watch as an ‘open-air prison’, with 1.6 million Arabs in need of humanitarian aid. These present day struggles are rooted in Zionist thought, as consecutive Israeli governments pursue bloodshed in order to secure ‘their’ homeland. As Zionism continues to be a hot topic in the media, it is crucial to remember its origins in violence. Ultimately, the Zionist movement is successful where it needed to be, at the international level, gaining assistance whilst conducting brutal policies against the indigenous people. The founder of modern Zionism, Theodor Herzl, once said that the solution of the Jewish question would normalise them and remove them from the pages of history. The ideas that he helped to popularise would, ironically, continue to make headline news over 100 years after the publishing of his flagship book.

Hannah Speller


ISSUE 36 | 2020

The Acid Rave Revolution vs. Thatcher With‌ ‌Britain‌ ‌gripped‌ ‌by‌ ‌Thatcherism,‌ ‌the‌ ‌growing‌ ‌rave‌ ‌sub-culture‌ ‌of‌ ‌the‌ ‌1980s‌ ‌was‌ ‌an‌ ‌expression‌ ‌of‌ ‌alienation‌ ‌and‌ ‌youth‌ ‌opposition

When Thatcher came to leadership on the 4th of May 1979, she said “where there is discord, may we bring harmony;” in her leadership speech. Although it wasn’t exactly the ‘harmony’ she had in mind, the British public grouped together to form their own, home-made ‘harmony’ in the form of illegal raves backdropped by the genre of acid house. Any pre-existing ‘discord’ present in the UK deepened immensely following Thatcher’s leadership. Once thriving areas became ghost towns, most notably Toxteth, where class relations came to a boiling point in the 1981 riots, in which the town was left to rot and burn. This economic and social upheaval led to a deprived, rejected, and poverty-stricken Britain. Amongst the political turmoil of the endless riots, the miners’ strikes of ‘84 and ‘85, the Battle of Orgreave in ‘84, a peak in football hooliganism, and mass unemployment, the British public were desperate to find a haven of happiness. This loss of national identity created a vacuum, one which was filled with the prospect of a burgeoning youth revolution, quietly blossoming amidst the mayhem. The mid-80s saw the second Summer of Love blossom as the UK embraced, with open arms, a new hypnotic, trance-like genre of music wedded to fields, warehouses and basement locations. There was a reorientation of nightlife, from fights outside of pubs to a hub of peace and unity – and drugs. It was specifically the drugs that allowed for a new type of peace, as it disinhibited ravers from any ability to perform a violent or aggressive act. Instead, blows were swapped out for fraternal hugs, regardless of football allegiance. Suits and heels were ditched for dungarees and trainers. Everyone was equal and conspicuous consumption was no longer necessary as class no longer mattered inside the raves. This equality allowed acid house and rave culture to transcend above, previously rigid, divisions based on the lines of black and white, north and south, rich and poor. It successfully

united one ecstatic generation of young people whilst triggering a moral panic in the police, politicians and parents. Although Thatcher infamously stated that there is “no such thing as society”, a warehouse filled with hundreds of people who could all come together, regardless of social background or position, in being treated poorly by the government, seems like a community to me. The raves had an inherent binding factor, acting as a bubble of unity in an otherwise hostile and divided social setting. Regardless of race, class, gender or football loyalty, the ravers embraced each other with an ecstasy-induced togetherness. Due to this unity amongst the ravers, media and press outlets found it increasingly difficult to tunnel-vision in on or condemn a singular scapegoat or demographic of society, as the sheer volume of ravers allowed for anonymity; they were an all-inclusive faceless mass. The right-wing press didn’t seem interested in covering the war on unemployment,

igins. Inversely, young people were trying to flee from the negativity and downward spiral of politics present at the time. Notions of political activism were initially redundant and superfluous, the raves were merely a commune for the pilled up public who were looking for a way out from the chaos around them. Yet, this social wave once rooted in pro-hedonist foundations quickly transformed into an anti-authority based political protest by the default of police brutality. Fun lovers and thrill-seekers looking for a place to dance and unite were treated as rioters, with state violence acting as the catalyst for converting a group of hedonists into heretics. Some of the last people in Britain who weren’t completely disenfranchised with politics, the white middle-class high on drugs, were eventually politically galvanised. A push back from Thatcherism was sparked, and every rave was a testament to the youth’s resistance. The battle between the blaring police sirens and the championing 120BPM music alone demonstrated the redundancy of the government. Is it fair to say that rave culture was merely a vehicle for young people to engage in cheap thrills and exercise their teenage kicks? Or, can it be argued, that these raves were a defibrillator, bringing a unifying heartbeat back to the disparate and disenfranchised youth of Britain, who were otherwise struggling to find happiness amongst the degradation and decay brought about by the Thatcher government? Regardless of your stance, the Second Summer of Love and the Acid Rave Revolution pretty much revived youth culture in an otherwise bleak and uninspiring social climate. Its legacy has set up the joyful hedonism which we still indulge in and witness in contemporary British youth culture to this day.

Rhiannon Ingle Ravers in London, 1985

yet found sufficient time to hammer in on the war on acid. The right-wing press whipped up an anti-ecstasy hysteria, yet they were so far removed from the rave scene, their comments didn’t go beyond a moral condemnation, a call for an increased police force, and the complete ignorance over drug use and its effects. On the one hand, acid house culture saw a Thatcher-fuelled unity amongst ravers yet it simultaneously created a divide between the youth and the older generation, which had not been as prevalent since the Sexual Revolution in the 1960s. Unlike its 1970s predecessor, Punk, Acid House was completely depoliticised in or-

1717


ISSUE 36 | 2020

‘Manifest Destiny’

How US expansionism shaped borders and the people living within them

The 1783 Treaty of Paris concluded the American Revolutionary War between the British Empire and the United States of America. Stretching from colonial settlements along the Atlantic Coast in the east, to the banks of the Mississippi river in the West, the borders of the new republic extended across a vast expanse of land. The boundaries, however, did not remain static for long; over the course of the next century the expansion of the American frontier followed a pattern of migration, settlement, and displacement. These changing borders forced people from their ancestral lands, and re-determined national identities. By the 1820s the boundaries had extended far beyond their original limits, as the US amassed territory across the continent. This expansionist yearning became a principal feature of American foreign policy, and was articulated in the popular 19th century concept of ‘Manifest Destiny’ - the belief that Providence preordained and justified the expansion of the USA across the North American continent. The extension of American borders throughout the century was not merely the adjustment of lines on a map: it had an indelible impact on cultures, identities, and experiences for entire populations.

Cherokee Girl, late 19th C.

Throughout the 19th century, sprawling communities of European settlers flooded into the lower south, as arable land in Georgia, Alabama, and other southern states became increasingly coveted for the production of cotton. Pressure on the government to facilitate the acquisition of this

land led to the Indian Removal Act of 1830 - the authorisation of the forced relocation of Native American tribes from their ancestral lands to a designated ‘Indian territory’ west of the Mississippi river. Though at the beginning of the 1830s the south-eastern states were home to nearly 125,000 Native Americans, by the end of the decade all but a few had been forced to relocate outside of the US border. 19th century political opinion justified this relocation as a regrettable, yet necessary, measure to allow for white expansion into desirable territory – or alternatively as a ‘sincere’ attempt to preserve Native American cultures by preventing the assimilation of indigenous tribes into European settlements. Native American attitudes towards the earth, which revolve around the collective ownership of land and appreciation of seasonal produce, were swept aside in favour of western monocultures and land privatisation as indigenous people were forced further west. The trek of the Cherokee, who were forced to relocate from present-day Oklahoma, became known as the infamous ‘Trail of Tears’, because of the loss of life on this devastating journey, and the relinquishment of a home that would never be reclaimed. The expansion of white US settlers into Native American land, and the subsequent forced relocation of these groups across the US border, irrevocably altered indigenous communities and their ties to the land. The Mississippi river wasn’t just the westerly border, it was a reminder of incredible loss. In 1848, along the southern border of the US, the Treaty of Guadalupe-Hidalgo ended the American-Mexican war, abruptly changing the geography of North America once again. Mexico relinquished all claims to Texas, which had been annexed by the United States in 1845 after a rebellion of American colonists had succeeded in breaking the state away from Mexico. The treaty gave the US the Rio Grande as a new border for Texas, and 525,000 square miles of land that had previously belonged to Mexico – which now makes up considerable portions of California, Arizona, and Colorado, amongst other western states. This changing border displaced almost 100,000 indigenous people on the Mexican side of the border, who were now swept up into the expanding territory of the United States. Entire communities became immigrants on the soil upon which they had always lived. These displaced people were given a year to decide whether to retreat to the new border of Mexico, and retain their Mexican citizenship, or stay on their land, and assume American citizenship. Over 90% chose to stay where they were: where the community and culture was known to them. Though the treaty promised that the property rights of Mexicans living in these transferred territories would be respected,

1818

Map of the New World, 1800

they were often not honoured as the land became assimilated into US settlements. Identities and cultures were transformed as the land moved into the ownership of the United States, and the residents became Americans. Though communities remained on their ancestral homelands, the border change profoundly affected their lives; as the popular slogan maintains: ‘We didn’t cross the border, the border crossed us’. Further efforts towards the expansion of the US continued until 1848 when the dream of ‘Manifest Destiny’ had finally been realised and the territory of the United States stretched from the east coast to the west, from the 49th parallel to the Rio Grande. The borders of the country had been solidified, but the people remained divided in their experiences and perceptions of this expansionism. Within the land that the United States had assumed, discord was rife. The innumerable experiences, cultures, and languages that had been shaped by the United States’ expansionist policies correspondingly created a country with infinite ‘American’ identities. As contemporary America attempts to understand its own national identity, it is important to reflect upon the experiences of the people affected by its expansionist policies, and the persistence of these memories today.

Ayla Magness-Jarvis


ISSUE 36 | 2020

Understanding Jihad

Jihad is practiced by Muslims every day across the world. It translates as striving in the way of Allah and represents a deeply personal and spiritual effort to contest with one’s ego to get closer to God. Islam has overwhelmingly gained the reputation of the ‘Religion of the Sword’ in the West due to the media’s presentation of jihad being a wholly militaristic concept. However, Islam varies greatly from its often violent western interpretation. Although both these notions of jihad have their roots in truth the reality is far more complex as they are emphasised and played down by different religious interpretations, sects and laws across different times and cultures. The Qur’an defines the term as “striving with one’s self and one’s money in the cause of Allah’’ allowing the extent one strives and the true cause of Allah to be left to interpretation. Viewing how jihad has been practiced through the lenses of history gives a pragmatic understanding of the ambiguous concept.

“The difficulty of examining greater, spiritual jihad in its historical context is the subtlety and intimate nature of the concept” The distinction between spiritual and militaristic jihad dates back to a passage in the Hadith, Islam’s second holist book after the Quran, where Al-Khatib al-Baghdadi quoted the Prophet Muhammad (PBUH) returning from one of his battles saying “We have returned from the lesser jihad (al-jihad al-asghar) to strive in the greater jihad (al-jihad al-akbar).” Interestingly, the lesser form is the combative and greater is the personal. This has continued to define the concept to this day for a majority of Muslims. Nonetheless, Muslim groups continue to disagree on the meaning, from the Ahmadiyya whom believe intimate, personal and internal struggle is the essential ingredient to jihad whereas Wahhabis believe a violent armed struggle is the primary function, showing its elusive variation. The difficulty of examining greater, spiritual jihad in its historical context is the subtlety and intimate nature of the concept. The study of lesser is contained in record as an overt sometimes murderous event which can be

recorded. The loud nature of lesser jihad has eclipsed the spiritual significance and modern day understanding of the concept.

There are conditions and limitations that apply to even the most violent interpretations of jihad. The Quran states “Fight in the cause of Allah those who fight you, but do not transgress limits, for truly Allah loves not transgressors” (2:190). According to this Surah, fighting is acceptable presuming it is in the cause of Allah and within the undefined limits. However, it may not even be a reference to jihad at all, as it did not mention it by name. The conditions for jihad, as stated in the Qur’an, are that it must be in selfdefence against an oppressing force but others argue that the ‘sword verses’ of the Qur’an have abrogated these conditions. The Sultan, Saladin, is perhaps the most notable protector of Islam against foreign aggression. Saladin’s mostly peaceful capture of Jerusalem in 1187 against the Crusader forces of Guy of Lusignan, King Consort of Jerusalem, and Raymond III of Tripoli solidified Saladin in Islamic legend as it was done within the conditions and in the name of jihad. Saladin’s repossession of Jerusalem came at the end of a brutal military expedition of the Levant, modern day Palestine, which saw mass casualties of Saladin’s men. The successful recapture of Jerusalem was expected to be ensued by a massacre of the Franks inside the city, as vengeance to the 1099 massacre; however Saladin did not kill a single Christian inhabitant and many were given safe passage to the coastal enclave of Tyre. The Abbasid period is referred to in Islam as the ‘Golden age of Islam’ as Islamic art and literature thrived and Islamic jurisprudence (fiqh) was debated and finalised. The Islamic laws finalised during this time period are still respected in the Muslim community today. Viewing how jihad was practiced in this period of time gives a more legitimate understanding of the concept today. Caliph Harun AlRashid was defending early Abbasid territory against the Byzantines. As the casualties of Harun’s army increased, he reached out to civilians to defend their land. This resulted in the Islamic

1919

ulema (scholars) volunteering to jihad and gaining an increasingly militarily interpreted version of the concept. Despite their claims of righteous militant jihad, it never appealed to the wider ummah (community), ‘where the frontier was a distant reality almsgiving and solicitude for the poor were still seen as the most important form of jihad’. During the Abbasid Period there was greater emphasis on greater jihad, than its lesser form.

“the conditions for jihad, as stated in the Qur’an, are that it must be in self-defence against an oppressing force” A modern day misconception of jihad is its potential to rally Muslims in order to fight battles; that jihad is exploited and ‘political actors use religion for their purposes’. However, this was not the case during the Abbasid Period. After the first crusade successfully captured Jerusalem in 1099, preacher and scholar al-Sulami wrote his ‘kitab al-jihad’ (call to jihad), in which he attempted to rouse Muslims to righteous anger in order to drive out the Franks, but it was in vain. It was only one hundred years later when Saladin, an established ruler with considerable power, made use of jihad that it was successful in mobilising large numbers of Muslims. The danger of historical interpretations of jihad is that societal standards change over time, as does religion, hence why contemporary scholars and Imams present Islam in a peaceful light. Whether lesser or greater, jihad is a struggle for all Muslims.

Jack Moon


ISSUE 36 | 2020

The Whitechapel Victims The infamous murders of Jack Ripper have dominated the criminal landscape since the 1880s and have sparked the imagination of many budding sleuths. So much so, that in 1931 the journalist Fred Best forged letters supposedly signed by Jack the Ripper, to maintain interest for his readers. This saw a continuum of interest and one that reached its pinnacle in 2015, through the opening of the Jack the Ripper Museum in East London. Its creation thus cemented the Ripper’s place within modern popular culture and celebrates his brutal crimes rather than condemning them. We have defined Mary, Annie, Elizabeth, Catherine and Mary as the Ripper’s victims. Not as women who struggled to survive in a patriarchal, Victorian society. The murders of the canonical five were the first cases of prostitution to be taken seriously by the male dominated justice system and highlights the system’s failure in protecting the most vulnerable within society. Therefore, the murder of Martha Tabram, who failed to present all the trademarks of a Ripper victim becomes symbolic of masculine prejudice towards prostitution in the late 1880s. Martha Tabram was the earliest victim

Martha Tabram, murdered aged 39

in a wave of violence towards prostitutes in Victorian Whitechapel. Brutally murdered at the age of thirty-nine, Martha is defined by her links to the Ripper. A definition which is made increasingly prominent by a historical debate, which judges whether Martha meets the criteria of the Ripper’s Canonical Five. Historians like Philip Sugden and Sean Day view Martha as a victim of the Ripper because of the similarities shown to the Ripper’s later

murders. However, the creation of a criteria, adds an element of unwarranted privilege to Martha’s potential as a Ripper victim. Modern popular culture and the wider historical debate has failed to explore the lack of justice for vulnerable women like Martha. Vulnerability, which was exacerbated by the everyday battle of severe poverty and the judgemental society in which she lived. Her murder simply marked her final condemnation by the masculine controlled Victorian society. However, to judge the extent to which Martha’s murder reflects the condemnation of prostitution we need to revisit the events of her murder. I will begin with Mrs Hewitt, a fellow resident, who was woken by ‘screams of murder’ but failed to be alarmed because of the commonality of domestic violence. It was not until John Reeves on his way to work on an early 1888 morning that Martha’s mutilated body was discovered on the first-floor landing. As the inquest revealed, and the initial examination of her body by Dr Timothy Roberts, her body had been lying on the landing unnoticed for three hours. The idea that a severely mutilated body, with thirty-nine stab wounds, could lay unnoticed in such a public place seems unimaginable. However, the idea that murder and violence became such a common practice highlights not only societies failings in protecting vulnerable women like Martha, but an inherit societal ignorance towards domestic and sexual violence. An ignorance which is mirrored within the injuries of Martha to her stomach and genitals, where she was stabbed six times. Horrific sexually motivated mutilation was also present within the Rippers ‘Canonical Five’ and became symbolic of the wider societal acceptance of masculine dominance and control. However, this wider societal acceptance was also present within the male dominated Victorian justice system, who were driven by the thrill of the chase. Jack the Ripper, a term coined by the press in the 1880s, intrigued the police with his new methods of murder. The Ripper’s methodological dissection of his victims added a professional and strategic element which had never been seen in the East End. So much so, that a twisted level of excitement arose from this new spectacle as new Scotland Yard Detectives like Abberline and Moore were introduced. This led to the formation of a cult around

2020

the Ripper, which hindered the efficiency of the police investigation. The police were only able to identify Mary Nichols from the mark on her petticoat and the main suspect in the case of Annie Chapman, was arrested based upon a clean leather apron. However, this level of incompetence only grew as the Ripper continued his reign of terror. Chief Inspector Swanson ordered the distribution of eighty thousand leaflets after the murder of Elizabeth Stride, showing a clear mismanagement of manpower. A mismanagement which facilitated the murder of Catherine Eddowes and the brutal mutilation of Mary Jane Kelly. This placed blame not only upon the murderer himself, but the Victorian justice system.

“a fellow resident was woken by ‘screams of murder’ but failed to be alarmed because of the commonality of domestic violence” Therefore, in the space of just four months eight vulnerable women had been murdered and the Victorian police had failed to bring anyone to justice. These were a series of crimes committed by a man and investigated by men. The assertion of masculine dominance is evident through the murders and the police themselves who were intrigued by the new methodological approach to murder rather than the acts of violence that had been committed. The Victorian justice system was rife with prejudice towards the most vulnerable women of society and it took the ritualistic dismemberment of eight women for the police to consider domestic violence seriously.

Sian Jones


ISSUE 36 | 2020

The Wilmington Massacre

In the late autumn of 1898, the city of Wilmington in North Carolina witnessed the only coup d’état in the history of the United States. The massacre of African-Americans and removal of elected officials from office that occurred on the 10thof November was the culmination of four years of deliberate planning on the part of white supremacists, and led ultimately to the institution of the racial caste system of Jim Crow. Despite the unprecedented nature of the events in Wilmington, the coup remains a strangely under-explored stain on the American pursuit of the ideals of freedom and democracy. Yet in many ways Wilmington, and not the

farmers abandoned the Democrats in favour of the newly-formed People’s Party. The populist movement had made electoral gains in the 1892 elections, standing on a platform of nationalization, progressive income taxation, and democratic accountability. In North Carolina, the Populists had gone a step further than their comrades in other states; co-operating with black Republicans to secure office in the upcoming 1894 elections. This new ‘Fusionist’ movement successfully merged Populist economic policies with African-Americans’ search for fair political representation. The alliance proved to be a winning one, and 1894 saw the party win a majority in

The only coup d’état in the history of the United States; the Wilmington insurrection, 1898

Plessy vs. Ferguson court ruling of 1896, truly marked the beginning of racial segregation and black disenfranchisement in the south. Orchestrators of the massacre, such as Charles Aycock, were consulted by white supremacist leaders in other states on how best to ensure the suppression of the black vote. Looking to Wilmington as a model, by 1910 every state in the south had disenfranchised African-Americans and ensured the unrivalled dominance of the Democrat party for the next fifty years. One of the many tragedies of Wilmington is the tantalising prospect of what could have been. With a prosperous black middle-class, higher literacy rates than their white counterparts, and a bustling port shipping cotton to the rest of the world, the largest city in the south-eastern state of North Carolina stood as a testament to the advancement and achievement of African-Americans since emancipation. As well as being a centre of economic prosperity, the city possessed significant cultural and intellectual importance due to its being home to the Daily Record– the only black-owned newspaper in the country. The massacre in Wilmington also marked the end of a relatively successful period of inter-racial democracy in North Carolina. Shifts in political allegiances had begun after the recession of 1893, as poor white

the state legislature, as well as the election of Fusionist governor Daniel Russel. The success of the Fusionists presented an existential threat to white supremacists and Democrats - who began a concerted effort to recover lost political ground by coordinating meetings of business leaders, newspaper editors, and political activists. Predicated as it was on a delicate alliance, anxieties about relationships between black men and white women in particular were used to form a wedge in the Fusionist movement. Democrat campaign posters stoked fear regarding the rise of ‘Negro Rule’, and the free pass supposedly being given to African-American men to attack white women. The campaign intensified after Alexander Manly, editor of the Daily Record, wrote an article condemning the lynching of African-Americans, and arguing that the vast majority of sexual relations between white women and black men were consensual. Coupled with this weaponization of gender, armed militias organised by the Democrats patrolled the streets of Wilmington and other cities in North Carolina in the run-up to the state elections of 1898. This divideand-rule campaign, sustained by violence and intimidation and the stuffing of ballot boxes, led to significant Democrat victories on the 9thof November 1898.

2121

But electoral victory was not sufficient for white supremacists in Wilmington, where Fusionist politicians retained control of local offices. Former congressman and future mayor Alfred Waddell called a meeting at which a ‘White Declaration of Independence’ was revealed, and read to leading figures of the black community in Wilmington, demanding resignations and admissions of subordinacy. The following day, a large organized crowd of white farmers, soldiers, businessmen, and political leaders marched to the offices of the Daily Record and set it ablaze. Whipped into a frenzy, the crowd swarmed on the homes and offices of prominent black members of the community. Specific targets like politician Daniel Wright were located, beaten and murdered. What started as organised militia activity descended into indiscriminate killing and violence. Reports on the exact number of deaths vary, from nine to three-hundred, but over a thousand African-Americans were forced to permanently flee their homes after being escorted out of the city by armed soldiers. For the first and only time in American history political leaders were violently forced out of office; Waddell, primary organizer of the violence, was selected as the new mayor. In a pattern to be repeated across the south, the first act of the new legislature immediately disenfranchised most African-Americans via the introduction of a poll tax and literacy tests - measures that whites were excluded from. Amid the shattered glass of store fronts, the swinging hinges of front doors forced open, and the burning office of the Daily Record, rose Jim Crow. The main historical importance of the massacre was that the legal codification of separate racial spheres passed in 1896 now possessed an underlying warning to African-Americans about the consequence of any acts of self-assertion. This symbiotic relationship between black agency and white resentment in America remains pertinent today - as does the concerted efforts of rich white supremacists to defeat a working-class alliance via the weaponization of anxieties about identity, be they based on race or on gender. The history of Wilmington needs much greater exploration and amplification, however the innocent victims whose lives were destroyed on the 10thof November should always remain at the centre of it.

Abraham Armstrong


ISSUE 36 | 2020

The Racialised War on Drugs On September 14th, 1986, President Ronald Reagan and First Lady Nancy Reagan made a national address to declare a ‘war on drugs’ within America. This declaration and the initiatives implemented as a result has directly impacted communities heavily populated with working-class ethnic minorities, the results of which we are still seeing. In this address, the Reagan’s made hyperbolic statements that inflated the extent of drug use in society. By stating that drug use is ‘killing our children’, ‘threatening our communities’ and ‘undercutting our institutions’, the Reagans firmly drew a line in public opinion surrounding issues of drug abuse. Those who were for the wellbeing of American society were anti-drug use and anyone who disagreed with their sentiment was pro-drug use and complicit in the demise of American values. Following this speech, public concern about drug use skyrocketed: polling of the American public listed drug use as the country’s biggest problem, when, at the same time, poverty was steadily increasing, and Reagan was drastically cutting funds to inner-city communities. Reagan’s strict political rhetoric through-

‘Crack’ cocaine is the street name given to cocaine that has been processed with sodium bicarbonate or ammonia, allowing it to be smoked rather than snorted (the most popular mode of consumption). Due to its relatively inexpensive nature and ability to be sold in small quantities as ‘rocks’, crack cocaine was seen as a more lucrative and cost-effective way to consume and sell drugs in comparison to powder cocaine that was comparably more expensive. It is significant to point out here that crack cocaine and powder cocaine have the same physiological composition, rendering their effects virtually identical. However, crack cocaine was policed far more heavily during the 1980’s: its presence in inner-city communities and association with ethnic minorities stigmatised its usage in a way that powder cocaine users did not. Reagan’s introduction of the Anti-Drug Abuse Act (1986) legalised more severe sentences for the possession of crack cocaine when compared to powder cocaine. It introduced the 100:1 ratio, meaning that possession of five grams of crack cocaine was given the same mandatory

Reagan passes a pen to his wife Nancy, after signing a 1988 anti-drugs bill

out his presidency consistently demonised drug use and positioned specific social groups as the perpetrators of the problem. The effects of this were felt most by inner-city communities, who were subject to increased police presence on the streets and coercive methods of state intervention. The arrival of the new drug ‘crack cocaine’ in the mid-80’s caused a national epidemic, and its association with working-class ethnic minorities (specifically those of Black and Latino descent) saw a sharp change in the way drug use was perceived and controlled in American society.

minimum sentence as possession of five hundred grams of powder cocaine. If you are caught with one gram of crack cocaine you will be subject to the same penalisation as if you were carrying one hundred grams of powder cocaine. As use of crack cocaine was far more prevalent in areas with high populations of ethnic minorities, this sentencing disparity resulted in a rapid increase in the number of black men in particular in the American criminal justice system. By 1990, the average drug sentence for African Americans was 49% higher than their white counterparts.

2222

The Anti-Drug Abuse Act was implicitly racialised: Reagan’s methods didn’t tackle the issue of drug abuse itself, instead, it sought to punish those who took drugs, and if you police the drugs more frequently taken by Black and Latino populations, they predominantly face this punishment and come to embody this societal issue. Furthermore, the Reagan administration approved the increase of police officers in neighbourhoods and an increase in police powers to use extended force. This resulted in police officers using more intrusive methods to arrest people, such as raiding people’s homes and using physical violence against civilians in the name of suspicion of drug possession. These tough on crime policies escalated racial tensions within neighbourhoods, as the escalation of violence saw people resisting arrest and also police officers acting on racial stereotypes to arrest people they believed were more likely to be carrying drugs without any established evidence. In one particular instance in 1985, a SWAT team used a battering ram to tear into a house they believed people were selling drugs out of. When they entered the house however, they only found two women and three children inside, eating ice cream. Rates of drug acquisition were often quite low (only 35% of raids actually seized drugs) proving that most of Reagan’s attempts to solve this war on drugs were fruitless. Indeed, many of the practices were scaled down towards the end of his presidency in 1989. The impact of the Reagan era is still felt throughout America, and the overrepresentation of ethnic minorities in American prisons is now at an all-time high. By labelling drug use as detrimental to the fabric of American society, Reagan instigated a national moral panic, one that called for swift, remorseless solutions to a problem caused by criminals, rather than offering rehabilitation to drug abusers. Race relations in America have always been fraught and contentious, and the Reagan era is no exception to this. Disproportionate policing based on racial profiling, as well as sentencing disparities have heavily contributed to this and remains a leading factor in mass incarceration today.

Dara Coker


ISSUE 36 | 2020

Dietrich Bonhoeffer: “The will of God will only be clear in the moment of action.”

Dietrich Bonhoeffer was a German Protestant theologian who lived and taught within Germany under the Nazi dictatorship of Hitler. His teachings questioned the position of religion within a secular society, which he used to oppose political authority in favour of devotion to God. He is an undeniably important figure, who used his beliefs to affirm basic teachings of freedom and human rights even in a society that was overtaken by fascism, and continued to teach such messages, even when doing so risked his life. He was executed by the Nazis in 1945 after his involvement in a plot to overthrow Hitler, which was based around his belief that damnation of an individual through the ultimate sin of murder was justified for the protection of innocent people. Bonhoeffer opposed the reluctance of the church to outwardly oppose Nazism in Germany. In Catholicism, Pope Pius XII attempted to maintain the neutrality of the Vatican in political matters, and despite expressing concern over the Hol-

ocaust, did little to oppose the actions of the Nazi Party. In 1932 the German Christian group was founded, with the aim of pushing protestant doctrine to conform to Nazi principles of racial purity, and their treatment of Jewish people. In response to such teachings, Dietrich Bonhoeffer, along with numerous other theologians, established the Confessing Church to create systemic opposition to the antisemitism shown by the German Church. After considerable oppression from the Nazis, the Confessing Church turned to underground seminaries to continue the teachings of resistance and preserve traditional Christian beliefs and practices. Bonhoeffer’s involvement in a plot to dispose Adolf Hitler is amongst his most famous acts, and ultimately led to his execution. Even in the extremities of this act, he was able to justify violence as “the will of God will only be clear in the moment of action”. He united the concepts of religious devotion with civil disobedience and resistance to corrupt authorities with

an emphasis on action, drawing on his understanding of the teachings of Jesus Christ in the New Testament, and declaring, in The Cost of Discipleship, that devotion to God came with an expectation of suffering, that all Christians should follow. Ultimately, the teachings of Bonhoeffer are too often overlooked due to their dependence on Christian teachings. However, his work overcomes the limits of religious belief, and is applicable to all individuals in the event that political authority transcends its moral boundaries and intervenes too far into the lives of individuals. His life should be remembered as one lived with the purpose of expressing his devotion to his ideology and human rights, through the practice of civil disobedience and resistance.

Rebecca Boulton

The Cheese and the Worms: The Cosmos of a Sixteenth-Century Miller ‘Everybody has his calling, some to plow, some to hoe, and I have mine which is to blaspheme.’ Thus spake Dominico Scandella, also known as Menocchio, the protagonist of Carlo Ginzburg’s influential work of early modern microhistory The Cheese and the Worms. Menocchio was a Friulian miller, a guitar player, and a school master in the 16th century village of Montereale. Despite his apparently mundane status as a historical figure, we find that in The Cheese and the Worms the study of Menocchio offers a profound insight into the nature of early modern society, in an especially tumultuous era of religious upheaval. For Ginzburg, the story of his protagonist is able to reveal more general claims about popular consciousness in pre-industrial Europe.

of the Catholic church. His charges were multiple: that the ‘Holy Scripture has been invented to deceive men’, that all were lies designed to oppress, and that indulgences and sacraments represented mere merchandise. Menocchio’s most famous claim, however, was his radical cosmogony which promoted a belief in materialism and a religious tolerance deemed heretical by his Inquisitors. It is to this cosmogony that we owe the title of Ginzburg’s work;

In 1584 and 1599, Menocchio faced trial for his blasphemy by the Roman Inquisition, and it is the records of these inquisitions, held at the Vatican Library, that provide the central focus for Ginzburg’s essay. Menocchio had attracted the attention of the Inquisitors for his radical and scathing views on the organized religion

Ginzburg uses the records of the inquisition to investigate the origins of the social construction of his knowledge and world-view. For early modern historians, Menocchio’s views are fascinating: where did he get them from? And how widely held were his views? This era, and this region of Italy, have previously

“I have said that, in my opinion, all was chaos, that is, earth, air, water, and fire were mixed together; and out of that bulk a mass formed-just as cheese is made out of milk-and worms appeared in it, and these were the angels”

2323

been neglected by ecclesiastical historians, yet Ginzburg’s findings pose a profound challenge to established views of the conformity, and homogeneity of religious thought in this area of Italy, in this period of time. Ginzburg assesses Menocchio’s esteemed and networked position as a miller to offer a theory on the cultural spheres of early modern Italy, making important conclusions about the processes of popular consciousness, and the domination of and resistance to the church. Since its publication and translation, Ginzburg’s work has provoked widespread reaction - both positive and negative - with many critics finding it difficult to support Ginzburg’s methods and his conclusions. Regardless, this widespread discussion of and critical interaction with Ginzburg’s work is a testament to his intriguing and exciting text. Regardless of whether you agree with his conclusions, this seminal contribution to the field of microhistory is of great methodological value to all scholars of the early modern period, and beyond.

Wilf Kenning


Editors Daniel Johnson Reuben Williamson Head of Design Marton Jasz Design Team Francesca Bradley Georgia Dey Beatriz Da Costa Honrado Rebecca Knight Head of Copy Editing Kate Jackson

Copy Editing team Ayesha Patel India-Rose Channon Savannah Holmes Kristen MacDonald Will Kerrs Head of Online Wilf Kenning Online Team Natasha Tai Amy Dwyer Head of Marketing Polly Pye


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.