HENRY SMITH STUDIES
PRIZE WINNERS 2017
R EI G AT E G R A M M A R SC H OO L
Images by: Amber Rothera - Governors’ Prize Winner (Astrophotography)
“HSS allowed me to combine my passions and get to know each of them on a different level. It was massively exciting and liberating to discover and invent new ideas for myself” Ashwin Bhat Headmaster’s Prize Winner “Starting HSS, there was such a vast scope of areas to delve into and write about; I started off looking into the evolution of women’s rights - even looking as far back as the time where women were persecuted for being witches! The ability to focus on any topic meant that each of us created a truly personal project. Researching and writing a critical analysis of a moral practice has been extremely interesting, even shocking at times; learning what is not in the usual school syllabus has been more than worthwhile.” Jo Welsh Headmaster’s Prize Winner 2
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
FROM THE HEADMASTER
I
n our inaugural year of Henry Smith Studies, I am so impressed by the calibre of the projects that have been produced. Across the board, students have applied themselves diligently and enthusiastically to the creation of a project which is completely their own. The range of creative and ambitious project areas has been truly remarkable: from a study of prosthetics and their enhancement of sport, to an exploration of how Hellenistic values have ruined women’s rights, all students have explored their own interests far beyond any syllabus. The task of awarding the Headmaster’s Prize was challenging. There were many deserving projects which were nominated, reflecting the quality of research and academic writing across the board. Whilst the projects contained within this book are those which stood out as being particularly outstanding in their creativity, rigour and ambition, I wish to extend my congratulations to every single student in the Lower Sixth Form.
All of you should be proud of your achievements. Independently researching, writing and creating such academic and ambitious projects is impressive and will serve you very well as you move onto future challenges. Particular congratulations, however, must go to both the winners of the Headmaster’s Prize (contained within) and the winners of the Governors’ Prize. Your projects are
outstanding, reflecting the time, energy and enthusiasm committed to them, and I have no doubt that they will inspire future students of Henry Smith Studies to aim high and think critically. Congratulations!
Mr Shaun Fenton Headmaster 3
R EI G AT E G R A M M A R SC H OO L
AN INTRODUCTION TO HENRY SMITH STUDIES
H
enry Smith Studies was created to encourage all students to broaden their academic horizons beyond the confines of A Level study, capturing a spirit of curiosity in learning. Its purpose is to encourage students to engage in intellectual debate, discourse and research in areas of academic life which pique their interest. All Lower Sixth students at Reigate Grammar School complete the programme, which is personally tailored to their interests and personalities. In doing so, Henry Smith Studies gives all students tangible and credible evidence of independent study interest, intellectual ambition and undergraduate quality skills. Our namesake, Henry Smith, was a great philanthropist, born in Wandsworth in 1548. He went on to accumulate great wealth through the acquisition of land and estates through the City and south of England. Henry died in 1627 and declared in his Will that a gift of £1,000 be used for the relief of the poor and to educate local children in Reigate. In 1675, Reigate Grammar School was founded to educate poor boys in reading, writing and simple calculations and has remained on the same site
4
to this very day. Henry Smith Studies seeks to capture Henry’s belief in the importance of learning, inspiring young people to strive to be the best that they can be, to pursue their curiosities and interrogate their questions.
printing to treat burn victims and a study of life with epilepsy. All students should be immensely proud of their achievements. They are fascinating and engaging, demonstrating real skill and rigour.
This cohort of projects is truly outstanding. The resulting pieces are wide-ranging and imaginative: creations include a set of architectural drawings, paintings, a songwriting app created from scratch, an adaptation of ‘The Boy in the Dress’ as a script for children, an essay on using 3D
Special mention should go to our Henry Smith Studies prize winners. The Headmaster’s Prize is awarded to projects of outstanding academic value. These prize winning essays are included within this book. The Governors’ Prize is awarded for creativity, ambition and endeavour.
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
“HSS was a wonderful opportunity to explore the subject that I was interested. It also gave me a good guide-line for how to write complicated academic essay.” Edward Nam Governors’ Prize Winner The Headmaster’s Prize is awarded to: Aaron Gilchrist: How can neural networks and machine learning be used to improve video games? Adam Mendoza: Is the thin-skull rule justifiable or should it be altered? Ashwin Bhat: The physics of a swinging ball, and how to predict its trajectory using 3D vectors, force variations and turbulent air flow models Joanna Welsh: Is bride price in the African culture a morally acceptable tradition? Leo Nasskau: How was the writing of Friedrich Nietzsche interpreted around the world and how is it considered today? Milo Stephens: Opus Alpha song writing app Tiger Fry-Stone: A diorama of the Battle of the Bulge during the Ardennes offensive. Tomos Nutt: To what extent was the ‘financial revolution’ and development of England’s economy caused by William III personal actions? The Governors’ Prize is awarded to: Amber Rothera: Astrophotography Amelia Knibbs: Why are teenage girls more prone to depression in the present day? Ava Warriner: Is Western Buddhism more of a social movement than a religion? Ben Dunn-Flores: Brain-computer interfaces Charlotte Hart: What would Thomas Jefferson think of the politics of modern day America? Edward Nam: Why is it that the countries containing the best universities are not the countries with the highest performing high school students? Manish Seeruthun: The mathematics of physics in the chemical world of nature Congratulations!
Mrs Leck
5
R EI G AT E G R A M M A R SC H OO L
HOW CAN NEURAL NETWORKS AND MACHINE LEARNING BE USED TO IMPROVE VIDEO GAMES? Aaron Gilchrist
T
Introduction he inevitable future rise of Artificial Intelligences (AI) over the human race has been widely envisioned within popular culture over the last few decades; only recently has the technology predicted to cause our demise started to be developed. In the future, we may encounter AIs in every part of our daily lives; currently AI is limited to rudimentary machine learning systems within search engines and data analysis. The idea of Artificial Intelligence conjures up the futuristic idea of a humanoid robot personal assistant; in reality, in the next five years it could start appearing within personal computer environments, especially video games. This essay aims to investigate and evaluate the viability and suitability of machine learning in some of these contexts, and predict how we will be using it in the future, especially regarding neural networks in the context of video games. predict likely things to happen in the near future.
Figure 1: We are still a long way from world domination by robots as predicted in the films such as Terminator 2.
A neural network, in this context, is a computational model used in machine learning which processes information in ways analogous to that of biological nervous systems, using the connections between so-called ‘neurons’. The networks are trained by giving them sample input and output data, which they attempt to work out the relationship between; this is often thought of as ‘learning’. They are especially efficacious with large sets of data, so have had use within the scientific community, and researchers are currently studying ways to use them in different contexts. Video games are unique in their reliance on realtime interactivity, which makes neural networks especially useful, due to their ability to ‘learn’ from past events and
As the limit of Moore’s law1 is reached, the optimisation of software is becoming increasingly important as the problems to compute become much more complex. The further development of AI will allow computers to understand the issues that they are solving, as opposed to obeying clever algorithms determined by the programmer. In 1997 IBM’s ‘Deep Blue’ supercomputer defeated Russian chess grandmaster Garry Kasparov (IBM, n.d. a.) using an algorithm which considered 200 million positions per second to search for the next optimum move by considering probabilities of select positions on the chessboard (IBM, n.d. b.). This is more efficient than brute-forcing (testing the possible outcomes of all possible moves), however, it was not perfect; after much analysis of the moves in the game by chess experts, it is widely thought that the Deep Blue beat Kasparov as a result of an unintended move due to an error (Latson, 2015). An AI could observe its opponent’s previous games and truly understand how they play; this should be more successful than a clever algorithm
Figure 2: The match in 1997 when Deep Blue defeated Garry Kasparov
written by a programmer. Additionally, the development of Artificial Intelligences in computers will allow programmers to focus on adding more features to software, as opposed to having to write specific algorithms for different conditions. The true extent of the possibilities of AI were aptly demonstrated in 2016 by Google’s ‘AlphaGo’ (Hassabis, n.d.), which defeated the South Korean Go Grandmaster Lee Se-dol. Go is a Chinese game, invented over 2500 years ago, played by two players on a 19x19 grid who take turns to place down stones and form territories on the board; the winner is the player with the largest total territory. The vast number of moves possible in the game has given it a reputation for being
1 In 1965 Gordon Moore, the co-founder of the microchip maker Intel, observed that the number of transistors per square inch on integrated circuits had doubled every year since their invention. Moore’s Law predicts that this will continue into the foreseeable future. 6
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
the efficiency of programming must be improved and ultimately the way to do this is through using AI.
Figure 3: The match in 2016 when AlphaGo defeated Lee Se-dol.
extremely difficult for computers to solve; whereas the number of possible games in chess is of the order of 10123, the number of Go games is of the order of 10360, and so it does not readily yield to brute force2. The person or organisation that produces the first true artificial intelligence appears likely to become very wealthy as a result of the huge aid to humanity that it will produce. As the speed of computers has increased, the approach of programmers has been to use the extra power to do more calculations. In 2014, leading microprocessor manufacturer Intel Corporation announced that it was augmenting a ‘refresh’ phase to its ‘Tick-Tock’ model of architecture development3, signifying that the scaling down of transistors is becoming more and more difficult and the limit of Moore’s law is soon to be reached (Bright, 2016). It is fair to suggest that once humans develop quantum computing technologies Moore’s law could resume being true, at least in spirit, however a new definition of it would be needed as a result of the different apparatus involved. While remaining in the super-atomic arena, to increase the efficiency of computing,
Google, the almost omnipotent search engine, has recently started a programme of research and development of more advanced neural networks as a way to make their search algorithm much more powerful and useful. Their ‘DeepDream’ (Tyka, 2015) program uses machine learning to identify patterns in images; this has already been used in the ‘Search by Image’ feature of their website to allow customers to find images similar to the ones that they upload. This regime has also been extended to the company’s translation software, Google Translate, allowing the website to move beyond its reputation for providing incorrect and maladroit results in most languages. Google describes the change as moving from breaking a sentence into words and phrases, which are then individually translated, to considering the entire sentence and translating this using a neural network which can call on usages it has seen before and learn from new usages. After the full integration of neural networks into Google Translate in late 2016 (Turovsky, 2016), a journalist reported the results of applying it to translate a short newspaper article into English from Hebrew (Spungin, 2017), a language with a limited vocabulary where there can be dozens of correct translations for a word. He concluded that the overall translation was adequate, with some sentences in acceptable English and others which lost all sense of their original
meaning. Clearly this development, compared to its previous reputation of producing almost nonsensical results, is a significant step in the right direction. The creation of a computer system which can learn in the same way as a human is the key to developing correct and realistic language translation; there is a natural progression from the principles used here to simulate organic linguistic nuances, and the more demanding real-time calculations required in the improvement of video game environments. In the age of almost photorealistic video games, it would be expected that the actions of Non-Player Characters (NPCs) should appear human like. This is not yet the case, but neural network controlled NPCs look like they will change this.
How do neural networks work? A neural network is best understood through an example4. Below I describe a neural network that aims to mimic visual recognition, something that a realistic opponent in a video game would need to do. The task is to build a network that will recognise hand written digits (0 to 9). We have 60 000 pixelated monochrome pictures of handwritten digits; each picture is split into 784 () pixels, the greyness of each pixel is provided, along with confirmation of the digit that has been written. The data set is split into 50 000 pictures (written by 250 people) to be used to train the neural network,
2 Typically, there are 35 different next moves in chess, and the game lasts for 80 moves: so the number of possible moves is approximately , or . In Go, there are typically 250 next moves and the game lasts for 150 moves, so the number of possible moves is approximately , or 250150, or 10360. 3 This refers to innovations in the process of manufacturing microprocessors (a “tick”) followed by innovations in the structure of microarchitecture (a “tock”) in alternating cycles (tick followed by tock followed by tick and so on). 4 I have synthesised this example from “Neural Networks and Deep Learning” by Michael A. Nielsen (Nielsen, 2015) 7
R EI G AT E G R A M M A R SC H OO L
and the remaining 10 000 (written by a different set of 250 people) to be used to test the network’s ability to recognise handwritten digits.
Building the network The network is shown in Figure 4 (not all the neurons are shown). There are 784 inputs to the network (input neurons): one recording the shade of greyness of each pixel of a picture. Although these inputs are shown as neurons, they are really just data items. The shadiness of each pixel is an input to each of the 30 hidden neurons in the next layer of the network. Hidden neurons are called that for no other reason than they are not input or output neurons, so are not interacted with by the user5. There are 10 output neurons. Once the network has been educated, the first output neuron will indicate how likely it is that a handwritten digit in a picture is a zero. The second output neuron will indicate how likely the digit is a 1, and so on. If the likelihood is high then the value will be close to 1 and if the likelihood is low then the value will be close to zero.
Figure 4: The network built to recognise digits from 0 to 9.
What work is done at each neuron? Figure 5 shows how each hidden neuron looks (only three of the 784 inputs are shown in it). Each hidden neuron uses the 784 (pixel greyness) inputs to produce a single output; all 30 of the single outputs from the hidden neurons are then used by the 10 output neurons. The single output value from a hidden neuron is calculated from the weighted sum of the input values and the bias. Each hidden neuron has its own set of 784 weights and one bias. Initially the weights and biases, each of which lies between 0 and 1, are determined
Figure 5: A simplified view of each hidden neuron.
randomly. Their values are changed through the learning process. If the weighted sum of the inputs plus the bias is close to 1 then the output value will be close to 1, and if the weighted sum of the inputs and the bias is close to zero then the output value will be close to zero. The output neurons are exactly the same as the hidden neurons, except that they have 30 inputs (the output values of the 30 hidden neurons) rather than 784, and each has its own set of 30 weights and one bias. Figure 6 shows an example output neuron (not all 30 inputs are shown). The formula used to calculate the output value from the weighted inputs
Figure 6: A simplified view of each output neuron.
5 The number of neurons in each hidden layer, and the number of hidden layers, are not fixed. In this example one layer of 30 neurons performs very well. 8
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
and bias is the same as that used for hidden neurons.
Educating the Network In this manner, you have constructed a network with 23 820 weightings6 and 40 biases7, all of which have a value of between 0 and 1 that has been randomly generated8. Now the 50 000 training pictures are run through the network. For each picture, you will have a pair of vectors of length 10. Here is an example: (1,0,0,0,0,0,0,0,0,0) - which indicates that the handwritten number was a zero9; and (0.150,0.366,0.918,0.411,0.090,0.562,0. 229,0.444,0.876,0.468) the “guess” generated by the network10. Not a very good first guess from the trainee. These pairs of vectors each represent two points in 10-dimensional space, and the aim of educating your network is to keep making small changes to the weights and biases that move the guesses closer and closer to the target points. Of course, no one can imagine 10-dimensional space, so suppose the pictures can only be of the numbers 0, 1 and 2 and you reduce the number of output neurons to three. After running the model, the first time you will have 50 000 pairs of points (each a target and a guess) that fit in a cube with sides of length 1.
Now you want to make small changes to the weights and biases to improve the fit of the guesses to the targets, but first you have to decide how to measure how good the fit is. It turns out you can focus on reducing the sum of the squares of the distances between each of the guesses and its target11. Call this measure for fit.
increase or decrease each of the new weights and biases, and to determine the size of the changes (the method of determining the size does not change and nor does h). Then you make all the changes in one go and calculate F2 (which will be smaller than F1). The next lesson uses F2 to calculate F3 and so on.
One way to incrementally reduce (i.e. improve the fit) is to use the gradient descent methodology. You make a small increase to just one weight or bias (call it ), leave all the others unchanged, and rerun all 50 000 pictures through the network to calculate the resulting change in 12. If the result is a decrease in of 1%, say, then in a minute you will increase by 1% of (a constant known as the learning rate). If the result is an increase of by 0.5%, say, then in a minute you will decrease by 0.5% of h.
Basing each lesson on 50 000 pictures is going to be quite time consuming, so a shortcut, known as stochastic gradient descent, is taken. This just involves basing each lesson on a small, randomly chosen, subset of the pictures.
Now you set x1 back to its original value and make a change to another weight or bias (call this x2), leaving all others unchanged. Suppose that after increasing x2 and rerunning all 50 000 pictures through the network F increases by 0.2%. Then in a minute you will to decrease x2 by 0.2% of h. Once you have determined how to change each and every weight and bias, you make all the changes in one go and recalculate F: call its new value F1, which will be smaller than F. That completes the network’s first lesson. Now you repeat the lesson using F1 instead of F to determine whether to
You randomly sort the 50 000 pictures and then you train the network using just the first 10 pictures (known as a mini-batch)13. Then you train it on the next 10, and so on. When the 50 000 have been exhausted by lessons, an epoch of training is said to have been completed (perhaps they should have referred to “an academic year”). When an epoch is completed you randomly sort the pictures again and repeat the process of training using each successive mini-batch. (Using a mini-batch size of 10 means a 5 000fold reduction in training time). The challenge is deciding how few epochs you can get away with, how small a mini-batch can be, and how large a learning rate (h) you can use. The table in Figure 7 shows the proportion of the 10 000 testing
6 784 for each of the 30 hidden neurons and 30 for each of the 10 output neurons. 7 One for each of the 30 hidden neurons and one for each of the 10 output neurons. 8 Actually, pseudo randomly, as the random number generator of the programming language is used. 9 If the “1” was in the second position it would indicate that the handwritten number is 1, and so on. 10 This isn’t a real result of the first training of a network; however, it is representative of a result after one training run. 11 Other measures are possible provided they depend on all the weights and biases. 12 In effect, you are calculating the first derivative of F with respect to h. 13 The weights and biases are changed so that calculated just using the 10 pictures reduces in value. 9
R EI G AT E G R A M M A R SC H OO L
Grand Theft Auto The lack of training data and both the time and monetary expense of obtaining it is a major hurdle to the future further use of neural networks. Researchers at Princeton University investigating alternative sources of data were able to train a neural network to recognise stop signs by using the extremely popular open world video game Grand Theft Auto 5 (GTA V) (The Economist, 2017).
Figure 7: A table showing the progress of the network’s training.
samples where the network fails to identify the digit. It also shows what happens if you choose too small a learning rate (i.e. the changes in weights and biases are too small, so slow progress is made towards the targets), or too large a rate. The result is stunning. We have taught a network, which knew nothing about handwritten digits, to recognise 95% of the new digits it is given. If the number of hidden neurons is increased to 100 the success rates increases to 96.6%, and accuracy in excess of 99.5% is possible if other changes are made. Also, once the network has been built14, it is quick to use15 and adding new data is straightforward. The updating process will start from the current weights and biases used by the network, rather than from randomly generated values, and does not need access to your original training data; otherwise the updating process is the same as the building process.
How might this translate to video gaming? Neural networks have been used successfully in, for example, the software used by sorting offices to read addresses written on envelopes, and by banks to process cheques automatically. However, these tasks have a commonality: a readily available voluminous source of training data. If neural networks are to be used to improve NPCs then video game writers are going to have to capture large volumes of training data. In theory, this is possible: online games could be programmed to capture the actions of gamers as they interact with the environment and with algorithmically driven NPCs. (Maybe some of that data is already being captured for this purpose). However, it is not possible to count the number of different circumstances that a NPC might encounter, so this seems to be quite a challenge. Here are some examples of previous uses of neural networks, linking to video gaming.
14 This means that the values of all the weights and biases have been determined. 15 Just run the pictures through the network and examine the output neurons. 10
The game’s ultra-realistic city and countryside environments provided varying data for the network, and it is hoped that this and future obtained data will be used in the development of self-driving cars. There are, of course, many other aspects of the game, which cost an estimated over US$ 137 million to develop (Sinclair, 2013), that can produce training data, and many other games of a similar quality which can provide comparable data. Although this study was designed to provide data for self-driving cars, it would be easy to use this to train video game NPCs controlled by AI.
Figure 8: The Atari 2600 games console.
Atari 2600 games console In 1977, Atari, an early pioneer of the video game industry and creator
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
Figure 9: A table showing the outstanding performance of a neural network in various Atari 2600 games.
of some of the most famous arcade games, released the Atari 2600 games console. Ahead of its time, it was advertised as a way to play arcade games at home, and played some of the most well-known games of all time, such as Pong (a rudimentary twodimensional simulation of tennis), and Breakout (a game where the player bounces a ball to destroy bricks on a wall), which were supplied on cassettes and played using simple joysticks. Researchers from the University of Toronto trained an identical neural network to play 7 of the games offered by the system (Mnih, et al., n.d.), and compared its results to that of random movements, algorithms based on substantial prior knowledge of the games’ workings through the distinctive colours of each object in the game environment (different objects are coloured differently, so high scoring targets can be identified), and a seasoned expert at Atari 2600 games. The table below shows the results of this test. The neural network clearly outperforms the algorithm based on prior knowledge in all of the games chosen, and the expert is beaten in Breakout, Enduro, and Pong. It struggles to beat the expert on Q*bert, Seaquest and Space Invaders as these require a long-term strategy throughout the game, which the expert is much more able to handle. It is key to note that the network in the
experiment had much less training than the Atari 2600 expert, and that the expert, being a human, works much in the same way as a neural network, albeit on a larger scale, by learning from their mistakes. A key feature of this successful implementation is that the change in score arising from the gamer’s action (or inaction) is available to the neural network in the training data, so that it can learn which actions are most effective. So, this example, although encouraging, does not readily extend itself to the operation of NPCs in video games, as there is not an instantaneous arcadestyle ratchet up or down of points; instead it may take some time before the impact of a particular action can be assessed (e.g. being friendly to a NPC may only benefit you much later on). Programmers need to find effective action-assessment criteria for neural networks if they are to control NPCs in a realistic manner. The next example illustrates some progress being made in this area.
Using neural networks to make game characters move more realistically Traditionally, character animation is based on pre-recorded motion clips that are cycled repeatedly through a single game, which are often unrealistic and can become monotonous. A team including Daniel Holden, from the University of Edinburgh, has been looking into the use of a neural network to generate natural-looking movements through extremely complex terrain. (Orland, 2017) 1.5GB of motion data captured in a single two-hour session was synthesised into a few dozens of megabytes of memory as a neural network. Unlike other animation methods, the network does not require ‘scenes’ of data stored in large databases in order to blend movement, which can slow a system. From the information it has learned, the network directly generates animations based on the user input and the terrain.
Figure 10: The character crouching on a steep ascent. 11
R EI G AT E G R A M M A R SC H OO L
The character slows to a stop when approaching walls, rather than fulling full pelt into the wall and then stopping. It even swings its arms for momentum before jumping. Unexpected results include that the network allowed the character to crouch when walking over a rough terrain, whereas the input data only included the character crouching on flat terrain, and walking or running on rough terrain.
Figure 11: An advanced algorithm classified the image on the left as a truck, however the image on the right was misclassified.
The animation is unrealistic if the terrain is too steep. This is because if you do not give relevant data to the network (in this case how to manoeuver in very steep terrain) then there is no guarantee of the result in unexpected circumstances.
may bring some improvement in the performance of NPCs, it appears unlikely that they will truly simulate the workings of the human brain to emulate their behaviour.
Simulation of the human brain
Given their power, it appears likely that the use of neural networks in real things will become much more widespread. How far should we be willing to cede control to such systems?
Fujitsu’s ‘K Computer’ in Japan, consisting of more than eight thousand processors and ranked as the seventh fastest computer in the world on the Top500 supercomputer index (as of November 2016) (TOP500.org, 2016), took about 40 minutes to emulate just one second of accurate human brain activity. The computer has 705,024 CPU cores, and 1.4 million GB of RAM (Sparkes, 2014), much more than the 4 CPU cores and 8 GB of RAM which are found in normal mid-range desktop computers. The program simulating the brain was the Neural Simulation Technology (NEST), which allowed it to emulate the equivalent of a neural network with 1.73 billion neurons with 10.4 trillion connections between them. The massive computing power needed for this simulation is indicative of how advanced the brain is compared to computers we have built. So, while neural networks
12
Conclusion
In the network example described previously there are 30 hidden neurons. There is no way of placing a meaningful interpretation on what any of those neurons are actually doing, or of understanding how those actions are combined together to produce the desired result. Whatever linkages the network has identified in the training data are unknown to us. Unlike an algorithm, we cannot be sure how the network will operate in all circumstances, in particular if unexpected data is received (such as very steep terrain). If you are not a game developer, this does not matter so much for a video game: the worst that can happen is the program does something odd (e.g. the NPC defies gravity) or crashes and there are some disgruntled customers. Of far more concern is the use of neural networks in operating real
things. Would you want to travel in an automated car driven by a neural network, or live near a nuclear power station operated by such a network? On the left in Figure 11 is an image of a truck that was correctly classified by a very sophisticated neural network of the sort that is becoming widely used. The middle image shows minor changes (interference) made to the pixels of the picture and the right-hand image shows the effect of applying those changes. The image on the right was incorrectly classified by the network. Far from being an exception, the researchers indicate that it is possible to identify minor interference for each of the images in their data set that produce the same outcome (Nielsen, 2015). So, the wider use of neural networks will need to be tempered with adequate consideration of the checks and balances that must surround their use, and the circumstances in which their control can be overridden. In a game of the racing genre, neural networks could be trained on data from famous real-life drivers to create NPCs with similar driving styles as them for players to race against and try to defeat. Role Playing Games (RPGs) could be made more immersive through the use of neural networks by tailoring the side-quests
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
and the main story to the play style of the player; this coupled with the use of procedural generation (algorithms that create game environments that are random, but fit certain constraints) could enhance the gaming experience significantly. The gamer will no longer be able to anticipate his or her opponent’s next actions once they have played the game a few times, making it much more difficult to play therefore requiring much more skill. AI controlled opponents in First
Person Shooter (FPS) games could become much more challenging to beat, particularly if they learn from the gamer’s moves, creating an opponent which knows the player’s flaws. At the time of writing, Neural Networks and Artificial Intelligences are at a too primitive stage to make a huge difference to the world of video gaming. However, in a few years from now, when quantum computers speed up processing considerably allowing for larger networks to be simulated
on smaller hardware, the simulation of the basic thoughts of every NPC, procedural generation of more realistic game environments, and provision of more challenging opponents will vastly increase the immersion of video games, and the enjoyment of players. In popular culture AIs are stereotyped as humanoid robots with the intention of world domination; in reality, they will be programs within our computer helping our productivity, and aiding our leisure through video games.
Bibliography Bright, P. (2016, March 23). Intel retires ‘tick-tock’ development model, extending the life of each process. Retrieved from Ars Technica: https://arstechnica.co.uk/information-technology/2016/03/intel-retires-tick-tock-process-architectureoptimisation/ Hassabis, D. (n.d.). Exploring the mysteries of Go with AlphaGo and China’s top players. Retrieved from Google DeepMind: https://deepmind.com/research/alphago/ IBM. (n.d. a.). Deep Blue. Retrieved from IBM 100: http://www-03.ibm.com/ibm/history/ibm100/us/en/icons/deepblue IBM. (n.d. b.). How Deep Blue Works. Retrieved from IBM Research: https://www.research.ibm.com/deepblue/meet/html/d.3.2.html Latson, J. (2015, February 17). Did Deep Blue Beat Kasparov Because of a System Glitch? Retrieved from Time: http://time.com/3705316/deep-blue-kasparov/ Mnih, V., Kavukcuoglu, K., Silver, D., Graves, A., Antonoglou, I., Wierstra, D., & Riedmiller, M. (n.d.). Playing Atari with Deep Reinforcement Learning. (U. o. Toronto, Producer) Retrieved from https://www.cs.toronto.edu/~vmnih/docs/dqn.pdf Nielsen, M. A. (2015). Neural Networks and Deep Learning. Determination Press. Retrieved from http://uralnetworksanddeeplearning.com/ Orland, K. (2017, May 4). The AI revolution is making game characters move more realistically. Retrieved from Ars Technica: https://arstechnica.co.uk/gaming/2017/05/how-neural-networks-are-making-for-more-lifelike-video-game-animation/ Sinclair, B. (2013, February 1). GTA V dev costs over $137 million, says analyst. Retrieved from GamesIndustry.biz: http://www.gamesindustry.biz/articles/2013-02-01-gta-v-dev-costs-over-USD137-million-says-analyst Sparkes, M. (2014, January 13). Supercomputer models one second of human brain activity. Retrieved from The Telegraph: http://www.telegraph.co.uk/technology/10567942/Supercomputer-models-one-second-of-human-brain-activity.html Spungin, S. (2017, April 7). What Suddenly!? Google Translate Isn’t About to Replace Humans. Retrieved from Haaretz: http://www.haaretz.com/israel-news/.premium-1.781219 The Economist. (2017, May 13). Shall we play a game? Why AI researchers like video games. The Economist. TOP500.org. (2016, November). November 2016. Retrieved from Top 500 The List.: https://www.top500.org/lists/2016/11/ Turovsky, B. (2016, November 15). Found in translation: More accurate, fluent sentences in Google Translate. Retrieved from The Keyword: https://blog.google/products/translate/found-translation-more-accurate-fluent-sentences-google-translate/ Tyka, M. (2015, June 17). Inceptionism: Going Deeper into Neural Networks. Retrieved from Google Research Blog: https://research.googleblog.com/2015/06/inceptionism-going-deeper-into-neural.html
13
R EI G AT E G R A M M A R SC H OO L
IS THE THIN-SKULL RULE JUSTIFIABLE OR SHOULD IT BE ALTERED? Adam Mendoza
I
Abstract n this essay, the current use of the thin-skull rule in modern law in England is examined and it is considered whether the current interpretation of the law requires alterations to be an effective and fair judge of criminal guilt as well as responsibility for any harm caused by the defendant. This is judged by considering the criteria of the intent of the defendant to cause harm, whether the outcomes of their actions were foreseeable and whether the defendant can be found as guilty within the chain of causation using the ‘but for’ or ‘sine qua non’ test. or otherwise negligently injured in his body, it is no answer to the sufferer’s claim for damage that he would have suffered less injury, or no injury at all, if he had not had an unusually thin skull or an unusually weak heart.”
Introduction The thin-skull rule, a doctrine of the law of causation, is a principle of common law that states the defendant must take their victim as they find them and that the defendant is responsible for all harm caused as a consequence of their actions: they cannot use a pre-existing vulnerability of the victim as part of their defence even if a normal person might not have suffered the same degree of harm. The principle appears to originate from a case in England in 1901, Dulieu v. White and Sons1, in which it was stated: “If a man is negligently run over 1 Dulieu v White and Sons, 1901 2 R v Blaue, 1975 3 Herring, Great Debates in Criminal Law, p. 37 14
It is therefore implied that any physical weaknesses of the victim that alter the outcome of the defendant’s actions cannot be seen to have broken the chain of causation and so cannot be used as a defence. Furthermore, the same principle also extends further to cover any of the victim’s actions influenced by religious beliefs: as seen in the case of R v Blaue (1975)2 in which the act of the victim to refuse a blood transfusion, that would have saved her life, was not seen as a novus actus interveniens (new intervening act) and therefore did not break the chain of causation. The outcomes of the previous cases have demonstrated the way in which courts have interpreted who is to blame in matters relating to the thin-skull rule, with the defendant being blamed as blame cannot be placed on the victim and the defendant is responsible for any damage stemming from his original tort.
Areas for debate The main area for debate around the thin-skull rule is whether it can be deemed to be fair that a defendant
should be liable for a consequence of their action that was unforeseeable to them, and if it is judged that they should be responsible for the outcomes, then to what extent? An outcome not being foreseeable to the defendant may be a relevant defence to mens rea, however, it is not a defence to causation3, therefore possibly resulting in the defendant being deemed to be guilty, but given a reduced punishment as they are less culpable. The use of reasonable foreseeability (whether or not a normal person of normal intelligence would have foreseen the outcome) could be used to more fairly judge guilt of the defendant, however, it is criticised for not adequately defending the interests of the victim. Another area of debate is whether a victim intentionally taking a decision not to save their own life is an action that should break the chain of causation, given that making the same decision for religious beliefs would
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
not break the chain of causation. How far should a defendant have to take their victim as they find them and why should a defendant have to take physical disabilities and religious beliefs as part of the victim but not the victim’s decision making? It is the view of P.J Rowe that “Now is the time, once and for all, to jettison the thin skull rule” and that “the thin skull rule is a red herring in stormy waters” that should be forgotten as it is his belief that the thin skull rule no longer serves a useful purpose because of the issues surrounding the foreseeability of the crimes that are committed4. This view, while not commonly held, will also be explored as to determine whether the thin skull rule is both justifiable and useful in today’s society.
Arguments opposing the thin skull rule Reasonable Foreseeability The ‘thin skull’ rule says that the defendant must take his victim as he finds him. Therefore, even if injury or death is not reasonably foreseeable the law still considers the defendant liable if the victim suffered from some physical or mental condition that made him or her vulnerable. In the case of Corbett, R v [1966] CA5, a fight broke out between the defendant and victim which resulted in the victim fleeing. The victim fell into a gutter as he attempted to escape and was subsequently hit by a passing vehicle, inducing a fatal injury. It was deemed that in this particular 4 5 6 7
occasion the response of the victim was foreseeable to the defendant and, therefore, the defendant’s attack was a cause of the victim’s death: the defendant was charged with manslaughter. While this may seem harsh, it is also understandable because the defendant chose to act despite realising that the eventual outcome was a possibility, however, the same charge can be handed out to someone who did not possess the necessary knowledge to acknowledge the eventual outcome as a possibility. Similarly, in the case of R v Hayward (1908)6 the defendant chased his wife out of their house while yelling threats at her, but did not physically touch her. The wife had a rare thyroid condition which, as a result of the physical exertion combined with the emotions of fright and panic, caused her to collapse and die. The defendant was found to be guilty of constructive manslaughter for the death of his wife, despite both his wife and himself being unaware that she had the condition that resulted in her death, making the outcome completely unforeseeable to both of them. The reason that the defendant was found guilty was because his actions were deemed to be unlawful and the cause of death when the ‘but for’ test was applied (but for his actions, his wife would not have suffered death from her thyroid condition), and therefore, when the thin-skull rule was applied, the unforeseeable nature of the outcome was not a valid defence. Once more, in the case of Smith v Leech Brain (1962)7 a widow sought to prosecute her husband’s employer
following the death of her husband due to an injury sustained at work. As a result of the negligence of the defendant the widow’s husband had incurred a burn to his lip, which triggered pre-cancerous cells within his lip, causing him to die from cancer three years later. It was adjudged that the burn was a foreseeable consequence of the defendant’s negligence and it was not necessary to show that death by cancer was foreseeable. Neither was it a valid defence that an ordinary person would not have died from the same injury. Thus, the thin skull rule was applied, with the defendant being forced to take his victim as he finds him, and the defendant was found to be liable for the death. It can be often seen as greatly unfair that a defendant can be held responsible for a result that was unforeseeable to them, principally due to the lack of intent. If a person was to consider the possible outcomes of their actions, foresee a bad outcome and yet still act in this way then there would be little complaint if they were then held responsible for those actions. However, if the same person was to do this and could not expect the eventual outcome then it seems far more unfair to hold them responsible and punish them for the outcome. Therefore, the use of reasonable foreseeability of consequences has often been called upon to give a fairer judgement upon the accused. For instance, if we consider the following hypothetical case: “Dan lends Susan his bicycle. She leaves it chained to a pillar outside
Rowe, The Demise of the Thin Skull Rule, p. 377-388 R v Corbett, 1966 R v Hayward, 1908 Smith v Leech Brain 1962 15
R EI G AT E G R A M M A R SC H OO L
and should reduced liability for the outcome be given to the defendant?
her house. A bolt of lightning comes down and destroys the bike. Violet lends Wendy her bicycle. Wendy’s garden is particularly prone to flooding. She leaves the bike in her garden. There is a flood and the bike is destroyed.” 8 In this example, it seems unfair to say that Susan caused the damage to the bike, but less so to say Wendy caused the damage to the bike, due to the issue of the outcome being foreseeable. The lightening and the flood are both uncontrollable by the defendant, however, due to the flood being foreseeable while the lightning was not, we issue different levels of blame. Given that this is the response we take to ‘freaks of nature’, such as acts of God, then why is it different or wrong to take the same approach for victim’s hidden vulnerabilities? The victim’s actions: It needs to be considered whether the actions of a victim following the original tort should be considered when contemplating the appropriate punishment in cases of causation and the thin-skull rule. If an action is taken by the victim which purposely and knowingly causes their death, possibly just to spite the defendant, should this be seen as a novus actus interveniens 8 Herring, Great Debates in Criminal Law, p. 37 9 California v Lewis, 1899 10 R v Holland, 1841 11 R v Dear, 1996 16
Previous cases, such as California v Lewis (1899)9, have shown that the victim’s own actions are not regarded to break causation, and therefore, so long as the wound remains an “operative and substantial cause of death” then, the defendant is guilty. In the case of California v Lewis (1899) the defendant shot his brother-in-law causing a wound that would have been fatal in a short period, however the victim hastened his death further by cutting his own throat. Using the ‘but for’ test, even if the victim had not cut their throat, the victim still would have died from the gunshot wound and therefore the wound was an “operative and substantial cause of death”. Likewise, in the case of R v Holland (1841)10 the defendant injured one of the victim’s fingers when assaulting him. The victim chose to go against advice given to him to have the finger amputated and subsequently died from tetanus. Again, it was deemed that the actions of the defendant were the cause of death and therefore the defendant was guilty. Similarly, in the case of R v Dear (1966) 11 the defendant attacked a man with a Stanley knife after the victim had apparently molested the defendant’s 12-year-old daughter. The defendant argued that the chain of causation had been broken because, two days after the stabbing, the victim had committed suicide either by reopening his wounds or because he had failed to take steps to limit the blood flow after the wounds had reopened spontaneously, and the lack of action
to protect himself constituted a novus actus interveniens. It was held that the real question was whether the injuries inflicted by the defendant were an operating and significant cause of or contribution to the death. Distinctions between the victim’s mere self-neglect (no break in the chain of causation) and the victim’s gross self-neglect (break in the chain of causation) were not helpful. The victim’s death resulted from bleeding from the artery severed by the defendant. Whether or not the resumption or continuation of that bleeding was deliberately caused by the victim, the defendant’s conduct remained the operative and significant cause of the victim’s death. In the before mentioned cases the victims have acted in such a way that the final outcome of the defendants’ actions have been altered, and hence, it could be suggested that a re-evaluation is required to adjudge who bears the responsibility of the following outcomes. Currently, due to the phrase that classifies the defendant as guilty so long as the wound, or wounds, inflicted by them are the “operative and substantial cause” of death, the defendant is found guilty, despite the influence that the victim had on the final outcome. As with an outcome seeming harsh if it is not foreseeable, it additionally seems harsh that a defendant can be prosecuted for an outcome which was largely the result of the victim’s own actions and could have been far less severe had the victim not acted in a way which had increased the harm. In contrast to the previous cases, the unlawful act must be criminal,
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
not just tort, for the defendant to be deemed guilty, as seen in the case of R v Arobieke [1988]12, in which the defendant was judged to be not guilty. The defendant went to a railway station looking for the victim following some animosity between them, leading to the victim fearing serious violence if the defendant found him. When leaving his train, the victim was electrocuted by the tracks and died. Since there was no unlawful act in standing on a platform looking into trains and there was no evidence that the defendant had threatened the victim to cause him to believe that he was at risk, there could be no conviction so the defendant was not guilty. In R v Kennedy13 the appellant had his conviction of constructive manslaughter overturned after the victim (Mr Bosque) had died following injecting himself with heroin supplied by the defendant. Upon the appeal, the Court of Appeal certified the following question of law: “When is it appropriate to find someone guilty of manslaughter where that person has been involved in the supply of a class A controlled drug, which is then freely and voluntarily self-administered by the person to whom it was supplied, and the administration of the drug then causes his death?” The following answer to this question was: “In the case of a fully-informed and responsible adult, never”. The charge was overturned because the victim had the choice, knowing the facts, as to whether to inject himself or not and then proceeded to self-administer the drug, meaning the defendant did not administer the drug nor cause the drug to be administer and, therefore,
the defendant was not guilty under the thin skull rule as it was not his actions that directly caused the death of the deceased, but instead the actions of the deceased himself. Regarding the case of R v Dalby (1982)14, which shared many characteristics with the case of R v Kennedy, as the defendant was found not guilty for manslaughter after supplying drugs to the victim as the victim injected them into himself (which was the direct cause of death), Waller LJ, the judge, stated: “The difficulty in the present case is that the act of supplying a scheduled drug was not an act which caused direct harm. It was an act which made it possible, or even likely, that harm would occur subsequently, particularly if the drug was supplied to somebody who was on drugs. In all the reported cases, the physical act has been one which inevitably would subject the other person to the risk of some harm from the act itself. In this case, the supply of drugs would itself have caused no harm unless the deceased had subsequently used the drugs in a form and quantity which was dangerous. ... In the judgment of this Court, the unlawful act of supplying drugs was not an act directed against the person of O’Such and the supply did not cause any direct injury to him. The kind of
harm envisaged in all the reported cases of involuntary manslaughter was physical injury of some kind as an immediate and inevitable result of the unlawful act, e.g. a blow on the chin which knocks the victim against a wall causing a fractured skull and death, or threatening with a loaded gun which accidentally fires, or dropping a large stone on a train…. or threatening another with an open razor and stumbling with death resulting…”. Subsequently, it is clear that, for the thin skull rule to apply, the offence must be the main cause of death, not just a small contributing act, so if the victim freely chooses to do something that results in their death then they are the responsible party. So, it could be said that although the defendant must take his victim as he finds him, he does not have to take responsibility for the decision making of the victim which results in their death, if it is done freely. This idea creates the problem that in cases, such as R v Blaue, in which the defendant chooses not to take the normal medical help due to a religious belief, the victim is freely choosing to do their action and therefore making the defendant not guilty for their subsequent death. However, this problem is solved by claiming that when a victim is acting due to a religious belief it is not truly a free
12 R v Arobieke, 1988 13 R v Kennedy, 2007 14 R v Dalby, 1982 17
R EI G AT E G R A M M A R SC H OO L
meaning the defendant can be deemed guilty without intent and given a reduced punishment.
choice. As stated in R v Kennedy only ‘free, voluntary and informed’ acts can break the chain of causation15, so if the victim’s decision was involuntary it would not be a novus actus interveniens and the defendant would be held to have caused the death. Following a religious belief can be seen as involuntary as to deny a victim the right to make this decision as a result of their religion would be to deny their human right to hold a religion. Previous cases, such as R v Benge16, have shown that the negligence of a victim is also not a relevant or valid defence in cases of causation, such as the thin skull rule, because it does not constitute a novus actus interveniens. However, following an injury that has been sustained, the plaintiff must take reasonable care to protect their thin skull from further injury; failure to do so may result in a finding of contributory negligence or possibly of volenti non fit injuria or snap the causal link17, as in McKew v Holland18. Henceforth, the actions of a 15 16 17 18 19 18
R v Kennedy, 2007 R v Benge, 1865 Rowe p. 380-381 McKew v Holland, 1969 Souper, Principles – Strict Liability
victim can constitute a defence in the cases where they are voluntary and informed, or in which the injury has not been sufficiently looked after, but in most cases the actions of the victim are not deemed to be a valid defence and therefore, the majority of the time, the victim will still be found guilty. Mens rea vs causation Mens rea is a necessary element of many crimes, however, this prevalent requirement does not apply in cases relating to the thin skull rule. The standard common law test of criminal liability is expressed in the Latin phrase ‘actus reus non facit reum nisi mens sit rea’, meaning ‘the act is not culpable unless the mind is guilty’. Resulting from this is the idea that the defendant cannot be deemed liable for their actions if they lack the intent to cause them. The lack of intent naturally excludes the possibility of the defendant having the mens rea, however mens rea is not required for a prosecution in cases of causation,
In jurisdictions with due process, for a defendant to be found guilty of a crime both actus reus (guilty act) and mens rea are required. Crimes in which a defendant can be liable without mental fault are known as strict liability crimes; the thin-skull rule is included within strict liability to act as a deterrent to committing offenses and an attempt to prevent harm to members of society. The liability is strict because ignorance of one or multiple of the factors which made their actions criminal will not prevent a conviction, even if the defendant is not culpable in any way. There need not even be criminal negligence (the least blameworthy level of mens rea) for a prosecution in cases featuring strict liability, such as cases of the thin skull rule. Strict liability laws were initially created in the 19th century in an attempt to increase prosecutions of factory owners when working and safety standards in factories were trying to be improved. Outside of cases featuring the thin skull rule, a common offense in the modern day which is covered by strict liability is vehicular traffic offenses. For instance, in a speeding case, it is irrelevant whether the defendant was aware that they were exceeding the speed limit, instead it only matters that the defendant was operating the vehicle going faster than the speed limit for a prosecution to occur.19 It is widely accepted that the thinskull rule falls under the area of strict liability, which highlights the difference
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
between mens rea and causation in law and therefore demonstrates why it is deemed fair within the law that a defendant can be sentenced as guilty without mens rea (as result of a lack of intent to commit the crime that they did). However, the imposition of strict liability may operate very unfairly in individual cases. For example, in Pharmaceutical Society of Great Britain v Storkwain20 (1986), a pharmacist supplied drugs to a patient who presented a forged doctor’s prescription, but was convicted even though the House of Lords accepted that the pharmacist was blameless. The justification is that the misuse of drugs is a grave social evil and pharmacists should be encouraged to take even unreasonable care to verify prescriptions before supplying drugs. Similarly, where liability is imputed or attributed to another through vicarious liability or corporate liability, the effect of that imputation may be strict liability albeit that, in some cases, the accused will have a mens rea imputed and so, in theory, will be as culpable as the actual wrongdoer.
Arguments supporting the thin-skull rule Without intent Whether it is fair that a defendant can be punished for an outcome they did not intend is a question that can often be looked at. However, this is the main area covered by the thin skull rule, with the statement of ‘you must take your victim as you find them’ meaning that the defendant is liable for all of the
consequences of their actions, including the ones that they did not intend. Therefore, it is still both possible, and seen as fair, to be prosecuted without any intent for your crime, however, the charge will usually be lowered in order to reflect the lack of intent: the charges for manslaughter can carry far lesser sentences than those of murder. Likewise, the defendant cannot be blamed for their own vulnerability, as that is unfair, so the accused must accept any enhanced consequences that occur as a result of the vulnerability, despite the intentions not being that severe, to protect the interests of the defendant. A prosecution, following a death, under the thin-skull rule where intent is not present can often carry the charge of constructive manslaughter (also referred to as unlawful act manslaughter), as happened in the case of R v Hayward21. In cases of constructive manslaughter an unlawful killing has taken place in which the defendant lacked the mens rea of murder. Constructive manslaughter is one of two forms of involuntary manslaughter, being used in cases where the defendant commits an unlawful act which ends up in the death of the victim. There are three necessary elements for constructive manslaughter: firstly, there must be an unlawful act; that act must be dangerous; that dangerous act must result in a death. The other form of involuntary manslaughter is gross negligence manslaughter; this is when the defendant has committed a lawful act which has still caused a death.
Due to an orbiter statement by Lord Denning in a civil case22 it was once thought that it was necessary to prove that the defendant possessed the intention to frighten or harm a person, or alternatively that the defendant could foresee the potential for their actions to cause harm. However, in the case of DPP v Newbury (1977)23 it was determined that the statements had no relevance in criminal cases. Consequently, the prosecution was only required to establish that the defendant had the mens rea of the unlawful act that they committed, rather than the death, for the charge of constructive manslaughter. From this it can be taken that it is not necessary for the defendant to have the mens rea in relation to the eventual death to be deemed guilty for it, but instead only the mens rea for the actions which caused it. Rights of the victim Under Article 9 of the European Convention of Human Rights: “Everyone has the right to freedom of thought, conscience and religion; this right includes freedom to change his religion or belief, and freedom, either alone or in community with others and in public or private, to manifest his religion or belief in teaching, practice, worship and observance. Freedom to manifest one’s religion or beliefs shall be subject only to such limitations as are prescribed by law and are necessary in a democratic society in the interests of public safety, for the protection of public order, health or morals, or for the protection of the rights and freedoms of others.”24
20 20 Great Britain v Storkwain, 1986 21 R v Hayward, 1908 22 Gray v Barr, 1971 23 DPP v Newbury, 1977 24 Article 9: Freedom of Thought, Belief and Religion | Equality and Human Rights Commission 19
R EI G AT E G R A M M A R SC H OO L
This article denotes that the religion of a victim is considered as part of who they are, consequently religions and religious beliefs are especially protected under the law and principles such as the thin-skull rule. This means that they cannot be used as a form of defence under law, as seen in cases such as R v Blaue (1975)25. Regarding the case of R v Blaue (1975), Klimchuk argues: ‘Freedom of religion is in a sense an equality right, for it not only permits persons the right to hold and manifest whatever religious beliefs they choose so far as such manifestations do not violate the rights of others, but (correlatively) it forbids us from discrimination on terms of
religious beliefs in matters of the public realm. Thus, to claim that Blaue may have been said to have killed Woodhead only if her beliefs (or some beliefs that lead persons who were assaulted to commit suicide) were commonly held is to deny her equal status, for to treat her on terms of equality in this context is precisely to disregard the question of whether her religious beliefs were popular ones.’26 As a result of this, the defendant is forced to accept the victim’s religion to the same degree as any physical disability that they could have had, leaving them fully responsible under the thin-skull rule for any harm caused to the victim even when the victim
25 R v Blaue, 1975 26 Klimchuk, Causation, Thin Skulls and Equality, p. 115 20
had chosen not to take available help on the grounds of it violating their religious beliefs, as happened in the case of R v Blaue, since they have the right to exercise their beliefs under the European Convention of Human Rights so long as they do not harm others.
Conclusion Although the thin-skull rule can sometimes be considered to be extremely harsh on the defendant, especially in cases where punishments are handed out for outcomes that are unforeseeable due to the law being implemented differently than in most other sections of law, there is currently no fairer alternative that still prioritises
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
protecting the victim, and their rights, while also being fairer to the defendant than the thin-skull rule currently is. While the use of reasonable foreseeability can initially appear to act as a fairer judge, than the use of strict liability, it fails to sufficiently protect the victim effectively and equally with others, particularly when the victim might be one of the more vulnerable members of society. Additionally, if the thin-skull rule was adapted to use reasonable foreseeability rather than strict liability then the thin-skull rule would lose the main section that differentiates it from other statutes of law and would, therefore, make the thin-skull rule a redundant piece of ruling. This was the foundation of P.J Rowe’s statement that the thin-skull rule should be disposed of, since he believes that judges are reluctant to find a defendant guilty when they lack intent and thus that judges already use a principle of reasonable foreseeability rather than strict liability, meaning that the thin-skull rule is already redundant, however, previous cases prove that this is frequently untrue. Moreover, regarding the view of P.J Rowe, that the thin-skull rule should be removed, that decision would leave a gap in modern law which is not effectively covered by any other statutes of law and it would leave the victim unfairly exposed. While the sentences using the thin-skull rule are often harsh, they are also fair and understandable and are necessary for the concept of causation in law to work. The issue regarding reduced punishment and liability for any outcomes that lack intent is already covered with the charge of constructive manslaughter being given in cases with a lack of intent as this carries a far lesser sentence and recognises the reduced role of the
defendant in the outcome. Therefore, no further alterations or amendments are required in this area of the use of the thin-skull rule to make it any fairer. The largest existing problem surrounding the thin-skull rule is the degree to which the defendant can still be held guilty despite a large influencing factor that can occur as a result of a decision taken by the victim, such as the conscious decision to reject medical treatment, following the original tort. While it is understandable that decisions influenced by religious beliefs are seen as involuntary and, therefore, not a novus actus interveniens, as in R v Blaue, it seems unfair still that in cases such as R v Holland, where the victim takes the conscious and voluntary decision not to be helped, that the actions of the victim do not result in a novus actus interveniens and the defendant is still held solely responsible for the final outcome. This could leave a defendant unfairly exposed when they have harmed a person who has a spiteful nature, especially when they do not even have to have intended to hurt the victim in the first place. The judgement in cases such as R v Dalby and R v Kennedy seems to more fairly reflect the role of the defendant in cases where the actions of the victim have led to increased harm to be caused to themselves, and so the defendant is not held responsible for the full outcome but only what their own actions caused. To resolve this, the thin-skull rule should be adapted so that if a plaintiff consciously and voluntarily chooses not to accept help, or instead makes their own tort worse than it originally was, then the defendant should only be deemed to be guilty for the most severe of the likely outcomes that
would have occurred had the victim accepted the help or not increased the harm. In this adaptation to the thin-skull rule all of the additional harm following the original tort would instead be blamed on the victim themselves and, therefore, be seen as a novus actus interveniens because the victim must take reasonable care to protect their thin-skull, and in several cases, such as R v Holland, they have failed to do this. This remains fair to the victim because they would still be covered for all of the harm which was either not their fault or not greatly heightened by their own actions. Torts in cases regarding decisions influenced by religions, such as R v Blaue, would remain solely the responsibility of the defendant because the decision is involuntary and only ‘free, voluntary and informed’ acts can break the chain of causation. Meanwhile, in a case such as R v Holland, where the victim did not have his finger amputated, the defendant would be responsible for the original harm, the amputation and any other likely medical or mental problems `which might stem from that, but not the death itself as the decision not to have the amputation should constitute a novus actus interveniens. This adaptation would therefore also have to override the ‘operative and substantial cause of death’ criteria which is currently used when attributing blame in cases of causation. In a case where a victim accepts help and takes reasonable care to protect their thin-skull but stills dies, for a slightly different but related reason (such as dying from a complication during surgery to repair the harm caused by the original tort), the defendant should still be deemed to be guilty as, in this case, the ‘operative and substantial cause’ criteria should still apply, as there is no novus actus 21
R EI G AT E G R A M M A R SC H OO L
interveniens, due to the victim doing what was expected of them to care for their thin-skull. Overall, the thin-skull rule continues to serve a useful purpose in modern law
and its use is still justifiable, however, alterations could possibly be made in which voluntary decisions by the victim which result in increased harm to themselves are deemed as new
intervening acts and, thus, less blame is placed on the defendant for the final outcome since the victim contributed to it.
Bibliography Primary California v Lewis (Superior Court of Shasta County (California) 1899) DPP v Newbury. AC 500(1977) Dulieu v White and Sons (High Court King’s Bench 1901) Gray v Barr. 2 QB 554(Court of Appeal 1971) Great Britain v Storkwain. 2 All ER 635(House of Lords 1986) Mckew v Holland. 3 All ER 1621(CA 1969) R v Arobieke (CA 1988) R v Benge (4 F. & F 504 1865) R v Blaue (Court of Appeal 1975) R v Corbett (Court of Appeals 1966) R v Dalby. 74 Cr App R 348 (Court of Appeal 1982) R v Dear (CA 1996) R v Hayward (21 Cox 692 1908) R v Holland (2 Mood. & R. 351 1841) R v Kennedy, 3 WLR 612 (House of Lords 2007) Smith v Leech Brain. 2 QB 405 (CA 1962) Secondary “Article 9: Freedom of Thought, Belief and Religion | Equality and Human Rights Commission”. Equalityhumanrights.com. N.p., 2017. Web. 3 Mar. 2017. Best, Arthur, and David Barnes. Basic Tort Law: Cases, Statutes, And Problems. 1st ed. Aspen Publishers Online, 2007. Print. Causation In (Criminal) Law. 1st ed. Law.uchicago.edu. 2017. Web. 2 Feb. 2017. Duhaime, Lloyd. “Thin Skull Rule Definition”. Duhaime.org. N.p., 2017. Web. 16 Jan. 2017. Herring, Jonathan. Great Debates in Criminal Law. 1st ed. Palgrave Macmillan, 2009. Print. Herring, Jonathan. Criminal Law. 1st ed. [S.l.]: Palgrave Macmillan, 2017. Print. “Involuntary Manslaughter - Constructive Manslaughter”. E-lawresources.co.uk. N.p., 2017. Web. 2 Mar. 2017. Klimchuk, Dennis. Causation, Thin Skulls and Equality. 1st ed. 1998. Print. Meliá, Manuel Cancio. “Victim Behavior And Offender Liability: A European Perspective”. Buffalo Criminal Law Review 7.2 (2004): 513-550. Web. 3 Mar. 2017. “Mens Rea Lecture | Criminal Law Lecture Notes | Law Teacher”. Lawteacher.net. N.p., 2017. Web. 16 Jan. 2017. Rowe, P. J. “THE DEMISE OF THE THIN SKULL RULE?”. The Modern Law Review 40.4 (1977): 377-388. Web Souper, M. “Principles - Strict Liability”. Sixthformlaw.info. N.p., 2008. Web. 21 Nov. 2016. “What Is an Eggshell Skull? | What Is an Eggshell Plaintiff?”. Rotlaw.com. N.p., 2017. Web. 1 Feb. 2017. 22
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
THE PHYSICS OF A SWINGING BALL, AND HOW TO PREDICT ITS TRAJECTORY USING 3D VECTORS, FORCE VARIATIONS AND TURBULENT AIR FLOW Ashwin Bhat
B
Introduction atting development in cricket today relies heavily on statistical analysis to repair technical faults and create more efficient means to improve and refine one’s dexterous skills. In general, the role of a batsmanespecially in Test cricket- is to score as many runs as possible on a consistent basis while not losing his wicket, both of which are more likely when a batsman regularly hits the ball with the middle of his bat. Knowledge of projectile motion in terms of vector force models has been vastly studied in the past century in physics, yet it has never been implemented in a consistent model for trajectories of cricket balls (Latchman and Pooransingh, 2016). However, the sheer complexity of the sport in terms of the number of external conditions on which each ball’s trajectory depends, including location, air pressure, climate, seam orientation, back spin and speed, means cricket analysts prefer to consider these factors to be random, wherein a large dilemma stands: knowledge of how these factors play can give a more accurate estimation of how the ball travels, and this could potentially make a batsman more successful. This project aims to create a mathematically rigorous model for how a swinging cricket ball travels in three dimensions using measurable conditions and calculating if its final position can be estimated within the uncertainty of the width and the height of a cricket bat. A positive result for this hypothesis could lead to large steps in understanding swing bowling and decision making by batsmen of the upcoming generation.
Hypothesis: Given enough factors, it is possible to estimate the final position of a swinging ball pitched at good length to within the uncertainty of the height and width of a cricket bat’s centre.
Representation as Vectors In order to describe a model of the trajectory of the ball, it must be broken down into its three vector components: height or bounce (x), lateral motion (y) and forward motion (z). The z component will be represented as a function of time to describe the duration for the ball to reach the popping crease, while the (x, y) coordinate will describe the ball’s position in space with respect to time. Bounce The vertical motion of the ball is the total downward force as a result of acceleration due to gravity as well as the initial downward force applied by the bowler’s arm. After contact with
the pitch, it rises to a certain height that is normally lower than the initial launch height. The assumption made in this study is that the ball lands in the “good length” region, which is between 5 and 8 metres from the batsman. The critical angle is the angle at which the bowler releases the ball with respect to the horizontal plane; the size of this angle has a positive correlation (between 0 and 90 degrees) with the extent of force applied by the bowler that goes into forcing it downwards. The vertical downward force on the ball can be derived from the formula: FV = mg + Fsin(q), where m is the mass of the ball (0.16kg), F is the force applied by the bowler on the ball, g is acceleration due to gravity (9.81ms-2) and q is the critical angle. Angle q for a bowler bowling at speeds between 60 and 100 mph (26.82 to 44.70 ms-1) is approximately 5 degrees1 if it is to land in the good length area. This can be ascertained from the following calculation of perpendicular
velocity ratios versus perpendicular distance ratios: VHDV = DHVV, where VH is the average horizontal velocity and VV is the average vertical velocity*. VH = 0.94Vcos(q) VV = Vsin(q) + 0.5gt Assuming DV = 2.20m, DH = 13m and t = 0.325s, 2.068Vcos(q) = 13(Vsin(q) + 0.1625g) 2.068Vcos(q) - 13Vsin(q) = 20.7025 Using addition formulae for trigonometric functions, this can be phrased as: R(sin(x + q) = 20.7025 Solved values for R and x replaced: (4.276624V2 + 169V2)½(sin(9.0387 q)) = 20.7025 \ q = 9.0387 - sin-1
20.7025
V(4.276624 + 169)½
If V = 40ms-1, q = 6.7850 If V = 20ms-1, q = 4.5100 23
R EI G AT E G R A M M A R SC H OO L
By the time the ball reaches the pitch, it would have lost approximately 12% of its horizontal velocity due to drag4. *It is accurate to use average values for these velocities because drag decreases with horizontal velocity, and over 13 metres the exponential relationship can be generalised as a proportional one; in addition, vertical velocity increases proportionally downwards as a result of acceleration due to gravity. To model the bounce of a ball after contact with a plane, it must be individualised for each case, since different pitches offer different extents of upward motion after landing. In principle, the energy of a system must be conserved, so the way in which each pitch differs is based on how much energy is dissipated from the ball’s kinetic energy on contact with the surface. The ratio of kinetic energy before and after collision is known as the coefficient of restitution, or CR:
the bowling speed. The final speed, v, is dependent on friction, and therefore is given by v = u - Ft/m, where u is 0.88q, F = µR, t is the time that the ball is in contact with the pitch (≈ 0.001 s)1 and m is the mass of the ball (0.16kg). From equating 1 and 2, eliminating CR, we get: √(h/H) = (u - µRt/m)/u Taking R = mg + mvsin(5)/thand, h/H = (1 - µ(g + V0sin(5)/thand) t/0.88V0)2 Therefore, h = H(1 - µ(g + V0sin(5)/ thand)t/0.88V0)2. It is important to note that h is the maximum height of bounce off the pitch, and isn’t the height of the ball as it passes the batsman. However, the assumption that the ball bounces on a good length means that h is reached as the ball passes the batsman, adjusted for uncertainty considering that the angle of impact with the pitch is not exactly vertically downwards. Lateral Motion
Making again the assumption that energy is conserved, the points at which kinetic energy is 0 represent maximum gravitational potential energy; the CR can therefore be derived due to loss of energy in terms of bounce and drop height:
CR therefore changes from pitch to pitch, depending on energy losses caused by its material composition. The velocity off the pitch is directly correlated with the value of µ associated with that pitch. With drag causing the ball’s velocity to decrease by 12% immaterial of the initial speed1, the value for u will be 0.88q, where q is 24
The lateral motion of a cricket ball for a fast bowler is mainly contributed by swing, the phenomenon by which turbulent and laminar air flow on opposite sides of the ball interact to cause deviation mid-air. Due to the tilt of the seam, both sides of the ball unequally share the layer of air surrounding it, known as the “boundary layer”3. The seam’s roughness causes particles of air to flow around it at varying speeds, slowing that side down, while the other, smoother side of the ball allows air to flow in streamlines. This pressure difference causes the ball to follow a swerved trajectory. (Bandara and Rathnayanka, 2012.)
Diagram of the physics behind swing6
Mathematically, the “shear” that swing causes on the ball’s trajectory may be represented as a horizontal vector, and is often described as a function of the ball’s forward component of velocity. Prior experiments, often modelled on arbitrary spherical debris in wind tunnels, have determined the side force on cricket balls under a variety of conditions, and have clarified the primary mechanisms of the different types of swing behaviour that have been observed10. Unfortunately, most of these investigations are not valid in practice, in that the determined forces are not related to actual swing trajectories that are of the utmost importance to the batsman. In real world scenarios, there are deliveries that do not swing conventionally as a result of unaligned seam position or weather conditions, so this study involves a control of no lateral movement. Also to keep in mind is the deviation caused during contact with the pitch, known as seam movement, due to the structure of the ball. When the ball lands at an angle on the threaded seam, it changes the axis on which it moves by a subtle
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
amount, often leading to the downfall of batsmen because of the lack of time they have to react to the change in trajectory.
Baker’s representation of the ball’s vector velocities with respect to air was a clever manipulation of its motion, and concluded with this summary:
Bentley et al.2 used the assumption that the side force coefficient of the ball stayed the same throughout transit – that the speed of the ball didn’t vary - to make calculations of swing motion. CJ Baker’s paper (A Calculation of Cricket Ball Trajectories, 2010) refutes this as being invalid since the ball’s velocity would fall by around 14 per cent before landing due to drag, and thus, for constant force coefficients, the side force will fall by around double that. However, assumptions of constant force coefficients give reasonably accurate estimations for the extent of swing, although it does not ascertain its lateral coordinate at specific times. Mehta8 also states that the predicted trajectories at high bowling speeds are very similar.
y = 0.0077CSx2, where y is the lateral coordinate of the ball with respect to the mean release position, CS is the side force coefficient, and x is the longitudinal coordinate. From wind tunnel experiments, Pooransingh measured that deliveries displayed distinct values of CS, irrespective of velocity, based on angle of inclination of the seam from the vertical9. Logically, this would make sense, because a greater angle would cause a greater imbalance of air flow between the two points of separation on the boundary layer. Roughly speaking, we can derive a formula from this data, because it is a continuous variable over a small range: CS = sin(3r/5), v ≠ 33.3, where r is the seam angle.
The observable anomaly is at 75mph (33.3ms-1), where the side force coefficient inexplicably increases. This is the only case where we will consider CS doesn’t follow the sinusoidal relation with seam angle. We may therefore conclude: y = 0.0077sin(0.6r)x2. Forward Motion Based on the hypothesis, the constraint for forward motion of the ball is that the ball travels from the bowler’s hand to the popping crease of the batting side (17.68m). As a result, this study will provide a representation of the ball’s final position on the z-axis with respect to time. Theoretically, previous studies have used vast generalisations on the relationship between the initial and final speeds of the ball; in practice, drag in the air and friction on contact with the surface contribute drastic opposing forces to the forward motion of the ball. Studies at USYD1 have proved that a ball bowled at 123kmph (34.2ms1 ) experiences a drag force equal to its weight. This is derived from the formula: D = 0.5rCDAv2 It is noticeable that the drag increases exponentially with an increase in the ball’s forward component of velocity. The silhouette area, A, over which drag acts for a cricket ball is equal to the surface area of a circle with radius 3.6cm (0.0362p). The density of air varies with temperature and pressure by a significant amount, so a particular value may be assumed to two significant figures.
Table 2: Side force and side force coefficients for the varying seam angles and ball velocity modelled in the k-epsilon interface.
As mentioned in prior points, the drag force coefficient (CD) changes with velocity, but it may be generalised, as 25
R EI G AT E G R A M M A R SC H OO L
it is less than 1. The value of CD differs based on the Reynolds number (Re), a dimensionless quantity describing the transition from laminar to turbulent flow of debris. Having made calculations of the Reynolds number with respect to drag and side force coefficients, Baker assumed CD was 0.5, which intuitively seems reasonable and logical. A standard spherical object at Re = 104 theoretically takes the value of 0.47 for CD, but because a cricket ball has a stitched seam, it experiences more drag as it has a wider silhouette area and a rougher surface. More significant than the drag acting on the ball is the friction between the ball and the pitch, which slows down the ball by approximately 30 to 40 percent of its initial z-component speed. Each pitch is described subjectively by how much speed it loses on contact with it, as a “quick” or a “slow” wicket. The friction of any surface acting on an object is given by: F = µR, where µ is the coefficient of friction of the surface, and R is the perpendicular reaction force exerted by the pitch on the ball. Initially, the ball is delivered with a downward component of Vsinq, which is very small when q is between 4 and 7 degrees. As a result, this may be ignored by the point of contact with the pitch due to nullification by air resistance. Therefore, the reaction force is the component of weight of the ball acting downwards at impact. Modelling the projectile using the formula5:
where d is the horizontal distance and y0 is the vertical distance, we get the
26
result for the angle immediately before impact:
Replacing values of q and v (g=9.8ms-1), the impact angle is very small – between 0 and 3 degrees. As a result, the value for R can be assumed to be exactly equal to the ball’s weight (0.16g), and the frictional force acting on the ball is 0.16gµ. In modelling the ball’s forward velocity against distance, there is a levelling curve formed which initially decreases very quickly, then settles as the backward drag velocity becomes equal to the velocity of the ball. The gradient of the curve of drag versus velocity can be calculated as: dD/dv = rCDAv Baker applied this concept to his study to get: u = (1 − CDrAx/2m)V0 v = (CSrAx/2m)V0, where x is the longitudinal coordinate, V0 is the bowling speed, CS and CD are the respective coefficients, and u and v are the longitudinal and lateral ball velocities respectively. In this study, the ball’s forward velocity is modelled as an exponential decay curve against time, with a sharp instantaneous drop on impact. The drop is dependent on the value of µ for the pitch: V1 = Vafter impact = 0.88V0 - µ(g + V0sin(5)/thand)t, where thand is the time through which the ball accelerates in the bowler’s hand, t = 0.001s and g = 9.8ms-2 [1]
So V = V0e-kt Previously ascertained experimental data1 says that the ball’s velocity drops by 12%, and the time it takes to reach the pitch is, to three significant figures, 0.325s: 0.88V0 = V0e-0.325k Cancelling out V0 on both sides, we get k = - ln(0.88)/0.325 = 0.393 Integration is very useful to make a calculation to derive the time taken for the ball to reach the batsman, because the area under each part of the curve is simply the distance the ball travels (13m and 5m respectively). Using this, we can make the time (t2) a function of V0, µ and t. Here, t1 represents the time taken to reach the pitch, and t2 is the total time to reach the batsman. Between the bounds t = 0 and t = t1,
∫V dt = -V0e-kt1/k + V0/k The area under the curve from 0 to t1 is equivalent to the length of the ball, which for this study is 13 metres. 13 = -V0e-kt1/k + V0/k Making t1 the subject, we get: t1 = ln (1-13k/V0)-1/k
[2]
Using integration again, one could phrase the areas bounded by t1 and t2 as:
∫V1e-kt dt = 5 Let x = e-kt2 – e-kt1 and let n = µgt: x = 5k/(n-0.88V0) [3]
∫V1e-kt dt = ∫V0e-kt dt - n(t2 - t1) (V1-V0)x = n(t2 – t1) Replacing [1] and simplifying: kn(t2-t1) = (-x)(n + 0.12V0)
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
Substituting x from [3], and substituting t1 from [2], we get:
been calculated to be 0.393 in the exponential decay curve. Replacing all values, we get: A = (1.457
APPROXIMATE SOLUTIONS As a three-dimensional matrix A, the problem can be expressed as: A=
The first column represents the height of the ball off the ground, the second the lateral coordinate of the ball with respect to the mean point of release, and the last the time taken for the ball to reach the batsman from release. For the sake of this study, we will assume the bowler to be James Anderson bowling outswing to a righthander at Trent Bridge. His average initial speed may be assumed to be 38ms-1 and his standard angle of seam inclination 300. His arm height, H, would be the sum of his height and the length of his arm, which would be approximately 2.0m. Since the distance of the ball’s motion is considered from crease to crease, x would take the value of 18m. Finally, n is the factor of slowing of the ball with the pitch, given by the expression µ(g + V0sin(5)/ thand)t. Acceleration due to gravity is 9.8ms-2 , and thand and t, the time through which the ball is accelerated from the hand, and the time through which the ball decelerates on the pitch respectively, can both be assumed to be in the order of 10-3 s. The coefficient of friction of the pitch and the ball can be summed up to make µ; a valid estimate for a dry wicket and the seam of a cricket ball is around 1.45. From these values, we get a value of n of 4.90. The constant k has previously
0.771
0.706)
Taking footage from the first two wickets of the Third Test between New Zealand and England at Trent Bridge (2008)11, and running it through Tracker with a calibrated scale, we get estimated measurements of: A1 = (1.23
0.73
0.63)
A2 = (1.42
0.71
0.58)
The maximum percentage uncertainty noticed between each of the columns are: (15.6%
8.6%
17.8%)
Across such a small data set, it is difficult to ascertain exactly how precise the model is, but ultimately there is a wide range across which each of these parameters vary from ball to ball. Although a maximum percentage difference of under 16% in the physical criteria is achieved, it does lie within the uncertainties of the dimensions of a cricket bat, as mentioned in the hypothesis. The width of a cricket bat is 4.25 inches (0.108m), and the height of the sweet spot is between 6 and 12 inches (0.15 and 0.30m), within which the x and y values lie. It must be said that despite this model working for two deliveries, unevenness in bounce and swing means that the range of x and y values necessarily will be larger than the aforementioned parameters, but for reasonably consistent pitches (µ - µavg < 0.1), this model is rigorous to one decimal place in all criteria.
accuracy, giving a good estimate reliant on initial speed, coefficient of friction of the pitch, angle of seam release and height of release. This is the first comprehensive study that has allowed a measurable set of parameters to predict how a swinging cricket ball moves. With some more refinement and newer variables being added, it will be possible to predict its position to greater degrees of accuracy. It also may be more accurate to model the time component as a curve proportional to the inverse square of the initial velocity, although finding a constant of proportionality for such a graph may only be attainable from observed data. Ethically, one’s initial concern with this study was that a positive result in this hypothesis would spoil the dynamics of the sport, aiding betting and match fixing, but each delivery depends on a huge number of factors that are not instantaneously measurable. For example, the pitch will have varying values for coefficient of friction in different regions, and the ball may not be delivered with a perfectly aligned seam position. Each ball is delivered with slightly different speeds depending on the momentum of the bowler’s arm, and changes in body position on release can change the way the ball moves. The sheer number of uncertainties in all these factors is too large to calculate precisely, but the matrix representation of resultant position could be useful to further studies, hopefully being utilised to understand batting technicalities, in turn improving foot movement, bat swing and shot selection in future years.
Conclusion This study satisfied the initial hypothesis to a reasonable degree of
27
R EI G AT E G R A M M A R SC H OO L
REFERENCES Baker C. J., ‘A calculation of cricket ball trajectories’. Proc. IMechE Vol. 224 Part C: J. Mechanical Engineering Science, 2010, 1948-1949. 2 Bentley K., Varty P., Proudlove M., Mehta R. D., ‘An experimental study of cricket ball swing’. Imperial College Aero Technical Note. 1982, 82–106. 3 Cross R., http://physics.usyd.edu.au/~cross/cricket.html, 08/11/16 4 Cross R., physics.usyd.edu.au/~cross/publications/BallBounce.pdf, 08/11/16 5 Cross R., https://en.wikipedia.org/wiki/Projectile_motion, 08/05/17 6 Gupta, A., https://www.scienceabc.com/pure-sciences/the-science-of-swing-how-do-bowlers-swing-the-ball-in-mid-air. html, 07/12/16 7 Mehta R. D., ‘A review of cricket ball swing’. Sports Eng., 8, 2005, 181–192. 8 Mehta R. D., ‘Aerodynamics of sports balls’. Annu. Rev. Fluid Mech., 17, 1985, 151–189. 9 Pooransingh A., Latchman R., https://www.comsol.fr/paper/download/258421/pooransingh_paper.pdf, 07/05/17 10 Shelquist, R., Equations - Air Density and Density Altitude (2009) 11 Wood O., https://www.youtube.com/watch?v=B75tNVWh_sg, 08/05/17 1
28
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
IS BRIDE PRICE IN THE AFRICAN CULTURE A MORALLY ACCEPTABLE TRADITION? Joanna Welsh
T
Introduction he widespread custom of bride price in African culture has come in to question recently. In essence, bride price, known as “lobola” in the southern parts of the continent, “mahari” in east Africa, or “winecarrying” among tribes in west Africa, involves a man and his relatives transferring goods to the relatives of his bride, at the time of marriage. With the rise of women’s rights in the past century, global attitudes have been reformed; and so a problem arises, from a western perspective, because of bride price’s seemed undermining of the freedoms and rights of women. With the resurgence of liberalism due to economic woes in the early 21st century, evident in the election of Barack Obama in the 2008 presidential election, ideas about equality and mutual respect between genders have significantly developed, partly due to the exponential growth in social media. The fundamental principles of ‘liberty, equality, and community’ (a slogan used by Liberal International) of Liberalism, as well the globalisation of business and the availability of information, have all contributed to a rise of modern women’s rights movements. In this paper I will be outlining the benefits and disadvantages of bride price, both looking at evidence and also considering the issue from differing philosophical perspectives. With these antithetical philosophies about bride price, (the West perceiving and witnessing its negative effects, while many practice the century old custom willingly), it is necessary to find a satisfactory answer to whether or not bride price is a morally acceptable practice, and by extension whether we should discourage a culture from following such a tradition. I would affirm that although there is difficulty being objective when it comes to morality, the ability to judge a culture’s ethical practices is necessary for the moral evolution of the global society as a whole. This practice of some African cultures seems alien to western eyes for a reason: it is our duty to protect vulnerable people. Does this therefore extend to attempting to eradicate bride price?
Evidence and philosophical arguments in favour of bride price Even though bride price is simply a gift from the family of the groom to the bride’s family, it can be seen as an integral part of the traditional marriage, used for ceremonial purposes during and after the original marriage ceremony. Bride price is sometimes revered as a symbol of sincerity and good faith that brings together the bride’s and groom’s families. Although it seems more akin to a business deal or a job interview to a person outside this culture, in sub-Saharan Africa, negotiations between the two families, that will soon be joined, in relation to bride price, are considered traditional customs, integral to African marriage and society. By many these negotiations are seen as a crucial component of the practice; important familial bonds and mutual respect is forged between the sides and the price itself is an emblematic “token” of this new-founded relationship, giving it true value and sobriety, (The World
Heritage Encyclopaedia). In this way, the following perspective has authority: “In some communities, bride price legitimises a marriage. Marriage cannot be valid without it. So it is not a burden or insult for men and women respectively, it is a proud cultural practice” (Daga Said, Seattle USA /Ethiopia).1 It can be seen to have further positive effects, for instance, bride price can be seen to increase women’s education and therefore the development of Africa, reducing gender inequality in education. In a study evaluating the effects of an INPRES school construction program,2 ethnic groups that traditionally engage in bride price payments at marriage, increased female enrolment in response to the program. Within these groups, higher female education at marriage is associated with a higher bride price payment received, providing a greater incentive for parents to invest in girls’ education and take advantage of the increased supply of schools. An important consequence of the bride price is that it provides an additional monetary
1 ‘Bride price: an insult to women, a burden to men?’, BBC News (Aug 2004) 2 N Ashraf, N Bau, N Nunn, A Voena, ‘Bride Price and Female Education’, Scholars at Harvard (July 5, 2016) p1-2 29
R EI G AT E G R A M M A R SC H OO L
like pieces of garbage. Are we saying there are no wife beaters in places where bride price is not common, say the West?”1
This map illustrates the distribution of bride price (and dowry) practices globally. This is from the Woman Stats Project concerning the current practice of exchanging goods at the time of marriage. It’s clear to see the contrast between western marriage ideals and of those in Africa and other more Eastern countries
incentive for parents to invest in their daughter’s education. Plus, if daughters cannot pay back their parents for educational investments made, the bride price compensates for this. The bride price provides a shorter-term and more certain monetary benefit to educating daughters. In reality the only true way to see whether bride price is a beneficial practice is by listening to individual›s opinions who have experienced both bride price and no bride price. An interviewee for BBC News, Kennedy Chama, Chililabombwe, Zambia, thought: “Bride price symbolises a woman’s value. Biblically, Adam had to be put to sleep and a part of him used to create Eve. He somehow paid for it. The payment also helps to preserve marriages. In Europe, where dowry payment is not common many marriages are falling apart.”1
It could be seen as a fallacy to assume that bride price causes domestic abuse, after all, the World Health Organisation estimate 35% of women worldwide have experienced such violence. There could be some correlation between bride price and abuse, but it may be unfair to say that it directly causes it. These views evidently suggest that bride price is not the only contributing factor to such consequences and putting all the blame of inequality on it is unhelpful.
The USA has a 53% divorce rate, whilst in Spain, Portugal, Luxembourg, the Czech Republic, and Hungary divorce rates are higher than 60%.3 However, in countries such as Benin, Ghana and Niger, there has in fact been a 10% reduction in divorce rates, with the highest rate of divorce being only 47.1% in Congo, significantly lower than the West.4 Although this may not be directly related to bride price, it could be seen as hypocritical for the West to dictate marital practices when the evidence is clear that they know no more than in Africa about how to sustain a marriage.
This leads the questioning individual to the conclusion that it may not be morally acceptable to make sweeping judgements of another culture’s customs, as you haven’t been brought up as they have, but entirely separately. Thus, the arising thought is that moral truths are not universal and immutable but subjective and personal. In this way, one would reason global relativism as the most suitable perspective to take. Global relativism is the claim that
To further this point, a second interviewee, Martin Curtis, Phoenix, USA, stated:
Moral dilemmas can show different truths for different people. It is not possible to find moral absolutes from experiencing the world; a moral absolutist commits a fallacy by ‘deducing’ such principles from a seemingly amoral world.
“It’s misleading to argue that bride price is a ‘license’ for men to abuse their wives or to treat them as purchased goods. I know a great number of men who never paid bride price but still treat their wives
‘all claims - not just claims about right and wrong - are only true or false relative to the standards of a person or group’ (Jeff Speaks)5.
Cultural relativism illustrates that individual groups can and do have
3 P Engel, ‘Divorce Rates Around the World’, Business Insider, May 2014 4 S Clark, S Brauner-Otto, ‘Divorce in sub-Saharan Africa’, Population and Development Review, Dec 2015, Volume 41, pg583-605 5 J Speaks, Professor of Philosophy, Moral absolutism, moral nihilism, moral relativism, University of Notre Dame, https://www3.nd.edu/~jspeaks/ courses/2014-15/10100/lectures/17-value.pdf (2009) 30
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
varying ethical principles, none of which hold greater truth or value over the others. Although none are superior to others, any individual will most certainly choose their own ethical principles as the preferable basis. Herodotus, an Ancient Greek philosopher, affirmed that “If anyone, no matter who, were given the opportunity of choosing from amongst all the nations in the world the set of beliefs which he thought best, he would inevitably—after careful considerations of their relative merits—choose that of his own country. Everyone without exception believes his own native customs, and the religion he was brought up in, to be the best.” Therefore, bride price, as seen from a British opinion, may well be immoral, however, this does not automatically entail that we should abolish it wherever we come across it. An equally valid opinion from within Africa could be established indicating bride price as ethically acceptable. And so, in the case of bride price, saying this is categorically immoral is like denying that anyone should be allowed to have a different moral perspective, which is an injustice. For some, especially in the African culture where it is considered a structural foundation of marriage, it is unfair to label people who follow bride price, negatively. A development of Herodotus’ principle was attempted to be found in historical particularism (coined by Marvin Harris in 1968). It argues that each society is a collective representation of its unique historical past. With a rejection of parallel evolutionism (the idea that all societies are on the same path and have reached their specific level of
development the same way all other societies have), historical particularism showed that societies could reach the same level of cultural development through different paths. This is a similar distinction that some religious people divide between exclusivism and pluralism, only one path or many paths to God, but instead what we seek is moral perfection. The world is not so vastly different but cultures have characteristics of diffusion, trade, corresponding environment, and historical accident which can diversify them from or make them similar to others. Franz Boas described cultural customs with three key features: environmental conditions, psychological factors, and historical connections. In Africa, bride price is carried out in a less-developed environment, where economic stability is rare (as can be seen by the current drought and food shortage crisis in East Africa). Such surroundings foster the need for strong investment between marrying families, such as using the bride price, so that neither can be abandoned without financial security. This is less necessary in the West with governmental benefits giving security to single parent households, or even anyone unemployed. The lack of necessity for a system to give financial stability in the West, means they can survive without bride price, but such a luxury is not so available to the LEDCs (Less Economically Developed Countries) in Africa. The psychology of African nations compared to Europe or North America, is startlingly different too, evident from their large families who continue in the profession of their parents. Equally, the historical evidence indicates that the joining of two people is really the joining
of families, which links back to the early times of tribe rivalry and the dissolution of such hostility in times of marriage and children. It can be seen that stripping Africa of its tradition of bride price, would be equivalent to wiping out its history, leaving them stranded in their volatile economic environment, and at a loss of a cultural mind-set. This outlook of Western superiority is in fact the opposite of moral, to a relativist, and therefore should not be carried out, in their opinion.
Evidence and philosophical arguments opposing bride price Bride price has been described as “the license of owning a family in the African institution of marriage.”6 In some African cultures, the prescribed value of a bride is directly connected with her reputation and esteem in the community, an aspect that has been criticised as demeaning to women. This is evidential in the, granted sarcastic but equally telling, new ‘bride price app’ by a Lagos-based creator. By asking a series of questions about looks, cooking skill, education and nationality, it will calculate a woman’s literal monetary value in Nigerian Naira. Thus illustrating effectively how even the modern movements in equality haven’t shaken the enduring patriarchy and uninterrupted patrilineal lineage of some African countries. Although this is a satirical representation, the basis for it is fundamentally truthful, that women are commodities to buy and sell, between her relatives and a prospective husband. Further still, to the Fang people, the majority ethnic group in Equatorial
6 H Waweru, The Bible and African Culture: Mapping Transactional Inroads, 2008, pg 170 31
R EI G AT E G R A M M A R SC H OO L
Guinea, the price is considered the literal purchase price of a wife, and the husband exercises economic control over her. The Fang people practice the bride price custom in a way that subjugates women who find themselves in an unhappy marriage. Divorce has a social stigma among the Fang, and in the event that a woman intends to leave her husband, she is expected to return the goods initially paid to her family. If she is unable to pay the debt, she can be imprisoned. Although women and men in theory have equal inheritance rights, in practice men are normally the ones to inherit property. This economic disadvantage due to the tradition of bride price reinforces women’s lack of freedom and lower social status. Similarly, the rights of each spouse are drastically different to western law. In Western culture it is hoped
that rights are equal, and the rights over children are not pre-set at a marriage’s initiation. However, in some African cultures, there is an obligation of the bride to provide children for her husband, and he holds the rights over these children. In illustration, the Swazi people, a traditional kingdom of Southern Africa, hold the view that a marriage is a union between two families, as well as between the bride and groom; the payment of bride price to a woman’s relatives inaugurates the husband’s rights over his wife. A woman’s main duty to her husband is to provide him with children. Should she be unable to fulfil such a commitment, her relatives must either give back the bride price, which could now be gone, or provide a second wife to the husband for free.7 Such an injustice seems deplorable to Western
Pastor Abasiubong Tom took this picture with bride price items of a woman from Mbaise, Imo state on Sunday, August 7, 2016. Bride price items like pigs, ram, goat, fish, beer, rice can be seen in the photo and are goods commonly given. It is immediately obvious to see that bride price is not just a minimal expense, in the majority of cases, but a major sacrifice on the groom’s side, thus leaving his family in a weak economic position. 7 J Peoples, G Bailey, Humanity – An Introduction to Cultural Anthropology, 1988, pg188 32
eyes and so suggests the immorality of bride price. Further, less obvious, consequences of bride price are significant in seeing the issue as it stands currently. Firstly, marital rape is a severe problem in African culture, and for many, bride price is a major cause of this. ‘Marital rape is any unwanted sexual acts by a spouse or exspouse, committed without consent and/or against a person’s will, obtained by force, or threat of force, intimidation, or when a person is unable to consent’ (United Nations Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW)). South Africa is rape capital of the world, where “a woman is raped every 17 seconds,” (Solidarity Helping Hand Women in Action, 2010). Numbers and figures do not illuminate the true epidemic at hand. It is unquestionable that the country has laws that clearly make rape an offence but it is not reducing the prevalence of the issue. The situation is even more dire regarding marital rape, which is often ignored or unrecognised, and is worsened with the practice of bride price. The Open Society Initiative for Southern Africa (OSISA) affirms that the law may not be enough to deal with what appears to be an attitudinal problem more than anything; the attitude, not just in South Africa, but also in southern Africa, and perhaps the whole continent – is that it is not rape as long as the perpetrator is one’s husband, or known partner. The myth that rape is only executed by strangers is dangerously thought by the majority. Yet, marital rape is included in the definition of gender-based violence
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
males of the same age, as older choose younger partners.9
Herds of cattle are often given as bride price. In some countries cash transfers are illegal and, therefore, trading in goods is a more secure way of fulfilling the custom. This can be a problem because if there is a dissolution of the marriage, goods such as cattle will have most likely been sold off or killed. If the bride price must then be handed back this can lead to women and their families in a state of severe debt
given by the United Nations CEDAW, which states that ‘violence against women shall be understood to encompass…marital rape’. J Alupo8 cited occurrences where bride price has a credible, direct link to domestic violence in the relationships in Uganda. To him, too many women in Uganda have suffered and had their rights violated because of bride price payment. Women are forced to live in hostile environments, conducive to servitude, slave-like conditions, hence leading to violence against women. In theory, bride price could be interpreted as unambiguous acknowledgement and valuing of women’s productivity and input to marriage; in reality, it often achieves nothing more than the unlawful
controlling of women’s bodies. High bride price has long been linked to domestic violence, owing to women’s fear of returning to their birth home without being able to repay the bride price. This problem extends more generally than just violence; bride price contributes to the idea that women are inferior to men and can be sexually exploited with little to no negative consequences for the man and major health consequences for the woman. In Uganda, women face a wide range of challenges including discrimination, low social status, lack of economic self-sufficiency, and greater risk of HIV/ AIDS infection. In some districts, HIV prevalence among 13–19 year old girls is at least 10 times higher than in
As in many African countries, the patriarchy and inequality of genders mean that women must be subordinate to men; this reduces their power to act independently, become educated, avoid poverty, or escape reliance upon abusive men. Young women are easily coerced into sex due to the attitudes prevalent in the culture; many need to in order to survive economically. 10 single-sex focus group discussions and 14 indepth interviews were conducted in Wakiso and Nangabo from July 2003 through March 2004 by the US NCBI. Their conclusion was that bride price was felt by women as being ‘bought’ into the man’s household, reducing their rights and decision-making power. ‘It limited women’s independence and perpetuated unequal gender power relations, especially regarding healthseeking behaviour’ and bride price was ‘associated with domestic violence, with serious sexual and reproductive health implications.’ According to one respondent, a traditional midwife, bride price was a ‘commoditization of sex.’ To acquire wealth from bride price, in order to have economic independence, girls are forced to marry at a young age, and many are forced to leave school for marriage. Yet this only reduces their freedom even more. A local council chairman noted that “women lack power to make decisions in the home. The culture does not allow them to stand up to the men. The laws are also (supporting) men, especially over
8 J Alupo, ‘Bride Price And Gender Violence’, A Paper Presented To The Participants At The International Conference On Bride Price And Development, Makerere University Kampala Uganda (2004) 9 Foundation for sustainable development international, Gender equity issues in Uganda, 2008 33
R EI G AT E G R A M M A R SC H OO L
sexual matters. If (women) want to leave, they pay back the bride price first.” As women have little to no economic power, women are at a disadvantage in negotiating sexual relations. This is illustrated from the interview with a midwife: “Some women did not want to have sex with the spouses, yet the men demanded for it. But since many may not want (sex) yet their men want, they have nothing to do but accept, or else the men will force them…. (men) may not accept any reasons after all, they think it is their right at any time”. Bride price is a threat to the safety and rights of many African women, instead of having freedom to be educated and become successful working citizens, they’re stripped of this opportunity and many instead become trapped in unhealthy relationships.10 The bride price tradition can even have destructive effects for men. When young men don’t have the means to marry, they cannot afford the extent of the bride price or do not have the means to get what is being requested. In strife-torn South Sudan, for instance, many young men steal cattle for this reason, often risking their lives. In Uganda, the effects of high bride-price on men, discovered in interviews which were conducted with male respondents in the Members of the Public Interview data-set, included poverty, stress and marital instability. The couples are forced to enter into marriages with bride price paid,
incurring severe debt. This heaps huge pressure on young inexperienced men when they have no resources, and young people starting marriage and adult life in financial trouble has serious implications on stability of marriages. The threat of a high bride price can also lead to co-habiting men and women being separated, due to nonpayment of bride price. Parents come and take their daughters back since bride-price is clearly not going to be paid, breaking up the new family.11 The relativists’ view appears, to me, to ignore the facts of what bride price is, and what it leads to. Allowing this sort of ethical anarchy provides no basis for us to judge morality. Such a perspective can lead to simple antinomianism, the rejection of any sort of moral law. Peter Singer asserted that ‘at the descriptive level, certainly, you would expect different cultures to develop different sorts of ethics and obviously they have; that doesn’t mean that you can’t think of overarching ethical principles you would want people to follow in all kinds of places.’ Surely two opposite opinions, on whether bride price traditions should be followed, cannot both have equal truth, that is irrational – therefore, there must be a single certain objective truth to be found. In the case of bride price, such a truth would be that it should be abolished, evident from the negative consequences it causes. Avenues such as moral absolutism, the concept that there are certain absolute standards, regardless of context, that
actions can be judged against, can effectively show an answer to the dilemma. Such absolutes, like articles such as the United Declaration of Human Rights12, justify attempts to ban, or at least regulate bride price, and safeguard against marital rape. Alasdair MacIntyre (1929-, Scottish philosopher) also affirms that cultural relativism is lacking and tackling the issue from a perspective of moral absolutism is the only possible solution.13 MacIntyre suggests that cultural relativism implies that cultures disagree greatly. However, the level of ethical opposition between cultures has been systematically exaggerated, and that the level of ethical alliance between them has been systematically underestimated. For instance, western marriage ceremonies do tend to have their own versions of bride price; the prenuptial agreement, the gifts given to the couple at the reception, and the meeting of the in-laws are all reminiscent of bride price. By separating one’s own culture from that of another’s, one is in fact worsening the issue. Surely we should be attempting to become more inclusive and reducing the ‘us and them’ attitude? What seems especially poignant is that we have all evolved from the same creatures and therefore, ultimately, we should not divide ourselves via our supposedly differing moral values. What are all cultures attempting to reach by improving and developing other than an absolute perfection of ethics? This leads to the conclusion that absolutism should be used in all cases as it means that we create a global united society,
10 D K Kaye, F Mirembe, A M Ekstrom, G B Kyomuhendo, A Johansson, ‘Implications of bride price on domestic violence and reproductive health in Wakiso District, Uganda’, National Center for Biotechnology Information, U.S. National Library of Medicine, Dec 2005 11 N Sambe, M Y Avanger, S A Agba, ‘The Effects of High Bride-Price on Marital Stability’, IOSR Journal of Humanities And Social Science, Nov - Dec 2013, pg69 12 General Assembly, United Declaration of Human Rights, December 1948 13 J Speaks (2009) 34
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
beliefs.14 This is disastrous for any decision, let alone one concerning that of marriage. For a relativist, bride price is neither right nor wrong, but both – a sort of Schrodinger’s cat of morality – which is fundamentally unhelpful, as it is like a chess player refusing to make the first move so that they can never lose.
The handing over of bride price from the groom’s family to the bride’s is often a formal occasion with a lot of people gathered. Here large tubers of yam and other assorted items have been brought together to be given to the bride’s family. Then men and women are wearing formal but celebratory clothing.
which evolves together, rather than leaving less ethically evolved cultures floundering in immorality. With the custom of bride price, it is not right to leave them, especially in their fragile position as developing nations, to battle the unfavourable results. A second issue with moral relativism, MacIntyre asserts, is that the relativist would claim that a culture’s morals only apply to their own culture, yet, when we look in detail, MacIntyre says, one thing we do not find is the view that these ethical values are only binding on members of the relevant culture. Instead, the moral views of cultures are easily understood as how that certain group believes it is best for human beings in general to live, and not just members of that culture. This, MacIntyre says, ‘reveals the incompatibility of any coherent relativism with the beliefs
and practices of all the major moral cultures and in so doing it suggests that relativism, although it does not entail moral scepticism, does entail a rejection of the standpoints of all those cultures.’ MacIntyre even suggests that the only thing relativism shows truth in is the fact it shows that no moral principle from any culture is absolutely true: ‘for the only judgment taken by moral relativism to be unqualifiedly true is the judgment that no judgment advanced from the standpoints of those cultures is unqualifiedly true.’ If we used moral relativism as a standpoint then in every decision one makes, one would believe it would not be the morally right action, if one had grown up in a different culture. If this were so then almost everyone throughout history has had a mistaken view of the status of their own moral
Therefore, in order to maintain tolerance for diverse views to find a universal good, then we should clearly go for moral absolutism, not moral relativism - after all, moral relativism denies that there are any such things as universal goods. Even though the relativist suggests it is in fact openminded of all moral values, moral relativism seems to, in fact, make some kinds of intolerance immune from criticism – for instance, how is the relativist to explain why discrimination, in a culture which promotes it, is wrong, not only for us, but also for members of that culture? And so, we can conclude that moral absolutism shows the greater utility in all cases of moral conflict, and therefore we should apply it to the case of bride price. Even if bride price is accepted within some African cultures, this in no way means it is ethically valid. As westerners, although we must be cautious about regarding our own moral code as superior, we can understand that from the evidence of this specific custom’s effects on social equity and health of both men and women, we recognise something must change in the African cultures that practice bride price. From the evidence of the disastrous consequences of bride price, the seemingly obvious solution would be to ban it, as it just continues the negative stereotype of women being simply baby-making machines that
14 J Speaks, p.2-3 35
R EI G AT E G R A M M A R SC H OO L
remain at home without proper education. Shouldn’t we be asking to change the definition of a woman’s’ duty’ in Africa, from that of bearing children and maintaining the home and instead suggest it be changed from that to the equal and unbiased playing field we (are attempting to) have in the West? Instead of a woman’s path being set from birth, it can be objectively and absolutely free and equal universally. Therefore, I would argue, bride price is not morally acceptable.
What change can be made? I assert that it is not enough to just announce that bride price is morally incompatible with equality and libertarian ideals. The current situation must be eradicated, actively. Long term solutions are required, not just short term compromises and this comes by tackling issues at their source, namely bride price. Both governmental and nongovernmental groups have taken measures to attempt to eradicate this discrimination through lawful bills and active charitable support. MIFUMI, an international non-government women’s rights organisation based in Uganda, also came to learn that traditional and cultural practices across many societies are the main drivers of violence against women and girls. They agree bride price ‘perpetuates male dominance, power and control over women and girls, denying women and girls their rights entitlements and enjoyment’. Thus, MIFUMI continuously campaigns for changes and reforms in these cultural practices. MIFUMI successfully petitioned the Government of Uganda and was given the authorisation to hold the first social referendum in Uganda in December 2001 which resulted in a successful verdict with 60% saying yes to the reform of bride price. In 2007 MIFUMI took the Uganda Government to the Constitutional Court wishing the court to rule that the practice of Bride Price is un-constitutional. Especially it was complained that the bride price once taken should not be refundable if the couple should get a divorce. MIFUMI has most recently pushed for reform in laws resulting in enactment of The Tororo Bridal Gifts Ordinance into law in 2009, and a norefund ruling on Bride Price was put in place by the Uganda Supreme Court on 6th August 2015.15 15 ‘Bridewealth’, Anthropology of Kinship, World Heritage Encyclopaedia 16 MIFUMI, International NGO, May 30th 2016 36
However, MIFUMI is far away from protecting all Ugandan women, let alone all women stuck in places that practice the bride price. Customary law cannot easily be changed by few activists from an international NGO like MIFUMI, but only from within the whole indigenous society. Bride price is interwoven into the fabric of African life and therefore extracting it is not as easy as it would first seem, even if some people support the change, an authoritarian figure dictating cultural practice would be unwelcome by many. To change traditional law on bride price, therefore, is difficult as it is guarded by society, which is especially evident in the rural areas, where it is seen with high regard. For example, the whole culture of the People of Ankole, a traditional kingdom in Uganda, is deeply connected to the institution of bride price. Its custom connects families for a lifetime and woman are proud of the extremely high value they receive, comparing to the Baganda or the Rwandese. Of course, next to constitutional changes, changes in customary law would be necessary to abolish the practice. And customary law is not changeable by decision, but develops over time as social attitudes evolve.16
Conclusions Both the weight of the evidence and the philosophical arguments point to the conclusion that bride price should not and cannot be continued. The only way this can be achieved is by concluding that the normal Western moral values are superior to those of the African culture that sustains bride price, and therefore actively attempting to change their system. Using bride price as common practice is directly comparable to buying one’s wife, which is against personal freedoms laid out in the UDHR and so should be abolished. Marriage customs should not subjugate women’s rights. While moral relativism concludes that we should not interfere with a contrasting culture’s practices, this denies the true gravity of the repercussions if we let this continue. Just as we educate the young on moral law, we must educate each other’s societies to ensure a collaborative and universal ethic. From analysing these problems, there is only one suitable solution. We can infer that there is a drastic need to update the nature of the African society on a small and large scale, a morally absolutist view. This is because moral relativism
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
allows these consequences and denies the necessity to combat them. Surely we should be allowed, obligated even, to judge another culture ethically to ensure justice for all?
Bibliography Adewimni, B, Bridge price app controversy http://www.independent.co.uk/news/world/africa/bride-price-app-that-allows-women-to-calculate-their-dowry-hascaused-controversy-in-nigeria-9456861.html Accessed 28/11/16 (2014) Alupo, J., ‘Bride Price and Gender Violence’, A paper presented to the participants at the International Conference on Bride Price and Development, Uganda (2004) Ashraf, N., Bau, N., Nunn, N., and Voenna, A. Bride price and female education http://scholar.harvard.edu/files/nunn/files/ bride_price_manuscript_01.pdf (2016) BBC News, Dowry in Africa, http://news.bbc.co.uk/1/hi/world/africa/3604892.stm Accessed 12/12/16 (2004) BBC News, Bride price practices in Africa, http://www.bbc.co.uk/news/world-africa-33810273 Accessed 12/12/16 (2015) BBC News, Bride price: an insult to women, a burden to men? http://news.bbc.co.uk/1/hi/world/africa/3604892.stm Accessed 21/11/16 (2004) Clark, S. and Brauner-Otto, S. ‘Divorce in Sub-Saharan Africa Population and Development Review’ (Dec 2015 Vol 41) (2015) Encyclopaedia Britannica, Dowry https://www.britannica.com/topic/dowry Accessed 21/11/16 Engel, P., ‘Divorce Rates around the world’, Business Insider (May 2014) FSD Foundation for Sustainable Development, Gender Equality Issues in Uganda, http://www.fsdinternational.org/ opportunities?tid%5B37%5D=37&tid_1%5B42%5D=42 Accessed 28/11/16 (2008) Kaye, D. K., Mirembe, F., Ekstrom, A.M., Kyomuhendo, G.B., and Johansson, A. Implications of bride price on domestic violence and reproductive health in Wakiso District Uganda National Center for Biotechnology Information, US National Library of Science (2005) NCBI (2005) https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1831942/ Accessed 28/11/16 Chabata T., ‘The commercialisation of lobola in contemporary Zimbabwe: A double-edged sword for women’, OSISA (October 23rd, 2012) Peoples, J. and Bailey, G. Humanity – an introduction to cultural anthropology (1988) Polavarapu, A. Uganda’s marriage and divorce bill on the table again https://ilg2.org/2013/03/10/ugandas-marriage-anddivorce-bill-on-the-table-again/ Accessed 21/11/16 (2013) Sambe, N., Avangar, M.Y. and Agba, S.A., ‘The Effects of High Bride Price on Marital Stability’, IOSR Journal of Humanities and Social Science, pp.65-69 (Nov-Dec 2013) Waweru, H. (2008) The Bible and African Culture: Mapping Transactional Inroads World Bank, Engendering Development through gender equality in rights, resources and voice http://documents.worldbank.org/curated/en/512911468327401785/pdf/multi-page.pdf Accessed 12/12/16 (2001)
37
R EI G AT E G R A M M A R SC H OO L
HOW WAS THE WRITING OF FRIEDRICH NIETZSCHE INTERPRETED AROUND THE WORLD AND HOW IS IT Leo Nasskau CONSIDERED TODAY?
T
he philosopher Friedrich Nietzsche has been scorned and misappropriated by philosophical commentators and amateurs alike. Often considered immoral and anti-Semitic, are these accusations warranted? What did he really believe and what impact did he have upon the world? Nietzsche was a German philosopher who lived from 1844-1900. Exerting a profound influence on society and Western philosophy, his works touched on existentialism, nihilism, perspectivism, and atheism. He developed concepts such as the master-slave morality, the übermensch, and eternal recurrence in a renowned evocative style. By using modern-day publications and sources from the period, I will investigate not only how his ideas were interpreted around the world and how these interpretations evolved, but events which influenced Nietzsche’s own way of thinking and impacted upon his chains of thought. The impact of his sister Elisabeth, a Nazi-sympathiser, and his untimely descent into mental illness provides an important contrast to his own stated “anti-anti-Semitism”1 and opposition to nationalism, whilst his legacy and influence in today’s world is also an aspect that warrants analysis and evaluation. Nietzsche lived a largely nomadic life which, along with his fervent passion for French scientific (Descartes) and enlightenment (Voltaire) thought, influenced his own thoughts. Other philosophers from whom he drew inspiration include Goethe and Schopenhauer, whilst the composer Richard Wagner was one of Nietzsche’s largest influences. Additionally, his life experiences, such as bias against resurgent German culture – which developed due to his rejection of contemporary religion and ‘herd-like’ society – and time spent with Wagner, also played a role in shaping his publications. The influence of Ralph Waldo Emerson was just as strong as that of Wagner but far more consistent; it influenced him such that Nietzsche became an intellectual free spirit who sought to counter and disrupt the passage of history, of tradition, and of convention. Over time, he came to identify himself as a sovereign self who refused to exalt inherited ideals.2 It is this frequent theme which grows as Nietzsche ages, being most profound in Thus Spoke Zarathustra. In this essay, I will examine the changing message of Nietzsche’s publications and how this came to be. Furthermore, I shall demonstrate how Nietzsche should be deemed innocent from later abuse of his philosophy and what he really represents.
1 Yovel, Yirmiyahu, Dark Riddle: Hegel, Nietzsche, and the Jews (University Park, PA: Penn State University Press, 1998) 2 Ratner-Rosenhagen, Jennifer, American Nietzsche, (Chicago: University of Chicago Press, 2010) 38
Friedrich Nietzsche
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
Nietzsche is a philosophical figure who divides opinion to greater extremes than most. Although the subject of much work by both mere enthusiasts and notable philosophical commentators, Nietzsche is often associated with racist and anti-Semitic ideas. However, the reality could not be more dissimilar. After his appetite for war, and the nationalistic fervor that accompanied it, was diminished following his involvement in the Franco-Prussian war in 1870, he became a changed thinker, nurturing the evolution of his ‘will to power’ concept. Granted leave from his post in Basel for ‘medical services’ with Prussian forces in August, he worked as a nursing orderly until September 7th when he collapsed with dysentery and diphtheria after having tended casualties for three days and nights continuously.3 By the end of October, he was back in Basel – whilst the war provided impetus for his ‘Will to Power’ idea, it was his last taste of combat and indeed any strenuous physical exertion - barring perhaps his contraction of syphilis in a brothel in 1867 as supposed by Thomas Mann.4 Following this, Nietzsche had a principled opposition to nationalism and the Third German Reich, reaffirmed in many letters he wrote to his family.5 Indeed, Nietzsche was more influenced by his own experiences rather than contemporary or past philosophers. Appointed as a professor at the University of Basel aged just 24, he renounced his Prussian citizenship and remained officially stateless for the rest
of his life.6 However, in 1879, following a decline in his health, Nietzsche had to resign his position at Basel. As a soldier, he had suffered a riding accident in 1868 and throughout his life experienced various disruptive illnesses such as violent indigestion and migraine headaches. These persistent conditions were aggravated during his time as a professor, forcing him to take longer holidays until work became impractical. Living off his pension, Nietzsche spent much of his time travelling after leaving his post and seems to have lived a “life of utter isolation.” He split his time between the northern coast of the Mediterranean (namely Nice and Genoa) and the mountains of Switzerland, where he spent most of his summers. Unlike some philosophers, many of Nietzsche’s ideas were not drawn from the publications of his peers. Rather, he was inspired, in particular, by the lifestyle and life experiences he subjected himself to. Of course, Nietzsche did his fair share of philological and philosophical learning - in his youth he was particularly fond of Schopenhauer and Emerson - and undeniably aspects of their ideas can be seen in his. However, he diverted from the path taken by most philosophers by both chance and design. His illness played a part, along with his turbulent upbringing; his father died when he was five and this was followed by the death of his brother Ludwig , six months later. He adopted uncommon attitudes, such as those to Christianity and the übermensch, as well as an aphoristic writing style. This is both a reflection of the environment in
which he wrote and the style of French moralists and the Russian philosopher Afrikan Spir. One character’s influence to whom Nietzsche was susceptible was Richard Wagner. As Walter Kaufmann puts it, “It was Wagner’s presence that convinced Nietzsche that greatness and genuine creation were still possible, and it was Wagner who inspired him with persistent longing first to equal and then to outdo his friend.”7 It is from Wagner’s Ring cycle and his famous opera Tristan and Isolde where Wagner’s influence on Nietzsche is most evident. From the Ring cycle, Wotan’s desire for power by acquiring Alberich’s ring contributed to the construction of Nietzsche’s will to power concept. Additionally, Wagner’s rejection of Christian imposed art and his famous opera Tristan and Isolde influenced Nietzsche by inspiring him that there was hope for the revival of a world which embraced Greek tragedy and ontology. Nietzsche ended up writing about his beliefs and hopes of revival in The Birth of Tragedy, which he dedicated to then mentor Wagner. However, in 1876 Nietzsche’s friendship with the controversial composer fell apart. Visiting that year’s Bayreuth festival, Nietzsche was sickened by the banality of the shows and the baseness of the public repelled him. He was also alienated by Wagner’s championing of “German culture”, which he felt was a pandering to Christian pieties and a surrender to the new German Reich. Nietzsche expressed his displeasure with the later
3 Hollingdale, Nietzsche: The Man and his Philosophy (Cambridge University Press) 4 Mann, Thomas, ‘Nietzsches Philosophie’ in Neue Studien (Stockholm: Berman) 5 On November 7th 1870, writing to life-long friend Carl Von Gersdorff, Friedrich Nietzsche maintained “I regard the Prussia of today as the greatest dangers for culture”; on February 24th 1887, Nietzsche wrote to artist Reinhart von Seidlitz, stating “for present-day Germany, I no longer have any respect.” 6 Hecker, Hellmuth, Nietzsches Staatsangehörigkeit als Rechtsfrage, Neue Juristische Wochenschrift, Jg. 40, 1987, nr. 23, pp. 1388–1391) 7 Kaufmann, Walter, Nietzsche: Philosopher, Psychologist, Antichrist. (N.Y: Vintage, 1968) 39
R EI G AT E G R A M M A R SC H OO L
Wagner in “The Case of Wagner” and “Nietzsche contra Wagner”. During his life, Nietzsche was himself woefully unsuccessful as an author and his fame was negligible. Now, with the new style of Zarathustra, his work became even more alienating and the market received it with scarce recognition. Although in the early 20th century, Nietzsche’s work enjoyed a respectable growth in popularity, it was hugely discredited as his ideas were adjusted to the needs of fascism and anti-Semitism. Since then, the study of Nietzsche and his works has been rather small, and in Europe especially, he has remained a discredited figure, despite the work of Kaufmann and others to change this. In modern times, the relevance and fame of Nietzsche varies throughout the world. Asian nations, notably Japan, have a far more positive view of the man with a “pan-European eye”8, than those in Europe do, but even in Asia, the style and content of his work has drawn criticism and censorship; Mao Zedong prohibited Nietzsche texts in China during parts of his rule. Indeed, Nietzsche was influenced by other philosophers, but part of what sets him apart is the extent to which his own life experiences had a grip over his ideas. In 1865, Nietzsche moved to Leipzig to study at the University. There, the works of Arthur Schopenhauer were introduced to him in a local bookshop. Paul Strathern writes that the younger Nietzsche, then 21, was so captivated by Schopenhauer’s philosophy that he “broke his rule on book buying”
and bought the book without reading through its pages beforehand. Later that night he spent hours avidly poring over the book. Nietzsche appears to have a great fondness for Schopenhauer. Schopenhauerian views held great influence on him for at least a decade, demonstrated by Nietzsche’s 1876 publication Untimely Meditations, which included a positive essay titled Schopenhauer as Educator; it suggested how Schopenhauer’s ‘philosophic genius’ may well revolutionise German culture. Whether Nietzsche ever truly departed from Schopenhauer is still debated and is not clear. Of Schopenhauer’s works, ‘The World as Will and Representation’, published first in 1818, was of particular importance to Nietzsche because it introduced to him the idea of the will. Schopenhauer believed that the will to life drives human actions and Nietzsche at first adopted this idea, soon to modify it for his appreciation of ancient Greek culture and Richard Wagner’s Ring cycle. Ancient Greek culture focused on tragedy and how they relieved some of the terrors of life; Nietzsche deemed this culture to have fallen from favour following Socrates’ introduction of rationalism. Ancient philosophers such as the 5th Century Pseudo-Dionysius the Areopagite (Dionysius) emerge as sources for Nietzsche’s beliefs and Nietzsche valued the gesamptkunstwerk (total artwork) it embodied. Dionysius wrote of catechumens, penitents, and the demon-possessed (people of lesser ecclesiastical orders) who were unable to appreciate wholesome ‘contemplation’ of God9. Whilst
Nietzsche believed that a redemption of the arts was necessary, he was opposed to “anti-artistic vibes coming off Christianity”10 and a revitilisation of the laity, able to receive ‘intelligible truth’. This view was shared by Richard Wagner, prolific composer, conductor, and one time closest friend of Friedrich Nietzsche. After meeting in 1868, Nietzsche was drawn into the Wagners’ closest circle; he greatly admired the couple and in 1870 he gave Cosima Wagner, Richard’s wife, the manuscript of “The Genesis of the Tragic Idea” as a birthday gift. Wagner’s influence over Nietzsche can be seen most clearly through Nietzsche’s reactions to Wagner’s Ring cycle and Tristan and Isolde. Three of the characters, Wotan, Alberich, and Siegfried, are all driven by a desire to acquire the ring and a need for power. Nietzsche noted how when Wotan stole the ring, he became unwilling to give it up - even for the representation of love, Freia. In particular, Nietzsche acknowledged Wotan’s desire to expand and conquer. According to Mark Berry, Wotan, in Nietzsche’s mind, “[stood] as an unacknowledged progenitor of his noble-race typology”. His thoughts of him as a “progenitor” indicate that he saw Wotan as a leader figure.11 These characters all contributed to Nietzsche’s will to power concept. Charles Darwin’s Origin of Species prompted Nietzsche to coin the phrase will to power. Darwin suggested that species succeeded in accordance with the ‘survival of the fittest’ maxim; those who are most able to dominate their environment
8 Strathern, Paul, The Essential Nietzsche, Nietzsche’s letters (Ebury Publishing) 9 Corrigan, Kevin and Harrington, L. Michael, Pseudo-Dionysius the Areopagite (The Stanford Encyclopedia of Philosophy (Spring 2015 Edition)), ed. Edward N. Zalta, https://plato.stanford.edu/archives/spr2015/entries/pseudo-dionysius-areopagite 10 Chaturvedi, Rahul, Richard Wagner’s Influence on Friedrich Nietzsche 11 Berry, Mark, The Positive Influence of Wagner upon Nietzsche, p.14 40
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
will in turn dominate the gene pool. However, Nietzsche was unable to link this biological theory to the characters in the Ring cycle. Many would argue it to be folly to attempt to link fictional characters with biological theory and misleading to draw conclusions from it – Nietzsche’s admiration for Wagner somewhat clouded his judgement – yet Nietzsche sought to justify the actions of Wotan and Siegfried. Nietzsche’s ideal, the will to power, identified the will to expand and conquer as the primary motivation for certain species. Thus, the extent of Wagner’s influence over Nietzsche can be seen in how the Ring cycle persuaded Nietzsche to ignore – and indeed, be critical of – the thoughts of contemporary biologists and instead base his own ideas on characters of Wagner’s imagination. In the American philosopher Ralph Waldo Emerson, Nietzsche found his “brother-soul”12; Emerson to Nietzsche was more than a profound thinker; he was a consistent influence. Emerson
Waldo Emerson
was Nietzsche’s first contact with philosophy, having first read Emerson’s The Conduct of Life aged 17. It was not until 2 years later that he expressed interest in Plato and another 2 years until he discovered Schopenhauer in Leipzig. His very first philosophical texts were a direct response to his reading of Emerson. He was inspired to produce two essays based on Emerson’s “Fate”; “Fate and History: Thoughts” (1862) and a follow-up “Freedom of Will and Fate”. These essays outline many of the major tenets of his later writings. Jennifer Ratner-Rosenhagen writes that it was ‘Emerson who taught Nietzsche a way forward, a way to begin imagining life as a process of thought’13. Nietzsche’s appreciation of Emerson is evident. He possessed four volumes of Emerson’s essays riddled with annotations and underlines around the text. In Nietzsche’s personal library, they appear to be his most worn collections, which suggests their high importance to him. George Kateb tells a story to demonstrate how Nietzsche was ‘Emerson’s best reader’: on a return trip from holiday in 1874, Nietzsche’s suitcase containing these essays was stolen. Unable to proceed without it, he promptly procured another copy .14 Emerson’s reach is evident throughout Nietzsche’s work. His book The Gay Science was so named in a nod to Emerson’s assertion that he was the ‘professor of the joyous science’, whilst it was a rereading of Emerson’s Spiritual Laws (1841)
which gave him the name Ecce Homo for the title of his autobiography1. Where Emerson calls life ‘a search after power’, Nietzsche says ‘life is the will to power’, and the infamous phrase ‘God is dead’ is far too similar to ‘as if God were dead’ (Emerson referring to ‘bankrupt spirituality’) to be considered unlinked. While Emerson said ‘every evil to which we do not succumb, is a benefactor’, Nietzsche concludes ‘what does not kill me makes me stronger’ and calls Emerson a ‘master of prose’, ‘the most fertile author of this century’. In Twilight of the Idols, he remarks that ‘Emerson possesses that good-natured and quick witted cheerfulness that discourages all earnestness; he has absolutely no idea how old he is or how young he [is still to] be.”15 All of these connections have led many, such as Harold Bloom, to infer the deep relationship between the pair. Bloom’s statement that ‘Nietzsche loved Emerson’ accurately identifies the consistent and profound influence Emerson had on Nietzsche. There were three particular individuals who encouraged Nietzsche to adopt aphorisms – he used short, poetic phrases to convey his ideas – rather than essays: Afrikan Spir; Georg Lichtenberg; and Paul Rée. Described as “an outstanding logician” by Nietzsche, Afrikan Spir and his 1873 Thought and Reality “had a lasting impact” on Nietzsche.16 In his 1878 book Human, All Too Human, Nietzsche presented propositions originally from Spir.17 M.S. Green observes that Spir began to exercise significant influence
12 Ratner-Rosenhagen, Jennifer, American Nietzsche, (Chicago: University of Chicago Press, 2010) 13 Ratner-Rosenhagen, American Nietzsche 14 Kateb, George, Emerson and Self-Reliance (SAGE Publications, 1995) 15 Nietzsche, Friedrich, Twilight of the Idols 16 Safranski, Rüdiger, Nietzsche: A Philosophical Biography (W. W. Norton & Company, 2003) p. 161. 17 in Section 18 (2,38; HH I §18) 41
R EI G AT E G R A M M A R SC H OO L
on him18, whilst Thomas H Brobjer identifies 18th century scientist Georg Lichtenberg as one who “may well have contributed an important influence to Nietzsche’s use of aphorisms”, while labelling Nietzsche’s friend Paul Rée as the strongest direct influence.19 In 1875 Rée published Psychologische Beobachtungen in which he explicitly defended such a use of aphorisms – this encouraged Nietzsche to follow suit. The theologian and philosopher Dionysius influenced Nietzsche’s übermensch theory as well as his will to power. As early as the late 5th century, Dionysius wrote of an ‘over-being’ in his Divine Names. While it is difficult to prove this caused, or is linked to, Nietzsche’s übermensch, we know that Nietzsche was knowledgeable of Dionysius; in The Birth of Tragedy Nietzsche talks of the fusion of Dionysian (representing passion) and Apollonian (creating structure) forms essential for real Greek tragedy. He also planned the publication of poems making up his collection Dionysian Dithyrambs. Additionally, Nietzsche sent short letters (known as the WahnzettelI, to mean ‘madness letters’) to his friends following his 1889 collapse, signing most of them with ‘Dionysos’ – this demonstrates his connection with Dionysian themes. Indeed, Nietzsche would later go on to create his own ‘over-being’ in his übermensch which he believed would recreate a culture combining
Dionysian and Apollonian themes and thus restore to the world ancient Greek culture and wholesome artistic expression. It was the philosophers that Nietzsche had once read and learnt from that he came to criticise in later years. While he once noted that Immanuel Kant was “the true son of his century”20, Nietzsche later criticised Kant as “inferior in his psychology and knowledge of human nature; way off when it comes to great historical values (i.e: the French Revolution); a moral fanatic” in his Nachlass published posthumously in The Will to Power. 21 Nietzsche himself believed ‘great historical values’ were misguided; in Beyond Good and Evil, he attacked Kant’s thing in itself and Descartes’ cogito ergo sum as unfounded beliefs based upon previous perceptions. Nietzsche’s atheism— his account of “God’s murder”22 — expresses his condemnation of all traditional perspectives and values23. Additionally, he describes Plato “as a bore”24, Mill as a “blockhead”, and of Spinoza, whom he once called his “precursor” in a letter to Franz Overbeck (July 1881), he used the phrase “How much of personal timidity and vulnerability does this masquerade of a sickly recluse betray?”25 The influence of these figures is diminished by their lack of continuity
and consistency. Indeed, Nietzsche saw fit to distance himself from almost everyone, both from the past and in the present. This was in part due to Nietzsche’s personality and the demands it brought. People did not like Nietzsche the person; he had no real friends, in the words of Paul Strathern. This is only part true. Those such as Peter Gast, although very few in number, stayed with him throughout his life. Nonetheless, it is accurate to say that Friedrich Nietzsche lived a lonely, nomadic life, which was largely to the detriment of his quality of life and work. Even of Richard Wagner, whom Paul Strathern argues was the “only one man Nietzsche knew properly throughout his life”, did Nietzsche became ultimately disenchanted26 – after the 1876 Bayreuth festival. He lived in “utter isolation”, frequently spending his summers in Sils Maria (in the Eastern Swiss Canton of Grisons) in his own words: “1500 metres above the world and even higher above all human beings”. This quote, from one of his letters, can be easily interpreted as a consequence of his selfdominated lifestyle and symptom of his growing arrogance - In Ecce Homo he describes Thus Spoke Zarathustra as the ‘highest and deepest book in existence’. Throughout the 1880s he found his utter solitude and lack of recognition ever more unbearable. His meagre lifestyle of inexpensive rooms, continuously working and eating in cheap restaurants not only embodied
18 Green, M.S., Nietzsche and the Transcendental Tradition (University of Illinois Press 2002) 19 Brobjer, Thomas H, Nietzsche’s Philosophical Context: An Intellectual Biography (University of Illinois Press 2010) p.63,41. 20 Nietzsche, Friedrich, ‘Preface (1886)’, Daybreak: Thoughts on the Prejudices of Morality, trans. R.J. Hollingdale (Cambridge: Cambridge University Press, 1982) 21 Nietzsche, Friedrich, The Will to Power, § 101, 64. See also § 382, 206.) 22 Nietzsche, Friedrich, The Gay Science (1882) §125 23 Wicks, Robert, Friedrich Nietzsche, The Stanford Encyclopedia of Philosophy (Fall 2016 Edition), ed. Edward N. Zalta 24 Nietzsche, Friedrich, Twilight of the Idols, or, How to Philosophize with a Hammer (1888): What I Owe to the Ancients 25 Russell, Bertrand, A History of Western Philosophy (Routledge, 2004) 26 Strathern, Paul, The Essential Nietzsche, Nietzsche’s letters (Ebury Publishing) 42
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
his often negative and gloomy conclusions, but they helped provoke conclusions such as this: “To live is to suffer, to survive is to find meaning in the suffering” in On the Genealogy of Morals27. Indeed, Colin Ellard, an international expert in psychology of navigation and psychogeography (the study of the relationship between the places in our everyday life and the effects that those places have on our minds), maintains that “everybody knows that where you are influences what you do and how you feel at some level”. Meanwhile, the contemporary commentator FCS Schiller writes that “the mark of [Nietzsche’s] emotional personality is over everything he wrote”28. It seems that the effect of Nietzsche’s surroundings on his work is evident not just to modern day readers, but was apparent to commentators of the 19th century as well. Strathern describes the imagery at Sils Maria as this29: Mountains rise sheer from the lake shore. You can take the remote paths of the mountainsides where Nietzsche used to walk. Lonely crags and foaming mountain torrents. The atmosphere of this region – the remote peaks, the sweeping views, the sense of isolated grandeur – creeps into the tone of his writings. While Nietzsche spent his summers in the mountains of Switzerland, his whole life was obstructed by ‘blinding headaches and debilitating illnesses’1 which regularly rendered him incapacitated for the majority of a week. Although the environment in
‘intellectual waywardness and spiritual crisis which grew throughout his life’.30 Whilst possessing characteristics of the poetic style he would go on to use, the following phrase also demonstrates that Nietzsche’s nihilistic views on life have their roots in his youth. It provides a key insight into Nietzsche’s attitudes towards life and education: The ‘Nietzsche rock’ which inspired Thus Spoke Zarathuatra
which he chose to live compounded this issue, there were other factors which caused his premature physical decay. His 1868 riding accident and the developing stages of syphillis served to exacerbate his condition. Nonetheless, the surroundings in which Nietzsche worked can be deemed to have had a significant influence on him. Nietzsche’s experiences as a child fostered convictions which stayed with him throughout his life. As a child, Nietzsche earnt a scholarship to attend the internationally recognised Schulpforta school. In spite of this privileged opportunity, he found the regimented system unsatisfactory and at times boring 2. Towards the start of his school career he was committed to the idea of education, labelling schooling the “foundation of all knowledge” (Grundveste alles Wissens), but he soon changed his view as his learning of historical sciences shook that foundation. Indeed, in his essay Fate and History (1862), he argued that historical research had discredited his views of Christianity – at this point Nietzsche was 18. Jennifer Ratner-Rosenhagen observes that his poetry of this year echoes feelings of
I know not what I love, I have neither peace nor rest, I know not what to believe, What life am I living, why? So, where was he to gain an education to inspire him to pursue his life the way he did? It was with school friends Wilhelm Pinder and Gustav Krug that Nietzsche was to find literary satisfaction, for the trio set out to subvert the sense of regimentation he felt Schulpforta embraced by establishing a literary group, Germania. In this, the three were each expected
Nietzsche during his time at SchulpfortaZarathuatra
27 Nietzsche, Friedrich, On the Genealogy of Morals 28 Schiller, F.C.S., Nietzsche and His Philosophy (Book Buyer, August 1896) 29 Strathern, Paul, The Essential Nietzsche (Ebury Publishing) 30 Ratner-Rosenhagen, Jennifer, American Nietzsche, (Chicago: University of Chicago Press, 2010) 43
R EI G AT E G R A M M A R SC H OO L
to produce a work of criticism, poetry, a musical composition, or some original research during their quarterly “synods”. Thus, at the age of 16, he was producing essays on Cicero, Byron, and Napoleon, as well as musical compositions and poems. Nietzsche was of the opinion that he “saved [his] private inclinations from the uniform law [and] lived a concealed cult of the certain arts.”31 Carl Pletsch has shown that Nietzsche regards his work with Germania as his true education; “it fostered a conviction that true knowledge cannot be found with an institutional setting, but is found through creative aspiration on one’s own terms.” This attitude of subverting the routes of the traditionalised is a clear theme of Nietzsche’s work and is perhaps the sole constant in his life. In this light, Nietzsche is shown to be committed to the development of his knowledge and education, particularly in his youth. Although this factor was at times sidelined in his life – i.e: the overriding influence of Richard Wagner – it remains as one of the key tenets of Nietzschean thought; controversial thought is something Nietzsche’s books and indeed the philosopher himself have always embodied. Illness was a constant feature of Nietzsche’s life and it eventually caught up with him in the winter of 1889. On a Turin street in Northern Italy the most controversial philosopher of the 19th century collapsed in a mental fit. While Paul Strathern acknowledges the role of excessive work and solitude as a factor, he identifies syphilis, now in its tertiary stage, as the primary cause of his mental illness32. He was cared for by his mother until her death in 1897,
after which he was put under the care of his sister Elisabeth Förster-Nietzsche. Elisabeth, widow of notorious antiSemite Bernard Förster, was the worst person who could have been put in charge of him. Elisabeth began editing her brother’s unpublished notebooks and distributed within them antiSemitic ideas. This was published as The Will to Power in the early 1900s (after Friedrich Nietzsche’s death on the turn of the century) which has emerged as the primary source of contemporary thought about Nietzsche. Despite the work of Nietzsche scholar Walter Kaufmann to purge Nietzschean works of the taints of his sister, Nietzsche himself has often been labelled a supporter of Nazi and anti-Semitic ideas. The influence of his sister has been profound, and whilst Nietzsche’s books spoke for themselves in the year preceding the Second World War – he enjoyed a period of popularity in the Western world – following the horrors of Nazi rule, Nietzsche has been amongst many scapegoated for Hitler’s rise to and actions in power. Contemporary Europeans were firmly anti-Nietzsche. Even during his life, Nietzsche was socially resented due to his views, mainly those regarding Christianity which originated from his reading of Greek and Roman texts. These taught him that contemporary religion was a shadow of its former self; society had lost its passion and enthusiasm for artistic expression and understanding. In 1883 he applied for a lecturing post at Leipzig University. However, he was informed that any “attempt at Leipzig would have been a failure owing to the fact that the Faculty would never dare recommend
31 Ratner-Rosenhagen, Jennifer, American Nietzsche, (Chicago: University of Chicago Press, 2010) 32 Strathern, Paul, The Essential Nietzsche, Nietzsche’s Letters (Ebury Publishing) 33 Nietzsche, Friedrich, Letter to Peter Gast (August 1883) 34 Ratner-Rosenhagen, Jennifer, American Nietzsche, (Chicago: University of Chicago Press, 2010) 44
[Nietzsche] to the Board of Education in view of [his] attitude towards Christianity and the concept of God.”33 At a time when beliefs such as Nietzsche’s over religion were socially reprehensible, Nietzsche became outcast from society as a consequence of his religious convictions. The sales of his books underline Nietzsche’s unpopularity. In 1885, he printed just 40 copies of the fourth part of Zarathustra. Most of these went to friends and family. By the early 1880s, Nietzsche had earned a small (that is to say, tiny) readership in the USA. Two letters from fans in early 1881 and late 1882 are all there is alluding to growing popularity across the Atlantic34. The first of these came from three admirers from Baltimore, Maryland: Elise Fincke; her husband; and a friend. Upon the back of this letter, Nietzsche wrote ‘Erster amerikanischer Brief. Initium gloriae mundi’ (‘First American letter. Beginning of world fame’). Another, Gustav Dannreuther, informed Nietzsche that he had translated his essay on Schopenhauer “no less than three times” and that he had particular admiration for his “Inopportune Reflections” (“Untimely Meditations”). Whilst there is evidence of growing fame – the influential Danish critic Georg Brandes hosted popular lectures about Nietzsche at the University of Copenhagen in 1888 – this in no sense meant his ideas were popular and Nietzsche’s predictions of ‘world fame’ are in all respects, far from the mark. During the last decade of Nietzsche’s life and the first decade of the 20th century, support of his
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
thought was largely limited to avantgarde artists who placed themselves on the edges of established social norms35. Admittedly, Nietzsche’s ideas were not given an equal shot at spreading throughout Europe. Until 1907, all work about Nietzsche was in German and thus its ability to appeal to wider audiences was diminished. However, this also implies there was little demand for translations of Nietzsche’s books which undermines evidence attesting to his popularity. It is evident that the meagerness of Nietzsche’s reception in the 19th century was consistent throughout Europe and the world; his books had no such thing as a wide readership, yet if they had, the scorn of criticism he would have amassed would have been astounding simply due to the huge degree to which they contrast with traditional, accepted concepts. In the decades before Nazi rule, Nietzschean thought enjoyed a steady growth in popularity. According to Robert Wicks, “Nietzsche’s advocacy of new, healthy beginnings, and of creative artistry in general stood forth.” In particular, Nietzsche became associated and was greatly popular with anarchist movements. Spencer Sunshine implies there was a large collection of things which drew anarchists to Nietzsche, including but not limited to: ‘his hatred of the state; his disgust for the mindless social behaviour of ‘herds’; (and) his anti-Christianity’36.
Furthermore, translations of Nietzsche’s work were published in the anarchist journal Liberty, edited by the American Benjamin Tucker. Robert Holub suggests that ‘Tucker preferred the strategy of exploiting his writings’ and quotes Tucker as ‘Nietzsche says splendid things, often, indeed, Anarchist things’37. This disputes the claim that Nietzsche was actually an anarchist, whilst suggesting that it was anarchist groups too, not just Elisabeth FörsterNietzsche, who lifted phrases of Nietzsche’s work to fulfill their own agendas. Nonetheless, it is clear that Nietzsche was frequently linked to, and used to inspire, anarchist thinking. Other anarchists he resonated with include the French individual anarchist Emile Armand, who cited Nietzsche as ‘important to his way of thinking’38. In the years leading up to the First World War, Nietzsche was associated with both left-wing policies as well as right-wing militarism and he was regularly caught in between political conflict. The Dreyfus Affair in France between 1894-1906 is an example of politicisation of Nietzsche’s work. In 1895, Alfred Dreyfus was wrongly imprisoned for leaking French military secrets to the German embassy in a scandal which divided France into two camps: the pro-Army camp; and the anticlerical, pro-republican camp labelled ‘Nietzscheans’ by the former faction39. German soldiers were given editions of Thus Spoke Zarathustra
during the war which implies that Nietzsche was highly regarded at the forefront of philosophical thinking. In Germany at least, was Nietzsche now widely known; Heidegger remarked that everyone in ‘his day’ was either ‘for’ or ‘against’ Nietzsche and Alan D. Schrift writes that “the gesture of setting up “Nietzsche” as a battlefield on which to take one’s stand against or to enter into competition with the ideas of one’s intellectual predecessors or rivals has happened quite frequently in the twentieth century.”40 While it is apparent that Nietzschean thought was more popular in the first three decades of the 20th century than in any previous period, the evidence does not suggest that he had notable fame, like others, in the wider world beyond Germany. Elisabeth Förster-Nietzsche was responsible for publicising her brother’s work in the guise of Nazi propaganda and her own German nationalist ideology. After Friedrich’s death, Elisabeth, who had recently returned home from a failed Germanic colony in Paraguay established by her and her husband Bernhard Förster, seized control of the rights to his work, winning her the ability to edit this as she pleased – changing her name to “Förster-Nietzsche” to further legitimize her right to edit her brother’s work. She withheld the publication of Ecce Homo until 1908, thereby preventing readers from understanding Nietzsche in the correct context which contributed to the misinterpretations of Nietzsche’s writings. Walter Kaufmann writes that
35 Wicks, Friedrich Nietzsche 36 Spencer, Sunshine, Nietzsche and the Anarchists (2005) 37 Holub, Robert C., Nietzsche: Socialist, Anarchist, Feminist (University of California at Berkeley) 38 Armand, Émile, The Anarchism of Émile Armand (1907-64) 39 Schrift, A.D., Nietzsche’s French Legacy: A Genealogy of Poststructuralism (Routledge, 1995) 40 Schrift, A.D., Why Nietzsche still? (University of California Press, 2000) pp 184–185 45
R EI G AT E G R A M M A R SC H OO L
“the content of Ecce Homo was critical to distinguishing Nietzsche’s philosophy from being Darwinist”41. Elisabeth emphasised sections emphasising the need for breeding and illustrated Nietzsche as a supporter of eugenics (in manipulating his later published and unpublished works) which became recognizable traits of Nazism. Furthermore, she wrote introductions to Nietzsche’s books which established her tainted view as the commanding one. Yet Elisabeth was hugely unqualified to comment, let alone edit, Nietzsche’s books. Kaufmann cites a paragraph of Rudolph Steiner, a philosopher hired by Elisabeth to teach her about her brother’s ideas, to illustrate her philosophical incompetence1: The private lessons…taught me this above all: that Frau Förster-Nietzsche is a complete laywoman in all that concerns her brother’s doctrine… [She] lacks any sense for fine, and even for crude, logical consistency; and she lacks any sense of objectivity. …She believes at every moment what she says. She convinces herself that something was red yesterday that most assuredly was blue. Elisabeth chose the Nazi Party to be the patron of the financiallytroubled Nietzsche Archive in Weimar. With Adolf Hitler and Albert Speer, Elisabeth planned the building of a Friedrich Nietzsche Memorial Hall, thus demonstrating Elisabeth’s strong connections with them. Hitler visited seven times in all, which Ben Macintyre attributes to “the propaganda value of Nietzsche as a Nazi prophet”42. In associating Nietzsche with the Nazi
Hitler admires a bust of Friedrich Nietzsche during one of his visits to the Nietzsche-Archiv in Weimar Germany, 1934.
Party, Elisabeth unknowingly discredited Nietzsche for future generations. Motives to explain why Elisabeth chose such a route – other than financial trouble – include Elisabeth’s own political leaning and her desire to inflate her own fame. The woman Friedrich Nietzsche once fondly referred to as ‘Llama’ sent Mussolini a 50th birthday telegram and Hitler attended her funeral after her death in 1935. Jenny Dinski writes that she had a “girlish, giggling, delight in Fascism” – this drove her to back up her own convictions with her brother’s work. Elisabeth also sought to promote herself and rise beyond her conventional life. Before the opportunity to work with Nietzsche’s work, she behaved conventionally, living with her mother as an unmarried woman. In 1865, she wrote ‘I just come up with nonsense … I prefer not to think’ to her brother – but she came to resent this role as a conventional citizen. After she began to abuse Nietzsche’s work, her life changed dramatically. She held soirées on the ground floor of the Nietzsche-Archiv, entertaining visitors from around the
world, whilst she kept her brother upstairs. She offered guests a ‘viewing’ of Nietzsche, whom she dressed in a white pleated robe “like a Brahmin”. One journalist was permitted to watch the philosopher sleep and eat, such was the celebritisation and monetisation occurring. Elisabeth’s work reaped rewards. She was put forward for the Nobel Prize for Literature in 1911 and 1923 which demonstrates her dramatic change in lifestyle that her role in Nietzsche’s work heralded, whilst at her funeral (attended by Hitler) she was eulogised as the ‘priestess of eternal Germany’. It is clear that the fame Elisabeth FörsterNietzsche won must have acted as a motive to continue misappropriating her brother’s work. From the already tainted The Will to Power, the Nazis drew an even more distorted interpretation. The Nazis’ conclusion focused on biological terms and introduced aspects of social Darwinism whereby the survival of the fittest (in Nazi eyes, the survival of the Aryan race) and the submission of the “half-man” was justified. In practice, this led to the persecution of the Jews, as well as gypsies, the disabled, and ethnic minorities. The Nuremberg Laws and the concentration camps into which “mongrel races” would be packed are physical examples linking Nietzsche’s work to Nazi and fascist action. Elisabeth’s edits helped develop the idea of the übermensch as the Nazis embraced it as inspiration. The übermensch is a concept which Nietzsche describes as ‘the goal’. He describes it is as ‘the meaning of the earth’, drawing parallels between man’s superiority over the ape and the
41 Kaufmann, Walter, Nietzsche (New Jersey: Princeton University Press, 1974) 42 Macintyre, Ben, Forgotten Fatherland: The Search for Elisabeth Nietzsche (New York: Farrar Straus Giroux, 1992) 46
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
The residence of Nietzsche’s last three years, along with the archive in Weimar, Germany, which holds many of Nietzsche’s papers
übermensch’s superiority over man43. William Shirer argues that Nietzsche’s “rantings must have struck a responsive chord in Hitler’s … mind” and that “Hitler considered himself the superman of Nietzsche’s prophecy”44. Evidence that Hitler was familiar with and supportive of Nietzsche’s work is that he often echoed phrases found in the latter’s publications. ‘Lords of the Earth’ is a frequent expression in Mein Kampf (1925), which also emphasised the importance of questioning social values so abhorred by Nietzsche as well as the ‘terribleness’ needed to refine the cultural masses. Where Nietzsche uses the phrase ‘blood poisoning’ to illustrate the contamination of ‘master’ or ‘wholeman’ by the ‘slave’, Hitler accuses the Jews of “(contaminating) the higher races and (weakening) the culture of the Aryan race with their blood”.45
“A daring and ruler race is building itself up…the aim should be to prepare a transvaluation of values for a particular strong kind of man, most highly gifted in intellect and will. This man and the elite around him will become ‘lords of the earth’ Nietzsche describing the ‘Superman’ in The Will to Power Although Nietzsche made no allusion to the übermensch being Germanic, the Nazis expropriated this concept as a means to justify their policy of Aryan-supremacy. Furthermore, phrases such as ‘blond beast’ are often used to link Nietzsche to the Nazi’s breeding policies, but the validity of this particular link is undermined by the fact that Nietzsche was describing a lion. Nonetheless, Hitler used the image of a blond beast to represent the Aryan master race, which he believed was responsible for all human culture. He wrote that the ‘whole man’ was to conquer the world: “If the power to fight for one’s own health is no longer present, the right to live in this world of struggle ends. This world belongs to the forceful ‘whole’ man and not to the weak ‘half’ man”46. By glorifying the übermensch, Nietzsche highlighted his disgust with democracy which was reflected in Hitler’s Mein Kampf. Hitler wrote that “social activity must never and on no account be directed toward philanthropic flim flam, but rather toward the elimination of the basic
deficiencies in the organization of our economic and cultural”47. Hitler’s distaste towards democracy is proven by history, yet having experienced the ineffectiveness of the Weimar government, Nietzsche cannot be deemed the primary inspiration for Hitler’s conclusions. Even though Nietzsche and Hitler rarely fully agreed on the rightful interpretation of Nietzsche’s philosophy, it is certain that Hitler was influenced by the philosopher. However, it is unlikely that Nietzsche was the inspiration behind Hitler’s racist ideas, rather, he served as philosophical justification for beliefs already nurtured by the future German dictator. Following the Second World War, Nietzsche’s popularity was reset amid the stigma attached to Nazism. By this stage, as a direct consequence of the propaganda of Elisabeth FörsterNietzsche and the Nazi Party, German philosopher Alfred Bäumler had worked for the Nazis to ‘shamelessly’ appropriate Nietzschean quotes to Nazi collectivism.48 However, the work of Kaufmann and others in the 1950s successfully refuted the image of Nietzsche as a Nazi inspiration.
43 Nietzsche, Friedrich, Thus Spoke Zarathustra (1885) 44 Shirer, William, The Rise and Fall of the Third Reich (New York: Simon and Schuster, Inc., 1960) 45 Hitler, Adolf, Mein Kampf (Boston: The Riverside Press, 1943, translated by Ralph Mannheim) 46 Hitler, Adolf, Mein Kampf, pg. 257 47 Hitler, Adolf, Mein Kampf, pg. 30 48 Golomb, Jacob; Wistrich, Robert S., Nietzsche, Godfather of Fascism? On the Uses and Abuses of a Philosophy (Princeton University Press, 2002) 47
R EI G AT E G R A M M A R SC H OO L
Walter Kaufmann
Kaufmann portrayed Nietzsche as a secular humanist and a forerunner of the existentialist movement49 in his books on Nietzsche, including Nietzsche: Philosopher, Psychologist, Antichrist (1950). Later, Arthur Danto’s Nietzsche as Philosopher cast Nietzsche as a forerunner to analytic philosophy in 1965. While this claim is questionable – Nietzsche’s extensive use of aphorisms hardly justifies such a label – Kaufmann’s Nietzsche remains a common light in which AngloAmerican readers regard Nietzsche in modern times. In the latter half of the 20th century, Nietzsche’s work appealed to artists and writers rather than to academic philosophical studies. Robert Wicks
maintains that Nietzsche’s influence was notable amongst French philosophical circles between the 1960s and the 1980s. His famed “God is dead” declaration, his perspectivism, and his emphasis upon power as the real motivator and explanation for people’s actions revealed new ways to challenge establish authority and launch effective critiques upon society. Nietzsche influenced numerous Europeans from all walks of life to various extents. Many of these amount to little of note, yet those listed below are just some of the figures very relevant in their respective fields who cite or display the influence of Nietzsche in their work. They range from playwrights and poets – W.B. Yeats described Nietzsche as the intellectual heir to William Blake50 (as did W.H. Auden51 and Arthur Symons52) – to dancers and musicians, to psychologists, sociologists, historians and fellow philosophers such as Albert Camus, Sigmund Freud, and Jean-Paul Sartre. Amongst those influenced by Nietzsche also include: Alfred Adler, Georges Bataille, Martin Buber, Jacques Derrida, Michel Foucault, Hermann Hesse, Carl Jung, Martin Heidegger, Rainer Maria Rilke, Lev Shestov, Richard Strauss, Mary Wigman, and Stefan Zweig.53 The earliest Anglo-American commentators on Nietzsche created around him the image of a celebrity. They seized his dynamic life story, personal idiosyncrasies, and his mental collapse in 1889 to drive interest in
the figure so at odds with so many ‘American values’. When English translations of his work reached America, they had a tendency to reflect his later, more dramatic concepts created after his 1889 collapse, rather than his more nuanced publications of pre-1889 – Allan Bloom writes that American society was given Nietzsche without having experienced the culture that brought him about54. Thus Spoke Zarathustra and The Antichrist are among the first books to reach America in the English tongue. Readers interpreted his philosophy through the context in which they understood him, and in translating works created during his times of insanity, Nietzsche in America became ever more radical to the American eye – in general, Americans took an ‘unwholesome, lighthearted, and softheaded approach’ 2 . Accordingly, Nietzsche appealed to the outsiders in society, those who considered themselves different from contemporary American culture and his popularity “grew year on year” throughout the 1920s55. Radicals idolised Nietzsche to the extent that they saw him as the intellectual Adam3. His idea of the übermensch ignited a firestorm in US radical periodicals, literary magazines, and journals. Radicals explored it in depth in order to further their understanding of the conflicts in society, the internal duel between the individual and the self. While religious commentators also sought to manipulate the übermensch and Nietzsche’s critiques of Christianity into one of secular
49 Wilkerson, Dale, Friedrich Nietzsche, a peer-review biography (Internet Encyclopedia of Philosophy) 50 Cullingford, Elizabeth, Yeats, Ireland and Fascism (Palgrave Macmillan UK, 1981) 51 Auden, W.H., New Year Letter (Faber and Faber, 1941) 52 Symons, Arthur, The Symbolist Movement in Literature (New York Dutton, 1899) 53 Wicks, Friedrich Nietzsche 54 Bloom, Allan, The Closing of the American Mind: How Higher Education Has Failed Democracy and Impoverished the Souls of Today’s Students (New York: Simon and Schuster, 1987) 55 Ratner-Rosenhagen, Jennifer, American Nietzsche, (Chicago: University of Chicago Press, 2010) By one of Kaufmann’s colleagues 48
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
culture – they identified Jesus as the ‘Christian Superman’ – it was the radicals who most keenly adopted Nietzsche. In Nietzsche, radicals found someone who thought the way they did; with critiques of Western society and his martial temperament and disciplined commitment, it was not only Nietzsche’s ideas which became idealised, it was his very nature. It was partly his endless commitment to pursuing his philosophy which attracted such a following. William English Walling, who founded the National Women’s Trade Union League in 1903, wrote “what concerns us is Nietzsche’s actual work … the activity itself”56. While Nietzsche’s publications imply that as a philosopher he was woefully unsuccessful, the furious and constant work behind it was of more importance to the radicals. This explains how among such groups, Nietzsche found a welcoming home. Emma Goldman complains that “shallow philosophers” are awed by Nietzsche’s “giant mind” which causes them to misconstrue his message,2 while George Cram Cook claims that “Nietzsche saved my soul from Tolstoi [and] Jesus” 2. This demonstrates the extent to which Nietzsche influenced the minds of these radicals, convincing them that few other works equalled his own. Letters of radical fans, addressed to the Nietzsche Archiv in Weimar refer to the way “He” (with a capital ‘H’) saved them from despair and they found new meaning in “His” prophetic writings2. Margaret Anderson supports this view,
claiming that Nietzsche was a ‘prophet´ of the radicals57, whilst Agnes Boulton observes that the pessimist playwright Eugene O’Niell regarded Zarathustra as his ‘sacred book’58. Nietzsche’s influence on O’Neill is showcased in his plays; Lazarus Laughed and The Great God Brown are both plays which have particularly clear pessimistic themes – the quote ‘life is for each man a solitary cell whose walls are mirrors’, from the former, reinforces this view. From him, radicals learnt that even though their contributions may not be recognised and their toils remained with few fruits, their work meant something. Yet while Nietzsche served as motivation for many radicals, he also illustrated the perils of pursuing such a path. A life of loneliness and social estrangement played a significant role in his mental decline. Thus, Nietzsche can be seen as a warning, a warning of the danger of the addictive pursuit of knowledge and understanding. It was the terror of two world wars that decimated Nietzsche’s stock in America to the extent that some regarded him as ‘dead as a doorknob’.59 Both Mussolini and Hitler exalted Nietzsche in their respective dictatorships and although the links between Nietzsche and fascism or national socialism were unclear across the Atlantic, the fact that he was glorified by both (to a greater extent in Germany) was enough to significantly lessen his popularity; his discomforting relation to fascism was his undoing60 . After being the philosophical spirit behind German imperialism during the
First World War, Nietzsche became the “philosophical mastermind of totalitarianism” according to RatnerRosenhagen. Indeed, there were very few people who even considered the possibility of reviving Nietzsche’s reputation, let alone thought it doable. Nietzsche was consigned to the sidelines and as a result, he was greeted with disdain, while to those he did influence, American society turned a blind eye. Following his association with a second world war, such a task was seen as insurmountable. Whilst Nietzsche was in no way dead, he was widely considered an abomination to the American readership. Around this period, academics preferred philosophy more analytical in style, a category in which Nietzsche was severely lacking. As a result, Nietzsche became
Elisabeth Förster-Nietzsche
56 Huneker, Egoists, pg.241; Walling, Larger Aspects of Socialism, pg.194 57 Anderson, Margaret, The Little Review Anthology (New York: Hermitage House, 1953), 18. 58 Boulton, Agnes, Part of a Long Story (New York: Doubleday, 1958), pg.61. 59 Ratner-Rosenhagen, Jennifer, American Nietzsche, (Chicago: University of Chicago Press, 2010) By one of Kaufmann’s colleagues 60 Bloom, Allan, How Nietzsche conquered America (Wilson Quaterly, Summer 1987), 80. 49
R EI G AT E G R A M M A R SC H OO L
a minor philosopher, and one scarcely worthy of the name at that. Walter Kaufmann took on the task of reviving and reinventing Nietzsche to the Anglo-American public. From completing a PhD at Harvard in 1947 to lecturing at Princeton from then until his death in 1980, Kaufmann became a most reputable Nietzsche scholar in the West. Nietzsche academic Tracy Strong observes that Kaufmann’s work “effectively gave him and his supporters control over Nietzsche studies in America.”61 This is certainly the case. Kaufmann saw some of the more questionable phrases from Nietzsche’s work and deserted them, and while he recognized that Nietzsche was a philosopher for those “more favoured by nature”, he attempted to simplify many of Nietzsche’s concepts. Often, he asserts the simplicity of Nietzsche’s writings, maintaining that one thing certainly is ‘in line with’ other sections. Kaufmann created a completely new Nietzsche, and in doing so, created one who transcended the scholarly discourse which had once scorned him. A German émigré, Kaufmann translated all but three of Nietzsche’s published works between 1966 and 1974. Along with significant other titles on the subject of Nietzsche, this has helped shape Kaufmann’s Nietzsche. Since this time, the Nietzsche that Anglo-American readers experience has almost entirely been crafted by Kaufmann. For all his faults, Nietzsche should not be seen as a supporter, nor proponent of, fascist or racist ideas. While his use of phrases such as “blond beast” create
opportunities for manipulation of his work, Nietzsche had no intention to supply such ideologies. Nietzsche did indeed observe such abuse “in [his] writings”. Nietzsche states that his “disgust with [anti-Semitic parties] … is as outspoken as possible’ and that he is ‘namely opposed, regarding antiSemitism”. He also cities his “anti-Semitic publisher – Schmeitzner – as an anti-Semite responsible for bringing ‘the adherents of this disagreeable party back to the idea that I must after all belong to them”62 and saw his work “completely buried and unexhumeable in this anti-Semitic dump”. His use of the word “outspoken” serves to underline the prevailing anti-Semitic sentiment evident in the time. Consequently, to modern readers Nietzsche will seem far more antiSemitic than his writings suggest when the context is considered. Context is important when reading Nietzsche, more so than many philosophers, yet Nietzsche understood this and accepted that his concepts were easily misappropriated. In Ecce Homo, he penned this damning assessment of the future for his name: “I know my fate. One day there will be associated with my name the recollection of something frightful, of a crisis like no other before on earth, of the profoundest collision of conscience” Although Nietzsche expresses little regret for this fate, deeper examination
disputes the validity of the claim that he was a Nazi (arguably the ‘crisis like no other’ that Nietzsche alludes to). In acknowledgement of this with such resignation and his utilization of the word ‘frightful’, we can learn that Nietzsche in no way sought to encourage the Nazi state and system and cannot be considered a willing fascist or anti-Semite. In a letter to his sister about her marriage to the antiSemite Bernhard Förster, he described her relationship as ‘one of the greatest stupidities you have committed’. Additional analysis of Elisabeth further highlights Nietzsche’s antipathy (and Elisabeth’s enthusiasm) in this regard. Dr Michael Tanner, a fellow of Corpus Christi College, Cambridge, details the effect of her doctoring of her brother’s work: “[Elisabeth] tailored [The Will to Power] by the title that she gave the various sections…in such a way that it suggested that [Nietzsche] was very much in favour of eugenics, breeding an ideal Master Race…it was a perversion of his message and to some extent a forgery.”63 The identification of Elisabeth, rather than her brother, as the motive behind many controversial publications published under Friedrich Nietzsche’s name is significant in diminishing the strength of the accusation that he himself was racist or supported fascism. Nietzsche’s status as a radical and a revolutionary is undoubted. Troy Southgate claims that ‘Nietzsche was more revolutionary [than] Karl Marx’64
61 Strong, Tracy, Friedrich Nietzsche and the Politics of Transfiguration (University of Illinois Press, 1975) 62 Kaufmann, Walter, Nietzsche pg.45 63 Strong, Tracy, Friedrich Nietzsche and the Politics of Transfiguration (University of Illinois Press, 1975) 64 Southgate, Troy, The Radical Tradition: Philosophy, Metapolitics and the Conservative Revolution (Primordial Traditions, 2011) 50
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
and this assessment has merit; while Marx took aim at the capitalist system (yet arguing that it was part of a process), Nietzsche launched tirades against not only the capitalist system, but Christianity and western culture. Furthermore, just months after Nietzsche’s death, the attorney Sigmund Zeisler concluded that Nietzsche was ‘the most radical thinker of the [19th] century’65. This contemporary evidence is useful and important to learning about Nietzsche because it demonstrates views not from modern times, but those of the period which accurately reflect the mood and attitudes towards Nietzsche. However, to judge an individual on how ‘revolutionary’ they are, one must consider their physical and lasting impact. Marx’s impact is most obvious in the Communist states of Russia
Karl Marx
“We are a bit like savages who, having been discovered by missionaries, have converted to Christianity without having experienced all that came before and after the revelation. The fact that most of us never would have heard of Oedipus if it were not for Sigmund Freud should make us aware that we are almost utterly dependent on our intermediaries for our knowledge of Greece, Rome, Judaism, and Christianity; that, however profound that knowledge may be, theirs is only one interpretation; and that we have only been told as much as they thought we needed to know.” Allan Bloom writing in the Wilson Quarterly 1987: “How Nietzsche conquered America” (formerly the USSR) and Cuba, whilst China, Laos, and Vietnam also subscribe to Marxist-Leninist ideals. Other states such as Zimbabwe, Namibia, Albania, Angola, and the People’s Republic of the Congo are just some of the numerous recent examples of Marxist Communism in state ideologies. Contrastingly, Nietzsche’s impact has been far more subdued. The word ‘Nietzscheanism’ means far less to people than ‘Marxism’ and while he had a profound impact on sections of society – Allan Bloom goes as far to say that his ‘call to revolt against liberal democracy was ultimately more powerful than Marx’s’66 – it is difficult to conclude anything other than Marx had the more physical and longerlasting impact. Therefore, even though Nietzsche was the more radical in his ideas, it was Marx that was more ablet
o see his ideas put into action, thus suggesting he was more revolutionary. Overall, Nietzsche was the most radical philosopher, and certainly one of the most revolutionary, of the 19th century. Nietzsche’s influence upon society is indeed marked. Robert Wicks writes that Nietzsche’s traits to seek to explain commonly-accepted values in society and base animal instincts had a great impact on the whole. Breaching the boundaries found by most philosophers, Nietzsche was ‘crucial to Sigmund Freud’s development of psychoanalysis67. Nietzsche was taken by Zionists and greatly admired: Chaim Weizmann called Nietzsche’s books “the greatest and finest thing”68 whilst Israel Eldad translated most of Nietzsche’s books into Hebrew69. Other Zionists to whom Nietzsche’s work had great
65 Zeisler, Sigmund, “Nietzsche and his philosophy” Dial (October 1st 1900) 219 66 Bloom, Allan, How Nietzsche conquered America (Wilson Quaterly, Summer 1987) 80 67 Wicks, Robert, Friedrich Nietzsche 68 Walter, Kaufmann, Nietzsche, philosopher, Psychologist, Antichrist (Princeton University Press) 69 Golan, Zev, God, Man and Nietzsche (iUniverse, 2007) p. 169: “It would be most useful if our youth climbed, even if only briefly, to Zarathustra’s heights...”) 51
R EI G AT E G R A M M A R SC H OO L
ppeal include Ahad Ha’am, Hillel Zeitlin70, and Martin Bruber, who saw fit to idolise Nietzsche as a ‘creator’ and ‘emissary of life’71. Overall, Nietzsche had a notable influence in a variety of fields, ranging from literature to philosophy. However, despite the work of Kaufmann and others, Nietzsche’s name remains tainted by long-standing and tenuous links to fascism, Nazism, and anti-Semitism. It is Nietzsche’ repeated opposition to these same ideals, along with his life story behind it, which makes such an image less justified and even more frustrating. Allan Bloom’s claim that society is ‘hardly aware of the great pasts that Nietzsche was thinking of ’ raises the question of whether this image will ever be fully cleared. Is an inadequate education limiting our capabilities of understanding the true Nietzsche? Bloom observes that time has left Nietzsche behind; nowadays, readers are ill-equipped to understand him. This has contributed to how misinterpretations of his work have remained in society; there is little else with which these inferences can be replaced. When these ‘intermediaries’ are lost, our link to the past follows a similar route. It is this that makes Nietzsche’s philosophy seem out-dated, out of touch, and erring on the bad side of the controversial and of history. Nietzsche’s pure ability as a philosopher should be recognised as commendable if not more. Whilst his very nature undermined his ability to experience the real world – Will Durant calls him “abnormally weak in the social instincts”72 – this trait can also serve to demonstrate his aptitude and ability as a philosopher; despite limited experiences, he was able to produce insightful and foretelling comments in his work. Indeed, that Nietzsche was able to write so profoundly in his final years, in a condition of dreadful ill health, and throughout his life, during times of physical pain, underlines his superb mental capacity as a philosopher and as a thinker. Many of less strength or a weaker mind would not have found the willpower to write so prolifically with such a powerful ability to “influence an entire century”
Bibliography Primary Texts Nietzsche, Friedrich, Twilight of the Idols Nietzsche, Friedrich, ‘Preface (1886)’, Daybreak:Thoughts on the Prejudices of Morality, trans. R.J. Hollingdale (Cambridge: Cambridge University Press, 1982) Nietzsche, Friedrich, The Will to Power Nietzsche, Friedrich, The Gay Science (1882) Nietzsche, Friedrich, On the Genealogy of Morals Nietzsche, Friedrich, Letter to Peter Gast (August 1883) Nietzsche, Friedrich, Thus Spoke Zarathustra (1885) Secondary Texts Armand, Émile, The Anarchism of Émile Armand (1907-64) Anderson, Margaret, The Little Review Anthology (New York: Hermitage House, 1953) 18. Auden, W.H., New Year Letter (Faber and Faber, 1941) Berry, Mark, The Positive Influence of Wagner upon Nietzsche Bloom, Allan, How Nietzsche conquered America (Wilson Quaterly, Summer 1987) 80. Bloom, Allan, The Closing of the American Mind: How Higher Education Has Failed Democracy and Impoverished the Souls of Today’s Students (New York: Simon and Schuster, 1987) Boulton, Agnes, Part of a Long Story (New York: Doubleday, 1958) Brobjer, Thomas H, Nietzsche’s Philosophical Context: An Intellectual Biography (University of Illinois Press 2010) 70 Golomb, Jacob, Nietzsche and Zion 71 Golomb, Jacob, Nietzsche and Jewish culture (Routledge, 1997) 72 Durant, Will, Philosophy and the Social Problem, Promethean Press 173-4 52
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
Chaturvedi, Rahul, Richard Wagner’s Influence on Friedrich Nietzsche Corrigan, Kevin and Harrington, L. Michael, Pseudo-Dionysius the Areopagite (The Stanford Encyclopedia of Philosophy (Spring 2015 Edition)), ed. Edward N. Zalta, https://plato.stanford.edu/archives/spr2015/entries/pseudo-dionysius-areopagite Cullingford, Elizabeth, Yeats, Ireland and Fascism (Palgrave Macmillan UK, 1981) Durant, Will, Philosophy and the Social Problem, Promethean Press 173-4 Golan, Zev, God, Man and Nietzsche (iUniverse, 2007) Golomb, Jacob; Wistrich, Robert S., Nietzsche, Godfather of Fascism? On the Uses and Abuses of a Philosophy (Princeton University Press, 2002) Golomb, Jacob, Nietzsche and Jewish culture (Routledge, 1997) Golomb, Jacob, Nietzsche and Zion Green, M.S., Nietzsche and the Transcendental Tradition (University of Illinois Press 2002) Hecker, Hellmuth, Nietzsches Staatsangehörigkeit als Rechtsfrage, Neue Juristische Wochenschrift, Jg. 40, 1987, nr. 23, pp. 1388–1391 Hitler, Adolf, Mein Kampf (Boston: The Riverside Press, 1943, translated by Ralph Mannheim) Hollingdale, Nietzsche:The Man and his Philosophy (Cambridge University Press) Holub, Robert C., Nietzsche: Socialist, Anarchist, Feminist (University of California at Berkeley) Huneker, Egoists, pg.241; Walling, Larger Aspects of Socialism Kateb, George, Emerson and Self-Reliance (SAGE Publications, 1995) Kaufmann, Walter, Nietzsche: Philosopher, Psychologist, Antichrist. (N.Y: Vintage, 1968) Kaufmann, Walter, Nietzsche (New Jersey: Princeton University Press, 1974) Macintyre, Ben, Forgotten Fatherland: The Search for Elisabeth Nietzsche (New York: Farrar Straus Giroux, 1992) Mann, Thomas, ‘Nietzsches Philosophie’ in Neue Studien (Stockholm: Berman) Ratner-Rosenhagen, Jennifer, American Nietzsche, (Chicago: University of Chicago Press, 2010) Russell, Bertrand, A History of Western Philosophy (Routledge, 2004) Safranski, Rüdiger, Nietzsche: A Philosophical Biography (W. W. Norton & Company, 2003) Schiller, F.C.S., Nietzsche and His Philosophy (Book Buyer, August 1896) Schrift, A.D., Nietzsche’s French Legacy: A Genealogy of Poststructuralism (Routledge, 1995) Schrift, A.D., Why Nietzsche still? (University of California Press, 2000) Shirer, William, The Rise and Fall of the Third Reich (New York: Simon and Schuster, Inc., 1960) Southgate, Troy, The Radical Tradition: Philosophy, Metapolitics and the Conservative Revolution (Primordial Traditions, 2011) Spencer, Sunshine, Nietzsche and the Anarchists (2005) Strathern, Paul, The Essential Nietzsche, Nietzsche’s letters (Ebury Publishing) Strong, Tracy, Friedrich Nietzsche and the Politics of Transfiguration (University of Illinois Press, 1975) Symons, Arthur, The Symbolist Movement in Literature (New York Dutton, 1899) Yovel, Yirmiyahu, Dark Riddle: Hegel, Nietzsche, and the Jews (University Park, PA: Penn State University Press, 1998) Wicks, Robert, Friedrich Nietzsche, The Stanford Encyclopedia of Philosophy (Fall 2016 Edition), ed. Edward N. Zalta Wilkerson, Dale, Friedrich Nietzsche, a peer-review biography (Internet Encyclopedia of Philosophy) Zeisler, Sigmund, “Nietzsche and his philosophy”, Dial (October 1st 1900) 219 (Footnotes) 1 Shirer, William, The Rise and Fall of the Third Reich (New York: Simon and Schuster, Inc., 1960) 2 Bloom, Allan, How Nietzsche conquered America (Wilson Quaterly, Summer 1987), 80. 53
R EI G AT E G R A M M A R SC H OO L
OPUS ALPHA Milo Stephens
T
his written commentary accompanies the creation of an artefact to address the following project focus: Explore the impacts of technology upon the collation of ideas and inspiration in the popular music industry.
The Brief (Written Prior to the Creation of the Artefact) In brief, the task at hand is to create a comprehensive songwriting scratchpad application for iPad, aimed specifically at songwriters (and other musicians). The app will be designed to act as a companion tool for musicians, built to assist with the collation and organisation of ideas and inspiration throughout the songwriting process. In an ideal world, users would be able to feed all data (of any type) relating to their current songwriting project into the app, and would be instantly provided with an intuitive way of structuring that data in order to build up a detailed breakdown of the song structure and sources of inspiration. The final target would be to produce a ‘map’ of the song suitable for use in the studio, at live performances and in several other performance situations. Although technical restrictions, such as constraints relating to the iOS ecosystem, may inhibit the development of the ideal application, the overall aim for the project is produce a fully working piece of software as close as possible to the one described above.
Post-Project Evaluation The idea for the project developed out of a personal passion for songwriting and arranging. I discovered a need for
54
Figure 1
a method in which song structures (including important details about each song section) could be displayed clearly, however resorted to the rather primitive technique of creating a table in Pages (Figure 1 shows an example). Although creating tables fulfilled the defined purpose, I felt that a specialist piece of software would open up so much more potential for musicians, and so decided to take on the challenge of building one myself. As someone who favours the adaptive software development methodologies, I chose not to create any written documentation before beginning the project. Prior to the formal introduction of HSS, several prototypes were created, in order to model the basic functionality of the application. Figure 2 is a screenshot of the first of these; it is evident that, despite this being the first iteration of the project, much of the content visible in this example has been carried over to the latest beta release of the application. I designed the first few iterations of the application for iPhone, however soon
Figure 2
realised that the system would benefit from access to the much larger, and therefore more flexible, screen of the iPad, hence the definitive use of the iPad as the target device in my brief. Development of the current, evolutionary version of the application began with a focus on the ‘song builder’; this is effectively the ‘map’ of the current song. As the remainder of the solution is centred around this element, it was clearly a natural starting point. For each song section, the title, length and instruments playing are visible; I opted for a clean, minimalistic
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
can be inserted; there are also the options to choose the section colour and the instruments playing in the section (indicated using minimalist icons). The first iteration of this view was a fixed sized canvas divided into sections for each category of content; however, when developing this design, I struggled to create a layout which could sufficiently adapt to the changing quantities of content, and to the reduction in available screen space due to the onscreen keyboard. As a result, I rebuilt the view, utilising a static table view which contains sections for the content categories. The use of a table is incredibly beneficial, as the ability to scroll allows for more content to be present than can be displayed on a single screen; automated scrolling moves the content upwards when the onscreen keyboard is displayed, so that the text currently being edited is visible. Furthermore, the lyrics and notes cells employ dynamic sizing, which is dependent upon the content, maximising the efficiency of use of screen space.
Figure 3
design approach to ensure that the most important information is instantly visible. Bold colours are used to distinguish between section categories. Users are able to perform a dragand-drop gesture to rearrange the sections; double tapping on a section presents the options of duplication and deletion. Figure 3 is a screenshot of a section of the code, which is executed in order to configure each section’s cell, from the Swift file associated with the ‘song builder’. Xcode, the IDE in use, employs syntax highlighting and automatic indentation to ensure that the code is readable; the presence of these features, along with my prior knowledge of the language, assured me, upon beginning the project, that I had chosen a suitable development environment for the solution. Tapping on a song section displayed in the ‘song builder’ presents the user
Figure 4
with a far more detailed view of that specific section, as shown in Figure 4. Here, the section title, lyrics, chord sequence, notes and section length
In addition to the collation of ideas, another key element of songwriting addressed by my project is the collection of inspiration. Being an iOS application, my solution is readily portable and therefore well suited to hosting a method of storing the inspiration behind the songwriting process. I developed the ‘inspiration library’ to act as a home to the audible and visible motives behind users’ songs; it is a bank of their own personal content which can be added to whenever, wherever. Figure 5 depicts how the ‘inspiration library’ categorises content, forcing chronological organisation for ease of use. One question which arose during the exploration of ideas phase of
55
R EI G AT E G R A M M A R SC H OO L
Figure 5
the project concerned the stage (of the songwriting process) at which musicians should seek the opinions of others; although I believe that the answer to this query is truly personal and certainly not defined, I chose to introduce a quote of the week into the ‘inspiration library’. The current week’s quote, and all previous quotes, are accessible to users at any time, and aim to inspire confidence throughout the songwriting process. A PHP script and database hosted online push the content into the application, allowing me, as the developer, to update the selection of quotes at any time, without requiring any device-side changes. This is incredibly beneficial when future maintenance of the project is considered.
56
A part of my mental plan for the project was to add the ability to link inspiration to the song section(s) it influenced during the writing process; unfortunately, due to time constraints, the current iteration of the solution does not include this feature. Along with the ability to save and share the song structures created, I wish to add this feature to the application in the first major update after public release. Furthermore, I had initially planned to have released the solution onto the App Store by the final project deadline; the app is currently in the beta testing stage. I put this time delay down to the fact that I was forced to totally redesign sections of the solution quite a way into the development process, meaning if I were to undertake the process again then I would very likely create some initial design sketches to prevent this from happening. To conclude, I believe I have surpassed my initial brief of creating a far more effective solution than the use of a Pages document for songwriting, as the application which has been created not only fulfils this purpose but also allows musicians to collate inspiration, all in a single piece of software. In response to my initial project focus, I can
safely say that technology has a huge impact upon the collation of ideas and inspiration in the popular music industry, streamlining the songwriting process. Hopefully many other musicians will benefit from my solution in the near future. Look out for Opus Alpha, coming to the iOS App Store soon.
Appendix: What’s in the name? In the past, the majority of composers assigned Opus numbers to their compositions; these effectively acted (and continue to act) as unique identifiers, giving an indication of the order in which works were created. Despite this practice being rather uncommon in the twentieth and twenty-first centuries, I felt that it was only fitting to select a traditional piece of terminology for the name of my application. Instead of following the word Opus with an arbitrary number, I selected the Greek letter a as the suffix, due to its association with projects which are currently under development, much like songs being crafted with the help of my solution. And with that, the name Opus Alpha was born.
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
THE BATTLE OF BASTOGNE: OTHERWISE KNOWN AS ‘THE BATTLE OF THE BULGE’ OR THE Tiger Fry-Stone ‘ARDENNES OFFENSIVE’
Inspired by Band of Brothers and Battle of the Bulge. Recreated using Airfix TM and Plastic Soldier Company Ltd.
Key Facts Date • 16 December 1944 – 16 January 1945 Location • The Ardennes, Belgium, Luxembourg, and Germany Result • Allied Victory German attack force: • 200,000 men • 1,000 tanks with assault guns (including new 70-ton (king) Tiger II tanks) • 1,900 artillery pieces • 2,000 aircraft, includes some Messerschmitt Me 262 jets. • German force had many battle-hardened veterans of the tough fighting from the Eastern Front – however, they also had ‘green’ units with boys and older men. Allied Forces • 80,000 men
• Fewer than 250 pieces of armor and about 400 artillery guns. • Many of the American troops were inexperienced and only a few specialist units.
About the battle: • Largest battle fought on the Western Front in Europe WWII • Largest battle ever fought by the US Army • It was a German offensive to block the American and British armies in France and the Low Countries and supply lines and to recapture of the port in Antwerp – Netherlands.
How did the battle end? • Harsh winter weather was wanted by the Germans for the attack – it would slow the Allies on ground and and in the air. • This also hindered the Germans, but was accounted for by their timetables. • Narrow and bad roads in the Ardennes ruined the strict timings for the Germans.
• Weather improved –Allied planes could support again and counter attack to push back the Nazis. • January 1st – Luftwaffe attack in Holland and ground attack in Alsace were unsuccessful • January 16th – The battle officially ended • Fighting continued until early February when front lines returned to the original positions of December 16.
57
R EI G AT E G R A M M A R SC H OO L
TO WHAT EXTENT WAS THE ‘FINANCIAL REVOLUTION’ AND DEVELOPMENT OF ENGLAND’S ECONOMY CAUSED BY WILLIAM III PERSONAL ACTIONS? Tomos Nutt Introduction Europe in the seventeenth century was hounded by international war. A few powerhouses were constantly vying for small scraps of land which held enormous prestige to those respective countries. Treaties and alliances were constantly being signed and broken in a never-ending stream of struggle for power. England, France, Spain, the Netherlands, and the Holy Roman Empire were the main competitors, but some had very different aims and views to others. Complex networks of trade in both goods and ideas facilitated scientific and financial revolutions in the modern world, and England were rivalling the Netherlands (realistically the merchants in Amsterdam) for global expansion in both trade and territory. France and Spain were much more rivalled in military might and influence, with economies much more substantial and stable than any other in Europe1. In relative terms, the Netherlands had only recently been acknowledged by all the major powers in 1648, but had exploded in terms of wealth and influence over trade routes, Amsterdam swiftly earning the status as the commercial heart of Europe. Into this chaotic dynamic William of Orange, stadtholder of the Dutch Republic, was thrust and his ideas and policies would go on to shape England for generations. William of Orange became King William III of England during the “Glorious Revolution” between 1688-1689. William was in a war already, the Nine Years’ War as part of a Grand Alliance with Austria and the Holy Roman Empire, Spain and
Savoy, intended to curb Louis XIV’s and France’s ambitions2. The most
William III
immediate reason for this overtaking of England was to bring England into this coalition against France and more specifically England’s maritime power and potential. William along with Mary signed the Declaration of Rights which announced them as monarchs of England, but struggled to be recognised for years. William was faced with several mammoth tasks, namely that of trying to restore some of England’s wealth and self-reliance. Into this context arrives the aim and centre of this essay’s question. There is no doubt that the ‘financial revolution’ was a relative success and England rooted itself with the capabilities and potential to grow into a substantial empire, but to what extent William was personally responsible for this growth remains ambiguous and
1 W. A. Speck, Stability and Strife: England 1714-1760, p.56 2 Geoffrey Treasure, The Making if Modern Europe: 1648-1780, p.266 3 J. R. Jones, Country and Court: England 1658-1714, p.103 58
clouded with historical determinism and stubborn modern thinking. The true nuanced dynamics of politics and economy often get overshadowed by conflict and initial motives of said conflict, rather than the consequences and domestic implications of such events. There are many Acts and attempts made to try and solve England’s dire economic state, and while the context of the war and need for money is the most important factor in motives for these Acts, we need to consider William’s influence in each of them and analyse to what extent he might have been personally responsible, successful and revolutionary or not. A lot of these changes greatly concern money, and that is a diverse beast which will be addressed at greater length, but for now it is more important to present some of the ideologies that fuelled the very concepts of the changes made and how they might impact future England. Mercantilism was a main driving force in Europe between the sixteenth and seventeenth centuries influencing domestic policy, and was the principal of the government having more control of the country’s economy to better control state power at the expense of rival powers3. This might seem like a logical concept, but at the time seemed radical compared to the actions of kings in previous centuries. It had previously been supposed that the king’s job was to win wars and act as an international icon of ruling, but the actual nitty gritty of trade was all but designated to merchants who had
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
no strong patriotic ties or influence. The increase in importance of bullion and minted currency somewhat forced European powers to take much stronger control in balance of trade to try and steer as much bullion as possible into the coffers of the respected countries. This was the main value used to rank power and so had to have enormous emphasis placed on it. Colonialism went hand in hand with mercantilism, that being the simple notion of expanding into areas outside of Europe and placing these territories under European jurisdiction. Colonialism was much newer and being hindered, because the repeating wars in Europe prevented any powers from focusing their military resources into distant conquest in the Americas and India. This was aligned with mercantilism because the more colonies you had, the greater the chance of finding precious metals for mining into bullion to be shipped back to Europe. Spain took advantage of this and were some of the first to set up gold mines in the American islands. This international concern with bullion and money would impact William III’s reign like no other, as under him England underwent significant transformation in the relation between the Crown and Parliament in terms of economy. Traditionally the kingship would always entail two sorts of spending: ordinary and extraordinary. Ordinary revenue was made up of the king’s personal wealth and assets, mostly from the lands owned by the monarch and material money and items the king possessed. This was always seen as the primary source of income for the king and one which he should be able to comfortably run the country with. Extraordinary expenditure was
Bank of England
revenue that had to be raised through Parliament using taxes, duties etc.4 in time of financial need, normally to fund a war. Extraordinary money throughout previous generations were always a single sum, never done or asked for in consecutive years. Yet when William came on to the throne, he saw the need to more and more repeatedly call Parliament to raise loans for his seemingly never-ending wars against Europe and France. The set of circumstances William found himself in was unique, with England now at his disposal but also tied in many traditional constitutional practices and the factions fighting in Parliament, the Tories and the Whigs. To try and establish the direct influence and control he had, we must look at the very nature of the changes that occurred, the motive behind them and the potential underlying leaders behind each change. Money itself was an evolving thing and must also be explained and considered if William III’s personal effects in the ‘financial revolution’ are to be coherently analysed.
Money and War in Europe
In order to properly appreciate the environment in which William III and Parliament were making changes, we need to first understand the very nature of money and the context of war in which it sits. These are the two main factors which contribute most significantly to the changes made in England because without the fundamental flaws in money of the time and massive financial pressure of constant war, there might never have been the creation of national debt or the Bank of England. The most deceptively easy of all the ideas behind the question of William III and how much he was personally responsible for the ‘financial revolution’ is money. Money as a concept has evolved so much over the last few centuries that money now is incomparable to the money used in the seventeenth century. Money was, in very simple terms, just the same as any other trade good. Gold and silver was used as this form of trade, and each coin was worth its weight in whatever metal it was made from. Anybody could use any coin from anywhere, because the value of metal did not change all
4 L. W. Cowie, Hanoverian England: 1714-1837, p.23 59
R EI G AT E G R A M M A R SC H OO L
that much depending on whichever country you were in (assuming we are just referring to Europe and her extremities). This was important, because each country paid primarily in their own country’s coinage, and you needed the seller to accept these coins for the transaction to be a success, because exchanging coins for other coins wasn’t a common practise. This allowed relatively free trade with minimal taxes between all countries and allowed exporters of goods to achieve great profits5. Theoretically. In practise, it was a lot more complicated and nuanced, especially concerning the modes of currency and external factors which affected values of goods greatly. This ideal internal and international trade was heavily muddied by business owners who had personal stakes in certain goods being traded. Take the King of France as a victim of this system. He would like to make more ships, and so needs timber for his ships. He would like to buy French timber, because it would keep the money
Old English Coins
within France’s economy and promote French business and eventually returning more tax to the crown through small import and export taxes. There are plenty of forests just south of Paris, so moving timber from there to Dunkirk to build ships does not seem like an enormous task. However, the timber planks were so large and heavy they would have to be moved by barge along a series of canals. The next problem arises when learning that each time the canal changes between water gates, an individual business owner that owns that certain stretch of canal will ask for a toll fee. Combine this with the slow traffic, it is both quicker and cheaper to buy Baltic timber from Scandinavia, and sail it miles down the sea into Dunkirk. This counter intuitive system demonstrates how convoluted even the simplest demand becomes. This again relies on the Scandinavian countries accepting your currency. The heart of this problem depends on your own currency being trustworthy and respectable enough to conduct international trade. Unfortunately, England had made the mistake of butchering her currency to the point where very few would accept English coinage. By the late 17th century, much of the money in England looked scabbed, decrepit, and decidedly dull looking. This dullness came primarily from other metals put into the coins to take a portion of the precious metal out and use it as government funding, known as debasement. This started with Henry VIII beginning the debasement of the coinage in 1542 to the point where silver only compromised around 25% of supposedly “silver” coins6,
5 Barry Coward, The Stuart Age, England, 1603-1714, p.45 6 Henry Kamen, European Society, 1500-1700, p.22 7 Geoffrey Treasure, The Making of Modern Europe: 1648-1780, p.82 60
Henry VIII Coin
the rest comprising of other metals like copper and tin. Elizabeth I had worked hard to considerably improve the ratio of silver and gold, but the reputation had already been damaged and now England was notorious for untrustworthy currency. On top of this, William III would have to deal with the even more fundamental problems of basing your coinage in silver and gold standards. The main issue with the silver standard in particular was that in England the new golden guinea was overvalued compared to the price of silver of the same quantity. What this meant was English silver coins were regarded as less valuable, and gold more so7. In practise, an English merchant would buy goods from off shore with silver as in these countries the same weight of silver was worth more, and sell goods for gold, as in England the same amount of gold compared was worth more. Silver started to flow out of England and gold in, essentially putting them on a gold standard in all but name. This was emphasised by the fact that silver was almost always worth
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
more melted down into pigs than in coins, so even more silver made its way out of England as travelling profit for merchants. The final issue William III had to deal with was the massive expenditure towards the wars. Arguably the main reason for William III even deciding to become the king of England was to bring them into a coalition against France in the 9 years’ war. England had a naval stronghold against France, the distance across the channel being relatively small and England would have the ability to significantly choke trade coming into French ports. However, first he would have to deal with the enormous inadequacies of the current English army and navy. Between 1689 and 1697, England built up a baby to support 40,000 sailors in the navy and 70,000 men in the army, and average expenditure reached to about £2,500,000 a year during the Nine Years’ War8. The insistence of Parliament in 1697 reduced the standing army in England to 8,000 men. This was a tiny number and even more expensive, because if a future war were to happen, it would take a lot more money to hire and raise new soldiers than it would to maintain a standing one. The Nine Years’ War cost nearly £50,000,0009. The war of the Spanish Succession was even more expensive, on average costing Britain more than £4,000,000 a year in supporting an army of 90,000. The combined cost of these wars exceeded £72,000,00010. This was an enormous amount of money for a slow war of attrition that extended a lot longer than necessary.
Yet it was vital for William III to continue to struggle in these wars otherwise France and Spain would get too powerful and might threaten the safety of the entire of Europe. This massive expenditure demanded funds, and new ways of raising them.
Financial Revolution William III defined a new relationship between the monarchy and Parliament that would see an indestructible link tied between the legislative and executive powers. Unlike today, previously in England a monarch would be the one to call Parliament to session, and the one to decide when it would be dissolved (be it after a time or after a decision is made). Parliament had very little power in that respect, other than the fact that they were the only ones who could create and pass bills of legislation. This is why the monarch needed Parliament, and ordinarily the motive behind calling Parliament to session was to raise taxes for war, or for the monarch to pass his own ideas for the country (almost always religious, as seen in Henry VIII’s manipulation of Parliament to legalise the break with Rome). This was all changed by the Triennial Act of 1694. This reaffirmed the principle of the 1641 act which stated that a Parliament must be in session every 3 years, but more importantly stated that each Parliament could not have a life span of more than 3 years.11 William III of course hated this idea, as it would restrict his executive actions that he could enact when Parliament was not
in session, and prevent him getting lots of bills passed through the same Parliament that might be favourable to him, no matter how much the public might dislike these bills. But, as we have seen, he desperately needed money, as this was in the build up to the war of the Spanish Succession, and this bill also provided more tax grant from Parliament to the government, which William III could simply not afford to refuse. This would become embedded even deeper when new avenues of debt became available, and these Parliamentary grants became crucial to servicing that debt. This was only the start of Parliament intervention into the crown’s affairs, and without these the ‘financial revolution’ probably would not have been possible. The tacking of bills like the Triennial Bill on to taxation grants of Parliament left William virtually powerless to prevent these bills, even if he did have a veto. The power Parliament had over William III can be seen in the first two Parliamentary sessions (March to May 1690 and October 1690 to January 1691)12 when they granted just less than the predicted £4,600,00013 that the government had asked for. William III was obviously very pleased by this apparent generosity, until in December of 1690 an Act was passed that established a commission of public accounts. The members of this commission would examine the state of public finances, which gave Parliament a foothold to point out deviations of best interest in the King’s spending and this could harm grants to the monarch, which William
8 David L. Smith, A History of the Modern British Isles: 1603-1707 p.308 9 J. R. Jones, Country and Court: England 1658-1714, p.65 10 David L. Smith, A History of the Modern British Isles: 1603-1780, p.312 11 J. R. Jones, Country and court: England 1658-1714 p.38 12 J. A. Downie, The commission of public accounts and the formation of the country party, p.33 13 Barry Coward, The Stuart Age: England, 1603-1714, p.388 61
R EI G AT E G R A M M A R SC H OO L
III desperately needed. This went further, as in November of 1691 they presented a report indicative of great waste in expenditure, persuading a commons committee to recommend cuts to the estimates of the army and navy for 1692.14 Never before had Parliament such a tool against the monarch and his spending, but also allowed the King to borrow more and more of the continued to use it effectively. All of this expenditure had of course turned into extraordinary expenditure. William III had invested all of his own personal fortune into the Nine Years’ War, and the expectation that the monarch should live off his own wealth had deteriorated, to the point where William III and Mary were granted £700,000 each year, starting in 1689,15 to live off, and to finance all official duties such as state visits. The civil list, as it became known, set yet another precedent for future monarchs of England. This personal reliance on Parliamentary grants added even more fuel to the involvement of Parliament in government and reduced even further the power William III had over legislation. All of this is to show how attached the monarchy had become to Parliament, and no amount of vetoing of William III could prevent it. Even after all these grants, between 1688 and 1702, the gap between royal income and expenditure was nearly £11,300,00016. Many things were tried to attempt to fill this gap, and ultimately only two successfully worked. The first was the creation of a National Debt, which had its origins in the
Million Loan act of January 1693. This act authorized William III to borrow £1,000,000. These were to be paid by annuities that would be funded by Parliamentary excise duties. What this essentially meant was that investors and syndicates could buy a bit of the government debt, and would eventually get repaid what it cost plus interest. So many people invested into this scheme because it was not at the whim of the monarchy and how much they could pay out after all their costs, it was instead underwritten by the Parliament, meaning that your annuity was always going to be paid back thanks to the Parliament having a separate fund that would only get spent on returning of these annuities. This was the true start of the ‘financial revolution’, because now there was almost no limit to the size of the National debt because it was being funded by the very people who has invested into it. This debt was always going to be paid back and could always remain, a permanent debt that has no downside. In 1815 the National Debt peaked at £1,000,000,00017, showing how secure and how stable this scheme was to become in the future. The National Debt would facilitate almost the entire industrial revolution as Britain expanded through colonialism and technological advancement. The importance of this scheme could only be eclipsed by the second successful attempt to fill the gap in expenditure, the Bank of England itself. The establishment of the Bank of England was first devised by Charles Montagu, 1st Earl of Halifax, in
14 Barry Coward, The Stuart Age: England, 1603-1714, p.389 15 W. A. Speck, Stability and Strife: England 1714-1760, p.14 16 Barry Coward, The Stuart Age: England 1603-1714, p.471 17 Peter Mathias, The First Industrial Nation: An Economic History of Britain 1700-1914, p.131 18 Barry Coward, The Stuart Age: England, 1603-1714, p.471 19 J. Brewer, The Sinews of Power: war, money and the English state, 1688-1783, p.114 62
1694 the idea was raised that there would be a loan of £1,200,000 and the subscribers would be incorporated as the Governor and Company of the Bank of England. Those subscribers would gain privileges at the Bank including issue and checking of notes, and Parliamentary assured loans with interest. The Bank of England was given a Royal Charter in the Tonnage act of 1694, giving London a central hub of international trade and allowing vast sums of money to come into the coffers of the state without the worry of it not being paid back. The rate of interest was set at 8% and would be paid by specific duties allocated by Parliament18. The £1,200,000 was raised in 12 days. This Parliamentary underwritten debt was again secure, and between 1697 and 1720 the unredeemed public debt rose from £16,700,000 to over £50,000,00019. The creation of the Bank of England and the sudden rise in spending into public businesses (mostly through secondary hiring in order to achieve large naval growth) had a large unforeseen impact, that it has started to help English businesses through demand for timber, wool, iron etc. without even intending to. However we must go back to the question of how much was William III personally responsible for all this. In a more short term view, he had very little influence or sway over the legislation and acts passed by Parliament, shown by the gradual growth of control Parliament has over William III’s expenditure, but in a more long term way he does have some impact, as without the economic
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
incentive of war and demand it created on new ways of raising money, the ‘financial revolution’ may not have taken good in quite the same way, and more likely would not have happened as fast. However not all attempts that were made to raise funds and debt for the government were quite as successful. The Million Lottery act of 1694 was the first ever lottery and comprised of 100,000 £10 tickets for sale with prizes of up to £1000 to be drawn20. This was not very popular and in 1697 the lottery loan for £1,400,000 only raised £17,63021. This shows how a lot of these endeavours were new and risky, and not all of them worked. The same could be said for the later South Sea Company, which was a company set up to run potential trade down South Africa after the peace of the Spanish Succession war. Many invested in it and when an act passed in 1720 meant all National Debt was the put under its control, stock shares soared 10-fold. The classic example of an economic bubble occurred, as people put more money into this enterprise they realised I had yet to even own a single boat, and the market crashed, leaving many poorer, such as Isaac Newton. Over all this, William III had set the precedent for the ventures to be profitable, and the failures show how fortunate William III was in the National Debt taking off and being able to gain large sums of money from Parliament, even if he was not directly the one to start the ‘financial revolution’.
Isaac Newton
Currency and Economy English currency by the time William III claimed the throne in 1689-1690 was the laughing stock of Europe. As I have already established, previous debasing, clipping, forging and precise exploitation of Sterling had made it untrusted and devalued within Europe. This had happened to such an extent that the value of silver bullion in Amsterdam and Paris was greater than the value of silver coins in England. This arbitrage market was causing a river of silver to flow out of England along with the value it held. To prevent this loss of capital and potential crown revenue, William III recognized that he would have to improve the value and trust in English currency. However, several fundamental underlying contextual factors inevitably prevented policy such as recoinage from being successful. The silver standard naturally fluctuated and often meant that silver was often more valuable as bullion than as coinage, preventing recoinage of
silver coins from ever being a viable solution to improving English currency. Entanglement within European wars also inhibited internal development, as international trade with Europe became stifled through naval battles of attrition and might. The straits of Gibraltar became particularly strangled by navies of all the major powers in Europe (England, France, the Holy Roman Empire, Spain, The Netherlands) as it was the only known route in and out of the Mediterranean (The Suez Canal had yet to be discovered). Blockades such as these severely limited the potential for internal and close development of trade, forcing much wider areas of trade expansion such as the Americas. Even though this would result in eventual development of the British Empire, this rarely trickles back into the English economy and thus results in less direct investment from the government into English currency and economy. It is important to bear in mind that William III was not short of money. The creation of the national debt, which transformed the Crown’s personal debt into a national one, along with other means of procuring loans such as the Bank of England, enabled borrowing of money on a scale never seen in Europe. This increase in stable and assured debt allowed considerable investment into the navy, both in quality and quantity of the ships. Not only would such investment aid potential victory in European war, it would also promote potential for international, even global, trade. This coincided with the initial rise of trading companies such as the East India Company, who especially
20 J. R. Jones, Country and Court: England 1658-1714, p.65 21 J. R. Jones, Country and Court: England 1658-1714, p.66 63
R EI G AT E G R A M M A R SC H OO L
exploited trade routes around Africa and the Cape of Good Hope22. Yet this again is an indirect source of investment into the economy of England, and it raises the question as to whether development of English currency and economy was the prime motivation for naval expansion. Culture and science also saw a considerable growth both in output from England and interest from other foreign countries, most notably the Dutch. Military success generated a national mood of confidence and self-optimism, fostering new waves of philosophical and cultural endeavors. Britain felt less threatened by the cultural influences of the Continent.23 What became known as the ‘Augustan Age’ was mostly fueled by investment from the state and personal commission from the wealthy such as John Churchill. Esteemed architects such as Christopher Wren saw most of this commission and work, and science saw a boom under the spearhead of Newton and Leibniz, with Newton publishing his ‘Optics’ in 170424. External developments entered England and proved integral to future growth and development, such as wrought iron working from France. However, the King himself had very little to do with the patronage of scientists and other great persons outside the circle of artists he employed, such as Antonio Verrio from Italy25. Even though this increase in culture brought in numerous investments from foreign powers, William III appears
to have not held it in high regard. For example, rather than allow Isaac Newton to finish his scientific works he instead made Newton head of the Royal Mint. This demonstrates how William III was more concerned with raising funds for his European conquests and personal wealth than with developing a cultural powerhouse. Even then, William III’s decisions such as the appointment of Newton were minimally focused on improving currency and the English economy directly, and rather left the results of war to indirectly contribute to the development of England and her economy. William III did make some direct efforts to improve the English currency. The Great Recoinage of 1696 is ample evidence to display his desire to help the Sterling. English coins had been notoriously devalued for decades prior to William III ascending to the throne. Previous debasement, in which a government recalls coinage to replace a percentage of the gold or silver in a coin with a base metal such as lead or copper, had caused serious decline of the weighted value of English coin. Debasement had been used as an easy way to gain excess profits for the Crown, as the percentage of precious metals that were removed went directly into the Crown’s coffers. The ‘Great Debasement’ between 1542 and 1551 was the worst period of this devaluing of English coinage. By March 1545, coins in England had silver content worth 50% of the face value of a coin. The value of each coin
22 Peter Mathias, The First Industrial Nation: An Economic History of Britain, p.238 23 David L.Smith, A History of the Modern British Isles 1603-1707, p.324 24 David L.Smith, A History of the Modern British Isles 1603-1707, p.323 25 Oxford University Press, Edited by Euan Cameron, Early Modern Europe, p.162 26 Ling-Fan Li, After the Great Debasement, 1544-51: Did Gresham’s Law Apply? p. 9. 27 Samuel Knafo, The Making of Modern Finance: Liberal Governance and the Gold Standard. p. 78. 28 Oxford University Press, Edited by Euan Cameron, Early Modern Europe, p.161 64
in silver content fell to only 25% of the face value of that coin by 1551.26 Although this had been addressed and rectified to a certain degree under Elizabeth I and subsequent rulers, the damage had been done to the trust of English coins.27 Sterling had earned the reputation of being debased, forged, clipped and subject to many other value damaging practices. Not only had there been damage internally to coinage, external dynamics of money had also affected the quantities of money and types of money within England. The eastern countries such as India and China had substantially increased trade with companies such as the East India Company28, which was enormously benefitting the economy of England and other European countries. However, these countries only accepted silver as payment for wares. This considerably contributed to the outward flow of silver from England, and the wares being brought in were being traded for gold and notes in trade centres such as Amsterdam and further exchange of gold with merchants within England itself, producing an influx of gold. Fundamentally, this put England on a pseudo-gold standard. The Great Recoinage of 1696 was designed to increase the value of silver Sterling as coins, not bullion, and therefore allow greater trade and growth of the economy within England. This shows how William III was personally interested in the welfare of English currency and economy. The
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
government would take in any silver coins, compensate the cost for the weight of silver in each coin, melt down this silver, mint new coins with a much higher silver content and introduce these new coins back into circulation. This undertaking and developing of accurate and successful recoinage was so great that William Lowndes, secretary of the treasury, turned to Isaac Newton (at the time warden of the Royal Mint) for help. Branch mints were established at Bristol, Chester, Exeter, Norwich, and York to assist with the work of recoinage. Between 1696 and 1700 the value of silver struck was £5,106,019, compared to the £3,302,193 coined in the previous 35 years. Financially, the recoinage was a resounding failure. The quantities of silver flowing out of England only increased and production had to be stopped in 1700 at a great monetary loss to the government. Although it was not known at the time, the only way to prevent this movement of silver coins in England to silver bullion in European countries and wider continental areas was to increase the face value of these silver coins until they were worth more than the equivalent weight in bullion. This would not happen until the Great Recoinage of 1816. Thus, although William III showed personal interest and acknowledgment of the dire needs of the English economy and currency and the need for reform in order to improve these things, his attempts to increase the trust and value of English currency only perpetuated the problems of the silver standard and failed to prevent continual draining of silver coinage out of the English economy, and so William III’s personal
actions in this respect did very little to help the development of England’s economy. Not all of William III’s focus in the economic sphere was directed towards currency. Other policies such as the Wool Act also showed his interest and potential advancement in the English economy. The focus on not just raising money in the short term but attempting to strengthen imports and exports within England does show how William III understood the need for encouragement of local business, especially in sectors rapidly diversifying and expanding in the late 17th century. There had been an overall stagnation of growth across the country, leading to prices of everything from bread to clothes to houses also stagnating. Reduction in this growth also lead to manual labour being much more desired as the English population started to decrease thanks to prices of food being disproportionately high compared to current wages. Larger estates could capitalise on their wealth and ability to hire considerably large amounts of labour through adoption of more advanced techniques in farming such as specific grazing fodder for livestock29 allowed for a wider array of draperies to be made of wool, the most important being Devon serge, a mixture of Irish and Spanish Wool. Production of manufactured textiles outstripped those in Scotland and Ireland. However, this production did need help and encouragement. The Wool Act of 1699 helped especially the growth of Devon serge as it placed high tariffs on textiles exported from Ireland and instead promoted the export of raw wool from Ireland. This
benefited both countries as the Irish wool required for Devon serge was pushed towards England and kept trade between Ireland and England. This demonstrates how canny William III could be concerning foreign and domestic trade, made more important by the constant European wars perpetually disrupting international trade to Asia and the Americas. Yet there is a second motive behind this example of clever agricultural policy. Other parts of this Wool Act not only affect Ireland but all English foreign territory and colonies. It prevented these developed textiles from being exported and instead virtually forced all traders in these colonies to sell their wares directly to English markets and then for those markets to individually sell products to buyers, forcing a middle man between producer and buyer of these textiles. This generated government taxes on all goods sold in domestic markets and, with the success of local textiles in places like Devon which made up one quarter of all of England’s cloth exports in 170030, William III had found another way of raising money for his European conquests. So even in the most seemingly wellmeaning of Acts that did genuinely benefit the English economy and agricultural sector, William III saw the opportunity to increase his personal wealth and gain in Europe. His gaze also started to look outwards more as this Act did affect colonies as far as the Americas, and demonstrates the increased interest in mercantilism and the benefits it could have for kings and queens. William III was no doubt deeply interested in the dynamics of the economy in England.
29 Barry Coward, The Stuart Age England 1603-1714, p471 30 Barry Coward, The Stuart Age England 1603-1714, p474 65
R EI G AT E G R A M M A R SC H OO L
The advanced commissions and investment into English culture (science, art, music etc.)31 shows a man with modernisation in mind, a desire to reside over as many new discoveries as possible. The attempt at recoinage, although ultimately futile, does demonstrate William III’s intent to actively aid and improve the English economy. Colonisation and mercantilism had broadened his gaze to potential trade in new lands. Yet his expansionist ambitions remained focused on Europe and more personal conquest. The war of the Spanish Succession demanded vast investment and expense, and William III would fuel that expenditure whatever the cost. Therefore, all of William III’s actions had the underlying motive of simply raising more revenue and taxes to fund his campaigns in Europe. To this end William III personally had little to do with the improvement of English currency and economy and rather left it to others such as Isaac Newton to try and fix inherent problems of the silver standard and English coinage.
against the mighty power of France to protect the Netherlands borders along the Rhineland into the Holy Roman Empire. He had invested an enormous amount into the potential of England, by committing all his political chips into the kingship of England, which is why he spent so much trying to make them a worthy contender against other European powers both militarily and economically. This massive demand of investment had brought strain on all other areas of the country, however; primarily the currency and means of gaining money. Fundamental problems of the silver standard and coinage of the country added to this strain as silver left the country and that which was left was untrusted, and forced William III and Parliament to try and support the economy of England. Legislation aimed at merchants such as the Wool Act tried to keep money internally within the government, and the recoinage attempted to improve the condition of the coinage and encourage more international trade (although that inevitably failed).
Conclusion
The ‘financial revolution’ and the transfer of expenditure power to Parliament facilitated by creation of the bank of England and the national debt allowed for much greater
William III was an ambitious man. He had willingly accepted the crown of England to bring them into a war
31 Henry Kamen, European Society 1500-1700 66
spending power, but forever tied the monarchs spending control to the will of Parliament through his desperate need for funds, as William III had to make more and more concessions to obtain the level of money needed to rival Spain and France. Overall he did not have a primary, direct impact in the economy and ‘financial revolution’ because he did not have the executive authority to prevent such things passing through Parliament without giving up his desperately needed taxes. However in a much longer term sense, he was the very instigator of the age, as without his drive for European war and desire to spend so much money in defending European lands, there would never have been as much demand for taxes and funds, so the creation of things like the Bank of England would have not happened for decades or even centuries later. He failed in trying to vastly improve the situation of the English coin, but did show plenty of interest in it, as it was needed in order to stabilise the overall economy. William III fell victim to very fundamental flaws in coinage standards and the mechanics of economics of the time, and while he was probably just lucky to be the monarch to oversee such a revolution, he definitely accelerated the process.
H E N RY S M ITH STU DI E S PR I ZE W I N N ER S 2017
Bibliography Barry Coward, The Stuart Age, England, 1603-1714 Geoffrey Treasure, The Making of Modern Europe: 1648-1780 J. R. Jones, Country and Court: England 1658-1714 Henry Kamen, European Society: 1500-1700 Oxford University Press, Edited by Euan Cameron, Early Modern Europe L. W. Cowie, Hanoverian England: 1714-1837 Peter Mathias, The First Industrial Nation: An Economic History of Britain 1700-1914 W. A. Speck, Stability and Strife: England 1714-1760 Ling-Fan Li, After the Great Debasement, 1544-51: Did Gresham’s Law Apply? Samuel Knafo, The Making of Modern Finance: Liberal Governance and the Gold Standard. David L.Smith, A History of the Modern British Isles 1603-1707 J. Brewer, The Sinews of Power: war, money and the English state, 1688-1783 J. A. Downie, The commission of public accounts and the formation of the country party www.bankofengland.com www.greatbritishmint.com Collection of state papers including 1694 Triennial Act
67
“I feel incredibly grateful to have been given the opportunity to spend some of my invaluable lesson time engaging in a project I am truly passionate about. The process of developing an iPad application from scratch has been an especially steep learning curve, but also one which has enlightened me significantly.” Milo Stephens Headmaster’s Prize Winner “HSS did not only give us the opportunity to express our creativity, but to also show the world what can be achieved when we are driven by passion and motivation from the bottom of our hearts.” Manish Seeruthun Governors’ Prize Winner
Reigate Grammar School, Reigate Road, Reigate, Surrey RH2 0QS 01737 222231 info@reigategrammar.org reigategrammar.org
facebook.com/reigategrammarschool @rgsheadmaster