The Brunswick & Greenwich Academy
Magazine of History 2018
Brunswick School 100 Maher Avenue Greenwich, CT 06830 (203) 625-5800 Brunswickschool.org
Greenwich Academy 200 North Maple Avenue Greenwich, CT 06830 (203) 625-8900 Greenwichacademy.org
Vol. 15
Cover Image: Soviet poster honoring motherhood. Note the ten children, two of them pictured as serving proudly in the Soviet armed forces.
The Brunswick & Greenwich Academy Magazine of History 2018 Editor
Dr. John R. Van Atta
Editorial Office
Department of History Humanities Wing Brunswick School 100 Maher Avenue Greenwich, CT 06830 e-mail: jvanatta@brunswickschool.org (Please submit manuscripts for review and any correspondence regarding editorial matters by e-mail to the editorial office)
Editorial Board Mrs. Margot Beattie Ms. Kristine Brennan Ms. Kristen Erickson Mr. Chris Forester Christian Hartch Gordon Kamer
Mr. Steve Mandes Jaclyn Mulé Wesley Peisch Ms. Rachel Powers Mr. Andy Riemer Jane Watson
1
Contents The Editor’s Page Articles Revolutionary in Their Own Right: Female War Correspondents in the Vietnam War by Katherine Dawson Paving the Path to Prosperity: The History of the Connecticut Turnpike by Wesley Peisch Smallpox Epidemics in New York City, 1890 to 1910 by Annabelle Raine A Multifarious Pestilence: The Black Death and its Effects on Medieval Europe, through a Sociocultural Lens by Rachel Dong Fortifying Farmland: World War II in Little Compton, Rhode Island by Ben Shore “The Submarines Were Here:” A Short Story By Ben Shore Unraveling the Socialist Sisterhood: the Conflict Between Ideology and Reality in the Soviet Liberation of Women by Jaclyn Mulé Melody Without Humanity: The Soviet Union’s Fifty-Year Musical Repression by Gordon Kamer
2
The Editor’s Page In the pages that follow, there is one particular image that I have not been able to shake from my head. It is the one on page 11: Henri Heut’s November 4, 1965, photograph of news correspondent Dickey Chapelle as she received the last rites from a Navy chaplain just before she died. A free-lance photoDr. John Van Atta, Editor journalist, Chapelle was accompanying a Marine platoon on a search and destroy mission just south of Chu Lai Quang Ngai Province, in Vietnam, when the lieutenant in front of her accidentally walked into an enemy tripwire. The wire ignited an attached mortar shell and sent a chunk of shrapnel into her neck, severing one of the carotid arteries. When somebody once tried to tell Chapelle that a war zone was no place for a woman, she replied, “That’s true, it is not a woman’s place. There’s no question about it. It’s no place for men either. But as long as men continue to fight wars, I think observers of both sexes should be sent to see what happens.” She had been one of the few women covering American troops in combat since World War II, when she started taking photos and writing stories from the frontlines, including the Pacific island battlefronts of Iwo Jima and Okinawa. She ventured alarmingly close to danger then too, describing snipers’ bullets fired at her as sounding like the buzzing of wasps. In 1962, she published an autobiography entitled What’s a Woman Doing Here ? A Combat Reporter’s Report on Herself. Chapelle was the first female reporter to be killed while on assignment. There is no fuller, or braver, measure of dedication to one’s craft—or one’s country. She was just forty-eight-years-old. As the lead article in this issue, we are lucky have Katherine Dawson’s essay on the role of women war correspondents in Vietnam— Dickey Chapelle and many others like her. It is a story of their uncommon courage and professionalism. And there is plenty more great material to follow. Consider Wesley Peisch’s account, based on primary source research, of how the builders of the Connecticut Turnpike in the 1950s had to overcome staunch opposition from influential Greenwich residents. For readers who might be interested in the ways that disease epidemics can impact culture, society, and politics, we have two examples: Annabelle Raine’s study of New York City’s response to smallpox around the turn of the nineteenth century and Rachel Dong’s article on the im3
pact of bubonic plague in mid-1300s Europe. In addition for this year’s magazine, Ben Shore provides us with something unusual—an original historical study presented in two different ways. First there is his local historical study of the World War II fortifications installed on the coast of Little Compton, Rhode Island, which he fashions as “Fortifying Farmland.” Then, alternatively, we have Ben’s short story version, “The Submarines Were Here,” written on the basis of the same historical information. Finally, we have a pair of unusual pieces resulting from senior independent study projects in twentieth-century Russian history: Jaclyn Mulé’s essay on the ideological role of women in the decades following the Bolshevik Revolution, and then Gordon Kamer’s on the story of artistic repression (especially in the field of music) during that same era. Still more than that, we also encourage students of both of our schools to try their luck in submitting their work for this magazine. If you have an essay, written perhaps for a history class, that you think might be pretty good, we would like to see it. We employ anonymous refereeing, a common practice in the history profession, which means simply that every submission is blind-reviewed by some members of the editorial board. We receive more top essays than we have room to publish, which makes the final decision-making always tough, but that is no reason for any brave soul not to try. Thanks, as usual, to school heads Tom Philip and Molly King, and also department chairs Kristine Brennan and Kristen Erickson, for their steady support. My good friend and erstwhile office mate John Pendergast helped me with various technical details of my laptop. Margot Beattie, with her now familiar but still rare expertise, offered valuable time and talents to the job of laying out the magazine text and conducting all the business negotiations with the printer. As I have said many times, there would be no history magazine without her. Most of all, thanks to all who may venture to read this issue and perhaps join me in appreciating the great work that our authors have done.
*
4
Revolutionary in Their Own Right: Female War Correspondents in the Vietnam War By Katherine Dawson ‘18 Female war correspondents, specifically those who covered Vietnam, have journeyed into the depths of war, braving harsh conditions and constant condescension, in pursuit of a story even before other women of the time demanded equal rights at home. The Vietnam War allowed both male and female correspondents to make a name for themselves. Despite the extensive amount written on all aspects of the Vietnam War, female war correspondents are often excluded from the narrative. This unfortunate lack of recognition became apparent when Stanley Karnow, a correspondent for Time and Life magazines, once asked, “Who was there besides [main correspondents] Gloria and Frankie?”1 To answer that question, approximately 470 women travelled to Vietnam—267 of whom were American. The Vietnam War brought about a time of change in America. Limited censorship resulted in distrust of government, allowing the antiwar and civil rights movements to flourish. The unique characteristics of the Vietnam War presented an unprecedented opportunity to aspiring female correspondents who quickly proved themselves the equal of their male counterparts, and with aid from shifting American values they were able to sustain that success, leaving a lasting legacy for their modern counterparts. Journalism, historically dominated by men, had long been an adversary of combining women and war. Before America’s involvement in World War II, women covering war were few and far between. In 1846, Margaret Fuller was one of the earliest female war correspondents to make a name for herself, reporting on the Italian revolution for The New York Tribune.2 During the Spanish-American War, Anna N. Benjamin travelled to Cuba to report on American troops. Martha Gellhorn, who went on to report in Vietnam, started her career reporting on the Spanish Civil War. Peggy Hull became the first female war correspondent recognized by the U. S. State Department, reporting on World War I.3 These women paved the way for aspiring female war journalists in World War II, and yet their legacy failed to eradicate the stigma surrounding women and war. During World War II, by building munitions and playing professional baseball, women showed aptitude at something other than home5
making. Following this trend, World War II marked the first time in history when a significant number of women reported on war. Despite proven capability and relentless effort, these so-called “Gal Correspondents”4 were not allowed in press briefings at the start of World War II, instead being restricted to only writing on the “women’s angle.” Yet many did not adhere to this restriction. In 1943, Margaret Bourke-White, a photographer for Time-Life, became the first women to fly with a U.S. combat mission. Marguerite Higgins covered the 1945 liberation of the Dachau concentration camp. Dickey Chapelle drew attention when she landed in Okinawa. Regardless of the great strides these women made, female journalists, like many other women, found themselves relegated to their old roles once the war ended and the men came home. In the case of female foreign correspondents, most of those who remained in the industry were assigned to culture and style sections of their respective newspapers.5 This trend continued throughout the 1950s and 1960s, and in 1968 there were fewer women working as foreign correspondents than in the 1930s.6 Pursuing a higher education or a career, especially once married, became taboo. Enormous weight fell upon women to look pretty, dress correctly, and find a husband. As Betty Friedan wrote in The Feminine Mystique, In the fifteen years after World War II, this mystique of feminine fulfillment became the cherished and selfperpetuating core of contemporary American culture. Millions of women lived their lives in the image of those pretty pictures of the American suburban housewife, kissing their husbands goodbye in front of the picture window. . . Their only dream was to be perfect wives and mothers; their highest ambition to have five children and a beautiful house…they gloried in their role as women, and wrote proudly on the census blank: “Occupation: housewife.”7 Accommodating men became the priority throughout American society, including the newsroom. If a woman did earn a job at a respected newspaper, she would most likely serve as a fact checker, never to be promoted to the coveted positions of reporter, writer, or editor.8 While earning a job at a respected newspaper was difficult for many women, the stigma surrounding women and war made it nearly impossible for a woman to get a job as a war correspondent. Marguerite Higgins was the only woman to report on the Korean War, and even she was eventually forced out of 6
South Korea.9 Like women in society generally, female journalists had lost the progress they had made during World War II. Distinct aspects of the Vietnam War, however, would offer women a new chance to assert their talents as war correspondents. Nicknamed the first television war, the Vietnam War brought about a time of significant change in the way warfare was covered. As Vietnam was an undeclared war “there was unprecedented access, even to combat areas, and virtually no censorship on reporting,” writes Maureen Ryan.10 Without the shackles of censorship, journalists gained the ability to report on whatever they pleased. This new-found “access gave women reporters a chance to show that they could cover combat bravely and honorably, holding their own even under the most frightening and stressful circumstances.”11 It also allowed the American people to see the brutal and gruesome effects of the Vietnam War.
New York Times reporter Kate Webb at work in Vietnam.
Television and photography bombarded the American people with images of dead soldiers, American and Vietnamese. In particular, the ability of television to display graphic content, as Kyle Hadyniak notes, “showed the uncensored nature of warfare in a way print or radio could not.”12 Realistic coverage shocked Americans at home, and the brutality of the war shocked the journalists who covered it. Coverage was contradictory and muddled. Victories were reported as defeats and vice-versa. The resulting credibility gap only strengthened the large-scale anti-war 7
movement. Polarization plagued America, and controversy between the government and anti-war protesters persisted. The news media seemed to be at the center of this fight. In 1963, President John F. Kennedy attempted to have The New York Times’ reporter David Habersham removed from Vietnam because of his negative coverage of the War.13 Nine years later, the Pentagon Papers would be published on the front page of the same newspaper, inciting more anger and distrust towards the American government. As the war progressed, many Americans began to feel Vietnam was an unwinnable war. In 1968, Walter Cronkite, the CBS evening news anchor and “the most trusted journalist in America,” stated his clear disapproval of the War.14 Vietnam attracted thrill seekers, aspiring journalists, and those looking for an escape. “If you were a journalist,” remembers Virginia Elwood-Akers, “and if you were adventurous, sooner or later you went to Vietnam. And the mere fact that you were a woman surely could not stand in your way.”15 Americans’ views on the Vietnam War contrasted radically with what they had been in prior wars. Shifting views at home and greater access in Vietnam encouraged women, especially freelancers, to go to Vietnam. Their presence would make a difference for war reporting in Vietnam and influence views about female reporters at home. Despite their eagerness, women had to “fight like hell to get the assignment in Vietnam,” Joyce Hoffman observes. At the start of American involvement in the Vietnam War, the feminist movement had yet to make a real impact. Betty Friedan had only just published The Feminine Mystique, in 1963. There was no Equal Employment Opportunity Commission and Title IX was not yet implemented. Many female journalists at the beginning of the war faced the reality that war reporting was considered too dangerous for young women and remained the boy’s club it had become in the 1950s and 1960s. Gloria Emerson, a fashion writer for The New York Times, urged her editor to send her to Vietnam. She did not get his approval but decided to go as a freelancer in 1956.16 The popularity of freelance journalism and loose military restrictions during this era made Vietnam an especially popular war for women to cover, and several women, including Denby Fawcett and Jurate Kazickas, paid their own way to Vietnam.17 As the war developed, however, the feminist movement made significant progress at home, which in turn would help women to gain official recognition as war correspondents.
8
The Supreme Court’s legalization of birth control in 1965 and of abortion in 1973 allowed women to determine for themselves when they had children. This freedom permitted women to pursue careers with less societal criticism. During the late 1960s and early 1970s, women used The Civil Rights Act of 1964, which among other things precluded discrimination against women in the workplace, to sue news organizations for greater opportunities. News organization executives felt pressure to send more and more female journalists to Vietnam, resulting in several women earning roles as war correspondents for major publications. The United Press International (UPI) hired several women, including freelancer Kate Webb, Political Correspondent Margaret Kilgore, and Tracy Wood, who covered the ceasefire in 1973. In 1970, Gloria Emerson finally received the green light to cover the war for The New York Times.18 Though the feminist movement did push news organizations to send women to Vietnam, sexism still followed female correspondents there. “Hey lady, what are you doing here,” was the line flung at Emerson when she arrived in Vietnam.19 Female correspondents were often dubbed “husband hunters” in order to trivialize their motives.20 Among the most appalled by the appearance of female war correspondents was United States General William Westmoreland, who tried to prohibit women from staying in the field overnight.21 One woman was refused entry to a combat area because an officer felt the she looked too much like his daughter.22 If many men did not believe women had a place in war reportage, and had no problem showing it, women proved they could provide unique viewpoints that their male colleagues lacked. While insisting there was no difference between male and female correspondents, they did offer something new: a female perspective. When Emerson first arrived in Vietnam she felt “convinced . . . that the Vietnamese could and should be enlightened by the west,” contends Hoffman.23 Emerson’s time as a correspondent for The New York Times changed her opinion. The young reporter ultimately developed a very anti-American stance on the War, often writing from a sympathetic angle regarding the Vietnamese or anti-war protesters. On October 1, 1971, she published an article entitled “Survey of Attitudes Made by Americans,” where she reported on Vietnamese refugees. The survey found that most refugees considered living conditions under communist control to be good and felt neutral towards the Viet Cong.24 The article later added that “death and destruction caused by frequent military activities by allied forces . . . were major reasons why many had fled their homes.” In another article, “Some Americans Who Don't Want to ‘Play 9
God,’” Emerson highlighted the anger of young Americans who wrote to President Richard Nixon protesting the war. Reporting on the Vietnamese people and on the anti-war movement shifted Emerson’s perception of the War. Many journalists had not considered looking at the war from those viewpoints. Yet, as Elizabeth Pond pointed out, “correspondents who went to Vietnam with [certain] a point of view could always find evidence to support that point of view.”25 Emerson was not the only female reporter to grow skeptical of United States involvement in Vietnam. Another was Martha Gellhorn. While covering a children’s hospital in Vietnam, Gellhorn was furious to discover that “uncounted” Vietnamese children were wounded and killed by American weapons.26 Denby Fawcett remembers with disgust seeing a GI tell a Vietnamese peasant woman, begging to stay with her husband as they took him away, to “shut the fuck up.” Fawcett realized, “the longer I stayed in Vietnam, the more cynical I became . . . each death, American or Vietnamese, was senseless . . . we were hurting the Vietnamese more than we were helping them.”27 Even Emerson, who hated being set apart because of her gender, knew “it was important for women to cover wars, because men were boys at heart. They got dazzled by guns and uniforms.”28 While men can often get caught up in the grandeur of war, women “tend to try harder to understand what is really happening to people on the ground,” wrote Marie Colvin, a war reporter who died in 2012 covering Syria for The Sunday Times.29 While the tendency of women to cover the human-interest side of war is impressive and important, those who covered Vietnam also showed they had the necessary skills to cover every aspect of war. The narrative that women are too fragile or cowardly for war has historically prevented women from reporting on the front lines, but female Vietnam War correspondents proved they could cover the grittier side too. Dickey Chapelle was a veteran photojournalist by the time the Vietnam War began. She had earned respect from many of her male colleagues, fighting for her right to cover both World War II and Vietnam. In 1961, Chapelle received the Overseas Press Club’s highest honor, the George Polk Memorial Award, the citation stating she could “hold her own with men twice her size when it comes to covering a war.”30 Chapelle was extremely patriotic and a fierce anti-communist,31 and she refused to follow rules prohibiting her from reporting on wars. “It took defiance like Chapelle’s to open the doors to women covering war,” Meg Clayton argues.32 On November 4, 1965, Chapelle died after being hit by shrapnel from an enemy booby trap while on an operation with United States 10
Marines. Henri Huet’s photo shows her lying on the ground during her last moments. Nearby, two Marines look stunned at the sight of an American woman killed in front them. Her final words reportedly were, “I guess it was bound to happen.”33 Several newspapers reported on the incident, including the New York Times, saying, “[Chapelle’s] aggressiveness . . . and her willingness to take a chance had set a pattern that she maintained all her life”34 This and other widely shared images showed women facing the same risks as men.
Henri Huet’s gripping photograph of reporter Dickey Chapelle receiving last rites in Vietnam.
Chapelle was not the only woman to provide such an example. Kate Webb, originally reported dead by The New York Times, emerged 23 days after she was captured by the Viet Cong on April 7, 1971.35 The “tough, hard-drinking, at times foul-mouthed, and ever unwilling to suffer fools” correspondent was captured because the Vietcong thought she was a spy. Webb’s time as a prisoner of war, reported by The New York Times, did not deter her, and she later covered the Tet Offensive.36 Chapelle’s and Webb’s willingness to wade into the thick of war, despite the clear danger, helped prove women could report on war with the same fervor and bravery as men. The hundreds of women who wrote about the Vietnam War allowed their editors back home, and the American public, to see that they belonged in the world of war reportage. Another was Frances FitzGerald, among the most well-known female journalists of the last century, who published an article for the Atlantic that she would later turn into a Pulitzer Prize winning book, Fire in the Lake: The Vietnamese and 11
the Americans in Vietnam.37 FitzGerald was one of the first journalists to predict that Vietnam was an unwinnable war, claiming that America’s lack of knowledge of Vietnamese culture made American “victory” impossible.38 At first reluctant to include women at all, news organizations ultimately came to depend on the perception as war correspondents in Vietnam. When Tracey Wood, a reporter for UPI, heard she would be heading to Vietnam, a male staffer told her “I don’t believe women should cover wars.” Less than a year later, sitting in North Vietnam, Wood negotiated the final details of the United States coverage of the release of American prisoners of war. Thirty journalists, one of whom was Walter Cronkite, waited for the results outside.39 Marlene Sanders was the first female television correspondent in Vietnam for ABC. Laura Trotta of NBC was the second. Elizabeth Becker filled the role for The Washington Post. Hundreds of American Women “flung themselves in a war for the adventure of the unknown,” Emerson said.40 Though it may be impossible to give each due credit, they all had an impact. Many excelled at their jobs, winning prizes and captivating audiences, and as Maureen Ryan adds, “[b]y the end of the war, female journalists had defied male resistance at all levels, as well as imprisonment, danger, and death, to emerge as fullfledged representatives of major newspapers, wire services, and television networks.”41 Unlike the aftermath of World War II, female war correspondents were able to sustain their success after Vietnam, continuing as correspondents and writing memoirs of their experience. Some of their ongoing success owed to the strength of the feminist movement. But much of their prominence should be attributed to their passion and ability. Emerson, fueled by her hatred of the Vietnam War, demanded justice Reporter Frances FitzGerald’s Department of Defense identification card. 12
on the pages of Newsweek. She later wrote a book entitled Winners and Losers, which won the National Book Award.42 She went on to cover the Israel-Palestine conflict, where she portrayed the Palestinians as “people whose yearning and dreams of a homeland were no less valid than those of the Israelis.”43 Fitzgerald went on to write about Fidel Castro and the Iran-Contra scandal. The fiery Kate Webb continued to cover unrest in places like South Korea, Indonesia, and Afghanistan. Marlene Sanders became the first female vice president of ABC news in 1976. In the end, as Hoffman concludes, female Vietnam War correspondents’ display of “skills, courage, and fortitude entitled them to be considered for any newsroom assignment.”44 These female correspondents were not only able to sustain their own success; they paved the way for others to follow. A door had been flung open and “[b]y the 1980s, the number of female foreign correspondents had tripled since pre-Vietnam War days to 33 percent,” according to Robin Ewing. Fifty percent of the correspondents in Bosnia were women, and by the time the United States got involved with Afghanistan, “it was routine to have women covering all aspects of the war.”45 Kim Gamel covered Iraq as a news editor for the Associated Press. Tina Susman was held hostage in Somalia and came come to realize that “as more women have been attacked and lived to tell about it and return to the field, they have earned the respect of their male colleagues, who can no longer fret that females might be too delicate for the job.”46 Anne Garrels was one of the sixteen reporters to stay in Bagdad during the United States bombing campaign.47 Editors no longer question a women’s ability to cover war, at least not openly. The 1960s and 1970s brought about a time of change in the United States. The feminist movement allowed women to pursue careers on terms more equal with men. The Vietnam War allowed female war correspondents to report their own point of view to a receptive public. Hundreds of women flocked to report on the war, and their journey proved that women were equal to men in terms of reporting. Some showed themselves willing to die for a story. The more that women covered the war, the more Americans accepted that they could. Sexism in the newsroom has not been fully eradicated, however. Women hold only 25 percent of the top-level news positions and only an average of 36 percent of articles printed by major news organizations are written by women annually.48 The New York Times, generally considered a liberal newspaper, is in fact the largest offender, with nearly 70 percent of its articles being written by 13
men. Today, women represent only 34 percent of journalists reporting in the field.49 Clearly, some stigma has survived through the last century. But this does not erase the exemplary work of the 267 American women who covered the Vietnam War. Rarely recognized for the opportunities they created for women, these reporters truly exemplified the grit, skill, and perseverance that any war correspondent must have. Notes Joyce Hoffman, On Their Own: Women Journalists and the American Experience in Vietnam (Boston, 2008), 4. 2 Robin Ewing, “Women Reporting War: The History and Evolution of the Woman War Correspondent” (M.A. thesis, The University of Texas at Austin, 2005), 11. 3 Ewing, 6. 4 Mark Jenkins, “‘Gal Reporters’: Breaking Barriers in World War II,” National Geographic News, December 10, 2003.. http:// news.nationalgeographic.com/ news/2003/12/1210_031210_warwomen.html (accessed January 5, 2017). 5 Hoffman, 4. 6 Ibid., 6. 7 Betty Friedan, The Feminine Mystique (New York, 1963), 18. 8 Lynn Povich, Good Girls Revolt (New York, 2012), 17-18. 9 Natalia Haller, “Female War Correspondents in Vietnam: A Turning Point for Women in American Journalism” (M.A. Thesis, Humboldt State University, 2006), 10. 10 Maureen Ryan, “Interpreters of Violence: Novels and Memoirs about Female Journalists of the Vietnam War,” Wlajournal.com, January 1, 2012. http://wlajournal.com/wlaarchive/24_1-2/Ryan.pdf, 3 (accessed November 16, 2016). 11 Ibid. 12 Kyle Hadyniak, “How Journalism Influenced American Public Opinion During the Vietnam War: A Case Study of the Battle of Ap Bac, The Gulf of Tonkin Incident, The Tet Offensive, and the My Lai Massacre” (M.A. Thesis, University of Maine, 2015), 2. 13 Ryan, 3. 14 Hadnyiak, 2, 7. 15 Virginia Elwood-Akers, Women War Correspondents in the Vietnam War, 1961-1975 (Metuchen NJ, 1988), 9. 16 Hoffman, 1, 13. 17 Ewing, 7. 1
14
18
Haller, 15.
19 Elwood-Akers,
2.
Hoffman, 9. Haller, 10. 22 Hoffman, 5. 23 Ibid., 13. 24 Gloria Emerson, “Survey of Attitudes Made by Americans,” New York Times, October 26, 1971. http://www.nytimes.com/1971/10/26/ archives/survey-of-attitudes-made-by-americans.html (accessed January 2, 2017). 25 Hoffman, 7. 26 Ibid., 7-8. 27 Fawcett, 25. 28 Scott Simon, “War Correspondent Gloria Emerson Dies at 75” National Public Radio, August 7, 2004. http://www.npr.org/templates/story/ story.php?storyId=3838631 (accessed January 6, 2017). 29 Meg Clayton, “The Women Who Fought to Be War Correspondents,” LA Times, November 10, 2015. http://www.latimes.com/opinion/op-ed/ la-oe-clayton-female-war-correspondents-20151110-story.html (accessed, January 3, 2017). 30 Ibid. 31 Hoffman, 13. 32 Clayton. 33 Emerson, xx. 34 “Dickey Chapelle Killed in Vietnam; Mine Fatally Injures Woman Photographer-Reporter,” New York Times, November 4, 1965, http:// query.nytimes.com/gst/abstract.html? res=9406E3DC1730E33ABC4C53DFB767838E679EDE&legacy=true (accessed December 21, 2106). 35 Kate Webb, “Kate Webb Tells of Her 3-Week Captivity,” New York Times, May 13, 1971, http://www.nytimes.com/1971/05/13/archives/ kate-webb-tells-of-her-3week-captivity.html?_r=0 (accessed November 17, 2016). 36 Hoffman, 383, 14. 37 Ryan, 7. 38 Haller, 36. 39 Wood, 224. 40 Emerson, xix, xx, xix, xvii. 41 Ryan, 7. 42 Hoffman, 379, 380. 43 Ibid. 380. 20 21
15
Ibid., 381, 383, 5. Ewing, 7, 8. 46 Clayton. 47 Ewing, 4. 48 “The Status of Women in U.S. Media 2014.” Women’s Media Center, https://wmc.3cdn.net/2e85f9517dc2bf164e_htm62xgan.pdf, 6 (accessed November 21, 2016). 49 Women’s Media Center, 9, 8. 44 45
* 16
Paving the Path to Prosperity: The History of the Connecticut Turnpike By Wesley Peisch ‘18 The era following World War II was a period of great change in America, characterized by economic growth and a newly emboldened government. One key component of this national trend was massive federal government investment in highways facilitating trucking and commuting. Although Greenwich is generally regarded as a quaint, historical town, it proved no exception from this political and cultural shift. In Greenwich, citizens received their new highway as a coastal alternative to the thenpacked Route 1. Although many of them opposed construction of the Connecticut Turnpike, the fact that it was built in spite of delays, evictions, and general traditionalism attests to the enthusiasm for progress that characterized many Americans in the modern postwar era. The proven success of this major transportation artery shows the value of long-term investment in the face of conservative opposition. Before the Connecticut Turnpike’s 1958 opening as part of the modern-day I-95, the interaction between Greenwich and New York was relatively limited. Greenwich was primarily a commuter and farming town relatively out of reach of the big city. The two key links to New York were the New Haven Line and Merritt Parkway, which both served Greenwich with relatively little intrusion.1 During the baby-boom years of the late forties and fifties, a jump in population and the suburbanization of citydwellers precipitated the push for large highways to facilitate the daily swell of people in and out of American cities, including New York. As for Greenwich, the town needed a new way to get these newly suburban residents in and out of the Manhattan urban core.2 But few highways like the Turnpike as we know it today existed in the early postwar period, so it was up to Connecticut to define exactly what it meant to build a “modern” highway. In the early stages of planning, this modern interstate road faced much opposition from numerous figures on the Greenwich political scene. The Connecticut Turnpike entered Greenwich politics in a 1944 report by Greenwich’s Traffic Commissioner J. W. Cone. The document outlined the benefits of such a road passing through Greenwich.3 Cone, among 17
Route of the Connecticut Turnpike (line in red).
others, saw that Route 1 was overcrowded and wanted a BostonWashington freeway cutting through Greenwich to relieve Route 1 traffic. The Post Road, which had become Route 1, had served as the main thoroughfare between New York and Boston since the colonial era, but commercial traffic generated by the booming postwar economy strained it to the limits, causing long traffic jams and heavy air pollution.4 Much of the opposition originated from skepticism concerning the practicality and the local impact of this new type of road. Planning for the Connecticut Turnpike began twelve years before the Interstate Highway and Defense Act of 1956, so the largest road many Greenwich residents had ever seen was the 1939 Merritt Parkway. A massive concrete “All-Purpose Throughway”5 would be a huge step forward, but at what costs? As a result, the idea for a cross-New England freeway went nowhere for ten years. When the plan for the road resurfaced in the early 1950s as the “Greenwich-Killingly Expressway,” part of a single continuous freeway to eventually link New York City and Boston, local citizens pushed back hard. Politically, Greenwich split into two groups over the proposed freeway. The first group, which supported the coastline route as state officials recommended, found a broad range of support, especially from citizens living north of Route 1. The second group, which wanted the Turnpike to arc north of Greenwich as an inland route, consisted of citizens who lived in the freeway’s path and locals who wished to preserve the old-time quaint character of the town.6 They worried that the state of New York 18
wanted to simply exploit Greenwich as a trucking corridor.7 The battle between these two groups came to center on the 1951 election of Greenwich’s First Selectman. One candidate, Republican C. Carleton Gisborne, supported the coastal route. His opponent, Democrat John F. Sullivan, opposed the freeway as a whole, especially the coastal plan.8 Gisborne had the advantage from the beginning. The Republican party dominated state and local politics, and both New York’s and Connecticut’s state governments supported a coastal route. Additionally, the coastal recommendation had already been approved by the Connecticut General Assembly on June 6, 1951. Sullivan, the minority candidate, ran a voracious anti-freeway campaign in Greenwich. He labeled himself the “ghost that stands behind every Republican speaker and official from here to Hartford,” committed to obstructing Turnpike construction by any means.9 He claimed that New York’s highway engineer had “made it clear that New York is building a thruway for trucks only and has never pretended otherwise.”10 The decision, however, was practically made even before the Sullivan-Gisborne election; Greenwich was overwhelmingly Republican and most residents shared the country’s vision of an automobile future. Gisborne won by 5,613 votes, a record for the GOP in Greenwich.11 Even post-election, the fight for a “Yankee Freeway,” that is, the inland route, continued. Despite the fact that New York state, Connecticut governor John Davis Lodge, First Selectman Gisborne, and a majority of Greenwich residents supported the coastal route, anti-freeway activists still attempted to obstruct construction in every way. Harold P. Vose, a Belle Haven resident, claimed that the turnpike construction vote had been “rushed through on June 8, the last day of the session, under suspension of the rules.” (In fact, it had not, and had passed through Connecticut’s General Assembly with only three dissenting votes.) He claimed that it would necessitate destroying 16,000 Fairfield County homes (the actual number being less than a fifth of that prediction). Some anti-freeway advocates even quadrupled the $130 million projected cost. Opposition to the Freeway from Fairfield County became so passionate that Governor Lodge had to issue in the Greenwich Time an open letter to Greenwich citizens refuting all the unfounded claims about the road. Though passionate, opposition to the coastal route of the Connecticut Freeway would prove in the end not enough to overcome the zeal of state and local officials for a modern highway.12 Although the Connecticut Freeway had been a prime example of postwar top-down planning, construction did not commence without local 19
oversight and even the occasional victory for freeway-opposing Greenwich citizens. One neighborhood threatened by the Turnpike’s construction took action. Riverside, a peninsular coastal community east of the Mianus River, responded to the threat of the Turnpike by convincing officials to reroute the freeway slightly to the north. When residents George Armstrong and Clifton Hipkins found surveyors’ plans for the turnpike, they discovered that it would go directly over Riverside, possibly destroying the Riverside Yacht Club. The two men raised $25,000 from neighbors and traveled to Hartford and Washington to lobby for a new route. It was not easy; according to Hopkins, “I don’t know how we ever did it, but we were out maybe five nights a week till twelve, one o’clock at night, and then I was up and in the office at 8:30 in the morning and didn’t get home till 8:00, 8:30 at night. How I did it, I don’t know.” The men were successful, and engineers added a bend to the freeway bypassing Riverside to the North.13 Local residents successfully intervened in the blasting through rock, which is a crucial part of any freeway construction project, especially in Byram, the western end of Greenwich, which once supplied rock to the Brooklyn Bridge and the base of the Statue of Liberty.14 The Slattery Construction Company needed to blast through 800,000 cubic yards of rock,15 so it is no surprise that an accident occurred. On July 23, 1956, chunks of rock launched by an improperly executed blast hit a children’s party in Armstrong Court,16 injuring ten children and three adults, none fatally. Two days later, State Highway Commissioner Newman S. Armstrong decreed that all Turnpike blasting operations would receive government oversight.17 That October, two construction workers were convicted and fined for not taking sufficient precautions in the July 23 blast, with a warning from the judge of jail time for perpetrators of another similar blasting incident.18 For the most part, however, local residents lost out in feuds with the construction companies and with government. For example, when the Slattery Construction Company came to Byram, it routinely ignored local measures to protect Old Greenwich residents. It used overweight, roaddamaging trucks and blocked sidewalks used by commuters walking to the Old Greenwich train station with construction waste, forcing residents to walk through construction mud.19 It blocked crucial thoroughfares like Field Point Road, which “marooned a thousand persons,” according to a government complaint. When asked to clean up, the company obliged for only a few months, earning only a warning from the Board of Selectmen that condemned their “public be damned attitude.”20
20
An interchange along the Turnpike route (I-95) through Connecticut.
The most affected group in Greenwich was the homeowners in the direct path of the road whose properties, under the state’s power of eminent domain, were condemned and destroyed, a practice common to most major infrastructure projects. The real tragedy of property condemnation in Greenwich was the thoughtlessness with which it was carried out. For example, two Byram families still living in their homes received only a 24-hour notice to move out before “at 8 a.m. . . . cranes, trucks, and other gear wheeled onto the land of [the property owners] and began moving trees and shrubs.” Harder still, the required compensatory payment had not reached some of the families before their being forced out of their homes.21 In many ways, the differences between the older Merritt Parkway and the Connecticut Turnpike represent a fundamental change in American city planning. Although both projects were highly ambitious, Greenwich residents received much gentler treatment during the 1938 construction of the Merritt than during the Connecticut Turnpike project two decades later. The Merritt Parkway was located in backcountry, naturally decorated with trees, had frequent exits and beautiful, custom-designed bridg21
es. In the name of efficiency, the quaint aspects of the Parkway mattered little in the plans of the Turnpike designers. The given reason for the absence of trees was the potential for casualties, such as on the Merritt, resulting from falling trees.22 The Merritt Parkway compromised safety and efficiency in favor of aesthetics and local approval, whereas the Turnpike reflected more a headstrong government and a “public be damned” point of view. And yet, to equate big government with efficiency is often a mistake; construction almost stopped because of a lack of funding from the Connecticut state government. Eager to begin construction, the state tried to begin construction in Greenwich before an appraisal had been determined and before 111 parcels of land had been purchased.23 Fortunately, the town pushed back until an appraisal had been reached, but the procrastination of state government in purchasing homeowners’ land resulted in the aforementioned sudden evictions of Greenwich residents. Another consequence of the state’s eagerness to begin construction was the depletion of funds extending construction through June 1957, after the building costs of the An old Connecticut Turnpike toll booth, road had escalated from $200 in Greenwich. million to $445 million24 and bond sales to meet those costs had dwindled. Because the Connecticut Turnpike was planned and begun before the Interstate Highway and Defense Act of 1956 promised 90 percent federal funding for interstate highways, the state had to foot the bill. To raise money, Connecticut sold highway bonds at a 4 percent interest rate to investors. State law prevented the interest rate from increasing beyond 4 percent, so when bond sales completely dried up, the Connecticut Department of Transportation’s hands were tied.25 Faced with termination of the Turnpike project, the road’s funding became the first priority for the January 1957 session of Connecticut’s General Assembly.26 Their new bill at that time allowed highway bonds to be sold at a greater interest rate than 4 percent until the Turnpike was finished.27 Because of these budget issues, the cost of the Turnpike became a subject of public scrutiny. In June 1957, Republican John C. Donaldson accused the state of “waste and inefficiency” for paying 22
$446,945 to the Turnpike’s architect. The story made the front page of the Greenwich Time that day.28 Ultimately, the turnpike was completed despite its many problems, opening on January 2, 1958, as a segment of the Interstate Highway System at a total cost of $464 million.29 But construction left Greenwich residents irritated with Governor Lodge. Despite the Republican history of Greenwich, the town voted Democratic (against Lodge) in the 1954 gubernatorial race. Because the margin between Lodge and his Democratic opponent, Abraham Ribicoff, was just over 3,000, it is likely that Fairfield County’s resentment over the Turnpike cost Lodge his re-election. Although it faced numerous difficulties during construction, the Connecticut Turnpike has proved a powerful economic engine for Greenwich, allowing for thousands of residents to commute to and from New York City every day and for easy movement of goods across the Northeast. The fact that it was paid for by the state rather than the federal government (all interstate highways received 90% federal funding after 1956) meant that the Connecticut Turnpike became one of the few interstate roads with tolls, which were eventually closed following a Stratford truck crash. The highway became the center of attention once again in 1983 after the Mianus River Bridge collapse, which killed three people. For six months, trucks from the turnpike had to be rerouted onto local roads in Cos Cob. The angry reaction from locals in the area recalled the earlier protests of Mianus residents during the highway construction. In both scenarios, large overweight trucks disrupted neighborhood traffic and damaged roads.30 Today, Connecticut finds itself in a position similar to that of 1951. Due to population increase in Fairfield County, I-95 is under immense traffic strain. When it was built, its intended ADT (Average Daily Traffic) was 30,800 cars. Today, the road routinely accommodates 143,000 vehicles. Recently, the state has considered numerous solutions to this traffic issue, including the re-implementation of tolls31 and widening of the road to include the breakdown lanes.32 The most contentious—and ambitious—of these solutions is a new a high-speed rail line linking Washington, D.C. and Boston through Greenwich. In all, the story of the Connecticut Turnpike is more relevant now than ever; history seems to be repeating itself in the growing debate over high-speed rail transportation, between private interests, historic23
preservation groups, and the federal and state governments.33 Most recently, a meeting to discuss the effects of high-speed rail occurred on May 23, 2017, at Greenwich Town Hall.34 Over the next few years, Greenwich will likely be torn between two futures, just as it was before. In one, the town preserves historical sites and neighborhoods. In the other, Greenwich takes the route its forebears did with I-95, stepping boldly and controversially into a more modern, better-connected Connecticut. Time will tell whether the choice Greenwich makes is the right one, but if I-95 is any indication, even a road paved with evictions, injury, and resentment can lead to a prosperous future. Notes Bill Young, “Kent House Removal Marks End of Era When Town Was Quiet Farm Community,” Greenwich Time, 4 November, 1955. 2 “Town’s Population Increase from ’50-’55 Greater Than in Decade Between ’40-’50,” Greenwich Time, 23 May 1956. 3 J.W. Cone, “Proposed New England All-Purpose Throughway Relative to Greenwich, Connecticut,” (Public Works Department, Greenwich, Connecticut, 1944). 4 “Fairfield To Seek New Truck Route,” The New York Times, 4 March 1951, p. 76. 5 Cone, “Proposed New England . . . .” 6 “Selectmen Will Hold Open Meeting Oct. 17 When Hill Reveals His Throughway Route,” Greenwich Time, 10 October 1951. 7 “Briggs Denies Sullivan Statement N.Y. Plans ‘Trucks Only’ Thruway,” Greenwich Time, 24 October, 1951. 8 “Thruway Dominates Discussion at League Candidates’ Session,” Greenwich Time, 2 November 1951. 9 Ibid. 10 “Briggs Denies…,” Greenwich Time. 11 “Thruway Dominates…,” Greenwich Time. 12 “Governor Says He Will Not Interfere in Present Plan for Thruway Construction,” Greenwich Time, 22 May 1952. 13 Clifton Hipkins, interview by “F”, “Early Days in Riverside, Sailing on Long Island Sound,” Greenwich Library Oral History, 22 August 1975. 14 Hollis Burke, “805,000 Cubic Yds. Of Rock To Be Blasted As Turnpike Cuts Through East Of River,” Greenwich Time, 9 April 1956. 15 Ibid. 16 Hollis Burke, “Turnpike Granite Blast Injures 3 Adults, 10 Children, Windows, Homes Smashed,” Greenwich Time, 24 July 1956. 1
24
“Argraves Says Dynamiting Sites On Turnpike To Get Special Committee Review,” Greenwich Time, 25 July 1956. 18 “2 Workers Fined $50 Each for Explosion on Turnpike; Judge Warns of Jail Terms,” Greenwich Time, 5 October 1956. 19 “Slattery Firm is Warned for its Attitude Toward Public in Turnpike Work,” Greenwich Time, 3 January 1957. 20 Ibid. 21 Hollis Burke, “Byram Family is Given a Choice: Save Pine Tree or Porch Steps,” Greenwich Time, 10 May, 1956. 22 “Engineer Explains Turnpike Effects in Old Greenwich,” Greenwich Time, May 18, 1952. 23 “State to Ask Town Approve Starting of Turnpike Here Before Offering Appraisals,” Greenwich Time, 5 December 1955. 24 “Estimated Turnpike Cost Boosted To $445 Million; Base Cross-State Toll $2,” Greenwich Time, 1 June 1956. 25 “Budget, Turnpike Finances, Court Reform Issue 3 Biggest Problems,” Greenwich Time, 8 Jan 1957. 26 Ibid. 27 “Assembly Passes Turnpike Bill, Ribicoff Signs It, Lauds Action,” Greenwich Time, 31 January 1957. 28 “Argraves Defends $446,945 Paid to ‘Pike Architect,” Greenwich Time, 19 June 1957. 29 Jeremy Plant, ed., Handbook of Transportation Policy and Administration (New York, 2007), 114. 30 Frank M. D’Addabbo, P.E. to Peter H. Conze, 27 July 1983, Historical Society of the Town of Greenwich. 31 John Burgeson, “With Half-Hour Delays Common on I-95, Experts Giving Tolls a Look,” Connecticut Post, 5 June 2014. 32 Ken Dixon, “Commission Approves Study for I-95 Widening in SW CT,” Connecticut Post, 2 February 2017. 33 Ana Radelat, “Feeling Heat from CT, Feds Say They May Alter HighSpeed Rail Plan,” The Connecticut Mirror, 13 February 2017. 34 Ken Borsuk, “Neighborhood Notes: News From Around Your Neighborhood,” Greenwich Time, 20 May 2017. 17
*
25
Smallpox Epidemics in New York City, 1890 to 1910 By Annabelle Raine ‘18 For centuries, smallpox ravaged the world, killing approximately one third of its victims. Smallpox is caused by the virus variola major, a member of the Poxviridae family. The disease presents with welts on the skin, which often leave noticeable scars. With no known cure, there were multiple outbreaks of smallpox as well as major epidemics in the 17th and 18th centuries, notably the 1614 pandemic that spread from Europe to the Middle East. The disease also decimated indigenous populations in the Americas, as European settlers and explorers brought the virus over from Europe. In 1796, Edward Jenner created the smallpox vaccine, which revolutionized medicine. By 1949, smallpox was officially eradicated in the United States, and by 1980 it was gone worldwide. However, from 1796 to 1980, many individuals died from the disease, despite the prevalence of Jenner’s vaccine. In the United States, many large cities all over the country mobilized to vaccinate their populations. New York and Brooklyn had a long history with smallpox. Health officials in those cities struggled, however, to get residents vaccinated, as many urban dwellers doubted the benefits of vaccination and feared the possible side effects. Further difficulty arose from conflicting beliefs on how to control the epidemic and whether forced vaccination was lawful. The handling of the smallpox epidemics during the 1890-to-1910 period, specifically the forcing of vaccinations, resulted in a diverse range of opinions and perspectives. Finally, a 1905 Supreme Court decision declared a limit to individual rights when the medical safety of the public was at stake. The smallpox epidemic of 1893-1894 in Brooklyn and New York City required governmental control via quarantines, which resulted in many legal actions against the two cities. In order to combat the epidemic, health officials developed a method consisting of increased vaccinations, quarantines, and pesthouses, applying in various ways across all sectors of society. Smallpox had infected the city many times before 1893, but the city had no way to forcibly vaccinate until then. Prior to 1893, the plan for controlling the disease involved vaccinating the surrounding residents of an affected area. The strategy changed, however, on February 1, 1894, when the Brooklyn Health Department named a new health commissioner, Z. Taylor Emery.1 26
Smallpox warning, from the New York Times, March 24, 1894.
Taking a more active stance on vaccination, Emery decided to implement quarantines to control the outbreak, declaring that “‘In case persons are found who have never been vaccinated, every effort should be made to induce them to accept it. . . . When the inmates of infected houses refuse to be vaccinated, the vaccinator may, at his discretion, direct the Sanitary Police to maintain a quarantine until all are vaccinated.’”2 After quarantine, infected individuals were sent to pesthouses. The quarantines and pesthouses met legal requirements, but they led to lawsuits filed against the Brooklyn Health Department. In one example, Robert Goggin sued the Brooklyn Board of Heath for ten thousand dollars resulting from damages incurred as a result of the deaths of his wife and two children.3 In addition to quarantines, Emery organized free disease-control clinics in more than twenty-four locations in order to increase the number of vaccinations. He understood the importance of increased public vaccinations as the only way to prevent smallpox. The clinics also hoped to combat the negative attitude toward vaccinations that many Brooklyn and New York citizens held. Many residents did not want to get vaccinated, as they could not see the positive effect that the vaccinations had. In addition to the free vaccination clinics, therefore, Emery worked to increase public awareness of the benefit of smallpox vaccinations by advocating them in the Brooklyn Daily Eagle, the most widely circulated local newspaper. There, he declared, “If every inhabitant of the city were to be thoroughly vaccinated today, the disease would once die out from lack of material upon which to feed.”4 Since Emery’s method consisted of multiple elements, it affected 27
people in various ways. People who were quarantined against their will tended to view vaccination negatively, while people who attended a free clinic might give a more positive response. In addition to increasing the number of vaccinations and introducing quarantines, Emery and health officials made vaccinations compulsory for students attending public schools. In 1893, the New York state legislature passed a law that stated, “Persons not Z. Taylor Emery, New York vaccinated shall not be admitted to the public Public Library schools. . . Boards of Education may employ a Digital Collections (accessed, March 17, 2018). physician to vaccinate pupils who have not been vaccinated.”5 If the student did not have the vaccination scar, then he or she would be sent home. By April 1894, Emery had sent out 56 vaccinators, who had issued 27,000 vaccinations in New York City schools.6 Emery faced considerable backlash against this requirement, especially from the Kings County (Brooklyn) AntiCompulsory Vaccination League. Led by Dr. Charles Waters, the League challenged the law in court. Justice Willard Bartlett of theNew York Supreme Court adjudicated the case. After hearing lengthy arguments, on April 3, 1894, Bartlett declared that since public education was a part of the state, the city health officials could bar unvaccinated children from attending public schools. Additionally, he declared that public school education was a “privilege, not a right.”7 This ruling confirmed Emery’s compulsory vaccination plan; however, it led to an increase in opposition sentiment, especially from the Anti-Compulsory Vaccination League. Among that opposition, Charles Higgins, the treasurer of the AntiVaccination League of America, wrote numerous pamphlets and books asserting that vaccination should not be a requirement for public school attendance. In one such publication, “The Case Against Compulsory Vaccination,” Higgins declared that the law requiring children to be vaccinated was ridiculous and that lawmakers left out key statistics regarding multiple cases of people who had died from receiving the vaccinations. Another main argument that Higgins advanced was that “vaccination, like inoculation, spreads and continues smallpox.”8 While this claim was false, it indicated the stigma and false information surrounding vaccinations. Many residents of New York City and Brooklyn shared this mindset, which con28
A cartoon from a December 1894 anti-vaccination publication, Historical Library of the College of Physicians of Philadelphia.
tributed to why smallpox continued to come back. As smallpox continued to spread, Higgins published more pamphlets that he hoped would serve as warnings against vaccinations to lawyers, doctors, and other public health officials. Even though his attempts were mostly unsuccessful, the arguments of the Anti-Vaccination League are still present today. Because of its handling of the 1893-1894 epidemic, the Brooklyn Health Department and the City of Brooklyn faced multiple lawsuits, representing the range of opinions about the management of the epidemic. In one, William H. Smith sued Emery in May 1894 for being quarantined because he had refused vaccination.9 After a series of appeals, the case came to an end in 1896 with the judge siding with Emery. In another, Mary A. Ferrer sued Brooklyn for being confined against her will at the Flatbush hospital (pesthouse). Doctors diagnosed her with smallpox and thus sent her there, but she actually had malaria. At the hospital, she was “subjected to the danger and jeopardy of catching said disease.”10 Still another case against the City of Brooklyn involved John Salmon, who sued the city for “damages for being vaccinated with impure virus.”11 Salmon was wrongly informed that receiving a vaccination was compulsory. He complied with the vaccination but soon developed painful sores over his body, resulting in his being confined to a hospital for three months in order to recover from his adverse reaction. Due to limited record keeping, the results of Ferrer’s and Salmon’s cases were not recorded. Both cases revealed, how29
ever, that the methods followed by Emery were not perfect and resulted in unsatisfied residents. Since the vaccination plan was not always accurate and affected people in various ways, a wide range of experiences and opinions arose. In the smallpox epidemic of 1900-1902, by contrast, health officials tried to achieve widespread vaccination through a tone of voluntarism, as well as efforts to win passage of a law that made vaccinations compulsory. While the epidemic originated on All Nations Block, both health officials and residents blamed the outbreak on the African American community, reflecting the racism of that time. Prior to 1900, less blame was placed on who started the outbreak; rather, the focus had been on how to vaccinate individuals who had been in contact with infected persons. After 1900, however, the tendency to blame blacks, says historian Michael Willrich, denoted “an American culture of race that scorned black bodies as vessels of moral and physical danger.”12 Health officials this time referred to the original smallpox carrier as the “negress.”13 It is unknown who spread the illness, but the first cases did appear on West 69th street, in the All Nations Block. It was at that time the most heavily populated area in New York City, thus allowing smallpox to spread rapidly. Additionally, many individuals lived in crowded homes that lacked proper sanitation and ventilation, also allowing the epidemic to be contracted easily. By 1901, there were 2000 diagnosed cases and by 1902, a full epidemic.14 The epidemic of 1900-1902 also marked a change in the method of control for smallpox epidemics, representing the change in belief as well. The initial mindset of most health officials in 1900 was that most New Yorkers had been vaccinated and that the outbreak indicated that their immunity had begun to diminish. In order to prevent further cases and the spread of smallpox, health officials devised a three-part plan. First, they would stabilize All Nations Block at the start of the outbreak, then vaccinate residents of other neighboring streets and in the schools, and finally establish a quarantine on All Nations Block. The quarantine policy was not new to residents of New York and Brooklyn, as that approach had been put in place in the previous epidemic of 1893 through 1894. But this time, multiple new cases of smallpox occurred, prompting New York Health Commissioner Ernst Lederle to step up his plan in 1902. He assembled a vaccination force of 200 men and sent notices to large businesses offering free vaccinations.15 Additionally, the department instituted a rule that all lodgers residing in boarding houses must show 30
evidence of vaccination or provide an agreement to get vaccinated, if deemed necessary. Health Department officials vaccinated 6,000 additional people after this new regulation was passed. Furthermore, there were two main but vastly different beliefs on how to handle the outbreak. Some believed that since there were so many vaccinations, the outbreak must have been under control. By the end of 1902, over 800,000 New York residents had been vaccinated, which was approximately a quarter of the city’s total population.16 Others believed, however, that since smallpox was still present and a further outErnst Lederle in 1902. break had occurred, the policy was not working. In the end, Lederle used public appeal to combat the epidemic. He and the Health Department insisted that it was every individual’s public duty to become vaccinated. The New York Times called becoming vaccinated “not only a wise measure of personal precaution, but it’s a public duty which every citizen owes to those with whom he comes in a daily contact.”17 Additionally, Lederle advertised “free vaccination for all” in newspapers, in order to increase public awareness.18 This tactic enabled Lederle to achieve the appearance of voluntarism in his vaccination plans. Even so, the 1900-1902 strategy held a new aspect of targeting the most-endangered part of the public in order increase the number of vaccinations. Furthermore, in Lederle’s plan, there was also a hidden element of forced vaccination, revealing some officials’ belief that it was appropriate for members of the public to be vaccinated against their will. Lederle sent out vaccination squads made up of agents of the health department’s Bureau of Contagious Diseases. The four-member vaccination squads wore military outfits to accent their importance. At one point they were vaccinating people at a rate of 1,500 per day all over the city.19 Lederle also instituted a new pesthouse, Willard Parker Hospital, on North Brother Island, where infected individuals were sent. Additionally, there were efforts to create a new law that would require compulsory vaccination. James H. McCabe of Brooklyn, a state senator, proposed a bill that would force any individual to be vaccinated when the health department deemed it necessary. If the person failed to comply, it would be a misdemeanor, and there would be a $50 fine and a minimum ten-day jail sentence.20 This 31
sparked a debate among health officials and residents whether compulsion was the best option for controlling the epidemic. The issue divided the medical field as well. Lederle was opposed, even though he had his vaccination squads already at work. The bill eventually died because the epidemic started to wind down. Even though the bill was never passed, it showed that some strongly believed in required vaccination as the most effective way to control an epidemic, while others still strongly opposed legal compulsion. Finally, the 1905 United States Supreme Court ruling of Jacobson v. Massachusetts established that the state had the power to take control over epidemics through mandatory vaccinations, thus bringing a legal end to any resistance. Even though the case did not originate in New York City, the events that occurred in Massachusetts had been similar. It turned out, in fact, that the health commissioner of New York, Lederle, had been deeply influenced by what was happening in Massachusetts. When a compulsory bill was proposed, Lederle opposed it because he did not want to deal with the kind of upheaval that was taking place in Boston. Further, Massachusetts, specifically Boston, had a long history with smallpox epidemics, just like New York City. In the 1900s, Massachusetts had been hit badly as well. From 1901 to 1903, Boston documented 1,600 smallpox cases and 270 deaths. Health officials there vaccinated in accordance with a state law that proclaimed vaccination could be compulsory if required for public safety. In 1902, a resolution was passed declaring that anyone who had been vaccinated in the past five years must be vaccinated again.21 That resolution led to the Supreme Court decision in Jacobson and finally placed vaccination policy in the national spotlight. When in 1905 the debate on vaccinations came to an end, the Supreme Court had ruled that in matters of public health, state power took precedence over individual rights. In support of the seven-to-two decision in favor of Massachusetts, Justice John Marshall Harlan declared, “there are manifold restraints to which every person is necessarily subject for the common good.” Further: A fundamental principle of the social compact is that the whole people covenants with each citizen, and each citizen with the whole people, that all shall be governed by certain laws for 'the common good,' and that government is instituted 'for the common good, for the protection, safety, prosperity, and happiness of the people, and not for the 32
profit, honor, or private interests of any one man, family, or class of men.22 The ruling had a major impact that is still debated today: whether states could use their powers to control epidemics. Even though some thought that anti-vaccination sentiment would disintegrate, the 1905 ruling would fuel more controversy toward vaccinations in the years to follow. Even though smallpox has been eradicated, there are other diseases, preventable by vaccinations, which still infect and kill many people today. Health officials can look back at the 1893-1894 and 1900-1902 epidemics to consider what worked and what did not work in controlling similar diseases. Most public schools require pupils to be vaccinated, and they cite the 1905 Supreme Court case; however, there are exceptions for religious or medical purposes. Recently, anti-vaccination sentiment received a temporary boost from a fake study that correlated vaccinations with autism. Additionally, some people can be exempt from being vaccinated on grounds of allergies or compromised immune systems, thus relying on the vaccination of others to guard their safety. There have been outbreaks of measles and whooping cough in California, and people who cannot get vaccinated are in danger of contracting these possibly lifethreatening diseases. And so, the smallpox epidemics in New York City and Brooklyn from 1890 to 1910 may offer an important cautionary tale on the value of modern medicine in the prevention of deadly disease. Notes James Colgrove, State of Immunity (Berkeley, CA, 2006), 21. Ibid., 22. 3 Robert Goggin, Flatbush Hospital small pox deaths, 1893-1894, Brooklyn, NY, Department of Law, Corporation Counsel records, 2013.015, Box 15, Folder 9, Brooklyn Historical Society. 4 “Vaccination is Safe: Says Health Commissioner in an Interview,” Brooklyn Daily Eagle, March 26, 1894, https://bklyn.newspapers.com/ image/50351296/?terms=smallpox%2Bvaccinations (accessed January 8, 2017). 5 Frederick Scrimshaw and Charles A. Walters, Public School admittance and vaccination disputes, 1894 – 1895, Brooklyn, NY, Department of Law, Corporation Counsel records, 2013.015, Box 37, Folder 3, Brooklyn Historical Society. 6 Colgrove, 25. 1 2
33
John Zarrillo,“On Vaccinations and the Small Pox epidemic of 1894,” Brooklyn Historical Center, http://www.brooklynhistory.org/ blog/2014/07/21/on-vaccinations-and-the-small-pox-epidemic-of-1894/ #_ednref4 (accessed November 16, 2016). 8 Charles M Higgins, The Case Against Compulsory Vaccination, 1907, Charles M. Higgins Papers, 1978.114, Brooklyn Historical Society, 29. 9 Colgrove, 26. 10 Mary A. Ferrer, Small Pox Quarantine, 1893 – 1894, Brooklyn, NY, Department of Law, Corporation Counsel records, 2013.015, Box 15, Folder 7, Brooklyn Historical Society. 11 John Salmon, Vaccination Injury, 1894, Brooklyn, NY, Department of Law, Corporation Counsel records, 2013.015, Box 36, Folder 14, Brooklyn Historical Society. 12 Michael Willrich, Pox: An American History (New York, 2011), 7. 13 Ibid., 3. 14 Colgrove, 34. 15 Ibid., 34-35. 16 Ibid., 37. 17 Willrich, 9. 18 “Forty Smallpox Cases” New York Times, December 5, 1900, http:// query.nytimes.com/mem/archive-free/pdf? res=9901E6DD1E3BEE33A25756C0A9649D946197D6CF (accessed November 16, 2016). 19 Willrich, 8. 20 Colgrove, 35. 21 Ibid., 38. 22 Henning Jacobson v. Commonwealth of Massachusetts, No, 70, February 20, 1905, https://www.law.cornell.edu/supremecourt/text/197/11 (accessed December 31, 2016). 7
* 34
A Multifarious Pestilence: the Black Death and its Effects on Medieval Europe, through a Sociocultural Lens By Rachel Dong ‘19 A plague leaves not only death, but immense social change in its wake. In 1347-1350, during the late medieval period, the infamous Black Death devastated Europe, decimating entire villages and communities and generating widespread public hysteria as it swept through the continent.1 The total death count numbered a daunting 20 to 50 million, approximately 60 percent of Europe’s population at the time. 2 Faced with a pandemic that killed unbelievable numbers without discrimination, Europe suffered an unparalleled breakdown of social structure as well. As Giovanni Boccaccio wrote in his introduction to The Decameron, a book chronicling the first-hand accounts of people fleeing from the plague: “This scourge had implanted so great a terror in the hearts of men and women that brothers abandoned brothers, uncles their nephews, sisters their brothers, and in many cases wives their husbands.”3 In this period of mass panic, with neither physicians nor the Church able to identify any reasonable cause of the disease, specious rumors spread with ease—and horrifying consequences.4 As a result, European people began to shift away from traditional values and toward a loss of faith in the Church, a scapegoating of the Jewish populace, an acceptance of greater socioeconomic freedoms for the working class and women, and a rise of peasant rebellions on an unprecedented scale. All of these changes had lasting effects on European society for hundreds of years to come. During and after the plague, the Church became wealthier but also less confident and short-staffed, which contributed to the decline of faith in religion and an inclination for citizens to turn to other sources for guidance. In medieval times, people naturally looked to the Church as a central force in their lives. But religious authorities could not provide medical help for the perishing multitudes. During the years of the plague, in fact, cathedrals actually raised their service fees to draw in more money, and priests stopped issuing last rites to the dying in fear of contracting the disease themselves. Such changes caused the Church to lose some of its credibility all across Europe. Anti-clerical sentiments sometimes found expression in violent incidents, a prime example occurring in England, where public discontent with Church officials rose especially high. In 1349, at
35
the height of the plague, a group of Englishmen broke into the monastery attached to the St. Mary’s Church in Worcester, chasing out the monks and priests with weapons and setting fire to the building. Similar events occurred in France, Italy, and Spain, demonstrating a strong dissatisfaction with religious authority, which only increased over time.5 The growth of Flagellants—masses of frenzied people processing through the streets, whipping themselves and hysterically pleading to God—constituted another example of people losing faith in the institution.6 Since many interpreted the plague as God’s divine wrath for the sins of humanity, the Flagellants believed they were “redeeming humanity” by shedding their own blood in martyrized acts of repentance and penance.7 Their self-punishing philosophy became increasingly attractive as the plague dragged on, and it was not unusual for crowds of sympathizers to march or congregate behind them as they paraded through towns all across Europe, again most prevalent in France, Spain, Italy, and England.8 Rejecting the value of clergy, the Flagellants regarded themselves as mediators with God, the role that priests normally played. In addition, they openly denounced the 36
 Â
Church, took over cathedrals, disrupted services, and proclaimed priests as incarnates of the Devil.9 In all of these actions they influenced the declining allegiance to the Church during and after the plague.10 Furthermore, the Black Death brought about a severe shortage of priests in Europe during the 14th century, as approximately 40 percent of the clergy died in the pandemic, in some places proportional to the general populace, and leading the Church to accelerate the training of replacements. New practices enabled young priests to be ordained at age twenty instead of twenty-five, and holy vows administered to adolescents at fifteen years-of-age instead of twenty. These alterations resulted in a younger, more inexperienced generation of clergy, causing the Church to seem less omnipotent and more untrustworthy and prompting people to look elsewhere for spiritual help.11 During the plague years of the late 1340s, some of the harshest persecutions of Jews in history occurred in European countries, setting the stage for later events targeting the Jewish community. The Jewish population had been oppressed in Europe long before the plague, but during the pandemic, persecutions worsened drastically, becoming brutally inhumane as a result of intensified anti-Semitism to a radical degree. Because of the decline of faith in the Church’s statements, ordinary people desperately sought an alternative explanation for the plague. Many found this in one of the most commonly outcast groups in medieval society: the Jewish population.12 Largely due to an incident in 1348 when a Jew named Agimet confessed under torture that he had poisoned wells in Venice, Toulouse, and the Mediterranean, Jews fell under growing suspicion of having intentionally caused the disease. Persecutions first started in southern France and Spain, where large numbers of Jews lived in constant tension with their Christian neighbors, especially as more Rabbis began to follow a new philosophy named Kabbalah. Kabbalah was very secretive and had witchlike connotations, which increased distrust of the Jews, especially after Agimet claimed to have been following the instructions of a Kabbalist Rabbi when he poisoned the wells. These tensions led to the first of the violent incidents in 1348, when in Spain, twenty Jews were murdered in cold blood, and many more had their houses looted and burned. From there, the situation escalated as anti-Jewish rumors circulated throughout Europe, and many more common folk began to use the Jews as a scapegoat for the plague.13
37
Depiction of people praying for relief from the bubonic plague, circa 1350.
Anti-Semitism was spread further by the Flagellants, who carried it to every town they visited. Accompanied by frenzied mobs, they aggressively targeted Jewish communities, slaughtering and torturing entire families.14 A particularly horrifying incident occurred in Strasbourg in 1349, when approximately 900 Jews were burned to death on Valentine’s Day, and the rest of town’s Jews were exiled. This appalling event truly demonstrated the harsh brutality of this public mania; yet, it was only one of the multitudes of persecutions that happened in this period. The persecutions officially stopped after 1350, but this wave of unparalleled anti-Semitism had extensive aftereffects.15 In the aftermath, European towns remained extremely anti-Semitic, but most importantly, a clear social segregation between Christians and Jews started to occur. The persecutions reinforced the idea that Jews were filthy and immoral beings, unworthy of being treated by humane standards. The few Jews who escaped the purges fled to the borders of Europe, living in small, huddled communities on the edges of territories. Few attempted resettlement after that, fearing a repeat of the horrific persecutions that had befallen them before. In the wake of the plague and these persecutions, European anti-Semitism remained prevalent among Christians, as demonstrated by the Expulsion of 1394 that took place in France, exiling all Jews living in France at the time.16 Although anti-Jewish feeling waned considerably over the next hundred years, it would have been impossible to completely erase the legacy of violent anti-Semitism that had so thoroughly pervaded the continent.
38
The Black Plague generally improved the standard of living for the non-Jewish European working class, however, introducing a more balanced economic dynamic between landowners and their serfs, as well as granting women increased autonomy. In the period following the pandemic, peasants found their rents reduced, or sometimes waived entirely, as lords scrambled to find enough workers to replace the large number that had perished from the plague.17 Because of the high demand for laborers, serfs gained new freedom, able to ask for inflated wages and rights. If not met, they could easily find work at another lord’s estate.18 These conditions led many nobles to cede to their serfs’ requests in order to retain them. Ordinary laborers also took advantage. In Saint-Omer, France, textile workers at a certain factory gained three successive wage increases after rioting against their bourgeois employers.19 The augmented salaries naturally led to an improved standard of living for peasants and working class laborers, shown in the shift from clay kitchenware to metal in many low-class households in the years after the plague. A new social division also arose from these new economic conditions: the Yeomen, a class of wealthy peasants in England that flourished during the 15th century, taking advantage of increased wages to buy their own land and cultivate it. As a further irony, the Plague also improved life for surviving women of all classes because of the large numbers of men who had died. Among the nobility and gentility, dowagers led fine lives; they were able to remarry easily, or live well on their own if they so desired, since they kept all the property and wealth accumulated by their late husbands. Working class women took on more productive roles in the household, expanding their labor into a variety of new industries. By 1350, the business of beer brewing prominently featured women. As well, the textile industry became female-dominated, illuminating further a whole new range of opportunities for women, who could use their skills now to become financially independent craftspeople and merchants.20 Stemming from the Black Plague, and from the improved socioeconomic conditions it brought for the working class, nobles became fearful of losing their influence and tried to enact more severe wage regulations, further dividing the classes and resulting in a multitude of peasant rebellions across Europe. The revolutionary reworking of economic constructs in the aftermath of the epidemic also prompted monarchs to decree regulations on the extent that wages could increase. One example was the Statute of Laborers of 1351 in England, which set strict maximum wage amounts for all working class positions. The English Parliament further 39
attempted to control laborers’ demands by imprisoning workers who overstepped their boundaries by demanding drastically increased wages. These harsh regulations fostered not only working-class resistance in England but also in other countries such as France and Italy, where similar laws appeared. European peasants and wage workers found the new edicts to be extremely unfair, a product of greedy upper class interests, which led to an even greater social divide than before.21 As time went on, increasing discontentment led to outright rebellion. In Florence, the Ciompi Revolt, in 1378, occurred when raucous crowds of underrepresented workers marched into the streets.22 The Ciompi were independent artisans, craftsmen, and other workers, typically lower class, who were barred from representation in the Florentine government because they were not part of a guild.23 The rebellion demonstrated a radical new revolutionary spirit of the lower classes to make a stand against their oppressors. Although it was not ultimately successful (the government they created had only a brief reign), the Ciompi Revolt represented only one example of the growing tensions between social classes resulting from the Black Plague. The Peasants’ Revolt of 1381 in England further showed this trend of mobilization of the lower classes in the aftermath of the Black Death. The English working class, extremely displeased with the Statute of Laborers, violently rioted and killed prominent individuals such as the Archbishop, the Royal Treasurer, and wealthy merchants, lawyers, and royal servants. In all, 60,000 men, led by Wat Tyler, a blacksmith turned radical revolutionary leader, gathered to protest at Blackheath, an important estate which housed the royal family, forcing royal officials (including adolescent monarch Richard II) to flee for their lives. After negotiation with the young king, English workers received increased labor privileges, but these were revoked soon afterward. Both of these revolts were completely unprecedented in the medieval period, as the lower classes had never before united against the wealthy estates in these types of large-scale insurrections.24 Even though the rebellions, coming in the wake of the Plague, rarely succeeded in the short-term, their true historical importance lay in the fact that they planted the seed of class-system dissent, and they demonstrated peasants’ restless desires for equality and some element of democracy that would seminally define later historical events such as the ultimate decline of the feudal system in the 1500s.
40
From the Black Plague, a profusion of social change impacted all parts of Europe. Still, there remains no way to categorize the pandemic’s varied effects on medieval society as definitively “positive” or “negative.” The Black Plague slightly narrowed the economic gap between rich and poor, and it lessened the influence of the previously overbearing Catholic Church.25 However, the plague also brought greater class division and unrest and caused intensified discrimination against the Jewish community.26 The failed working class revolutions that came out of the pestilence only bred more dissent, falling short of their goal of reducing the large social gaps that continued to exist in medieval society. Furthermore, the Jewish persecutions during the Black Plague represented a significant augmentation of anti-Semitic sentiments in Europe and provided the foundation for a trend that would span centuries: the continued segregation and scapegoating of the Jews. It can be concluded that although the Black Plague formed a new Europe, more secular and economically balanced, it was also a Europe that remained starkly divided. Notes See especially Norman F. Cantor, In the Wake of The Plague: The Black Death and The World It Made (New York, 2001). 2 “History,” Centers for Disease Control and Prevention. September 14, 2015, https://www.cdc.gov/plague/history/index.html (accessed May 3, 2017). 3 See Giovanni Boccaccio, The Decameron (translated by G. H. McWilliam. Harmondsworth, Middlesex, UK, 1972). 4 See especially Barbara W. Tuchman, A Distant Mirror: The Calamitous 14th Century (New York, 1978). 5 Ibid., 122. 6 Ibid., 114. 7 Gilles Li Muisis, Chronicle, 1350 (manuscript, Benedictine monastery, St. Giles, Tournai). 8 Tuchman, A Distant Mirror, 114. 9 Ibid., 115. 10 Ibid., 122. 11 Cantor, In the Wake of The Plague, 206. 12 Tuchman, A Distant Mirror, 122. 13 Cantor, In the Wake of The Plague, 148, 151, 155, 152, 153. 14 Tuchman, A Distant Mirror, 114. 15 Cantor, In the Wake of The Plague, 153-58, 156. 16 Tuchman, A Distant Mirror, 116. 1
41
Ibid., 119. Cantor, In the Wake of The Plague, 203. 19 Tuchman, A Distant Mirror, 120. 20 Cantor, In the Wake of The Plague, 203. 21 Tuchman, A Distant Mirror, 120-121. 22 The Editors of Encyclopædia Britannica. “Revolt of the Ciompi,” Encyclopædia Britannica, https://www.britannica.com/event/Revolt-of-theCiompi (accessed May 4, 2017). 23 Ibid. 24 Dan Jones, “The Peasants’ Revolt,” History Today 59, no. 6: 33-39, History Reference Center, EBSCOhost (accessed May 3, 2017). 25 Tuchman, A Distant Mirror, 119, 123. 26 Ibid., 113, 121. 17 18
* 42
Fortifying Farmland: World War II in Little Compton, Rhode Island By Ben Shore ‘18
Historic USGS Map of New England, Sakonnet Point.
Most Americans think of World War II as having unfolded overseas. But in fact, the fighting nearly washed onto the East Coast of the United States, and the preparations for that possibility transformed both the land and the lives of those who lived there. World War II was, in large part, a war over resources. As Axis forces tried to prevent the shipment of American goods to Britain, the United States needed to protect its shores and ships from German harassment. In 1938, Little Compton, Rhode Island, a quiet, rural peninsula with a long history of strategic importance, was chosen to help guard United States ships and soil from German UBoats, planes, and invaders. Little Compton was transformed by the resulting physical and social changes, leaving impressions of the war that remain evident today. Little Compton sits on the eastern side of the Sakonnet River, which gives ships access to deep-water ports such as Providence and Bristol. Sakonnet Point is the southernmost tip of Little Compton, jutting into the Atlantic Ocean. Due to the strategic importance of its geographic situ43
ation, Little Compton has had a role in history that is disproportionate to its small size and sleepy, agricultural nature. Centuries, ago, the small peninsula was inhabited by the Native American Sakonnet tribe. This land was important to local Native American tribes both spiritually, because it was the subject of creation legends, and practically, because it was blessed with fertile farmland surrounded by rich fishing waters.1 In the spring of 1676, during King Philip’s War, Awashonks, the female sachem of the Sakonnet tribe, signed a peace treaty with Benjamin Church, the captain of the first Army Ranger force in America, that protected her tribe from King Philip’s Native American forces.2 This treaty added strength to Church’s position, helping him to capture and kill King Philip, which led to the English settlement in Little Compton. English farmers and fishermen came to revere the place for its fertile farmland and ample fishing stocks; but it remained (and remains today) a rural place with no major commerce or industry, just farmers and fishermen plying their trades on land and sea. Two centuries later, as the Industrial Revolution had transformed much of the East Coast into centers of industry, Little Compton became a summer haven for rich city dwellers. By the 1890s, Little Compton, or “the point,” was a quietly wealthy summer community. Its main attraction was a grand hotel, along with an exclusive fishing club on a tiny island off the point. The club was legendary for its unparalleled fishing and was visited by robber barons and statesmen alike, including Grover Cleveland and Chester A. Arthur.3 During prohibition, there was a rum-running operation that smuggled alcohol from vessels outside the three-mile limit onto the empty beaches, satiating the needs of vacationers and supplying the nearby city of Fall River, Massachusetts.4 During World War II, Little Compton proved its disproportionate strategic worth once again. Its geographic position jutting south into the Atlantic Ocean made it valuable to the United States military when the U.S. East Coast became a potential war zone. As World War II raged in Europe, naval strategies focused on access to war-related goods. This led to the Atlantic Ocean becoming a naval theater and the East Coast of the United States potentially at risk. After Russia joined the Allies, the food supply in Germany was drastically limited, and the Allies focused on preventing German attempts to import the resources necessary to sustain its war effort. The Allies attempted the blockade of Germany, hoping to limit their food and industrial resources. In response, the Germans tried to starve Britain, a highly populated island nation which needed millions of tons of imports every week to survive. 44
On the ocean, this conflict translated into a tonnage war involving shipping convoys. That convoy system meant it became immaterial what a given ship was carrying. Regardless of its contents, if it were sunk, it could be replaced by another from its home country. German Grand Admiral Karl Dönitz, the creator of the theory of tonnage war, led Operation Drumbeat, which sought to disrupt shipping on the East Coast of the United States.5 Accordingly, the United States military took precautionary measures to defend against this German menace. In order to protect its merchant ships and its people from German attack, the United States developed naval weapons and built land defenses to protect the East Coast from German U-Boats, surface ships, and planes. To defend from Unterseeboots, or U-Boat submarines, the Navy developed depth charges, which sank and exploded at set depths, and later, the Hedgehog system, which fired many charges, each set to explode on contact with submarines.6 To defend merchant ships from attacking surface vessels, the Navy escorted transatlantic convoys with armed destroyers and built about thirty forts (some may not have been completed or declassified by the military) on the East Coast, armed with massive artillery. Finally, to defend from planes, the military installed anti-aircraft guns, even though it was unlikely that German planes would ever reach the United States. Because of its strategic position, sticking south into the Atlantic, Little Compton was chosen as a location for these land fortifications. Three reservations built on the Sakonnet peninsula made up Fort Church, a fort designed to protect some 145° degrees of ocean from Block Island to Martha’s Vineyard.7 Along with a similar fort on Point Judith, Fort Church would protect Narragansett Bay and Newport Harbor, which at the time featured the largest United States naval base, from German boats and landing parties. While Fort Church never fired a shot in anger, close encounters occurred. In one case, a submarine was detected on radar within half a mile of a Little Compton beach, prompting concerns about an impending landing party,8 and German submarine U-853 was confirmed sunk just ten miles offshore on the day after Germany surrendered.9 According to Ian Macdonald, a wartime resident of Little Compton, “the submarines were here.”10 Fortunately, by the time of any likely sea encounters, the United States military had heavily bolstered the point’s defenses against German vessels.
45
 Â
Starting in 1939, shortly after a devastating hurricane had leveled most of the buildings on the point, the army built batteries, bunkers, and other auxiliary buildings to mount guns, watch the sea, and house soldiers in Little Compton. The military seems to have foreseen the need for coastal defenses, bePhotos of Battery Gray, 1946. cause it started to prepare for transporting large artillery as early as 1934.11 Fort Church included the West Reservation, the East Reservation, the South Reservation, and an observational station.12 Because Little Compton was so rural, the army took great care and expense to camouflage their battlements as agricultural and residential structures. Huge guns were mounted in bomb-proof concrete batteries, covered with earth and vegetation. Battery Gray housed two casemated 16inch-in-diameter guns, fortified under concrete and earth.13 The 16-inch guns were originally destined for a battleship in 1918, but the Navy had too many ships at the time, so the guns went into storage until 1939, when they were brought to Little Compton.14 These huge weapons, mounted 500 feet apart and connected by an underground tunnel, measured 68 feet long and weighed 143 tons.15 They were loaded with blue silk gunpowder bags which took several seconds to fire a shell weighing nearly one-and-a-half tons a distance of up to 26 miles. Battery Reilly housed two 5-inch guns under additional reinforced concrete. These batteries were constructed with 8,000 psi concrete (most industrial concrete is no more than 4,000 psi) with one-inch steel reinforcement. (Years later, when residents wanted Battery Reilly’s tunnels destroyed for liability reasons, the construction company failed in attempts to blast the concrete and had to resort instead to filling around it with earth.)16 Battery 212, in the South Reservation, had one 8-inch gun, two 155 millimeter guns, and some anti-aircraft installations. The anti-aircraft guns proved impractical, as German planes could not reach the East Coast and the technology of the anti-aircraft guns was limited.17 There were also machine guns mounted every fifty yards or so on a long eight- to-twelve46
foot-deep trench running along the shore of the Sakonnet River. The U.S. military had prepared for a landing force coming off a submarine and sabotaging the fortifications or redirecting the 16-inch guns at Newport or Point Judith. They also limited the big guns’ fields of fire so that, in the event of enemy capture, they could not be aimed to fire on other forts or on Newport. Watching the sea was an important function at Little Compton. To do so, lookouts sat on top of the batteries and in lookout houses on the East Reservation and on Warren’s Point. Some lookouts were fitted with binoculars that could transmit information to bunkers with a pushbutton to pinpoint submarine locations. On Warren’s Point, a radar station helped to spot surface ships. Like the rest of the outbuildings in
Harbor defenses of Narragansett Bay, fields of fire from Little Compton.
New England Division Corps of Engineers, Fort Church layout.
47
and around Fort Church, the lookout houses were designed to blend in with the rural New England architecture. To house soldiers, the army built barracks with wood from trees knocked down in the 1938 Hurricane. At the height of operations, about a thousand military personnel lived in Fort Church.18 For a town of Little Compton’s size, this was a huge influx of people. The army also built a movie theater and a gym to keep the soldiers occupied. For safety there were several gas- and bomb-proof bunkers with double doors and ventilation systems.19 There were also a number of fire control structures, presumably built to prevent fires from spreading after a bombing. While the bunkers and fire control structures were off-limits to the public, the army gave town residents access to the gym and movie theater. The soldiers and the activity they brought with them provided a welcome distraction for a town still reeling from the effects of the Hurricane of 1938, which had destroyed what little infrastructure that had been there before. During World War II, Little Compton experienced a period of economic upswing, social harmony, and integration with the military. After the Great Depression and then the 1938 storm, Little Compton inhabitants endured severe financial difficulties. In 1939, when soldiers arrived with salaries of thirty to forty dollars per month, much more money flowed into Little Compton, which helped the economy.20 The town experienced a social upswing as well, thanks to the influx of young men and the presence of military recreational buildings. Townspeople went to see movies for a quarter21 and enjoyed playing ping pong in the gym.22 A small fisherman’s bar had been built at the tip of Sakonnet Point on the site of the grand hotel that had been destroyed by the hurricane. The bar, called “The Foc’s’le” changed from a dreary fisherman’s watering hole to a lively bar with music and dancing. It held an annual Fisherman’s Ball, where fishermen, soldiers, and townspeople gathered. They even held a contest for the largest fish caught.23 World War II brought prosperity and activity to the town at a time when it badly needed both. Judging by newspaper articles of the time, Little Compton residents took pride in their role in the war, though they did not particularly fear being attacked by Axis powers.24 After the war ended, Little Compton still experienced lasting effects of the war. A few soldiers, like Mogens Eskelund, married women they met in Little Compton and settled down on the point. 25 Most soldiers left, however, leaving Little Compton with decreased economic and 48
social activity. The land that the army had bought from townspeople to build the fort was returned to the former owners. For some families, like the Taylors, this transition was easy. The European grape grower that the Taylor family brought into the United States to work for them had been able to continue thereafter on their property, which had briefly become a commander’s headquarters during the war. But other families were not so lucky, as their land had been substantially transformed. Ian Macdonald, who later became a submariner, found inspiration as a boy by the continuing presence of military equipment. To this day, he remembers “what a great playground it was.” He and other boys spent their summers playing tennis, going to the beach, and horsing around with the still-fullyoperational, but poorly guarded, 16-inch guns. As a boy Macdonald recalled, he attended a blindfolded mystery party in the dark caverns of Battery 212. He currently lives in an old World War II army barracks.26 Other townspeople might not have been as happy with the military presence in Little Compton. The fort occupied significant space, changed the landscape, and cost millions of dollars, but it never became necessary for combat. After 1949, the gunpowder was destroyed and the guns removed, but the remaining fortifications leave an indelible imprint of the war on Little Compton. In an attempt to cover up what some now see as an ugly and expensive blunder,27 much of the remains of Fort Church have been physically buried. Little Compton’s role in the war is rarely mentioned now in conversation. It seems almost as if the town has forgotten the huge mounds of earth and concrete that once bristled with guns and defended American water and soil. Those who were young men and women living in Little Compton at the time remember World War II as a moment when the town enjoyed an increased sense of importance and activity. Recently, Battery Gray has been cleared of undergrowth. It remains to be seen whether Fort Church, long buried under bittersweet, buckthorn, and wild roses, will become a point of pride for Little Compton once again. Because World War II affected transatlantic trade, Little Compton had a strategic role to play in U.S. military preparations. Its geographic position, sticking south into the Atlantic Ocean, became valuable again, as it had once been to local Native American tribes, early European settlers, fishermen, rum runners, and others. Fort Church played a large hypothetical, if not actual, role in defending Newport and Narragansett Bay. The army’s presence changed the nature and layout of the town, livened up the economy, and transformed the social scene during the war. After the war 49
ended, Little Compton resumed its identity as a coastal backwater, a place of farms surrounded on three sides by the sea. Still, the physical evidence of the fortifications still remains, part of a landscape of World War II that still lives on in the hearts and minds of a few of its townspeople. With luck, the stories that accompany the now-overgrown wartime structures will continue to be passed down from generation to generation so that Little Compton’s role in the war might be remembered. Notes James C. Garman and Michelle G. Styger, Sakonnet Point Perspectives (Sheridan, PA, 2011), 8. 2 Samuel G. Drake, The history of King Philip's war; also of expeditions against the French and Indians in the eastern parts of New-England (Boston, 1825), 155. 3 Garman and Styger, 28. 4 Garman and Styger, 80. 5 Timothy Mulligan, Neither Sharks Nor Wolves, (Annapolis, MD, 2011), 53. 6 “Hedgehog,” Destroyer Escort Historical Museum, <http:// www.ussslater.org/tour/weapons/hedgehog/hedgehog.html> (accessed 25 May 2017). 7 U.S. Army, Fort Church, Little Compton Historical Society, World War II Archive: Brownell Library, Little Compton, RI. 8 Janet Lisle, A Home By The Sea (Little Compton, RI, 2012), 325. 9 Adam Lynch, “Kill and Be Killed? The U-853 Mystery,” Naval History Magazine, June 2008, <https://www.usni.org/magazines/ navalhistory/2008-06/kill-and-be-killed-u-853-mystery> (accessed 25 May 2017). 10 Ian Macdonald (interview by the author), May 23, 2017. Ian Macdonald came to Little Compton in 1946, but everything in Fort Church was still completely operational at the time, and the two 16” guns at Battery Gray remained active for many years. During this time as a young boy, Ian learned everything there is to know about 16” guns, to the extent that he was inspired to go into the Navy. Due to his fascination and experience with the Navy and all things military in Little Compton, he is an authoritative voice on the history of Fort Church. 11 Ibid. 12 Lisle, 319. 13 U.S. Army, Fort Church. 14 Ian Macdonald, May 23, 2017. 15 Untitled 1939 Newspaper Clipping, Little Compton Historical Society World War II Archive: Brownell Library, Little Compton, RI. 16 Ian Macdonald, May 23, 2017. 1
50
Ibid. Walter Elwell (interview by the author), May 24, 2017. Walter Elwell was born in 1934 and his father was the head of plumbing for Fort Church. His father’s position meant he was able to enter secured military areas, so he has great memories of what the operational military structures looked like and how they functioned. 19 U.S. Army, Fort Church Report of Completed Works, Little Compton Historical Society World War II Archive: Brownell Library, Little Compton, RI. 20 Ian Macdonald, May 23, 2017. 21 Walter Elwell, May 24, 2017. 22 Lisle, 327. 23 Garman and Styger, 105. 24 Untitled 1944 Newspaper Clipping, Little Compton Historical Society World War II Archive: Brownell Library, Little Compton, RI. 25 Mogens Eskelund, Little Compton Remembers World War II, 1945-1995 (Fall River, MA, 1995), 47. 26 Ian Macdonald, May 23, 2017. 27 Lisle, 329. 17 18
*
51
“The Submarines Were Here”: A Short Story By Ben Shore ‘18
Historic USGS Map of New England, Sakonnet Point.
FACT: Fort Church was a World War II military complex in Little Compton Rhode Island. It was equipped with two 16-inch-in-diameter guns, among other large artillery and a radar station. While Fort Church never fired a shot in anger, a number of close encounters occurred. A submarine was detected on radar within half a mile of a Little Compton beach, prompting concerns about an attack by an enemy landing party,1 and, in addition to other encounters, a German submarine U-853 was confirmed sunk by a navy destroyer just ten miles offshore after it had sunk a U.S. coal collier ship on the day after Germany surrendered,2 in addition to other encounters. According to Ian Macdonald, a wartime and current resident of Little Compton, “the submarines were here.”3 0715 June 10, 1950, FORMER WEST RESERVATION OF FORT CHURCH, LITTLE COMPTON, RHODE ISLAND Walter crouched down tentatively, careful not to slip on the cold, slick metal as he sat, straddling the end of the gun barrel. Tucking his hands into his Dad’s waxed coat, he shivered. From his perch, Walter could see the peachy morning sunlight spread across the horizon in front of him. He looked out over the golf course and farmland to the Atlantic 52
Ocean, squinting at what he thought must be Mr. Mataronas’s lobster boat, pulling traps by the lighthouse, and the Alison Rose, tied up to the dock, unloading fish after a night offshore. The harbor was beginning to buzz, the way his tube radio slowly warmed up when he turned it on. As a small boy, Walter had heard the gun he was now sitting on fire in training exercises. He had become fascinated. He knew of the silk bags of gunpowder, and how long it would take a 2,400-pound shell to leave an eighty-foot-long barrel. He knew how the aiming systems in the bunker could direct the gun to hit a dinghy two miles off the coast— farther than he could see this hazy morning. But all that—the gunpowder, the shells, the electronics, the soldiers—was gone now. In 1949, five years after Germany’s surrender, lines of trucks rumbled down West Main Road, right past Walter’s grandparent’s house, transferring equipment and shutting down all operations at Fort Church. The fear of German U-Boats had by then subsided. Sitting on the gun barrel on that peaceful morning, the threat of war seemed long past. So, when Walter heard the crackling of a radio coming from within the bunker, he was convinced it was some saltwater stuck in his inner ear. After twisting a finger in his ear, he still heard the radio, now, what sounded to him like angry German. He was so startled he lost his balance, slid off of the barrel, and flopped onto the wet turf below. Suddenly, his twelve-year-old imagination running wild, he became certain it must be “the Germans” in the bunker. Walter scrambled to get up and then sprinted home, as if he had been shot from the barrel over the golf course. He didn’t slow down until he got to the break in the stone wall behind his house. Meanwhile, Mrs. Cluett eyed the telegram on the dinner table. It stated the decision of her oldest son, Alden, to stay in Cape May with the Navy instead of returning home for the summer. She was proud of his dedication but missed the days of watching him play with his friends at the beach and on the tennis courts. She picked up her empty coffee mug and walked to the sink to clean it. From the kitchen window she saw her son Walter bound around the corner of the barn. He looked just like his older brother, she thought. He had his same sandy mop of hair that bounced as he jogged up the ter53
race. Seeing him approach the back door, Mrs. Cluett called out. “Wipe your feet!” She cocked her head when she did not hear Walter’s usual exasperated response. She put down the mug and dishtowel she was holding and stepped back to see Walter, wide eyed, standing frozen on the doormat. No stranger to Walter’s wild imagination, she asked him, “What’s up?” Walter, still a little out of breath, and still half-convinced that he had just been hearing things, responded, “I think there are Germans in Battery Gray.” 0730 June 10, INT’L SHIPPING LANES, 40 MILES OFF THE RHODE ISLAND COAST “Vierteldrossel in Richtung Newport sofort,” said the captain, expressing his desire to proceed towards Newport at quarter-throttle. The captain, moments ago, had heard the same command from his superior, via radio message from a high-altitude plane. “Ja Kapitän,” responded the man in the captain’s quarters, responsible for relaying the information to the four men in the conning tower. This lieutenant, who was missing home dearly, realized with great cynicism, that no matter the outcome of this mission, he would not have to live aboard this submarine any longer. 0812 June 10, LITTLE COMPTON, RHODE ISLAND Lieutenant Elwell braked hard, seeing the driveway of 25 Sakonnet Point Road at the last second. His coffee spilled onto the metal floor of his Jeep, and he cursed. Rumbling down the straight gravel driveway he saw Mrs. Cluett waving to him on the porch, prompting him to speed up though she was waving for him to slow down. She had called him while he was reading the paper, and he was not eager to spend time attending to the tip of some 12-year-old. In the Cluett’s kitchen, Walter, his mother, and Lieutenant Elwell quickly discussed the events of the morning. Lieutenant Elwell, convinced it was nothing but radio interference, said he would check it out, and of54
fered to bring along the boy for fun. Walter and he climbed back into the Willys jeep and cautiously pulled out of the driveway. After arriving at the bunker, Walter, trying not to let on how often he had been playing around the old gun, guided the Lieutenant through his old haunt, to exact spot where he had heard the disturbing noise earlier that morning. On a hunch, the Lieutenant led Walter around the back of the bunker, through a year’s worth of overgrowth. He took out a hefty key and unlocked the one-inch steel personnel door to the left of the main gates. To his great surprise he saw the flickering lights of the intercept radio. “Holy mackerel. That’s only supposed to turn on when it intercepts a long wave radio communication—the kind the U-Boats used to use,” said Elwell, as he felt the familiar rise in heart rate that accompanied the submarine scares of his army days. Walter felt like he was in a dream; in his mind’s eye, he saw a giant black submarine steaming right toward him as he swam helplessly in the ocean. “This radio was hard-wired into the underground power lines to this bunker, so I guess it must have stayed on the whole time,” Elwell explained to Walter, as he walked up to the radio. He swiped some dust off of it, and looked at the ticker which showed the last intercepted message. It read 0742. “We must have missed an interception,” said Elwell, looking off into the distance and pausing before he asked Walter, “do you remember anything of what you heard?” “I don’t speak German at all! All I can remember is, um, I think I heard the word Newport.” “That’s impossible, the Krauts would never say a real place; they would encode it,” said Elwell, beginning to think Walter had been hearing things. “Unless who or whatever that came from thinks they aren’t being listened to anymore.” His already elevated heart rate now went up a few more beats per minute. He gathered himself. Without letting Walter know about his state of near panic, Elwell guided him back to the Willys and drove him back home to his fretting mother. 1920 June 10, SAKONNET POINT ROAD, LITTLE COMPTON, RI 55
By now, Walter was starting to tell himself that he had imagined the whole crazy morning. He sat, eating supper, and explained his day at the beach to his mother. His best friend, Ian, hadn’t believed his story. It only heightened Walter’s frustration over no one believing him when his mother told him that Mr. Mataronas was missing a whole string of lobster pots, even though the ocean had been calm and windless as a pond for days. That was such an extraordinary occurrence, she said, it was all anyone at the harbor could think or talk about. Walter imagined eight twentyfoot bronze propeller blades twisting up and snapping the lobster gear the way he wiped a spider web off his face when he was playing in the hayloft of their barn. “Did he tell you where he lost it?” “Just south of the lighthouse.” Walter wouldn’t sleep much that night. He needed to let Lieutenant Elwell know about those lobster pots. 0018 June 11, WARREN’S POINT ROAD, LITTLE COMPTON, RI Walter set down his bike and flipped off his flashlight, setting the two down in the long grass in front of Lieutenant Elwell’s house. He had snuck out of his own house that night, unable to bear the anxiety any longer. Two long, black Cadillac Series 62s lounged in the driveway. Through summery linen curtains, he could see the shadows of several men sitting around Lieutenant Elwell’s kitchen table. Approaching the window, he could just make out their saying that the radio signal was most likely from hundreds of miles offshore, and that they had time to prepare defenses. Walter thought of the lost lobster pots only a couple of miles offshore. Considering that he was already in trouble for sneaking out, he decided to go into the house and tell them about the missing lobster pots. The men, who had made the drive from the Newport Navy base, jumped at the sound of the screen door creaking open but were relieved— and not too surprised—to see a Cluett boy snooping around on a summer’s night. When Walter told them about the missing lobster gear, they felt naturally apprehensive about his theory. Who would put his reputation on the line to support the crazy notions of some wild-eyed young boy? They appreciated his efforts to help, however, and, as they had little other 56
evidence to go on, started to become increasingly convinced by Walter. At nearly 0200, they called the radio room at the Cape May Naval Base, knowing the consequences of staying quiet and risking the lives of Rhode Islanders. They delivered a message to the Chief of Naval Operations, Harry Yarnell, about the intercepted radio transmission and the mysterious harbor gossip. The next thing that Walter, who had fallen asleep on a couch, could remember was being carried up the porch of his house to his perpetually fretting mother. He was furious with himself for getting caught and, more importantly, for having fallen asleep at such an important moment. He hated falling asleep. Nonetheless, he was sleepily content with his service as his mother tucked him into one of the two beds in his room. 0900 June 13, 7 MILES SOUTH OF RHODE ISLAND COAST Walter’s older brother Alden stepped out onto the sunny deck of the USS Mark. He had left Cape May the night before with the rest of the crew, headed for Narragansett Sound, off the coast of Rhode Island. Newport had made radar contact with a submarine in the middle of the night of the June 11th, confirming the tip of a retired lieutenant who had intercepted a German radio message. Alden was now on a mission to hunt a U-Boat, something the men had been prepared to do for months, but it was nonetheless a daunting task. With a pang of homesickness, he realized that this could be the closest to Little Compton he would get for another nine months, when he was scheduled return home. By now he could just make out the slender, windswept tip of Little Compton, and its pristine white lighthouse. Alden was surprised to see them headed this close for shore, knowing recreational vessels were all around them. He was deep in a daydream, about his days sailing Sunfish out of Sakonnet Harbor, when a blaring siren shattered the quiet shushing of the hull piercing the mirror-flat water. That meant the sonar must have picked up a submarine, and the men needed to get to battle-stations. It also meant the submarine could know where the destroyer was. Now would begin the hunt. The destroyer lured the U-Boat valiantly, knowing the submarine would not miss the opportunity to sink a
57
U.S. warship in open water. The Navy did not want to put any civilians or their boats in harm’s way. By 0300 that afternoon they had unloaded over a hundred depth charges, discharged the hedgehog device five times, and had missed the sub every time. Whoever the German captain was, he must certainly have been skilled. At 0430 the destroyer pinged the U-Boat again. Realizing they were low on munitions, the Navy captain of the destroyer decided to unload all they had in a last ditch effort to sink the sub. They had survived several torpedo misses now and did not want to risk the enemy getting another shot. At the captain’s command, Alden pulled the iron bar out from the chute, releasing twenty-five depth charges into the oily water. One after the other, they fell from the ship, dutiful mercenaries seeking black German steel. 0428 June 13, FORMER FORT CHURCH WEST RESERVATION, LITTLE COMPTON, RHODE ISLAND Walter had no idea whether the superiors that Lieutenant Elwell had contacted had taken any action, or whether there was ever a submarine at all. Nevertheless, he had skipped sailing class and was avoiding the water entirely. This afternoon, Walter was sitting on the gun barrel again. He imagined hunkering down in the bunker, loading a shell from the long hanging tracks, packing in the silk bags of gunpowder, and finally aiming the gun on a pinpoint target—a U-Boat. Walter looked at his friends out sailing their Sunfish, their once-colorful sails turning to black silhouettes against the sliding sun. Shifting his eyes upward, he saw the plumes of clouds on the horizon, like billows of smoke from some far off fire. Notes Janet Lisle, A Home By The Sea (Little Compton, RI, 2012), 325. Adam Lynch, “Kill and Be Killed? The U-853 Mystery,” Naval History Magazine, June 2008, <https://www.usni.org/magazines/ navalhistory/2008-06/kill-and-be-killed-u-853-mystery> (accessed 25 May 2017). 3 Ian Macdonald (interview by the author), May 23, 2017. 1 2
58
Unraveling the Socialist Sisterhood: the Conflict of Ideology and Reality in the Soviet Liberation of Women By Jaclyn Mulé ‘18 When Marxist feminist Alexandra Kollontai excitedly returned to Russia from exile upon the outbreak of the Russian Revolution in 1917, she could not have known how communism in Russia would ultimately affect the role of women. She and Vladimir Lenin envisioned an ideal communist society that had freed women to participate in the workforce by alleviating all domestic burdens. The recent sequence of wars, however, left the early Soviet Union with a gender imbalance, to which women’s labor contributed, inconveniently lowering the birthrate as a form of stateprioritized industrial work over the domestic sphere. Soon enough, a fundamental question surfaced in realistic Soviet feminism: how could the state at once encourage women to reproduce and fully utilize their labor potential? Determined to revolutionize Russia’s economy, the government began to decrease its emphasis on women’s individual liberties, shifting away from the original Bolshevik feminist ideals. Lenin’s Marxist ideology had rooted women’s oppression fully in the institution of private property and claimed that communism would eradicate gender inequality. Yet, ironically, social and wartime economic realities resulted in a plummeting birthrate, which caused Soviet premier Joseph Stalin to exploit women as devices for the goal of economic advancement, doubling their workload and diminishing their freedom. In the nineteenth century, Marxists argued that communism would liberate women by eradicating patriarchal power, which they identified as symbolic of capitalism. Karl Marx and Friedrich Engels, the most influential communist intellectuals, had long considered women’s subjugation an example of class oppression and saw the husband as representative of the bourgeois institution of private property. The bourgeois family, they claimed, rested on a foundation of female enslavement, as the husband expected his wife to perform domestic tasks without pay. The husband emKarl Marx 59
bodied the “exploiting capitalist,” while his wife symbolized the “oppressed proletarian.”1 Claiming that economic resources cemented patriarchal power, Marx and Engels held that only the complete eradication of capital could revolutionize the family. In order to perpetuate the proletariat revolution, they sought to change the “sexual ties” that composed the foundation of the household by limiting the “family’s functions to that of social production and consumerism.”2 A perfect communist world would nationalize production and “expand public education, childcare, and the social provision of housework,” transferring tasks from family to society. Disintegrating the family as an economic unit would liberate women from housework, allowing them to overcome their husbands’ oppression and contribute to public life. Moreover, the equalization of economic resources and legal rights would create “sexual symmetry” in families.3 As a result, communism would free women, legally, economically, and domestically. In the years preceding the Revolution of 1917, Vladimir Lenin and the rest of the Bolshevik Party upheld the beliefs of Marx and Engels and strongly considered the family a symbol of capitalism. According to the Bolsheviks, the family rested on a base of unpaid domestic labor, performed solely by women. Marx, Engels, and Lenin argued that only the destruction of capital could change the “economic dependency relations” within the household that prevented its reformation. As an individual economic unit, the family represented capitalist inequity, and women’s oppression reflected workers’ oppression. Men easily usurped the dominant position because they were the primary breadwinners.4 Women, on the other hand, found themselves confined to “petty, stultifying,” and “unproductive” work in the home, as Lenin disdained domestic tasks in his 1919 speech to the Fourth Moscow City Conference of Non-Party Working Women in Moscow.5 The Bolsheviks believed that a communist government would overcome domestic labor, which they saw as fruitless, by transferring it to the larger society and utilizing the wasted womanpower in the labor force. The wages earned by women would inevitably shift the power dynamics within the family and deprive husbands of the economic superiority necessary for patriarchal control. 6 Convinced that capitalism stood at the root of women’s subordination, Lenin believed that its replacement with communism would transform sexist attitudes. The family stood as another bourgeois barrier that the Bolsheviks had to overcome in order to achieve an ideal communist society.
60
Building on these precepts, the Bolsheviks envisioned idealistic policy to relieve women from the domestic burdens of motherhood and empower them to join the workforce. In 1919, Lenin declared his beliefs in a speech to the Fourth Moscow City Conference of Non-Party Working Women in Moscow: “Private property . . . keeps . . . women in a state of double [poverty and wage] slavery.” To Lenin, both husbands and Russian law demeaned women as “private property,” easily controlled by their oppressive husbands and enslaved by housework. Later in his speech, Lenin argued, “Even when women have full rights, they Vladimir Lenin still remain factually downtrodden because 7 all housework is left to them.” In the eyes of Lenin and the Bolshevik party, only the communalization of domestic services, as well as the reorganization of the individual family, could emancipate women from “household bondage.”8 These services would provide women with flexibility and choice, breaking the chains of domestic servitude. Lenin planned to establish large-scale state domestic services to enable women to join the labor force as productive individuals dedicated to the development of a supreme industrial economy — the Soviet Union’s foremost goal. Lenin also believed that the success of the revolution rested on the support of proletarian women, especially since he viewed them as an undervalued economic force. Despite his support of women’s rights, he disdained feminism for fear that any particular group might threaten the collective Bolshevik identity.9 Moreover, the Bolsheviks claimed that feminism promoted capitalism by seeking to liberate only propertied women.10 At first, this argument appears to contradict the Bolshevik focus on liberating women; according to communism, however, collective identity and dedication to the state should always prevail over individual liberties, even those of subjugated women. Immediately after the revolution, Lenin strove to implement policies that reflected Marxist ideology and emancipated women. One of Lenin’s first actions was to legally establish women’s equality and invalidate sexist laws through the far-reaching Code of 1918. Primarily, he changed the laws regarding religious marriage, joint property after mar-
61
riage, and barriers against divorce. First, he replaced religious marriage— which he considered a patriarchal expression of capitalism—with civil marriage, trying in general to destroy the power that the Russian Orthodox Church held over the public. In addition, the code legalized divorce at the “request of either spouse,” serving to equalize the sexes as marital partners under the law. The code also reversed the power held by husbands over their wives’ property after marriage, establishing that “neither spouse had any claim on the property of the other.” These laws executed the Marxist desire to destroy the family as a legal and commercial unit, thus annihilating a major component of capitalism in the newly communist society. Regardless, many Bolsheviks viewed the code as “transitional” because they assumed that the capitalist institution of marriage would soon disintegrate altogether.11 The revolutionary laws concluded in 1920 with the Decree of October 18, which legalized abortion.12 Six years later, the Code of 1926 continued the communist campaign against the traditional family by decreeing it equivalent to de facto (cohabitation) marriage.13 Together, these laws represented the shift from Russian traditions to a new communist order, as well as the crusade against all customs deemed bourgeois. Although Marxist ideology did indeed motivate Lenin and his successor, Joseph Stalin, to alleviate women’s domestic responsibilities and mobilize them for the domestic force, the depletion of the male population following recent wars served as the foremost incentive. World War I, the Revolution of 1917, and the Civil War in Russia that followed had left the Soviet Union with a deficit of male workers, which inhibited Stalin’s vision of cultivating the communist country into an industrial superpower. In order to fill the vacant positions with women workers, Stalin enacted the First Five-Year Plan of 1928, a list of economic goals that encouraged the mobilization of women for the workforce. This 1928 Plan immediately increased the number of women in heavy industry. Stalin sought to utilize female workers, whom he recognized as an overlooked resource, on communal farms and factories.14 Over a decade later, World War II worsened the labor crisis, driving Stalin to replace men who became soldiers with female employees. In fact, by the mid-1940s, the success of the Soviet economy rested on womanpower.15 Throughout the first twenty years of the Soviet Union, women came to participate in a wide range of professions, and by 1945 composed 56 percent of the national workforce.16 For example, women had a near monopoly in the realms of medicine, dentistry, and pharmacy. Whereas in 1913, they composed 10 percent of physicians, by 1959 that number had leapt to 79 percent.17 Once confined to 62
the domestic sphere, women had grown critical to the labor force and economic success of the Soviet Union, primarily due to Marxist policies. Remarkably, although Lenin’s and Stalin’s idealistic laws had positive effects on the proportion of workingwomen, they also diminished the motivation to procreate, depleting the future labor force and alarming the regime. Determined to heed the tenets of Marxism, the early Soviet Union failed to consider the difficulty of reconciling family work and freedom of divorce with the responsibilities of parenthood.18 Suddenly, the policies Joseph Stalin appeared to backfire. The abuse of the divorce and de facto cohabitation laws crumbled marital and family stability, as husbands began to neglect their marriages and families, deserting impoverished wives.19 By 1925, half of marriages in Moscow ended in divorce, which proved a “special hardship” for Soviet women searching for work.20 Meanwhile, increased numbers of juvenile delinquents and deserted children roamed the streets. The most pressing issue, however, was the economic consequence of disintegrated family stability: the drastic drop in the birthrate, to which the legalization of abortion had also contributed. The lack of a future Soviet labor force loomed ominously in front of Stalin, who had attempted to center policy on industrialization. Although the social programs echoed Marxist ideology, upholding the goal of communism, they had also reaped negative results with a significant impact on the present and future economy of the Soviet Union. The failure of promised social programs to materialize compounded the birthrate problem, as workingwomen no longer had sufficient time to dedicate to their children, and state institutions did not offer adequate domestic support. The Bolsheviks dreamed of fully utilizing womanpower in the workforce while simultaneously encouraging reproduction. They intended to fulfill this ambition by providing socialized programs that would assuage domestic burdens. In theory, these programs would enable 63
women to hold full-time jobs without diminishing their incentive to reproduce; women would have time and energy to devote to both labor and procreation. The few extra-familial agencies that actually transpired, however, proved incompetent to deliver the already-limited material resources demanded by workingwomen. Even programs that successfully offered services had drawbacks. For example, long lines for public dining and laundry hindered women from utilizing the services.21 Furthermore, the government simply disregarded its plans to establish kindergarten facilities and birth control information centers.22 As a result of these realities, women Soviet poster honoring the contribution of largely remained accountable for women in industrial development. the same domestic tasks as they had under capitalism. Their new job responsibilities exacerbated their overall workload, deterring them from reproducing. Consequently, the birthrate plummeted throughout the late 1920s and early 1930s.23 Recognizing the mass manpower required for industrialization, Stalin began to panic. In order to encourage childbearing, the Soviet premier now backtracked on his original policies, instituting a cult of domesticity and prioritizing the welfare of the state over a woman’s individual liberties. The Bolsheviks had previously attempted to unhinge the family, which they criticized as embodying capitalism, but, beginning in the mid-1930s, they enacted a sudden reversal of strategy. The regime now exalted the family as a “microcosm of the new Soviet order.”24 A flurry of legislation passed in 1936 and 1944 abolished common-law marriage, erecting barriers to obtaining a divorce, and outlawing abortion. Stalin also sought to dictate the nature of his budding labor force; he made the Soviet education of children a parental duty, so that future citizens would exhibit the Soviet ideals of “self-discipline” and “punctuality.”25 Moreover, the Soviet Un64
ion staunchly supported large families and domestic motherhood—a complete reversal of Marxist thought. Marital and family relationships evolved into a public concern. The government prioritized societal consequences of marriage, such as reproduction, over the need for individual freedom, viewing equal treatment of women as less important than family stability. Despite the fact that Soviet women participated in the workforce more than any country on Earth, the state saw physical reproduction as a woman’s most valuable purpose.26 Indeed, encouragement of reproduction swiftly developed into a cult of motherhood. The state sought to provide incentives for reproduction by awarding women extravagant titles for raising a certain number of children. For example, if a woman raised ten children, the government endowed her with the honor of Mother Heroine; if she raised 5 or 6, she received a motherhood medal. The regime declared that socialism liberated women to focus on their most useful function: instilling communist values in their children.27 Even as the century progressed and a new political ideology held dominion over Russia, women remained enchained to the very domestic sphere from which Marxism had sought to liberate them. By the early 1940s, the government expected women to both participate fully in both the domestic sphere and the workforce, tasking them with a double burden and shaping them into tools for reproduction and manual labor. Prior to the revolution, women’s responsibilities revolved mainly around the household, which, Marx and Engels had argued, made her a slave to capitalism and served as the root of women’s subjugation. While communism had successfully broadened women’s labor options, it had failed to emancipate them from household responsibilities. As a result, women became less free, especially as the government enPoster celebrating Russian womanhood couraged more childbirth, which in after the Bolshevik Revolution of 1917. turn created even more domestic tasks. Moreover, husbands viewed 65
any male interference in traditionally female responsibilities as offensive to their masculinity and refused to relieve their wives of some of the domestic burdens.28 Yoked to the domestic sphere and yet also forced to perform manual labor at the same rate as men, women found themselves with a doubled workload. This additional burden prevented women from achieving the same levels of productivity as men, which, in turn, justified traditional gender prejudices and led to lower pay.29 They were free to participate in all careers but without equal opportunities for advancement. Soviet leaders such as Stalin wanted women to bear labor burdens but refused to give them high political office, and very few women held prominent positions, especially within the government.30 Despite the participation of women in the labor force and the diversification of their professions, they remained at the lowest rungs of the job ladder.31 In addition, women received on average two-thirds of the wages enjoyed by men.32 A woman’s uterus, childbearing ability, and labor capacity belonged to the state. Instead of objects belonging to the private property of a capitalist husband, women were objects exploited by Stalin’s regime for economic advancement—both as laborers and as the mothers of valuable future generations. As the father of communism, Karl Marx had supported women’s liberation, deeming sexism a product of capitalism. He argued that capitalism endowed wage-earning husbands with economic power over their wives, who suffered from the burden of domestic tasks. Like his idol, Lenin considered women’s domestic subjugation unnecessary for the establishment of communism, which required immense manpower for industrialization. Therefore, both ideological and practical matters drove Lenin to legally equalize women immediately following the Revolution of 1917. Practicality, however, soon took precedence over women’s equality, when many of the progressive reforms worked in reverse effect during the late 1920s and early 1930s, driving down the birthrate. Stalin saw industrial empowerment, which necessitated a future labor force and therefore current reproduction, as a more significant concern than women’s liberation, and he soon instituted a cult of domesticity in an attempt to solve the birthrate problem. Moreover, the idealistic social programs proposed by Marx, and then Lenin, which in theory should have relieved women of domestic tasks and thus freed them to join the workforce, failed to meet the increasing demands of workingwomen. Faced now with the double weight of manual labor and domestic responsibilities, women’s freedom actually diminished under the early So66
viet Union. The regime had fervently denounced the way that capitalism supposedly had caused inequality of the sexes, as women became private property of their husbands. Hypocritically, the communist regime fashioned women into the property of the state, using them as instruments for labor and reproduction. Conflict between women’s emancipation and communism may have been inevitable because feminism focuses on the freedoms of the individual woman. Although classic capitalism is inherently patriarchal, it offers greater prospects for female individualism and autonomy—if free government enacts policy that encourages women to ascend the social hierarchy. On the other hand, feminism cannot be reconciled with communism’s obsession with the welfare of the state, which comes at a high price to personal liberties. Notes Gary Lee Bowen, “The Evolution of Soviet Family Policy: Female Liberation Versus Social Cohesion,” Journal of Comparative Family Studies 14, no. 3 (1983): 299-313, http://www.jstor. org/stable/41585341, 301. 2 Alice Schuster, “Women's Role in the Soviet Union: Ideology and Reality,” The Russian Review 30, no. 3 (1971): 260-67, doi:10.2307/128134, 301. 3 Ibid., 302. 4 Bowen, 301. 5 Vladimir Lenin, Lenin Collected Works (Moscow, 1962), https:// archive.org/details/LeninCW. 6 Bowen, 302. 7 Lenin. 8 Mary Buckley, “Women in the Soviet Union,” Feminist Review, no. 8 (1981): 79-106, doi:10.2307/1394929, 90. 9 Schuster, 261. 10 Barbara Evans Clements, Bolshevik Women (Cambridge, UK, 1997), 138. 11 Encyclopedia.com, “Family Code on Marriage, The Family, and Guardianship,” Encyclopedia of Russian History, http://www.encyclopedia.com/ history/ encyclopedias-almanacs-transcripts-and-maps/family-codemarriage-family-and-guardianship (accessed December 30, 2017). 12 Marxist Internet Archive, “Abortion Laws in the Soviet Union: The Decree of October 18, 1920,” Encyclopedia of Anti-Revisionism On-Line, https://www.marxists.org/history/erol/ca.firstwave/cpl-abortion/ section5.htm (accessed December 30, 2017). 13 Encyclopedia.com, “Family Code of 1926,” Encyclopedia of Russian History, http://www.encyclopedia.com/history/encyclopedias-almanacstranscripts-and-maps/family-code-1926 (accessed December 29, 2017. 1
67
Schuster, 261-263. Ibid., 263. 16 Buckley, 81. 17 Schuster, 264. 18 Bowen, 304. 19 Ibid., 303. 20 Encyclopedia.com, “Family Code of 1926.” 21 Bowen, 303. 22 Buckley, 89-90. 23 Ibid., 94. 24 Bowen, 304. 25 Ibid., 303. 26 Ibid., 305. 27 Buckley, 94. 28 Schuster, 266. 29 Buckley, 89. 30 Schuster, 265-266. 31 Buckley, 98. 32 Ibid., 88. 14 15
* 68
Melody without Humanity: The Soviet Union’s Fifty-Year Musical Repression By Gordon Kamer ‘18 In the movie adaptation of Doctor Zhivago, the fictitious General Strelnikov declares, “The personal life is dead in Russia. History has killed it.” Boris Pasternak’s Doctor Zhivago, one of most widely read novels set during the Russian Revolution, is about a doctor who writes beautiful love poetry in his spare time. Zhivago is that kind of person whose life was most ruined by the Bolsheviks’ seizure of power, one whose personal life is everything— passions that are apolitical, emotional, and completely unnecessary to the state. Zhivago carries his childhood Balalaika (a triangular -shaped stringed musical instrument) throughout his life as a companion to his writing. Music, as with poetry, is about feeling. It is not physical. It does not contribute to historical progress. The Soviet government’s opinion of music—as with most things—was that if it did not serve the state, it did not have a place in Russia. The USSR saw music as a tool of statecraft and attempted to cajole artists into making music advance revolutionary proletarian (lower-class) interests as opposed to letting it voice any genuine, or more universal, emotion on the part of the composer. The Soviet government only wanted to get artists to produce its fake art, while conspiring to make music a part of its propaganda machine. Artists resisted as much as they could, however, and never completely or honestly carried out the state’s wishes. As every movie and piece of literature would attest, music was a part of the Russian Revolution even before October 1917. “The Internationale,” the internationalist hymn of socialism, was sung in Russia by protesters in the 1905 Revolution as well as in party congresses and on many other occasions. After the October Revolution, “The Internationale” served as the new state’s national anthem. The song begins, Stand up all victims of oppression For the tyrants fear your might Don't cling so hard to your possessions For you have nothing if you have no rights The lyrics’ meaning is self-evident and politically motivating. In addition, “The Worker’s Marseillaise” was often sung, hearkening back to the 69
French Revolution with similar lyrics to “The Internationale” but to the tune of “The Marseillaise.” From the very beginning, music served as a tool of political inspiration, especially for the masses when considering both songs’ simplistic tunes. The People’s Commissariat for Education (often called “Narkompros”) served as the official Soviet body concerning cultural education—and thus musical education—at the beginning of the Revolution, later becoming the Ministry of Education. Immediately following the October Revolution, the Ukrainian Marxist Anatoly Lunacharsky became the first commissar of this new governmental organ. On December 12, 1917, Lunacharsky wrote a letter to artists of state theaters describing his and the government’s attitude toward art in a new Russia. He began his letter, “Dear Citizens, you know very well how important it is to regulate the relationship between the artists and workers of the state theater on the one hand and the state itself on the other.” Lunacharsky then softens his initial authoritarian tone by writing, “It is perhaps unnecessary to say that the new power does not require workers . . . to adopt any particular political credo, and even less so in the sphere of art. You are free citizens, free artists, and no one is violating your freedom.” That statement means nothing, however. If citizens were truly free, Lunacharsky would not have to continue: “But there is a new master in the country: the working people. The working people cannot support state theaters until they are sure that these do not exist for the entertainment of the rich, but for the satisfaction of the great cultural needs of the working population.”1 Thus, from the beginning, the Soviet Government forced art in Russia to be explicitly proletarian. From the earliest days of the Revolution, the Soviet Union recognized its interest in using art and music to legitimize its new dictatorship of the proletariat. In general, 1917 was a year of struggle over the foundations of power. Leaders grappled with the questions of “What makes a state?” and “What gives people power?” The system of dual power from February to October (or, more accurately, from March to November) instilled in the Bolsheviks the need to signal their political legitimacy, which, once accomplished, helped give them real power. They turned music into another battleground for shoring up support. Just as the Bolsheviks installed themselves in the royal palaces formerly of the Tsar and his ministers, they also began putting leaders of the Moscow Soviet in the royal boxes of the Bolshoi and Mariinsky Opera Houses. The sight of party leaders in the place of royals was intended to remove any doubt that the Bolsheviks were sovereign. But the institutions running the theaters rebelled, tried to kick out the Bolsheviks, and threw rocks at them in their boxes. The polit70
ical battle spilled over into the heart of Russian art. The Bolshoi troupe even passed a resolution expressing its dissatisfaction with the revolutionary government, stating “The State Moscow Bolshoi Theater as an autonomous artistic institution does not recognize any right of interference in its internal and artistic life on the part of powers that have not arisen from within the theater and have not been elected by it.” The Soviet government basically accepted the theater’s terms at first, and as the government eased pressure on the theater, the troupe no longer allowed rocks to be thrown at Bolshevik party leaders.2 The Bolsheviks’ willingness to relinquish some control over the state theater, in exchange for an expression of respect for party officials in the royal boxes, demonstrates the importance the Bolsheviks placed on the appearance of authority. They leveraged music and its connection to the old royalty to instill a tsar-like authority even if they did not yet try to instill tsar-like policies. With the Revolution still in its early, most idealistic stages, policy toward music was a convenient way for the Bolsheviks to show that they were keeping their promises in regard to establishing a socialist state. The new government took quick, concrete steps to affect Socialist principles, carrying out smaller parts of the larger policy goals. First, Lunacharsky nationalized musical conservatories, took on their finances, and abolished their fees. He even abolished entrance exams, before later reinstating them.3 At the same time as Lunacharsky’s actions, the Bolshevik secret police, the Cheka, noticed a disappearance of musical instruments. At once, the Cheka signed a decree nationalizing all musical instruments and, in the case of an elderly Count Zubov, they confiscated his four priceless Stradivari violins. The resulting accumulation became one of the world’s greatest state collections of instruments.4 Nationalization (in the case of conservatories and instruments) and democratization (as in eliminating fees and entrance exams) fulfilled significant goals for arts administration as a part of the larger scheme of governing. Nationalization and democratization, however, did not succeed as quickly in other parts of government as in music. Bolshevik leader Vladimir Lenin’s New Economic Policy, and the idea increasingly espoused by many in the Bolshevik regime that true socialism would have to wait, seemingly did not apply to music. The most intuitive explanation for this leniency is that music was just not considered as politically important as bread or land. And, of course, it was not. However, artists who viewed music as a divine language—personally expressive—and as an end in and of itself would be horrified at the prospect of such radical experimentation. Those artists’ concerns were not unwarranted, either. How many Tchaikovskys or Rachmaninoffs were lost because 71
of untested policy? The question itself is horrifying to any lover of art, but the government did not care. The Soviet Union cared only that music served its main purpose: support of the regime. As time went on, the Soviet regime further strengthened its control and influence over music. Dmitri Shostakovich Soviet dictator Joseph Stalin exercised increased control over many aspects of Russian personal life, and the musical realm was no exception. He oversaw a quest to control the essence of the emotions that a song made a listener feel. In 1928, governmental criticism of music was institutionalized in the form of the Russian Association of Proletarian Musicians (RAPM). The stated goal of this new bureaucratic organization was to ensure that music in Russia be proletarian in nature. The RAPM forced composers to write march-like massovaya pesnya, songs for the masses.5 Tunes were expected to be simple, powerful, and direct. It directed composers to avoid complexity and atonalism, the very hallmarks of experimental twentieth-century music. That same year, Dmitri Shostakovich wrote his first opera, The Nose. The Nose was precisely what the Soviet regime did not want: harmonies and melodies completely incomprehensible to the masses. It was too erudite—or, ironically, too revolutionary. The Russian Revolution dismissed this twentieth-century “revolutionary” music in favor of “Revolutionary” music, written to celebrate the worker state. Thus, Shostakovich met heavy criticism. A few years later, in 1930, The Worker and the Theater, a state-sponsored newspaper, addressed artistic dissenters such as Shostakovich: “the Supreme Court of the USSR should give no quarter to warmongers, wreckers or counter revolutionaries… we demand that wreckers should be shot.” In other words the state no longer took lightly any transgressions against its policy of proletarian music. In fact, its reaction to Shostakovich paled by comparison to another contemporary composer, Alexander Mosolov, who went to be executed in 1937, branded an “enemy of the people.” The
72
reason Shostakovich managed to survive was probably in part due to his willingness to conform. He wrote a public apology in 1930, “I live in the USSR, work actively and count naturally on the worker and peasant spectator. If I am not comprehensible to them I should be deported.”5 All told, the oppressive Soviet government under Stalin forced great geniuses to write music to appease simple-minded bureaucrats. Shostakovich, however, did not always follow the Communist Party’s rules, and the USSR’s reaction further showed the government’s resolve to crush musical expression. In some rare cases, Stalin himself took up the job of fighting dissent. At first, Shostakovich stuck to non-political compositions, and there was little need for fighting. Yet much of Shostakovich’s music was not supportive of the Communist Party, whereas the state-appointed music critics liked to interpret music as being procommunist. As Shostakovich wrote in 1933, “When a critic… writes that in such and such a symphony Soviet civil servants are represented by the oboe and the clarinet and the Red Army men the brass section, you want to scream!”6 Shostakovich was just an artist, but he was being used conveniently by the state. In 1934, for example, he finished his opera Lady MacBeth of the Mtsensk District. In 1936, not long after its premiere, a scathing review appeared in the state-run Pravda titled “Muddle Instead of Music.” It called the opera “formalist,” “bourgeois,” and “vulgar.”7 These adjectives, representing the exact qualities the state wanted to suppress, were synonymous with treason in Soviet Russia. Shostakovich had been hired unwittingly as a tool of the state, and when he did not follow the rules—to which he had never agreed—his Soviet employer threatened termination. It was not enough that music refrain from being lyrically subversive or openly hostile to the state. The personal qualities in the music also had to bend to historical and political circumstances. The renowned Soviet pianist Vladimir Ashkenazy would later write that the review of MacBeth was “probably dictated by Stalin.”8 In the end, Shostakovich’s opera was banned until years after Stalin’s death. Just as the Pravda article went to print, his 4th Symphony was due to make its premiere. Shostakovich’s two prior symphonies were described as patriotic works, and both had large finales. However, the composer had no wish to continue on the same path. He told an interviewer in 1935, “I am not afraid of difficulties. It is perhaps easier, and certainly safer, to follow a beaten path, but it is also dull, uninteresting and futile.”9 “Dull” and “uninteresting” reflected the state’s designs. No other styled work would be acceptable. Thus, it was no surprise that after a few rehearsals, the new 4th Symphony was ordered withdrawn, its premiere to be delayed for another thirty years. 73
In the Soviet Union, the government rarely lost a battle with an artist. Chastened, Shostakovich now undertook works more palatable to authorities. According to McGill professor of music and humanities Dr. Cory McKay, Shostakovich’s 5th Symphony exemplified “Soviet Realism,” a glorification of the proletariat that the state desired, full of triumphant “joyousness.”10 That symphony was a huge success, but Shostakovich later reflected that had he more freedom, he would have “displayed more brilliance, used more sarcasm… revealed [his] ideas openly instead of having to resort to camouflage, [and] would have written purer music.” By the time of the Great Patriotic War (Russia’s name for World War II), musical conformity in the form of war propaganda dominated. In one example, Shostakovich’s “The Song of Peace,” a light tune and widely liked, shored up his popularity as a great mass-song artist. After hearing “The Song of Peace,” a Soviet critic said “I want to congratulate everyone assembled in that we no longer, and I hope we will never again, call Shostakovich a representative of the formalistic direction.” Years later, Shostakovich would go on to write his 11th Symphony, which was dedicated to the Revolution of 1905. Even then, Shostakovich’s son whispered in his ear before the premiere, “Papa, what if they hang you for this?”11 The fear remained—however irrational by that time. “The Year 1905” earned Shostakovich an invitation to join the Communist Party in 1960. To deter the impression that he had genuine admiration of the Soviet leadership, Shostakovich conspicuously missed his first party meeting, forcing Communist officials to make up a story that he was sick.12 All the same, one of the great composers of all time had spent most his life as the Communist Party’s unwilling court composer. Historians argue over the extent to which composers cooperated or even supported the Soviet Union and its actions toward the arts. In the mid-twentieth century, some western observers might even have been led to believe that Soviet citizens were just as supportive of their side in the Cold War as Americans were of their government’s. In a 2003 op-ed, Ashkenazy recalled reading an article in The Economist which said that because Shostakovich did not try to explain his pieces in the concert programs that accompanied the performances, he did not in fact dissent artistically. Ashkenazy wrote, The truth is, Shostakovich confided in only a small circle of trusted friends. To have said too much elsewhere—at rehearsals, for example—would have been career suicide, and perhaps worse. . . . As the outstanding Russian composer Rodion Shchedrin has 74
said, ‘Nobody wanted to go to the gulag.’ One must not forget that we had no independent judiciary. The Communist Party was the only jury and, between the mid-30’s and his death in 1953, Stalin the only judge.13 The reality is that few if any musicians and composers enthusiastically approved of the USSR’s policies in general. In the same op-ed, Ashkenazy reflected on a question he had been repeatedly asked since his defection in 1963: “Mr. Ashkenazy, when did you decide to leave the Soviet Union?” He explained that it was the wrong question to ask because Soviet citizens had no right to Sergei Prokofiev foreign travel. When the Soviet government inexplicably allowed him to bring his wife on a concert tour of Britain, Ashkenazy seized the chance to defect. Composer Sergei Prokofiev also lamented, publicly, his limitations under the Soviet Union. The government attempted to control the subject matter and melodies of Prokofiev’s work. In 1940, the Soviet Union paid Prokofiev to write a ballet inspired by Shakespeare’s Romeo and Juliet, but Soviet censors then told him that he had to remove all references to royalty. The final version of the ballet therefore had characters and backdrops designed specifically for the common folk of Russia. In addition, as the government did with other composers, it forced Prokofiev to write military-styled tunes. In response, Prokofiev included sarcastic statements in the elements of his pieces that the government forced upon him. In his 3rd Piano Concerto, for example, he wrote steady, dark, march-inspired lines but drizzled them with light trills and high-pitched breaks. It was a form of anti-governmental satire he could manage to pass through the government censors. In addition to his musical dissent, it was later revealed that Prokofiev had been writing letters critical of the Soviet Union. In fact, he wrote notes for a lecture he never gave with bullet points including: “1. Soviet Art, despite its enormous breadth, is declining in its quality.” He 75
added, on the revolutionary Marxist composer Felix Dzerzhinsky, “The music of Dzerzhinsky is illiterate. His development of folksongs represents a decline in comparison to that which occurred seventy years ago. Lesser composers regard Dzerzhinsky’s absence of talent as a marker of success.” Finally, in his most candid criticism, Prokofiev planned to say, “The official directive concerning the struggle against Formalism has been carried out too zealously. The baby has been thrown out with the bathwater. . . . Music comprising of second-rate material cannot be firstrate.”14 It goes without saying why Prokofiev never gave this lecture. Dissent amongst Soviet artists—even the most celebrated—was widespread. Many of the people carrying out the government’s policy in music never themselves embraced the cause. Western ignorance of Soviet artists’ misgivings about their life in the USSR was, of course, by governmental intention. Arlene Portney, the first American woman pianist to win a major international competition, attended the 1967 George Enescu International Piano Competition in Bucharest, Romania. She recounted that she “was watched at all times.” On the foreign contestants’ hotel floor, a Soviet official handled all the room keys, monitoring the comings and goings of all the pianists. On one occasion, Portney was invited to the home of a Soviet competitor who wanted to share more details about his life. After walking halfway to his house, however, Portney and her fellow Americans noticed that they were being followed, so they went back to the hotel and never met with their Soviet counterpart. If the authorities had ever found out the access the Soviet pianist was planning to give the Americans, his fate might have been the same as another person Portney encountered. After the competition, Portney planned to go on a trip to Russia, but after she had a long and honest conversation with the woman who was supposed to be her tour guide, the woman disappeared, and Portney was forced to cancel the trip. Soviet intelligence also set up traps for the foreign competitors: spies, for example, attempted to convince Portney to buy Rubles off of the black market for a favorable rate so as to incriminate herself. Years later, in England, just to confirm what she already knew about the lamentations of her Soviet competitors, Portney would witness the defection of Romanian Radu Lupu, who was the winner of that 1967 George Enescu competition. Upon Lupu’s defection, authorities immediately went to his family home, kidnapped his sister, and separated his parents. Music as a tool of state building was unwelcome by the artists ordered to carry out this role in the Revolution. As Leonid Sabaneyev, a So76
viet music critic, wrote in 1923, “Everything significant that was born during the Revolution passed it by—they are all messages in a bottle from some desert island, which the wind of the revolutionary storm left untouched.”15 Sabaneyev’s critique was a description of an uninspiring musical culture just beginning to take hold. To quote a popular joke, Russian history can be explained in five words: “And then it got worse.” In just a few years, revolutionary Russia would see the rise of Stalin and the creation of a musically oppressive regime lasting for five decades. The despair Sabaneyev expressed early in the process became dissent and, at times, outright defiance of the power in control. No matter how hard the government might have genuinely tried to create a revolutionary kind of music made for the masses, the intellectually void, simplistic tunes ignored the movements in other parts of the world that made music more interesting. Generations of composers fell victim to a regime bent on conforming music to its version of history and exploiting it for political gain. Notes Marina Frolova-Walker and Jonathan Walker, Music and Soviet Power: 1917 -1932 (Suffolk, UK, 2012), 3. 2 Ibid., 4. 3 Ibid., 6. 4 Ibid., 8. 5 Cory McKay, “Political Influences on the Music of Shostakovich,” http://www.music.mcgill.ca/~cmckay/papers/musicology/ ShostakovichPolitics.pdf (accessed January 21, 2018). 6 Ibid. 7 Brian Moynahan, Leningrad: Siege and Symphony (New York, 2015), 38. 8 Vladimir Ashkenazy, “Making Music in the Shadow of Stalin,” The New York Times, February 16, 2003. 9 Richard Freed, “Shostakovich: Symphony No. 4.” 10 McKay, “Political Influences.” 11 Ashkenazy, “Making Music.” 12 McKay, “Political Influences.” 13Ashkenazy, “Making Music.” 14Simon Morrison, The People’s Artist: Prokofiev’s Soviet Years (New York, 2009). 15 Frolova-Walker, “Music and Soviet Power,” 77.
1
* 77
The Brunswick & Greenwich Academy
Magazine of History 2018
Brunswick School 100 Maher Avenue Greenwich, CT 06830 (203) 625-5800 Brunswickschool.org
Greenwich Academy 200 North Maple Avenue Greenwich, CT 06830 (203) 625-8900 Greenwichacademy.org
Vol. 15