How Pandemics Have Shaped History

Written by Ella Raphael.

In the wake of Coronavirus it is easy to feel an overwhelming sense of uncertainty and fear. Yet,  humans are unfortunately well acquainted with pandemics, from the Plague of Justinian in 541 AD to the Ebola outbreak of 2014. Disease outbreaks have changed politics, ended revolutions and, in some cases, have caused wars. They have destroyed economies and changed the demography of entire contents. Past responses to pandemics have demonstrated the remarkable power of humanity when we work together, however, some have revealed our ability to commit great evils.

Pandemics throughout history have served as mirrors to society. They have revealed immense racial, political and economic prejudices lurking beneath the surface. Author of Epidemics and Society: From the Black Death to the Present, Frank M. Snowden, has even said that the existence of infectious diseases has gone hand in hand with political oppression. For example, he argues that part of the reason behind the barbaric slaughter of Parisians after the 1848 Revolution in France was because the working classes were seen as a medical threat as well as a political threat. In these over-populated communities, there was the risk that they would spread diseases to the rest of society. He says this is the true reason behind the metaphor of ‘the dangerous classes’, and the real reason behind the brutality of the subsequent 1871 massacre. The Cholera outbreak Europe in the 1830s coincided with massive social upheaval. In Britain, popular opinion refused to believe that cholera was an unknown disease, by contrast it was thought that it was an attempt to reduce the working class population by poisoning them. Whether this was the case or not, pandemics make people question the powerful institutions and social structures that are in place, and they can also encourage conspiracy.

In light of this, outsiders have often been blamed for disease outbreaks. On the rare occasion- such as the case of U.N peacekeeping troops bringing cholera to Haiti in 2010- they have been right. Yet more often than not, scapegoating is used as a coping mechanism for dealing with the fear and desperation that pandemics can cause. The Jewish population of Strasbourg became the target of inhumane persecution and suffering as the Black Death started to ravage Europe. Local officials had declared that they were to blame for the pestilence, as they were accused of poisoning the wells. They were given an ultimatum, either convert or die. Around half chose the former. The rest of the Jewish population was publicly burnt to death or expelled from the city, making it one of the worst pogroms of the pre-modern world. This is a poignant example of the dangers of hysteria and fear mongering, which inevitably come attached to epidemic outbreaks. 

This hysteria and panic has been a fundamental part of the Coronavirus pandemic. In the United Kingdom there has been a surge of xenophobia and racism towards people of Asian descent. In the United States, Donald Trump and his advisers insist on calling Covid-19 the “Chinese Virus”, perpetuating the cycle of finger-pointing and scapegoating. Claire Jean Kim, a professor of political science and Asian American studies at the University of California, warns that this language is dangerous because “we are being misled about what causes pandemics and how to possibly prevent them or reduce their severity in the future.” As seen above there is a unsettling history of leaders depicting outside groups as threatening. Our reactions to pandemics can  marginalise communities and legitimise hate crime. 

Nevertheless, perhaps pandemics are an unnerving but necessary reminder that humans are all the same: everyone is vulnerable. The plague of Justinian in 541, which spread across the Byzantine, Roman and Sasanian Empires, showed no class sympathies. It affected the powerful and powerless alike as even Justinian, the Eastern Roman emperor, contracted it. It must not be forgotten, however, that these outbreaks disproportionately affect disadvantaged communities, despite acting as a reminder that we are all facing the same problem. They can be an opportunity for international cooperation and compassion. 

Despite the mercilessness of pandemics, they have the ability to yield miraculous changes.  Snowden argues that one of the reasons the Haitian Revolution succeeded was because of yellow fever. The slave resistance led by Toussaint Louverture against Napoleon’s army was so successful because the slaves of African descent had the vital weapon of immunity that the Europeans did not have. The fever that was affecting Napoleon’s troops was a key reason behind his withdrawal from the island. Of course, this was not the only reason for the successful revolution, Louverture’s impressive strategy is thought to be the key factor, yet it is an example of how pandemics have altered the course of political history. They have indirectly contributed to freedom and liberty. Pandemics can also bring out the best in people. The Ebola virus outbreak was met with many inspiring responses and its mitigation was seen as a global effort. During the crisis, Doctors from Medecins Sans Frontieres put their self interests aside to go to the front line with the sole goal of helping the most vulnerable in society.

As a response to the Coronavirus outbreak some have argued that we should attempt to dissolve our global connections to prevent future outbreaks. The problem with this argument is that epidemics are not modern phenomena, and neither is globalisation. Yuval Harari, author of Sapiens and Homo Deus, states that if we wanted to protect ourselves from pandemics by isolating ourselves, we would have to go all the way back to the Stone Age, as this was the last time that communities were truly separate. The true antidote to pandemics is not global segregation, it is information and collaboration.

Painting by Sarah Yuster – https://fineartamerica.com/featured/composition-in-blue-minor-sarah-yuster.html

Bibliography:

Aratani, Lauren. “‘Coughing while Asian’: living in fear as racism feeds off coronavirus panic,” The Guardian, 2020.

Chotiner, Isaac. “How Pandemics Changed History,” The New Yorker, 2020.

Erdelyi, Matyas.  A history of the great influenza pandemics: death, panic and hysteria, 1830–1920, European Review of History: Revue européenne d’histoire, 22:3, 2006, p.508-509

 Evans, Richard J. “Epidemics and revolutions: cholera in nineteenth-century Europe.” Past & Present 120 (1988): 123-146.

Harari, Yuval N. Homo Deus: A Brief History of Tomorrow, 2016. 

Jordà, Òscar, Sanjay R. Singh, Alan M. Taylor. 2020. “Longer-Run Economic Consequences of Pandemics,” Federal Reserve Bank of San Francisco Working Paper 2020-09.

Kolbert, Elizabeth. “Pandemics and The Shape of Human History”, The New Yorker, 2020. 

Lew, Linda. “Homo Deus author Yuval Harari shares pandemic lessons from past and warnings for future,” South China Morning Post, 2020. 

McKibbin, Warwick J. and Alexandra Sidorenko. Global macroeconomic consequences of pandemic influenza. Sydney, Australia: Lowy Institute for International Policy, 2006.

Snowden, Frank M. Epidemics and Society: From the Black Death to the Present. New Haven; London: Yale University Press, 2019

Review: How We Disappeared by Jing-Jing Lee

Written by Tessa Rodrigues.

How We Disappeared is a profound tale told by Jing-Jing Lee which gives a voice to a forgotten generation of Singapore after the Second World War.

The narrative is split into three different focalisations: Wang Di in first person at the age of seventeen as Singapore surrenders to the Japanese in 1942; Wang Di in the year 2000, observed by a semi-omniscient narrator; and lastly eleven-year old Kevin who discovers an earth-shattering confession from his ailing grandmother the same year. Each story is interwoven into the other to showcase the deep scar left in Singapore following the Japanese Occupation, and the way in which so many were forgotten as the city developed into a bustling first-world metropolis. 

Wang Di’s story begins with an idyllic picture of kampong life in Singapore; however, we are additionally presented with the underlying patriarchal values that stifle her self-esteem. During the Fall of Singapore in 1942, she is stolen from her home and brought to a brothel where she lives for the remainder of the occupation. Her account of a Japanese ‘comfort house’ illustrates the atrocities the women captured experienced by the Japanese soldiers. Lee chooses to present this experience from a first-person point of view to truly reveal the pain, struggle, and subsequent humiliation thrust upon these women when they returned to a home that no longer welcomed them. Wang Di’s courage during her abuse is made clear to the readers, and we feel infuriated when her resilience is met with a silencing sense of shame from the family she yearned for. Her story resonates heavily with a modern-day audience, as #MeToo and other movements against sexual assault are becoming more prevalent. Lee provides a thought-provoking criticism of the way in which post-war Asian cultural norms silenced victims with humiliation and marginalisation, instead of demonstrating empathy towards the sexually abused.

This is made even clearer as we flash forward fifty-eight years later. Wang Di is now seventy-five years old and mourning her late husband. She is illiterate, quiet and slowly fading into the background as she has no family left to remember her. Her primary income comes from collecting old cardboard, a profession that her new neighbours turn their noses up at. Singapore has grown rapidly around her, and the stifling of her once vibrant personality her experiences during and following the war caused her to internalise her shame and humiliation. This leaves her with a lingering regret that she never let her late husband share about his own struggles during the war, and a longing to understand his hidden past. In an aged Wang Di, we are shown the struggles of the ‘Silent Generation’ of the elderly within society today as they begin to disappear into the shadows. We are called to take a step back and examine the way in which we treat not just the older generations within our families, but those who are a part of our collective community and Singapore’s national identity.

Alternatively, Kevin’s narrative serves as a representation of a younger but equally unseen generation. He is only eleven years old, but he is extremely perceptive of his father’s struggles with depression and his mother’s attempts to maintain the illusion that there is nothing to worry about. He manages to evade his parents’ infrequent gaze and investigates his Ah Ma’s secretive past. While his parents aren’t negligent, there is still a premature independence that is thrust upon Kevin as an only child in a household with two working parents. He is the new questioning generation in Singapore who pushes against the perceived norms to find the truth.


All in all, Jing-Jing Lee provides a stunning historical narrative that incorporates the heart-breaking, forgotten story of those who were forced to disappear; however, she also reveals the way in which modern society, not only in Singapore, must strive to remember a past we cannot escape. How We Disappeared pays tribute to those left behind and those who were not able to speak up, and in doing so brings them back to life.

The International Monetary Fund during the Cold War: Charitable Body or Neo-colonial Power?

Written by Ella Raphael.

When we think of colonialism, we tend to think about war, invasion and the suffering certain nations have inflicted onto others. We perhaps think less about the indirect, ideological or economic control countries have over one another. One of the main ways countries do this is by controlling systems of global governance and organisations, such as the International Monetary Fund (IMF). It was established in 1945 as an attempt to regulate the global post war economy. The IMF comprises of 189 nations, but historically the United States has held a strong majority of the votes. It presents itself as an organisation designed to ‘foster global monetary cooperation, promote high employment and sustainable economic growth.’ However, its intervention in Asian and African nations in the late-twentieth century suggests otherwise. Strengthening a capitalist, neoliberal dogma seemed of higher importance than helping the struggling nations it vowed to support. This is not to say that the IMF is a sinister, corrupt organisation; however, it has been branded out of touch and insensitive. Whether it intended to or not, the IMF has a history of damaging countries it claims to help.

Neo-colonialism can be loosely defined as the indirect political and economic control or influence of a nation-state, or a powerful institution extends over another nation. It has been a predominantly modern phenomenon, a new form of influence since the end of imperialism and the Second World War. Tukumbi Lumumba-Kasongo claims imperialism can also be ‘strong or destructive conditions of dependency of a nation-state [or institution] over another,’ which many scholars have argued matches the IMF’s relationship with many developing nations. In the process of receiving loans, struggling nations have been left indebted to the powerful organisation. The only way to escape this debt is to meet the IMF’s conditions. 

This approach has been vastly problematic in the past. Loans are only provided if the countries agree to undergo Structural Adjustment Programs, where the aim is to end protectionism switching to an agenda of privatisation. This has been a way for the IMF (which is dominated by the US) to export and enforce the political and economic ideology it deems desirable. John Hilary has argued that the conditions attached to the aid packages have undermined the passage of democracy in these nations. It prohibits governments from implementing development policies suitable to their national situations as they have to focus on repaying the loans. This weakens the economy further, perpetuating the cycle of dependency on bilateral monetary organisations. 

The US dominance of the fund during the Cold War did not help with its identification as a charitable organisation. This was at a time when these international bodies were promoted as being ‘above’ the affairs of the cold war. There was a clear ideological bias leaning towards capitalism, neoliberalism and privatisation. Between the 1960s and 1980s, it seemed as though the IMF was merely an extension of the US’s power. Teresa Hayter’s book Aid as Imperialism (1971) argues that aid has frequently been used as a political weapon. Nixon even said himself in 1968: ‘let us remember that the main purpose of aid is not to help other nations but to help ourselves’. As the main contributor of aid within the IMF, this creates an unsavoury image. Loans had been disguised as aid, but they meant much more. They ultimately meant meddling with the self-determination and autonomy of the recipient nations. 

This was the experience of many African nations during the 1980s and 1990s. Perhaps the most famous misstep was Ghana. The IMF forced parliament to override a critical governmental decision to raise import tariffs on poultry. Adherence to the funds conditions was suddenly of higher importance than addressing the pressing economic issues. Even the IMF itself has admitted since that its aggressive intervention in the 1980s was inappropriate. The organisation’s reliance on the neoliberal dogma – without assessing if it would work in the individual national frameworks – was an ‘insufficient basis for a constructive trade policy dialogue.’  Again, the IMF was not intentionally pursuing a self-sabotaging policy, however its agenda was out of touch with the nations it was helping. Its priorities leaned towards maintaining the stability of the world economy, and ultimately maintaining the power of the richest nations. 

The Philippines provides another example of the economic disarray caused by IMF intervention, which occurred between 1960 and 1990. Walden Bello argues the IMF and the World Bank were the sole architects of the turmoil the Philippines was left with for the latter part of the twentieth century. The Philippines is one of the few nations to be the subject of multiple IMF structural adjustment programmes. The 1962 devaluation imposed by the IMF forced 1,500 Filipino entrepreneurs into bankruptcy. To make matters worse, during the 1970s the IMF and the World Bank hurled the Philippines into a plan of export led growth. They were not helping to strengthen the domestic economy but were instead making it dependent on the exports of capitalist economies, reinforcing the current dependency paradigm. To many it was clear that the IMF was the engineer of this economic disaster, but it insisted the solution was a continued policy of liberalisation. As is often the case with colonial powers, the organisation took advantage of its power to control the Philippines and tried to morph it into an economy that would slot into the current capitalist order.

Although aid was not provided with malicious intent, the IMF implemented Structural Adjustment Programmes during the Cold War that were rendered out of touch with the national economies. As we have seen in Ghana and the Philippines, these programmes ended up sabotaging domestic growth in favour of maintaining the strength of the capitalist system. It ultimately boils down to a powerful entity forcing an external, unsuitable ideology onto another country under the guise of helping it. When this is the core objective of the fund, it is difficult to separate its ‘charitable’ projects from the quasi-colonial motives bubbling beneath the surface. 

Bibliography:

Bello, Walden, Broad, Robin. The International Monetary Fund in the Philippines: In The Philippines Reader: A History of Colonialism, Neocolonialism, Dictatorship and Resistance, 1982.

Gudikunst, Nicole, Briggs, Kristie, Clark, Terry, and Deskins, John. The Social Impact of the International Monetary Fund: Structural Adjustment Programs in Latin America from 1980–2000, 2010, ProQuest Dissertations and Theses.

Hayter, Teresa. Aid as Imperialism. Pelican Books. Harmondsworth: Penguin, 1971.

Hilary, John. “Africa: Dead Aid and the Return of Neoliberalism.” Race & Class 52, no. 2 (2010): pp. 79-84.

Lumumba-Kasongo, Tukumbi. “China-Africa Relations: A Neo-Imperialism or a Neo-Colonialism? A Reflection.” African and Asian Studies 10, no. 2-3 (2011): pp. 234-66.Weisbrot, Mark. “The IMF is hurting countries it claims to help”, The Guardian, 2019.

Review: ‘The Five’

Written by Mhairi Ferrier.

The Five, by social historian Hallie Rubenhold, tells the untold stories of Jack the Ripper’s victims – the Canonical Five. Painstakingly researched, The Five provides the reader with a view into nineteenth-century society’s attitudes and norms. Traditionally the widely accepted narrative has been that Jack the Ripper, whoever he may have been, was a killer of prostitutes. Rubenhold distinguishes that only two of the victims, Elizabeth and Mary Jane, appeared to have engaged in sex work during their lives. Of Polly, Annie and Catherine, Rubenhold notes from her research that there is no evidence to say they ever undertook any form of sex work. As is suggested in the book, this narrative made such horrific murders more palatable for the public – the idea that they were just prostitutes, seemingly acted as a justification for such killings. What The Five reminds the reader is that these women had their own stories, which deserve to be told. These women had lives, they had families, they had anecdotes and adventures which deserve to be shared rather than the tales of their vicious murders. 

“It is only by bringing these women back to life that we can silence the Ripper and what he represents.” 

Jack the Ripper has developed into an industry in its own right. The name is so entrenched into popular culture, that few would be unfamiliar with the term. This industry has people flocking to Whitechapel to observe the murder sights, buying Ripper themed souvenirs, and so much more. This is done at the expense of the victims and, as Rubenhold points out, hints largely at the misogyny that exists in our society today. Barely a thought is given to the victims, when the public are engaging in this rather unsettling, and quite inappropriate, Ripper culture. One just has to look at the negative reviews of this work, to find that people are disappointed that the accounts of each of the women stops before recounting the details of the murders themselves. Or other reviewers who cannot accept that these women were simply “not just prostitutes,” disregarding the fact that this does not make such killings any less terrible or any more acceptable. Rather, it just highlights why a book of this type is still necessary in order to improve the accepted narratives and views of women. What The Five does so well is that it truly lets the reader engage with the lives of these women. We get an insight into the highs and the lows they faced before their untimely deaths. We gain an understanding of what led them to end up in Whitechapel in the first place. We understand the attitudes which led to them being branded prostitutes in newspaper report after newspaper report. What these women had to endure in their lives – deaths of their children, family suicides, illness, poverty, addiction – makes for heart-breaking, and at times somewhat difficult, reading. But what it does do well is reinsert the human aspect back into their stories. These women were grieved, they were mourned, and they should not be forgotten while the figure who murdered them has become so immortalised. “He” will never be forgotten, so why should they? It is likely to make you feel a mix of anger and sadness – but it truly is a worthwhile read which will hopefully encourage you to reassess your beliefs about Jack the Ripper and his victims. 

Rubenhold, H. The Five. London: Penguin, 2019.

Protection of the White Continent: The Antarctic Treaty System of 1959

Written by Jack Bennett.

In the depths of the Cold War in 1959, the ice-covered landmass became a focus of international diplomacy with the three nuclear-weapon states of the USA, USSR and Britain establishing a model to ensure the nuclear-free, peaceful scientific cooperation and protection of Antarctica. This produced a new, globalised governance regime through the Antarctic Treaty System (ATS). Threatened by commercial fishing and resource exploitation due to improved scientific knowledge and more advanced equipment gaining greater access to Antarctica, the global diplomatic community aimed to ensure its sustainable, demilitarised and collaborative governance. Space was not the only frontier in which an international race was sparked: science became a collaborative enterprise with Antarctica the experimental station of Cold War scientific diplomacy. 

By the 1940s, the claimant club grew as Argentina and Chile joined two other European states; France and Norway. The USA and the Soviet Union did not recognise the legitimacy of any of these territorial claims. After the Second World War there emerged a new era of rising territorial and resource competition: in particular, the strategic importance of certain minerals created conflict arising from contested sovereignty. The issue of sovereignty over Antarctica was resolved in December, 1959 when 12 nations (Argentina, Australia, Belgium, Chile, France, Japan, New Zealand, Norway, South Africa, the United Kingdom, the USA and USSR) signed up to the Antarctic Treaty. This formalised and guaranteed free access and research rights so that all countries could work together for the common cause of scientific research and exchange of ideas. However, claimant states such as Australia and Argentina struggled to reconcile their own concerns about the future role of the superpowers. The treaty, which applies to the area south of 60degrees south-latitude, is surprisingly short but remarkably effective. The New York Herald Tribune reported that the treaty was a cause for ‘enduring hope’. The treaty system ensures the use of the continent strictly for peaceful purposes and promotes international cooperation to ensure the protection and sustainability of Antarctica. 

Permanent stations were established during the 1950s to commence the first substantial multi-nation research program during the International Geophysical Year (IGY) of 1957-58. Territorial positions were also asserted, though not agreed, creating a tension that threatened future scientific co-operation and potential conflict. The USSR showed off its polar prowess when it set up several bases, including one at the magnetic south pole and another at the most difficult spot to reach on the continent, the so-called ‘Pole of Inaccessibility’. With the IGY set to expire at the end of the year, Washington worried that Moscow would carve out a menacing military presence in Antarctica. As a consequence of disputes over ownership, the Antarctic Treaty agreement was signed by the nations that had been active on the continent during the IGY in order to avoid disagreements and conflicts, resolve disputes over ownership and mining rights and establish guidelines limiting development on the continent. Critically, the treaty occurred in a brief thaw in East-West tensions that had emerged during Soviet Premier Nikita Khrushchev’s visit to the USA in September 1959.

The treaty acknowledged that the demilitarisation and denuclearisation of Antarctica could forge a precedent for future international relations. Argentina and Chile played critical roles in the treaty as it also served as a precedent for agreements in other contested areas, such as the 1967 Outer Space Treaty and seven other zones free of atomic weapons. It is difficult to estimate how important the ban on nuclear weapons was alongside the prohibition on military activity. Cold War historiography demonstrates the catalytic power of science and scientific co-operation: it was the arms control element of the Antarctic Treaty that really underpinned peaceful governance. In part, the USA hoped to use the treaty to score a Cold War propaganda victory, keep Soviet missiles out of Antarctica and block communist China from gaining an unregulated foothold in the continent. However, the agreement still allowed for the peaceful uses of nuclear power, and from 1962 to 1972 the USA operated a defective nuclear reactor at McMurdo Station that contaminated over 12 thousand tons of soil. 

From the 1960s onwards the treaty confronted resource-related questions. Conventions on sealing, fishing, environmental protections and, controversially, mining were developed, with large acceptance from the international community. The proposal for a mining regulation was publicly rejected by Australia and France in the late 1980s in favour of a protocol on environmental protection. Mining was banned and priority was placed on developing an effective regime of environmental conservation. Tension was high in the 1980s as new players such as China and India began to make their presence felt in Antarctica and environmental groups such as Greenpeace demanded an end to whaling and a permanent ban on mining. This culminated in the 1991 Madrid protocol, which prohibited mining and made all continental activities subject to an environmental assessment.

Since coming into force on 23 June 1961, the treaty has been recognised as one of the most successful international agreements and a vehicle of cooperation of Cold War-era detente. Problematic differences over territorial claims have been effectively set aside and as a disarmament regime it has been outstandingly successful. Since then, the initial 12 signatories have risen to over 50, and the continent has remained largely free of military activity and has endured as a nuclear-free zone. But, the future of the ‘white continent’ remains contentious: continuing geo-political tensions, disputes over ownership and pressures from climate change all threaten it.

Fatima Ahmed Ibrahim, and Women’s Power in Sudan

Written by Lewis Twiby.

In 2019, when protests sparked across Sudan, the world seemed perplexed about the level of women’s activism during the protests. Alaa Salah, dubbed the ‘Woman in White’, became world renowned, and the iconic photo of her has been likened to the iconic photo of Che Guevara, Guerrillo Heroico. However, this ignores literal millennia of women’s resistance in the area comprising contemporary Sudan – from Mandy Ajbna carrying the severed head of her father to unite Sudanese communities against the British in the 1800s, to the Kandaka of Meroe defeating the forces of Alexander the Great in 332 BCE. Just two years before the outbreak of the protests one of Sudan’s most resilient and important feminists passed away, Fatima Ahmed Ibrahim. Fatima’s life shows the resistance to oppression regardless of the odds, and serves to inspire countless other women.

     Born in Omdurman in 1933, Fatima grew up in a political family – her grandfather was dismissed from the judiciary for opposing British colonialism, her mother was the first Sudanese woman to learn English, and her father was denied a teaching post for resisting the imposition of English in schools. Consequently, she inherited a legacy of resistance, and, at the age of fourteen, set up the Intellectual Women’s Association while at school to demand Sudanese liberation, and to oppose Britain’s support for conservative forces. She even co-founded a school paper called Elra’edda, under the pseudonym of ‘Daughter of Light’, focusing on women’s rights, anti-colonialism, and democracy. At just the age of eighteen she helped organise Sudan’s first women’s strike; Sylvia Clark, the head of the Omdurman High Secondary School, dropped subjects like science, so Fatima and others organised the strike. The strike ended up a success, by misogyny dashed her hopes – her father barred her from attending university. Seeking a new path, she became a teacher, and then Sudan’s leading feminist.

     The 1950s was a revolutionary time for Sudan. Across the colonised world liberation movements were throwing off colonialism, and this inspired the new young generation wanting a more equal society. Especially important for Sudan was the 1952 Egyptian coup which deposed the British-aligned monarchy and brought to power a reformist group, soon to be led by Gamal Abdel Nasser. Naturally, this would inspire the young Fatima, fresh off of her protests at school. In 1952 Fatima was one of several influential feminists who formed the Sudanese Women’s Union (SWU), which still remains one of Sudan’s major feminist organisations. Quickly, Fatima became involved with radical politics. When her brother, Salah, joined the Communist Party in 1954 she did as it was the first party to allow women entry. Due to this, Fatima and the SWU would offer radical analysis of women’s place in Sudan, and would especially focus on women’s rights in the workplace. This is further seen in the SWU’s publication, formed 1955, Sawt al-Mara, Women’s Voice, and, despite its first publication being repeatedly banned, it allowed women to present themselves as the subjects, not the objects of discussion. Among the articles discussed were the rights of working women, the rights of rural women, FGM, marriage, childcare, and feminism internationally. In 1956, the same year that Sudan became independent, she became the SWU’s president, but the honeymoon of women’s activism soon came to an end.

     In 1958, General Ibrahim Abboud came to power via a military coup, toppling the multi-party council which, albeit poorly, governed the country. Aligning Sudan with the United States against Nasserism and socialism, Abboud cracked down on worker’s and women’s rights. Sudan’s post-independence history saw periods of democratic rule followed by tyrannical military rule, which undertook genocide in the Darfur and south. Despite being banned by Abboud, the SWU continued campaigning and publishing Sawat al-Mara underground, and, just like in 2019, women like Fatima provided the backbone which resulted in the 1964 October Revolution. With Abboud toppled, progressives seized the new opportunity and women won the vote, something which allowed Fatima to become the first woman MP in Sudan, the Middle East, and Africa. Her position in parliament allowed her to use the state to bring women’s rights to the forefront of society demanding equal employment, equal pay, and equal access to higher education. However, there were limits to progressive politics. In order to be respected Fatima was forced into presenting herself as family-oriented and traditional. Then, a second coup happened.

     In 1969 Jafaar Nimeiri seized power and started reversing many of the progressive reforms enacted by the October Revolution. For publicly denouncing Nimeiri she was incarcerated for two years, and her husband, who was also a trade unionist, was murdered. Women were expelled from the administration, barred from traveling, and colourful clothing was banned. After her house arrest, Fatima continued advocating for human rights with the, now clandestine, SWU, and was repeatedly arrested for doing so. A revolt in 1985 deposed Nimeiri, but in 1989 Omar al-Bashir, who ruled until 2019, seized power with the National Islamic Front. As al-Bashir used Islam to justify his rule, proclaiming Sudan an ‘Islamic state’, he feared the influence of Fatima. She used Islam to undermine his rule proclaiming that ‘In the Qu’ran, God tells the Prophet “You have no power over people…”. And if the Prophet has received no power over people, what over Muslim could claim the right?’. Arrested again, Amnesty International intervened allowing her to form a new branch of the SWU in exile.

     Abroad, Fatima became one of many exiled and proud women who fought for the rights of the oppressed. In 1991, she was elected president of the Women’s International Democratic Federation speaking at conferences across the world, and her high standing allowed her to return to Sudan becoming a deputy in parliament in 2005. Her legacy lives on. Just as in 1956, 1964, and 1985 women were at the forefront of the protests of 2019. Western media often depicts the Islamic world as leaving women utterly crushed, requiring foreign intervention to ‘rescue’ them from men. However, Fatima’s life shows the resilience and resistance of women worldwide.

Teach-Out Review: Indigenous Politics and Revolutionary Movements in Latin America

Written by Anna Nicol.

In solidarity with the UCU strikes, there have been a number of organised Teach-outs which aim to create new spaces for learning and to explore alternative subject matters. In doing so they deconstruct traditional formats of learning and show that learning can take place at any time, in any format. On Tuesday 3 March, Dr Emile Chabal, the Director of the Centre for the Study of Modern and Contemporary History, organised a Teach-out led by Dr Julie Gibbings (University of Edinburgh) and Dr Nathaniel Morris (University College London). Focusing on Mexico, Guatemala and Nicaragua, Dr Gibbings and Dr Morris aimed to provide a short overview of indigenous participation in these revolutions over the twentieth century, highlighting various similarities and differences across borders and dissecting indigenous identity and affiliation within each. 

Having decided to discuss the revolutions chronologically, Dr Morris began with the Mexican Revolution which spanned from 1910 to 1920. Here, Dr Morris highlighted an important element of discussing indigenous history: historians come into contact with differing, and occasionally competing, definitions of “indigeneity”. While 80%-90% of the population in Mexico had indigenous ancestry, only 40-50% continued engaging with indigenous social structures, histories, languages, and interrogating their position in the world; therefore, focussing on indigenous revolutionary participation already presents obstacles in how we engage with and define indigenous identity itself. He argued that indigenous groups initially supported the revolution, in part as a result from pressure from landowners and the desire to reclaim their lands as well as with the aim to increase power and respect for their communities. However, Dr Morris noted that the leaders of the revolution perpetuated similar ideas and values of the old state in that they did not factor indigenous people into the “new Mexico”; instead, they aimed to solidify a population of mestizos (an individual with both Hispanic and indigenous heritage) which created fertile ground for indigenous uprisings against mestizo national versions of the revolution until 1940, when the revolution became less radical. Throughout the revolutionary transformation of Mexico, the concept of “indigeneity” closely followed the values of indigenismo, which prioritised maintaining the “traditional” and performative aspects of indigenous identity, such as native dress, while eradicating cultural values and practices which defined their “otherness” within Mexican society.

Dr Gibbings continued on from Dr Morris by describing how there were frequent intellectual exchanges across the Guatemala-Mexico border; for example, Miguel A. Asturias noted Mexico’s process of mestizaje after his visit in the 1920s but did not believe it could be applied to Guatemalan society, instead encouraging European immigration to Europeanise Guatemalan society. She then explained that after becoming independent in the nineteenth century, the western part of Guatemala became the political and economic heart of the country because of the growth of the coffee economy in the highlands. The growing economy resulted in widespread migration into indigenous highlands and mobilised indigenous communities as a labour force for coffee planting. Similarly to Mexico, the 1944 to 1954 revolution was largely led by the middle class and urban students which aimed to go to the countryside and “civilise” indigenous groups through education indicating, Dr Gibbings argued, that it was a revolution from above. The contentious issue in Guatemala was the unequal distribution of land – such as these coffee-plantations, which moderates believed could be tackled by redistribution amongst the campesinos. This process would be headed by the elites as top-down agrarian reform; this change provoked revolution from below as it encouraged indigenous labourers to petition for land. Dr Gibbings argued that these petitions became a vehicle for historic restitution, because completing the required sections of the petitions allowed for indigenous groups to write about how the land had historically belonged to them before it was stolen and colonised. These petitions posed a threat to the landed elite and companies like the United Fruit Company thereby leading to a CIA-supported military coup in 1954 to overthrow the revolution. 

Dr Morris concluded the presentation by describing the Nicaraguan Revolution of the 1980s. Similar to Guatemala, the Somoza dictatorship was backed by the United States and oversaw the unequal division of land, of which a small elite owned 90% that was frequently leased to American companies in industries such as mining, fishing etcetera. A guerrilla movement emerged during the 1970s and successfully overthrew the Somosa dynasty in 1979. The revolution was seen as a “beacon of hope” by many who hoped it would be an anti-imperialist, left-wing (but not authoritarian) revolution that would end socioeconomic and political disparities and institute social reform. In order to understand the reception of the revolution, Dr Morris took time to note the geographical divides within Nicaragua, outlining htat the Caribbean coast was never fully conquered by the Spanish, and so the coastline became known as the Miskitu territories, where the Miskitu and Mayangna communities lived. While the Miskitu and Mayangna were not entirely opposed to the revolution when it initially reached the Caribbean coast, they soon believed that the dictatorship, although oppressive, had generally allowed for their ethnic and cultural differences to continue undisturbed. Therefore, as the revolutionaries attempted to assimilate Miskitu groups into the “new nation” through education, similar to policies in Mexico at the beginning of the century, the Miskitu found their cultural autonomy challenged and attempted to resist. The disturbance led to rumours that the Miskitu were separatists and wanted to break away to form their own state. The tension between the revolutionaries and indigenous population culminated in the former forcing the latter to leave their villages and into camps in the jungle, further alienating the communities. As indigenous people escaped these camps, they often fled to Honduras where the Contra army was organised and supported by the CIA, who were providing arms to counterrevolutionaries – the Sandinistas did not see a difference between distinct indigenous groups and so everyone was treated as pro-American counterrevolutionary subversives. The civil war continued through the 1980s into the early 1990s when the Sandinistas were defeated at the ballot box by centrist-right wing liberals. 

Having provided a brief yet comprehensive overview of the three revolutionary countries, the floor was open to a discussion which cannot be justly reproduced here. The discussion allowed for the speakers to further develop earlier points and for other members of the Teach-out to ask questions. Themes covered included the failure of left-wing revolutionaries to successfully incorporate indigenous movements into their cause, without they themselves denying indigenous rights to autonomy, and also explored the gendered dimension of the revolutions, which saw the inclusion of women but no substantial launch of a women’s liberation movement. However, for me the most interesting part of the discussion was circling back to the concept of “indigeneity.” Dr Chabal asked how the development of indigenous identity has challenged neoliberal ideas, such as multiculturalism. In response to this question Dr Gibbings referenced Charles Hale’s argument on the indio permitido, or “permissible Indian”. Indio permitido is a term borrowed from Bolivian sociologist Silvia River Cusicanqui who argued that society needs a way to discuss and challenge governments that use cultural rights to divide and domesticate indigenous movements. Hale therefore concluded that indigenous communities are allowed to build rights and establish platforms of culture so long as they do not hinder or challenge government schemes. Indigenous communities thereby become “permissible” if they act within the economic framework that the government establishes, but are then discredited if they disagree or attempt to act outside of those state frameworks. He writes “governance now takes place instead through distinction…between good ethnicity, which builds social capital, and dysfunctional ethnicity, which incited conflict.” Understanding “permissible” and “impermissible” notions of indigeneity can therefore help us to better understand indigenous participation within these revolutions: indigenous groups were accounted for within the “new nations” when they adapted to the values of the forming nation-state, be it conforming to the national education system, learning Spanish or allowing for a top-down redistribution of land. If indigenous communities resisted or attempted to construct a communal identity outside these values they were then deemed counterrevolutionary or “subversive”. Dr Morris closed by connecting neoliberal ideas of indigeneity at the end of the twentieth century to the perception of indigeneity at the beginning of the century; he argued that neo-liberal recognition of indigenous groups is not that dissimilar to indigenismo in that indigenous “traditional” practices, such as dress, dances etc. are seen as acceptable but there is no space made for linguistic difference or political representation. 

Grappling with the notion of “indigeneity” and representation left me challenging my own perceptions of indigenous identity. Discussing indigenous narratives within history and competing perceptions of indigeneity urges us to interrogate our own approach to talking and writing about indigenous history, and understanding how we incorporate an indigenous perspective into the narrative of revolution. Perhaps this final thought is the most productive part of a Teach-out: to have individuals leave examining their own approach to research and education with the hope that new spaces will continue to form to re-evaluate and develop multiple narratives and perspectives.

Teach-Out Review: How Slavery Changed a City: Edinburgh’s Slave History

Written by Lewis Twiby.

As part of the teach-outs currently happening in solidarity with the UCU Strike, the History Society and the African and Caribbean Society hosted a very informative talk on Edinburgh’s connection to the slave trade. Chaired by two History undergraduates, Jamie Gemmell and Isobel Oliver, three experts – Sir Geoff Palmer, professor emeritus at Heriot-Watt, Lisa Williams, the director of the Edinburgh Caribbean Association, and Professor Diana Paton, our own specialist in Caribbean slavery in HCA – gave short speeches, and then answered, questions about Edinburgh’s slavery connections. In keeping with the ideals of the strike, of resistance and hope for a future, the speakers aimed to move away from traditional narratives of subjugation, instead focusing more on rehumanising enslaves peoples, discussing resistance, and how we can educate others on slavery.

     Sir Geoff Palmer was first to speak, beginning his talk on how he moved to London from Jamaica, and eventually up to Edinburgh in 1964. He discussed how, where Potterrow is now, was where the Caribbean Student Association was located, and how this talk would never have happened in 1964. Sir Palmer then went on to discuss the economic and ideological ties Edinburgh had to slavery. This included how David Hume used slavery as evidence for Africans being of lower intelligence, which, in turn, became a justification for the enslavement of Africans. He further highlighted how the literal structure of Edinburgh is partially built upon slavery. Scots owned 30% of Jamaican plantations, amassing to around 300,000 people, and the staggering wealth which was made through slavery helped built the city. 24 Fort Street, 13 Gilmore Street, York Place, and Rodney Street all had slave owners living there – Rodney Street is even named after an admiral who defended Jamaica from the French. The person who received the largest government compensation following the abolishment of slavery in 1834, John Gladstone, lived in Leith and received £83 million in today’s money. Despite the dark history of exploitation, Sir Palmer had some hope. He emphasised how having these talks was a step towards a brighter future, and stated ‘We can’t change the past, but we can change the consequences’.

     Professor Diana Paton continued after Sir Palmer, and wanted to look at the everyday aspects of slavery, and the rehumanisation of those enslaved. She explained that many of those who had plantations in Edinburgh actually inherited them – the horrors of slavery meant that plantation owners had biological children, but they were fathered by on enslaved women in an exploitative system, and many were barred from inheritance. As a result, inheritance subtly spread the influence of slavery in Edinburgh. For example, the Royal Infirmary in the 1740s received £500 from Jamaican slaveholders as a donation, and in 1749 was left in a will a 128-acre plantation with 49 enslaved people. Margareta McDonald married David Robertson, the son of HCA’s ‘founder’ William Robertson, and then inherited a plantation from her uncle, Donald McDonald. The callous attitudes they held towards people showed the dehumanisation of slaves, according to Professor Paton. The infirmary, a place of healing, rented out slaves earning £20,000 a year in today’s money, and a letter in the 1790s from Margareta asked whether she would get money from selling her slaves. However, Professor Paton also wished to rehumanise those enslaved and try to piece parts of their lives back together. For example, using the inventory of the McDonald’s, she found out about the life of Bella, born in Nigeria she was around 30 in 1795, and tragically passed away in 1832 – just two years before emancipation. Professor Paton stressed that by looking for people like Bella we can remind the public that those enslaved were not just nameless masses, but real, breathing people.

     Lisa Williams then began her speech, stating that her own Grenadian heritage, and the works of figures like Sir Palmer, inspired her to create the Edinburgh Caribbean Association. Williams wanted to break the exploitation of black historical trauma by creating the Black History Walks – specifically it was not a walking tour of slavery, although slavery is covered. Instead, it traces the forgotten history of Edinburgh’s Caribbean and African population since the sixteenth-century. In the 1740s, where the Writer’s Museum is today, a black boy worked as a servant and was baptised; Malvina Wells from Carriacou was buried in St John’s Kirkyard during the 1887; and how the mixed-race Shaw family even inherited slaves. Williams further emphasised the ideological impact of slavery, both in the past and today. Some white abolitionists, including William Wilberforce, exposed racist beliefs, so non-white abolitionists, like the Robert Wedderburn, challenged slavery and racial bigotry. Meanwhile, John Edmonstone from Guyana taught Darwin taxidermy and biology, something now believed to inspire him to go on his journey where he began developing the theory of evolution. She then discussed how the impact of slavery in Scotland today impacts education. Pride in the Scottish Enlightenment, a lack of teaching in the past, and racism in present society, a by-product of slavery, meant that this has been forgotten by society. However, she further argued that shifts in public opinion over reparations, including Glasgow University’s recent announcement that they would start looking at reparations, opens the doors for new educational opportunities. She concluded saying that the first look at African history and slavery should not be through the slave trade, instead it should be with African civilisations being taught in schools and the events of the Haitian Revolution.

     The question section, split into two with set one by the hosts and set two from the audience, cannot be adequately summarised here. This section of the teach-out allowed the speakers to elaborate on ideas they had wanted to discuss earlier, and the intellectual and emotional impact from this cannot be accurately represented here. Instead two themes cropped up throughout the discussion: education and decolonisation. Even then, these two themes were interconnected and can be best described as education through decolonisation. Sir Palmer, for example, spoke of how more research was needed to trace the economic and intellectual connections institutions had to slavery. Old College was partially funded through plantation profits, and how graduates from the medical school went to work on slave ships and plantations. This was echoed by Williams and Professor Paton – Williams cited how UncoverEd literally uncovers the forgotten history of the university, and how this was needed to be done elsewhere, not just in universities. Professor Paton echoed that the study of the Scottish Enlightenment had to be radically challenged, how their views on race helped justify slavery and the emergence of racism how we know it today. This further raises the question of should we even be naming buildings and raising statues of these people? The passion of the speakers is one thing to take away from this – Williams’ drive to challenge heritage sites in Scotland to acknowledge slavery and abolition, and Professor Paton’s description of education and public memory in Scotland about slavery as ‘insulting’ highlighted their desire for change. A direct quote from Sir Palmer remains with me, and shows why we need to study the past and decolonise, we have to ‘find out what is right, not do what is wrong’.

Casualisation, Contracts, and Crisis: The University in the early 21st Century

Interviews conducted and written by Jamie Gemmell.

From the University of Edinburgh’s various prospective student webpages, you would conclude that teaching lay at the heart of the institution. In their words, Edinburgh offers “world-class teaching” and is “always keen to develop innovative approaches to teaching.” Whilst the quality of Edinburgh’s teaching may not be in doubt, it is apparent that, judging by the way the institution treats staff, teaching is near the bottom of the university’s priorities. Over the past few months I have conducted interviews with Dr. Tereza Valny (Teaching Fellow in Modern European History), Dr. Megan Hunt (Teaching Fellow in American History), Dr. Kalathmika Natarajan (Teaching Fellow in Modern South Asian History), and Professor Diana Paton (William Robertson Professor of History). This piece aims to give voice to some of their experiences, putting a face to some of the more opaque problems raised by the ongoing industrial dispute between the UCU and Universities UK.

Three of my interviewees are “Teaching Fellows,” a position frequently defined by its contractual vagueness. On the surface, this short-term position is designed to provide opportunities for early career scholars, with an emphasis is on teaching and other student-facing activities. Often, the role is financed when a permanent member of staff acquires a large research grant. Theoretically, it’s a win-win: a more senior scholar can dedicate more time to their research, whilst a more junior scholar can gain some of the necessary skills and experience required for a permanent position. The reality is very different. In Dr. Valny’s words, the Teaching Fellowship is “extremely exploitative and really problematic.” In her experience, it meant being “plunged into an institution” to run modules and “figure[ing] it out as you go along.” Similarly, Dr. Natarajan referred to the contract as “precarious.” She finds the contractual obligations “so overwhelming, that I often … need a bit of a break,” leaving her unable to conduct research in her, unpaid, spare time. 

One of the primary issues around the Teaching Fellowship is the workload. Whilst Dr. Hunt’s contract stipulates that she should be working around twenty-nine to thirty hours per week, in reality she works “easily double that.” If she doesn’t have “specific plans on a weekend” she “will work.” Even then, she remains in a “cycle where you never quite get on top of it.” Dr. Natarajan puts it a bit more diplomatically, suggesting that her hours “definitely stretch more than the average work week.” Under the department’s carefully calibrated workload framework, five hours of one-on-one time are given to each tutorial group for a whole semester and forty minutes for a typical undergraduate essay – that includes engaging with work, writing up feedback, and discussing it with the student. Obviously, this is not sufficient. Dr. Hunt concludes that if she worked the hours laid out by the workload framework, her classes “would be turning up and saying let’s have a chat.” Even as a Professor, these issues do not fall away. Whilst working to contract as part of the UCU industrial action this term, Professor Paton has been able to spend much less time preparing for teaching than she normally would, only “scanning over primary sources” and “relying on long-term knowledge” when it comes to the secondary literature. By focusing on quantifying time so precisely, the institution has failed students completely, relying on the goodwill of the University’s employees. It hardly reflects a desire to introduce “innovative approaches” to teaching. 

With workloads so high, it is common for early career scholars to become trapped in teaching positions. Advancement in the sector relies on putting together a strong research portfolio – that means articles in highly regarded journals and respected book publications. As one of the University’s primary sources of income is research funding, scholars with reputable research backgrounds are crucial. However, Teaching Fellowships, by their very nature, stipulate little to no time to research. When I asked Dr. Natarajan how many hours she dedicated to research she laughed and said, “absolutely none.” Despite developing many of her key ideas through her teaching, Dr. Valny has never had the “space to take those ideas” and transform them into a book proposal. This can lead to anxiety and stress. Dr. Natarajan’s PhD is “constantly at the back of my mind,” yet she rarely finds significant time to transform the piece into a monograph. Without the adequate time allocated to research, these scholars can never advance. Dr. Valny, rather depressingly, concludes that if she continues within a Teaching Fellowship she will become “unemployable” in any other position. With her contract expiring in August this year, it appears that this possibility could become a reality. Her situation reflects a broader problem where staff dedicated to their students and teaching are not rewarded for their work.

The emphasis on research has led to pernicious discourses that have devalued teaching, further demoralising many early career scholars who find themselves ensnared in these roles. In contrast to her time in Prague, where she was rewarded for producing popular courses (although still employed only temporarily), Dr. Valny finds herself suffering from feelings of “imposter syndrome” and “guilt, or inadequacy” when confronted with suggestions that she need only apply for research grants to escape her role. For Dr. Hunt, being “respected for what I already do quite well,” would be more appreciated. She claims that “institutionally it (teaching) doesn’t matter.” By being “a good teacher,” she has risked her career being “put on hold, if not completely stalled.” Similarly, Dr. Natarajan has found her teaching being treated as “a side-line” or a “side-note” to research. Performative professionalism has often defined these scholars’ teaching approaches, hiding an institution that disregards teaching and actively encourages academics to move away from teaching. This is despite some Teaching Fellows, such as Dr. Valny, accepting that a permanent teaching position would be “actually fine.”

These issues around workloads and casualisation intersect with the brutal policies of the Home Office, frequently referred to as the “hostile environment.” Home Office regulations stipulate that only “highly-skilled migrants” can live and work here, meaning those on short term contracts face another level of instability. For Dr. Natarajan, this has been a major source of precariousness. Dr. Natarajan can “only stay as long as I have a job or, rather only as long as I have a visa and the visa depends on my job.” If Dr. Natarajan or her husband fail to secure another job, after their current contracts expire, they risk deportation. Within the sector more broadly, advertisements for short term jobs often assert that only those with a pre-existing right to reside can apply. This issue throws cold water over criticism that stereotypes strikers as middle-class whites. Demonstrably scholars of colour, often, in the words of Dr. Natarajan “have their own very different set of precarious circumstances.” 

Many of these issues reflect deeper structural problems within the higher education sector.  Scholars frequently cited the removal of the student cap and increase in tuition fees, reforms from 2010, as exacerbating pre-existing issues and transforming education into a commodity. Dr. Natarajan has suggested that the university has become a “business venture,” whilst Professor Paton claims that there was an “almost instant” change in the way students and management conceptualised higher education after 2010. Over the years, under Professor Paton’s analysis, this “quantitative increase has become a qualitative change,” putting pressure on staff and students. Despite student numbers and tuition fees increasing, Dr. Hunt suggests that “the service that people are paying” for is not being provided. Rather, money flows into marketing and big projects that elevate the positions of senior management figures.

The university sector appears to have reached a tipping point. On a micro level, staff are under increasing pressure, with workloads increasing and casualisation becoming more widespread. A two-tier system has developed, with early career scholars expected to teach more and research less. Goodwill and professionalism appear to be the only things preventing university teaching coming to a standstill. On a macro level, the sector has become partially commercialised with fees privatised and universities encouraged to compete for students. This has occurred without a concomitant provision of consumer rights, leaving students forced to accept higher levels of debt without safeguards in place to demand improvements or changes in the service provided. These institutions have been left in some middle ground between state-funded institution and privately-funded business venture, to the detriment of academics and students. Demands being made under the ongoing industrial dispute are hardly radical. Many academics are simply requesting greater job security and more respect for the work they do. If universities aren’t designed to support students or academics properly, we are all left asking who on earth are they designed for?  

ENDURE AND SURVIVE: THE LGBTQ+ HISTORY OF VIDEO GAMES

Written by Tristan Craig.

Ever since an Italian plumber called Mario entered the world of computer entertainment in 1981 on a never-ending plight to rescue Princess Peach, the damsel-in-distress trope became the driving narrative for the majority of video games in the early days of their development. Just as Perseus slew the beast that threatened his beloved Andromeda, it fell to the might of the male protagonist – who, in his first appearance, was a carpenter referred to as ‘Jumpman’ – to rescue his girlfriend: the somewhat less imaginatively named ‘Lady’. A simple plot device catering to a predominantly white-heterosexual-male market, Super Mario Bros. sold over 40 million copies upon its release in 1985, reviving the home computer following the crash of 1983 and paving the way for the platform format.

This was a format which sold a very large number of games, but which offered remarkably little in the way of representation beyond the white-heterosexual-male binary. As homophobia swelled in the wake of the AIDS epidemic of the same decade, LGBTQ+ inclusion was profoundly absent from the video game industry and those who did feature either did so in a pejorative or peripheral manner. 1986’s text adventure Moonmist is commonly cited as the first to include any illusion to a queer character. Vivien Pentreath, an artist struggling to cope in the aftermath of the suicide of her female lover, Dierdre, is thought to be the first lesbian character to feature in a video game, however at no point is her sexuality explicitly stated. The only reference to her sexual identity is a note in one of four possible endings stating that ‘Vivien was intensely attached to Dierdre’ and that she was jealous of the latter’s heterosexual marriage. It is also worth noting that in this story arch, Vivien emerges as the villain in an otherwise tertiary role.

Whilst the inclusion of non-heterosexual characters was particularly rare, transgender identities were almost non-existent. Following the massive success of Super Mario Bros., Nintendo continued to develop games starring their eponymous hero. The second release in the series, which arrived on the Nintendo Entertainment System in 1988, introduced a character named Birdo: a pink creature of indeterminate species and gender. Birdo arrived in the United States with a manual entry which read ‘He thinks he is a girl and he spits eggs from his mouth. He’d rather be called “birdetta”’. The game itself provided no further backstory nor allusions of any kind to Birdo’s gender identity, consigning it to a problematically worded blurb in a guide. Future iterations of the game removed any illusion to the concept of Birdo being anything other than a cisgender female – although a 2008 Japan-only release called Captain Rainbow would revisit her canon, in one country at least. 

As the larger development companies continued to indulge the majority of their market, the release of HyperCard software for the Macintosh in 1987 allowed independent designers to produce their own software with ease. The first fully LGBTQ+ game, written in HyperCard, subsequently emerged in 1989; Caper in the Castro follows a lesbian private investigator called ‘Tracker McDyke’ as she attempts to find her kidnapped drag queen friend. Released as charityware, a note from creator C.M. Ralph as the game is launched states she ‘wrote this game as a labor of [her] love for the Gay and Lesbian community’ and asks the player to make a donation to an AIDS charity of their choice. The game would be picked up by Heizer Software where it enjoyed success – albeit as the renamed and fully ‘straightwashed’ Murder on Main Street.

The 1990s began making somewhat more progressive steps away from the standard format. 1996 saw the introduction of a female protagonist in the guise of archaeologist Lara Croft. The Tomb Raider series of games broke away from the male dominated lead, yet Croft was lauded and criticised in equal measure for being both a highly intelligent and hypersexualised lead. Her inception, although hugely impactful to the video game market, was once again aimed primarily at a male target audience. But the late 1990s managed to provide a landmark for LGBTQ+ inclusion. Black Isle Studios’ Fallout 2, released in 1998, contained the first same-sex marriage in a video game – 6 years before the first US state would legalise them. Fast forward to the 2000s and the landscape is certainly more diverse. Advances in the technical capabilities of home computing and the subsequent rise of the roleplaying game has allowed players to craft their own identity, free from being forced down a singular heterosexual mode of gameplay. And yet, it is hard to deny the imbalance, particularly in representing gender identities beyond the male-female binary. 

So how colourful does the future of gaming look for the LGBTQ+ community? May 2020 is set to welcome to highly anticipated sequel to Naughty Dog’s The Last of Us who first introduced the post-apocalyptic world of Joel and Ellie in 2013. This time, our attention will be turned to Ellie – an openly lesbian protagonist. Following on from DLC, Left Behind features fourteen-year-old Ellie sharing a short but tender kiss with best friend Riley – developers have chosen to fully actualise the sexual identity of their protagonist. But development companies have a long way to go if they want to fully include representation of a large proportion of their demographic, as 2014 documentary Gaming in Colour explored. As 2020 celebrates the 40th anniversary of an Italian plumber and his relentless quest to save his princess, perhaps we ought to reflect not only on how far the video gaming industry has come but on how much further it could and ought to go.

BIBLIOGRAPHY

‘Caper in the Castro: Internet Archive’, https://archive.org/details/hypercard_caper-in-the-castro (accessed 15.02.20) (note: you call play an online emulation of the game at this link)

‘Censored or Celebrated (Flouting Margins: Part 2)’, https://www.scholarlygamers.com/feature/2018/04/25/lgbt-flouting-margins-part-two/ (accessed 14.02.20)

‘LGBTQ Video Game Archive’, https://lgbtqgamearchive.com/games/games-by-decade/1980s/ (accessed 14.02.20)

The Writing on the Wall: The Perilous Future of Historical Sites and Monuments

Written by: Tristan Craig.

In March 2014, officials in the Huairou District of Beijing announced their intention to certify part of the Great Wall of China a ‘graffiti zone’, formally allowing individuals to freely etch their names into the millennia old fortification. Located within the Mutianyu section – one of the best preserved areas of the wall and particularly popular with tourists – the controversial decision was made in an effort to abate increasing destruction by overzealous visitors. Authorities claimed this drastic step was a necessary one, arguing that patrolling officers and warning signs across the 5500 mile long structure had already proven ineffective, and that granting people permission to inscribe their messages on a small portion of the wall would lure them away from defacing it elsewhere. 

The Great Wall of China, however, is not unique in its suffering. Just three months after the dedication of the graffiti zone in Beijing, a part of the parapet on the historic Pont des Arts bridge in Paris (originally constructed during the reign of Napoleon Bonaparte in the nineteenth century before being rebuilt in the 1980s) collapsed under the weight of padlocks attached to its railings. In a tradition where amorous couples seek to immortalise their devotion by marking their initials on a ‘love lock’ before attaching it to a bridge or other railing, the literal weight of this act proved costly. Subsequently, some forty-five tonnes of padlocks were removed from the Pont des Arts and destroyed. Locals in Verona, Italy are similarly bearing the burden of sentimental tradition. The courtyard of the Casa di Giulietta, believed to belong to William Shakespeare’s eponymous – and fictitious – heroine in Romeo and Juliet has become awash with scribblings, engravings and paper notes tacked to the walls with chewing gum (despite the infamous balcony from which she delivers her soliloquy being a twentieth century addition). The response from local authorities in this instance was to provide removable boards to satiate the desire of visitors to commemorate their love and to prevent further damage to neighbouring properties.

Human defilement is not the only factor at work in the destruction of outdoor landmarks, however. Erosion caused by weather conditions and natural degradation over time is a constant threat to their conservation. How best to preserve them whilst maintaining the integrity of the original structure has often proven a contentious issue. The Pictish standing stones of medieval Scotland exemplify this struggle, as their pictorial and ogham inscriptions are gradually worn by the harsh climate. In Forres, on the Moray coast, the monolithic ninth-century Sueno’s Stone now stands interred behind glass panels to prevent further damage, although remains on the site where it was first erected. The Anglo-Saxon Ruthwell Cross, which once stood in the yard of the church after which it is named, is now housed inside the building in an apse specially constructed for it in 1887. Whilst sheltering the cross – which dates to the eighth century – from the elements has ensured its survival, in doing so it no longer serves the same purpose for which it was originally created. What once was an object to be observed by passing laity – an ecclesiastical monument to the omnipotence of God and a beacon for travelling pilgrims – is now an artefact of historical religious significance. Whilst the artistry of the cross can still be admired by those who enter Ruthwell Church today, its context in the annals of medieval Christianity ought not to be disregarded.

Preserving and restoring structures subject to elemental deterioration presents a plethora of issues to conservationists, something which is only exacerbated by sites which benefit greatly from the tourist trade. Drawing new swathes of visitors to areas on occasion serves as the driving force in restoring ancient monuments but becomes problematic when done so to an inadequate standard. Addressing cosmetic deterioration on a merely superficial level often fails to fully address the source of the decay or does so in a manner unsympathetic to the original architecture. In 2016, the Great Wall of China once again hit headlines due to what individuals, including the chairman of the China Great Wall Society, viewed as an incredibly unsympathetic repair on a mile long stretch of it however official plans were announced in 2019 to reinstate the wall and prevent further irreversible damage.

Needless to say, preserving a monument as grand in scale as the Great Wall of China is not quite as simple as installing plywood panels along its ramparts or placing it behind armoured glass sheets. The dedication of the graffiti zone may be seen as the best possible course of action in what has become an increasingly dire situation. The most drastic course of action – preventing tourist access to the well-travelled sections of wall – would prove a gravely arduous and immensely costly task given the economic boost provided by tourist commerce, although graffiti is but one problem threatening the future of the structure. An estimated 30% of the Great Wall built by the ruling Ming dynasty has been lost due to natural erosion and the theft of bricks by tourists and locals alike. Although patrols continue along the route, maintaining 5500 miles of fortification visited annually by an estimated eleven million visitors in the most popular sections alone is a complex issue.

On occasion, however, the inscriptions left behind by figures eminent in their own right may actually serve in drawing visitors to an area. A signature purported to belong to Romantic poet Lord Byron on a column of the Temple of Poseidon at Cape Sounion in Attica draws as many literary admirers as it does classicists. It would malapropos to suggest, however, that making an allowance for this particular piece of history serves to encourage the public debasement of ancient monuments. The desire for people to leave their mark on sites of great provenance is all too common an issue facing conservationists and whilst individuals may do so ignorant to the consequences of their vandalism, it is a problem needing addressed on a global scale before historical sites are irreversibly damaged – or lost entirely.

BIBLIOGRAPHY

Aitken-Burt, Laura. “Rewriting History.” History Today, January, 2020.

Chance, Stephen. “The Politics of Restoration.” The Architectural Review 196, no. 1172 (October 1994): 80-4.

Coldwell, Will. “Great Wall of China to Establish Graffiti Area for Tourists.” The Guardian, 4 March, 2020. https://www.theguardian.com/travel/2014/mar/04/great-wall-of-china-graffiti-area.

Image: https://www.discoverchina.com/article/reach-great-wall-china-from-shanghai

Kinloch Castle, Isle of Rum.

Written by: Mhairi Ferrier.

Majestic, intriguing, remarkable, captivating…

These are just some of the words that come to mind when describing the Isle of Rum, located in the Scottish Highlands. The largest of Scotland’s Small Isles, accessed by ferry from Mallaig, Rum is these days maintained by a combination of the Scottish Natural Heritage (SNH) and the Isle of Rum Community Trust. In 2009 and 2010 there was a transfer of land and assets in Kinloch Village and the surrounding area from the SNH to the Community Trust. This was a landmark change for the island and marked a new phase in its history. Kinloch Castle, located in Kinloch – the island’s main village, is still under the control of the SNH. 

             The Isle of Rum has a deeply rich history, spanning from the Ice Age to interactions with Vikings before falling victim to the Highland Clearances. A piece of this length could not begin to do justice to the comprehensive history of the island, although there are some points in this history which hold the key to the island’s economic future. At its height the Island community was nearly 450 inhabitants; this is, however, before the Highlands Clearances removed these people from their homes. Never again has the Island hosted such a population – today’s island community is made up of less than 30 people. Post-clearances, the Island passed through different owners before becoming the possession of the Bullough family.

            Between 1897 and 1900, Kinloch Castle was constructed on the Island as commissioned by George Bullough. Bullough had inherited a large sum of wealth and the island from his father, John who made his fortune in Lancashire’s textile industry. Extravagantly built, the castle cost £250,000 (today equivalent to millions of pounds) and was decorated in the Victorian fashions of the age. The Bulloughs hosted guests on the island, offering a wealth of activities within the castle and across the rest of the island. Everything from a spot of dancing in the ballroom, to a game of golf or tennis, or perhaps guests of this station would be more interesting in pursuing stalking on the island. Yet life on the island was a tale of two halves during the Bullough years, on one side the extravagant lifestyle of the Bulloughs and their guests, and on the other those working for the Bulloughs for whom island life was most likely a struggle. 

            The First World War put an end to this extravagance, with George Bullough gaining a military position and workers enlisting in the army. After the war drew to a close, less than a handful of the workers returned from the conflict and the Bullough family frequented their island paradise less.  There was little appetite for the lavish pastimes and dinner parties that had been the norm for the privileged during the Victorian period and the beginning of the Edwardian epoch. George Bullough’s death in 1939 led to the Castle being frequented even less; with his widow Lady Monica selling the Island (including the Castle) in 1957 for a sum of £23,000. Sir John Betjeman was largely correct when he predicted that:

In time to come the Castle will be a place of pilgrimage for all those who want to see how people lived in good King Edward’s days.

Kinloch Castle quickly became a staple for any tourist visiting the island and for many years hosted hostel accommodation and a pub. There were, and remain, regular tours for visitors to gain an understanding of the lifestyle led in stark contrast to the pursuits of an ordinary island man or woman. As most know, the upkeep of any historic building is costly, and this struggle led to the closure of the castle’s hostel and pub in 2015. The campaign to keep the castle in its prime continues and is supported by the Kinloch Castle Friends Association (KCFA).  

            KCFA have great plans going forward which include re-opening the hostel with brand new accommodation as well as restoring the museum rooms of the castle to their former glory. This would not only boost the amount of accommodation available for visitors on the island but also help boost the economy for locals. With new housing being built on the island, things would appear to be going from strength to strength. However, there is one snag in this plan: KCFA’s asset transfer bid was rejected by SNH who did not believe the association’s plans to be financially viable. With each delay such as this, the castle is deteriorating, further increasing the sum required to complete the restoration. 

Well what now? 

            KCFA are appealing for further financial support in order to make their plans a reality. Should further funding not materialise, demolition is a stark possibility that SNH are considering. While the demolition of historic buildings is not unusual, it would be a disaster in this case. Kinloch Castle is a unique part of Rum’s fascinating history; a building which tells the story of the decadence and wealth of the most privileged in the early 1900s, truly contrasting the life of the working-class islanders. The Castle is a true symbol of Highland history, it demonstrates what happened on many of the region’s estates after the Clearances. As such, the Castle must remain as reminder for all. Furthermore, the Castle will provide a vital element to the local economy should it remain. With repopulating of the vast Scottish Highlands now taking shape thanks to various initiatives, any available boosts to local economies are vital in order to make it a success. 

            This new decade has the possibility to be remembered as the one in which Kinloch Castle is demolished or it could be the one in which a revitalised and restored castle is able to make a real impact to the community. The combination of new housing, new opportunities under community ownership and a restored castle no longer controlled by SNH would be a monumental development for Rum. 

For now, we’ll have to wait and see.  

Image Source: http://www.isleofrum.com/isleofrumheritag.php