Modernisation Theory: Challenging British Exceptionalism and the Unilinear Model

Written by Ella Raphael. 

Modernisation Theory refers to a model of societal transition, originally meaning the movement from a ‘traditional’ society to an ‘advanced’ society. Since the seventies it has been a topic of contentious debate. Revisionists have challenged traditional theorists, such as Walt Whitman Rostow and Marion Levy, and have criticised their narrow rubric of modernity, which has been based on Britain’s economic success in the Industrial Revolution. Recent historiography has raised the following questions: what constitutes modernity, who’s rubric are we following and why? The new wave of debate has reiterated the benefits of extending the theory, incorporating the ideas of multiple modernities and economic efflorescences, for example. Despite this, moving forward, historians must navigate the risk of making the theory too broad, and must ensure it maintains its sense of structural cohesion. 

Rostow’s 1960 theory of modernisation was incredibly influential, yet vastly criticised. He argues in The Stages of Economic Growth: A non communist manifesto that all societies go through five stages of development, starting in a Pre-Newtonian state and ending in an age of mass consumption. He uses industrialising Britain as a recipe for economic success for “developing” countries. He does not hide his political agenda, explicitly calling his model an “anti-communist manifesto” stating that the Soviet Union can achieve modernisation once it abandons the marxist model of development. Levy, another early theorist writing in the sixties, argued that as the level of modernisation increases, so does the structural uniformity among societies. Since the revival of modernisation theory in the nineties this unilinear model has been adapted. 

The two main criticisms of Rostow’s model of modernisation are that it is eurocentric and teleological. Historians now recognise that there are multiple paths to “modernity”, which in this case means economic maturity. Jack Goldstone argues that within modernisation theory there is too much emphasis on the heroic narrative of the “Rise of the West”. He argues that many early modern and non-European societies experienced “efflorescences” of economic growth, and steady increases in technological change. He also challenges British exceptionalism by arguing that Britain’s industrial success was a historical anomaly which resulted from the lucky conjuncture of an economic efflorescence and a growing culture of engineering. His research on Qing China and Golden Age Holland weakens the unilinear vision, and proves that countries do not all follow the same economic trajectory. 

The enlightenment ideology that every society inevitably develops a similar set of ideas, customs and institutions is a simplification. Adam Smith, an early influencer on modernisation theory, used the framework of the ‘Four Stage Theory’ to suggest that given enough time, every society would converge towards one homogenous form. Sanjay Subrahmanyam, David Porter and Joseph Fletcher address the issues with comparative exercises. There is a tendency to use categories derived from a European experience, and these then shape the questions comparative historians ask. Western modernity has been given a privileged position, and has been set as the benchmark against which all other societies are inferior. Condorcet, another enlightenment philosopher epitomises this Eurocentric idea; he believed that the rest of the world could look to Western European societies and see its future. The rubric of modernity is rooted in the idea of European superiority. As Matthew Lauzon argues, this rigid outlook has provided a “theoretical justification for European cultural and imperial hegemony”. The teleology is problematic because it puts modern European society as the pinnacle of civilisation and human development. 

Additionally, historians now recognise that as well as many paths to modernity, there are many different destinations too. Shmuel Eisenstadt’s concept of “multiple modernities” has helped discredit the notion that modernisation is synonymous with Westernisation. He argues that forms of modernisation across societies are not homogenous because of their varied cultural and historical backgrounds. He looks at fundamentalism and argues that this should be seen as an alternative branch of modernisation rather than as a traditionalist form of governance. He says ‘the distinct visions of fundamentalist movements have been formulated in terms common to the discourse of modernity; they have attempted to appropriate modernity on their own terms.’ Tu supports this through his idea of “Confucian” modernity, in Japan for example. He states that East Asian modernity focuses more on soft authoritarianism, paternalistic polity and government leadership in the market economy. Both Tu and Eisenstadt prove that modernity is not uniform and is also not just derived from Western Europe. Although this adaptation succeeds in addressing the teleology and Eurocentrism of the original theory, there is the risk of it becoming too broad, losing its core meaning and thus being made redundant. Volker Schmidt urges multiple modernists to create a core meaning of the term, so that their claim can be appropriately measured. Nevertheless, the idea provides a potential framework through which future historians can compare levels of development. 

The adaptations made to modernisation theory have helped redefine what ‘modernity’ means. They have brought into question whose rubric we chose to follow and they have helped us understand alternative economic and political trajectories. Eisenstadt’s multiple modernities theory and Goldstone’s concept of economic efflorescences have challenged British exceptionalism and Rostow’s unilinear model. Nevertheless, the adaptations are not perfect and pose a new set of methodological issues. It is now the task of current historians to create a standardised, core meaning of modernisation, in order to fully assess whether societies have reached this stage. 


Eisenstadt, Shmuel N. & Schlechter, Wolfgang (eds.), Daedalus 127.3 (1998), special issue: ‘Early Modernities’. 

Fletcher, Joseph, ‘Integrative History: Parallels and Interconnections in the Early Modern Period, 1500-1800’, in Studies on Chinese and Islamic Inner Asia (Aldershot: Ashgate, 1995) pp.1-35

Goldstone, Jack A., ’Efflorescences and Economic Growth in World History: Rethinking the “Rise of the West” and the Industrial Revolution’, Journal of World History 13, no. 2 (2002): 323-389. 

Goldstone, Jack, ‘The Problem of the “Early Modern” World’, Journal of the Economic and Social History of the Orient, 41.3 (1998), 252. 

Hoff, Karla, ’Paths of Institutional Development: A View from Economic History’, World Bank Research Observer 18, no. 2 (2003): 205-226. 

Hout, Wil. “Classical Approaches to Development: Modernisation and Dependency.” 2016. 

Lauzon, Mathew ‘Modernity’, in The Oxford Handbook of World History, ed. Jerry H. Bentley (Oxford, 2011), pp. 72- 84. 

Levy, Marion J. Jr. 1966. Modernization and the Structure of Society. Princeton, NJ: Princeton University Press. 

Marsh, Robert M.,’Modernisation Theory, Then and Now’, Comparative Sociology 13 (2014): 261– 283. 

Porter, David (ed.), Comparative Early Modernities (New York, 2012) 2. 

Rostow, Walt Whitman, The Stages of Economic Growth: A non-communist manifesto, 3rd ed. (Cambridge: Cambridge University Press, 1991), 4-16 (Chapter 2: ‘The Five Stages of Growth – A Summary’). ebook 

Schmidt, V. 2010. “Modernity and Diversity.” Social Science Information, 49: 511–538. 

Scott, Hamish ‘Introduction: Early Modern Europe and the Idea of Early Modernity’ in idem (ed.), The Oxford Handbook of Early Modern History, 1350-1740. (Oxford, 2015) pp. 1-34. 

Subrahmanyam, Sanjay, ‘Connected Histories: Notes towards a Reconfiguration of Early Modern Eurasia’, Modern Asian Studies, 31.3 (1997), pp. 735-762. 

Tu, Wei-ming.,“Multiple Modernities” In K. Pohl eds. Chinese Ethics in a Global Context ( Leiden, 2002), pp.63. Walker, Garthine, ‘Modernization’ in eadem (ed.), Writing Early Modern History (London, 2005) ch.2.

Image source:

Teach-Out Review: Indigenous Politics and Revolutionary Movements in Latin America

Written by Anna Nicol.

In solidarity with the UCU strikes, there have been a number of organised Teach-outs which aim to create new spaces for learning and to explore alternative subject matters. In doing so they deconstruct traditional formats of learning and show that learning can take place at any time, in any format. On Tuesday 3 March, Dr Emile Chabal, the Director of the Centre for the Study of Modern and Contemporary History, organised a Teach-out led by Dr Julie Gibbings (University of Edinburgh) and Dr Nathaniel Morris (University College London). Focusing on Mexico, Guatemala and Nicaragua, Dr Gibbings and Dr Morris aimed to provide a short overview of indigenous participation in these revolutions over the twentieth century, highlighting various similarities and differences across borders and dissecting indigenous identity and affiliation within each. 

Having decided to discuss the revolutions chronologically, Dr Morris began with the Mexican Revolution which spanned from 1910 to 1920. Here, Dr Morris highlighted an important element of discussing indigenous history: historians come into contact with differing, and occasionally competing, definitions of “indigeneity”. While 80%-90% of the population in Mexico had indigenous ancestry, only 40-50% continued engaging with indigenous social structures, histories, languages, and interrogating their position in the world; therefore, focussing on indigenous revolutionary participation already presents obstacles in how we engage with and define indigenous identity itself. He argued that indigenous groups initially supported the revolution, in part as a result from pressure from landowners and the desire to reclaim their lands as well as with the aim to increase power and respect for their communities. However, Dr Morris noted that the leaders of the revolution perpetuated similar ideas and values of the old state in that they did not factor indigenous people into the “new Mexico”; instead, they aimed to solidify a population of mestizos (an individual with both Hispanic and indigenous heritage) which created fertile ground for indigenous uprisings against mestizo national versions of the revolution until 1940, when the revolution became less radical. Throughout the revolutionary transformation of Mexico, the concept of “indigeneity” closely followed the values of indigenismo, which prioritised maintaining the “traditional” and performative aspects of indigenous identity, such as native dress, while eradicating cultural values and practices which defined their “otherness” within Mexican society.

Dr Gibbings continued on from Dr Morris by describing how there were frequent intellectual exchanges across the Guatemala-Mexico border; for example, Miguel A. Asturias noted Mexico’s process of mestizaje after his visit in the 1920s but did not believe it could be applied to Guatemalan society, instead encouraging European immigration to Europeanise Guatemalan society. She then explained that after becoming independent in the nineteenth century, the western part of Guatemala became the political and economic heart of the country because of the growth of the coffee economy in the highlands. The growing economy resulted in widespread migration into indigenous highlands and mobilised indigenous communities as a labour force for coffee planting. Similarly to Mexico, the 1944 to 1954 revolution was largely led by the middle class and urban students which aimed to go to the countryside and “civilise” indigenous groups through education indicating, Dr Gibbings argued, that it was a revolution from above. The contentious issue in Guatemala was the unequal distribution of land – such as these coffee-plantations, which moderates believed could be tackled by redistribution amongst the campesinos. This process would be headed by the elites as top-down agrarian reform; this change provoked revolution from below as it encouraged indigenous labourers to petition for land. Dr Gibbings argued that these petitions became a vehicle for historic restitution, because completing the required sections of the petitions allowed for indigenous groups to write about how the land had historically belonged to them before it was stolen and colonised. These petitions posed a threat to the landed elite and companies like the United Fruit Company thereby leading to a CIA-supported military coup in 1954 to overthrow the revolution. 

Dr Morris concluded the presentation by describing the Nicaraguan Revolution of the 1980s. Similar to Guatemala, the Somoza dictatorship was backed by the United States and oversaw the unequal division of land, of which a small elite owned 90% that was frequently leased to American companies in industries such as mining, fishing etcetera. A guerrilla movement emerged during the 1970s and successfully overthrew the Somosa dynasty in 1979. The revolution was seen as a “beacon of hope” by many who hoped it would be an anti-imperialist, left-wing (but not authoritarian) revolution that would end socioeconomic and political disparities and institute social reform. In order to understand the reception of the revolution, Dr Morris took time to note the geographical divides within Nicaragua, outlining htat the Caribbean coast was never fully conquered by the Spanish, and so the coastline became known as the Miskitu territories, where the Miskitu and Mayangna communities lived. While the Miskitu and Mayangna were not entirely opposed to the revolution when it initially reached the Caribbean coast, they soon believed that the dictatorship, although oppressive, had generally allowed for their ethnic and cultural differences to continue undisturbed. Therefore, as the revolutionaries attempted to assimilate Miskitu groups into the “new nation” through education, similar to policies in Mexico at the beginning of the century, the Miskitu found their cultural autonomy challenged and attempted to resist. The disturbance led to rumours that the Miskitu were separatists and wanted to break away to form their own state. The tension between the revolutionaries and indigenous population culminated in the former forcing the latter to leave their villages and into camps in the jungle, further alienating the communities. As indigenous people escaped these camps, they often fled to Honduras where the Contra army was organised and supported by the CIA, who were providing arms to counterrevolutionaries – the Sandinistas did not see a difference between distinct indigenous groups and so everyone was treated as pro-American counterrevolutionary subversives. The civil war continued through the 1980s into the early 1990s when the Sandinistas were defeated at the ballot box by centrist-right wing liberals. 

Having provided a brief yet comprehensive overview of the three revolutionary countries, the floor was open to a discussion which cannot be justly reproduced here. The discussion allowed for the speakers to further develop earlier points and for other members of the Teach-out to ask questions. Themes covered included the failure of left-wing revolutionaries to successfully incorporate indigenous movements into their cause, without they themselves denying indigenous rights to autonomy, and also explored the gendered dimension of the revolutions, which saw the inclusion of women but no substantial launch of a women’s liberation movement. However, for me the most interesting part of the discussion was circling back to the concept of “indigeneity.” Dr Chabal asked how the development of indigenous identity has challenged neoliberal ideas, such as multiculturalism. In response to this question Dr Gibbings referenced Charles Hale’s argument on the indio permitido, or “permissible Indian”. Indio permitido is a term borrowed from Bolivian sociologist Silvia River Cusicanqui who argued that society needs a way to discuss and challenge governments that use cultural rights to divide and domesticate indigenous movements. Hale therefore concluded that indigenous communities are allowed to build rights and establish platforms of culture so long as they do not hinder or challenge government schemes. Indigenous communities thereby become “permissible” if they act within the economic framework that the government establishes, but are then discredited if they disagree or attempt to act outside of those state frameworks. He writes “governance now takes place instead through distinction…between good ethnicity, which builds social capital, and dysfunctional ethnicity, which incited conflict.” Understanding “permissible” and “impermissible” notions of indigeneity can therefore help us to better understand indigenous participation within these revolutions: indigenous groups were accounted for within the “new nations” when they adapted to the values of the forming nation-state, be it conforming to the national education system, learning Spanish or allowing for a top-down redistribution of land. If indigenous communities resisted or attempted to construct a communal identity outside these values they were then deemed counterrevolutionary or “subversive”. Dr Morris closed by connecting neoliberal ideas of indigeneity at the end of the twentieth century to the perception of indigeneity at the beginning of the century; he argued that neo-liberal recognition of indigenous groups is not that dissimilar to indigenismo in that indigenous “traditional” practices, such as dress, dances etc. are seen as acceptable but there is no space made for linguistic difference or political representation. 

Grappling with the notion of “indigeneity” and representation left me challenging my own perceptions of indigenous identity. Discussing indigenous narratives within history and competing perceptions of indigeneity urges us to interrogate our own approach to talking and writing about indigenous history, and understanding how we incorporate an indigenous perspective into the narrative of revolution. Perhaps this final thought is the most productive part of a Teach-out: to have individuals leave examining their own approach to research and education with the hope that new spaces will continue to form to re-evaluate and develop multiple narratives and perspectives.

Teach-Out Review: How Slavery Changed a City: Edinburgh’s Slave History

Written by Lewis Twiby.

As part of the teach-outs currently happening in solidarity with the UCU Strike, the History Society and the African and Caribbean Society hosted a very informative talk on Edinburgh’s connection to the slave trade. Chaired by two History undergraduates, Jamie Gemmell and Isobel Oliver, three experts – Sir Geoff Palmer, professor emeritus at Heriot-Watt, Lisa Williams, the director of the Edinburgh Caribbean Association, and Professor Diana Paton, our own specialist in Caribbean slavery in HCA – gave short speeches, and then answered, questions about Edinburgh’s slavery connections. In keeping with the ideals of the strike, of resistance and hope for a future, the speakers aimed to move away from traditional narratives of subjugation, instead focusing more on rehumanising enslaves peoples, discussing resistance, and how we can educate others on slavery.

     Sir Geoff Palmer was first to speak, beginning his talk on how he moved to London from Jamaica, and eventually up to Edinburgh in 1964. He discussed how, where Potterrow is now, was where the Caribbean Student Association was located, and how this talk would never have happened in 1964. Sir Palmer then went on to discuss the economic and ideological ties Edinburgh had to slavery. This included how David Hume used slavery as evidence for Africans being of lower intelligence, which, in turn, became a justification for the enslavement of Africans. He further highlighted how the literal structure of Edinburgh is partially built upon slavery. Scots owned 30% of Jamaican plantations, amassing to around 300,000 people, and the staggering wealth which was made through slavery helped built the city. 24 Fort Street, 13 Gilmore Street, York Place, and Rodney Street all had slave owners living there – Rodney Street is even named after an admiral who defended Jamaica from the French. The person who received the largest government compensation following the abolishment of slavery in 1834, John Gladstone, lived in Leith and received £83 million in today’s money. Despite the dark history of exploitation, Sir Palmer had some hope. He emphasised how having these talks was a step towards a brighter future, and stated ‘We can’t change the past, but we can change the consequences’.

     Professor Diana Paton continued after Sir Palmer, and wanted to look at the everyday aspects of slavery, and the rehumanisation of those enslaved. She explained that many of those who had plantations in Edinburgh actually inherited them – the horrors of slavery meant that plantation owners had biological children, but they were fathered by on enslaved women in an exploitative system, and many were barred from inheritance. As a result, inheritance subtly spread the influence of slavery in Edinburgh. For example, the Royal Infirmary in the 1740s received £500 from Jamaican slaveholders as a donation, and in 1749 was left in a will a 128-acre plantation with 49 enslaved people. Margareta McDonald married David Robertson, the son of HCA’s ‘founder’ William Robertson, and then inherited a plantation from her uncle, Donald McDonald. The callous attitudes they held towards people showed the dehumanisation of slaves, according to Professor Paton. The infirmary, a place of healing, rented out slaves earning £20,000 a year in today’s money, and a letter in the 1790s from Margareta asked whether she would get money from selling her slaves. However, Professor Paton also wished to rehumanise those enslaved and try to piece parts of their lives back together. For example, using the inventory of the McDonald’s, she found out about the life of Bella, born in Nigeria she was around 30 in 1795, and tragically passed away in 1832 – just two years before emancipation. Professor Paton stressed that by looking for people like Bella we can remind the public that those enslaved were not just nameless masses, but real, breathing people.

     Lisa Williams then began her speech, stating that her own Grenadian heritage, and the works of figures like Sir Palmer, inspired her to create the Edinburgh Caribbean Association. Williams wanted to break the exploitation of black historical trauma by creating the Black History Walks – specifically it was not a walking tour of slavery, although slavery is covered. Instead, it traces the forgotten history of Edinburgh’s Caribbean and African population since the sixteenth-century. In the 1740s, where the Writer’s Museum is today, a black boy worked as a servant and was baptised; Malvina Wells from Carriacou was buried in St John’s Kirkyard during the 1887; and how the mixed-race Shaw family even inherited slaves. Williams further emphasised the ideological impact of slavery, both in the past and today. Some white abolitionists, including William Wilberforce, exposed racist beliefs, so non-white abolitionists, like the Robert Wedderburn, challenged slavery and racial bigotry. Meanwhile, John Edmonstone from Guyana taught Darwin taxidermy and biology, something now believed to inspire him to go on his journey where he began developing the theory of evolution. She then discussed how the impact of slavery in Scotland today impacts education. Pride in the Scottish Enlightenment, a lack of teaching in the past, and racism in present society, a by-product of slavery, meant that this has been forgotten by society. However, she further argued that shifts in public opinion over reparations, including Glasgow University’s recent announcement that they would start looking at reparations, opens the doors for new educational opportunities. She concluded saying that the first look at African history and slavery should not be through the slave trade, instead it should be with African civilisations being taught in schools and the events of the Haitian Revolution.

     The question section, split into two with set one by the hosts and set two from the audience, cannot be adequately summarised here. This section of the teach-out allowed the speakers to elaborate on ideas they had wanted to discuss earlier, and the intellectual and emotional impact from this cannot be accurately represented here. Instead two themes cropped up throughout the discussion: education and decolonisation. Even then, these two themes were interconnected and can be best described as education through decolonisation. Sir Palmer, for example, spoke of how more research was needed to trace the economic and intellectual connections institutions had to slavery. Old College was partially funded through plantation profits, and how graduates from the medical school went to work on slave ships and plantations. This was echoed by Williams and Professor Paton – Williams cited how UncoverEd literally uncovers the forgotten history of the university, and how this was needed to be done elsewhere, not just in universities. Professor Paton echoed that the study of the Scottish Enlightenment had to be radically challenged, how their views on race helped justify slavery and the emergence of racism how we know it today. This further raises the question of should we even be naming buildings and raising statues of these people? The passion of the speakers is one thing to take away from this – Williams’ drive to challenge heritage sites in Scotland to acknowledge slavery and abolition, and Professor Paton’s description of education and public memory in Scotland about slavery as ‘insulting’ highlighted their desire for change. A direct quote from Sir Palmer remains with me, and shows why we need to study the past and decolonise, we have to ‘find out what is right, not do what is wrong’.

Casualisation, Contracts, and Crisis: The University in the early 21st Century

Interviews conducted and written by Jamie Gemmell.

From the University of Edinburgh’s various prospective student webpages, you would conclude that teaching lay at the heart of the institution. In their words, Edinburgh offers “world-class teaching” and is “always keen to develop innovative approaches to teaching.” Whilst the quality of Edinburgh’s teaching may not be in doubt, it is apparent that, judging by the way the institution treats staff, teaching is near the bottom of the university’s priorities. Over the past few months I have conducted interviews with Dr. Tereza Valny (Teaching Fellow in Modern European History), Dr. Megan Hunt (Teaching Fellow in American History), Dr. Kalathmika Natarajan (Teaching Fellow in Modern South Asian History), and Professor Diana Paton (William Robertson Professor of History). This piece aims to give voice to some of their experiences, putting a face to some of the more opaque problems raised by the ongoing industrial dispute between the UCU and Universities UK.

Three of my interviewees are “Teaching Fellows,” a position frequently defined by its contractual vagueness. On the surface, this short-term position is designed to provide opportunities for early career scholars, with an emphasis is on teaching and other student-facing activities. Often, the role is financed when a permanent member of staff acquires a large research grant. Theoretically, it’s a win-win: a more senior scholar can dedicate more time to their research, whilst a more junior scholar can gain some of the necessary skills and experience required for a permanent position. The reality is very different. In Dr. Valny’s words, the Teaching Fellowship is “extremely exploitative and really problematic.” In her experience, it meant being “plunged into an institution” to run modules and “figure[ing] it out as you go along.” Similarly, Dr. Natarajan referred to the contract as “precarious.” She finds the contractual obligations “so overwhelming, that I often … need a bit of a break,” leaving her unable to conduct research in her, unpaid, spare time. 

One of the primary issues around the Teaching Fellowship is the workload. Whilst Dr. Hunt’s contract stipulates that she should be working around twenty-nine to thirty hours per week, in reality she works “easily double that.” If she doesn’t have “specific plans on a weekend” she “will work.” Even then, she remains in a “cycle where you never quite get on top of it.” Dr. Natarajan puts it a bit more diplomatically, suggesting that her hours “definitely stretch more than the average work week.” Under the department’s carefully calibrated workload framework, five hours of one-on-one time are given to each tutorial group for a whole semester and forty minutes for a typical undergraduate essay – that includes engaging with work, writing up feedback, and discussing it with the student. Obviously, this is not sufficient. Dr. Hunt concludes that if she worked the hours laid out by the workload framework, her classes “would be turning up and saying let’s have a chat.” Even as a Professor, these issues do not fall away. Whilst working to contract as part of the UCU industrial action this term, Professor Paton has been able to spend much less time preparing for teaching than she normally would, only “scanning over primary sources” and “relying on long-term knowledge” when it comes to the secondary literature. By focusing on quantifying time so precisely, the institution has failed students completely, relying on the goodwill of the University’s employees. It hardly reflects a desire to introduce “innovative approaches” to teaching. 

With workloads so high, it is common for early career scholars to become trapped in teaching positions. Advancement in the sector relies on putting together a strong research portfolio – that means articles in highly regarded journals and respected book publications. As one of the University’s primary sources of income is research funding, scholars with reputable research backgrounds are crucial. However, Teaching Fellowships, by their very nature, stipulate little to no time to research. When I asked Dr. Natarajan how many hours she dedicated to research she laughed and said, “absolutely none.” Despite developing many of her key ideas through her teaching, Dr. Valny has never had the “space to take those ideas” and transform them into a book proposal. This can lead to anxiety and stress. Dr. Natarajan’s PhD is “constantly at the back of my mind,” yet she rarely finds significant time to transform the piece into a monograph. Without the adequate time allocated to research, these scholars can never advance. Dr. Valny, rather depressingly, concludes that if she continues within a Teaching Fellowship she will become “unemployable” in any other position. With her contract expiring in August this year, it appears that this possibility could become a reality. Her situation reflects a broader problem where staff dedicated to their students and teaching are not rewarded for their work.

The emphasis on research has led to pernicious discourses that have devalued teaching, further demoralising many early career scholars who find themselves ensnared in these roles. In contrast to her time in Prague, where she was rewarded for producing popular courses (although still employed only temporarily), Dr. Valny finds herself suffering from feelings of “imposter syndrome” and “guilt, or inadequacy” when confronted with suggestions that she need only apply for research grants to escape her role. For Dr. Hunt, being “respected for what I already do quite well,” would be more appreciated. She claims that “institutionally it (teaching) doesn’t matter.” By being “a good teacher,” she has risked her career being “put on hold, if not completely stalled.” Similarly, Dr. Natarajan has found her teaching being treated as “a side-line” or a “side-note” to research. Performative professionalism has often defined these scholars’ teaching approaches, hiding an institution that disregards teaching and actively encourages academics to move away from teaching. This is despite some Teaching Fellows, such as Dr. Valny, accepting that a permanent teaching position would be “actually fine.”

These issues around workloads and casualisation intersect with the brutal policies of the Home Office, frequently referred to as the “hostile environment.” Home Office regulations stipulate that only “highly-skilled migrants” can live and work here, meaning those on short term contracts face another level of instability. For Dr. Natarajan, this has been a major source of precariousness. Dr. Natarajan can “only stay as long as I have a job or, rather only as long as I have a visa and the visa depends on my job.” If Dr. Natarajan or her husband fail to secure another job, after their current contracts expire, they risk deportation. Within the sector more broadly, advertisements for short term jobs often assert that only those with a pre-existing right to reside can apply. This issue throws cold water over criticism that stereotypes strikers as middle-class whites. Demonstrably scholars of colour, often, in the words of Dr. Natarajan “have their own very different set of precarious circumstances.” 

Many of these issues reflect deeper structural problems within the higher education sector.  Scholars frequently cited the removal of the student cap and increase in tuition fees, reforms from 2010, as exacerbating pre-existing issues and transforming education into a commodity. Dr. Natarajan has suggested that the university has become a “business venture,” whilst Professor Paton claims that there was an “almost instant” change in the way students and management conceptualised higher education after 2010. Over the years, under Professor Paton’s analysis, this “quantitative increase has become a qualitative change,” putting pressure on staff and students. Despite student numbers and tuition fees increasing, Dr. Hunt suggests that “the service that people are paying” for is not being provided. Rather, money flows into marketing and big projects that elevate the positions of senior management figures.

The university sector appears to have reached a tipping point. On a micro level, staff are under increasing pressure, with workloads increasing and casualisation becoming more widespread. A two-tier system has developed, with early career scholars expected to teach more and research less. Goodwill and professionalism appear to be the only things preventing university teaching coming to a standstill. On a macro level, the sector has become partially commercialised with fees privatised and universities encouraged to compete for students. This has occurred without a concomitant provision of consumer rights, leaving students forced to accept higher levels of debt without safeguards in place to demand improvements or changes in the service provided. These institutions have been left in some middle ground between state-funded institution and privately-funded business venture, to the detriment of academics and students. Demands being made under the ongoing industrial dispute are hardly radical. Many academics are simply requesting greater job security and more respect for the work they do. If universities aren’t designed to support students or academics properly, we are all left asking who on earth are they designed for?  

Beyond Pop: The Extremes of 1970s Britain

Written by Jack Bennett.

The music of the 1970s reflected the extreme divisions and polarisations within Britain, revealing the intersection of popular culture, politics and economics. What emerged during this decade was a cyclical process of adoption and outpacing regarding cultural trends. The idealised utopianism adopted by the youth of the 1960s receded with the appearance of hard-edged styles, which was then reversed during the 1970s, seeing the emergence of hyper-Mod working-class cool in the form of skinheads, building upon the earlier Teds and Mods. While the influence of glam rock introduced a resurgent androgyny to the streets of Britain, challenge and usurpation of style and cultural pre-eminence became the defining factor of the decade. Nowhere is this better presented than in the punk movement. The music of the 1970s mirrored these cultural and stylistic fluctuations: this can be seen in the way Soul picked up in Northern clubs from Wigan to Blackpool to Manchester; the struggle between the concept albums of the art-house bands and the arrival of punkier noises from New York in the mid-seventies and the dance crazes that ebb and flow in popularity. Musical styles begin to break up and head in many directions in this period, coexisting as rival subcultures across the country. These changes were fundamentally driven by the traversing of tumultuous, uneven and complex socio-political landscapes.

Currents of popular music transformed during this decade, both through revolutionary change and continuation. Notably, despite the rise of new styles such as reggae and ska, this did not result in the demise of rock ‘n’ roll nor Motown. The Rolling Stones and Yes carried on, oblivious to the arrival of the Sex Pistols and the Clash. Within this melting pot of musical and stylistic chaos during the 1970s, it is important to emphasise that the life it lived and its soundtrack are not quite the same. For instance, between the early fifties music characterised by Lonnie Donegan and the mid-seventies’ stylings of Led Zeppelin, real disposable income exactly doubled. Yet from 1974 until the end of 1978, living standards actually went into decline, marking an end to the long working-class boom. It was this dissolution of the previously upheld Post-War Consensus which had committed consecutive Prime Ministers and leading parties to the maintenance of low unemployment and social welfare support. By the 1970s, as a consequence of economic instability and pressures such as the OPEC oil crisis of 1973 (which resulted in nation-wide strikes and a three-day working week), the nation was plunged into darkness.

This darkness subverted the earlier optimism under which British pop was invented – between 1958 to 1968 – when the economy was undergoing rapid expansionism. The changing mood entering the 1970s was caused by increasing unemployment, as the total number of Britons out of work passed 1 million by April 1975. There was a general attitude that a blanket of bleakness had been cast over the nation, and socio-cultural realist escapism was sought as a remedy. This second phase involved the sci-fi ambiguities and glamour of Bowie, the gothic, mystical hokum of the heavy rock bands like Black Sabbath and Led Zeppelin, and the druggy obscurities of Yes. The second half of the seventies were the years of deep political disillusion, strains which seemed to threaten to tear the unity of the UK: Irish terrorism on the mainland, a rise in racial tension, and widespread industrial mayhem. Most notable of these socially, politically and economically calamitous and transformative events was the Winter of Discontent in 1978. Due to widespread industrial unrest and strike action bringing the nation to its knees, The Sun reported the tumultuous events and portrayed Prime Minister James Callaghan’s intransigence towards the situation through the headline “Crisis? What Crisis?”. The optimism which had helped fuel popular culture suddenly began to run dry. What emerged was a darker, nightmarish inversion of the optimism and vibrancy that embraced the music and culture of the 1960s.

A darker, nightmarish inversion which was expressed most notably through punk. This creatively explosive, politically astute cultural and musical movement offered an anti-establishmentarian, liberating assault on mainstream decencies grounded in the philosophy of nihilism. One of the most iconic bands of this movement, The Sex Pistols, following their formation most explicitly positioned themselves as the antagonists of The Beatles. As a result, music became a source of power in the battle with authority and repression, expressing the self-loathing and pessimistic attitude of the decade. In response to the punk aesthetic and attitude there developed a seeping moral panic within Britain. Surrounding the growth in prolific, confrontational, violence and controversial actions – punk and the Sex Pistols in particular became a publicity engine attacking the established rock pantheon and encapsulating the emotion of the decade. The press and politics only served to further these already ingrained opinions. From concerts known for their wild and uncontrollable crowds, to juvenile political attacks in songs such as ‘Anarchy in the UK’ and, in the year of the Silver Jubilee, ‘God Save the Queen’. Punk became a vehicle of expressing opposition to the social and political net which enmeshed the nation during the 1970s.

Yet punk was the first revival of fast, belligerent popular music to concern itself with the politics of the country, and this was the first time since the brief ‘street fighting man’ posturing of the late sixties when mainstream society needed to notice rock. On the other side of the political divide was an eruption of racist, skinhead rock, and an interest in the far-right political orientation. Among the rock stars who seemed to flirt with these ideas were Eric Clapton, who said in 1976 that ‘Powell is the only bloke who’s telling the truth, for the good of the country’ – referring to the infamous 1968 Rivers of Blood speech made by the Conservative MP Enoch Powell. As well as David Bowie, who spoke of Hitler as being the first superstar, musing that perhaps he would make a good Hitler himself. These notions were a far cry from the 1960s utopian optimism in the future for Britain and the youth culture. Reacting to the surrounding mood, Rock Against Racism was formed in August 1976, helping create the wider Anti-Nazi League a year later. Punk bands were at the forefront of the RAR movement, including above all The Clash and The Jam. ‘Black’ music such as reggae, ska and soul, with strong roots in the Caribbean immigrant populations throughout Britain as well as African American influences, became a major cultural force, crossing racial divisions and promoting decisive turn against the rearing head of a racist demagogue in the music culture of Britain. Ska revival bands such as the Specials and the reggae-influenced The Police and UB40 had a greater impact than typical ‘popular music’. The seventies produced, in the middle of visions of social breakdown, a musical revival which revived the ‘lost generation’. This effectively marginalised the racist skinhead bands and youth culture which was strongly related to the National Front at this time and were renowned for violent, racially motivated attacks across the country, pushing them out of the social and musical environment of Britain. As one cultural critic of the time put it, ‘A lifestyle – urban, mixed, music-loving, modern and creative – had survived, despite being under threat’. Despite the era-defining social, political and economic struggles of the 1970s, music became an expression of cultural values and movements. The radical nature of generational transformation in the 1970s produced a new youth culture that was increasingly splintering during this period.

For Geoff Eley, the decade was the storm centre of a change in the narrative of post-war national identity, destabilised by the 1960s and rendered more aggressively patriotic by the New Right. Defined by an internal chronology of escalating problems. Lynne Segal counters this preconceived narrative, arguing that during the 1970s major strides and flourishment occurred in relation to homosexual rights, anti-racist and feminist movements. For example, in 1975-76, while embroiled in rampant inflation around 25%, legislation was enacted on equal pay, sexual discrimination, race relations, domestic violence, and consumer rights. This demonstrates the ambiguity and fracture of the decade, which for many saw liberation and power rather than just crisis and decline. A decade of grit and glamour.


Image source: Patrick Sawer, ‘’We ran the NF out of town’: how Rock Against Racism made Britain better’, The Telegraph, 27 April 2018,, accessed on 8 February 2020. 

Black, Lawrence. “An Enlightening Decade? New Histories of 1970s’ Britain.” International Labor and Working-Class History, no. 82 (2012): 174-86. 

Marr, Andrew. A History of Modern Britain, London: Pan Macmillan (Reprints edition), 2009. 

British Culture and Society in the 1970s: The Lost Decade Edited by Laurel Forster and Sue Harper, Cambridge: Cambridge University Press, 2010. 

War & Peace: Art in Ducal Milan

Written by Joshua Al-Najar.

Art was a key tool for renaissance cities to disseminate ideas and fashion an identity in a pluralistic, competitive society. Scholarship has tended to focus on the programmes undertaken in republics, such as Florence and Venice – perhaps less considered is how dynastic systems were able to deploy the Renaissance’s lessons in the form of state art. One prominent example is Milan, a duchy, where humanism, classical learning and heritage guided the patronage of art to strengthen the authority of the ruling duke. This was a response to the perceived vulnerabilities of this approach to rule. 

Authority and status were conveyed using classical learning in the art of ducal Milan with deeply distinct motives. Where republican regimes used themes tied to civic humanism, the Dukes of Milan deployed the lessons of antiquity in the creation of ‘renaissance magnificence’. This concept was ultimately rooted in individualistic veneration and regarded the act of conspicuous spending on elaborate works as a display of virtue; as such, patronage of sumptuous artworks could be used to the heighten the status of the individual patron, as well as being considered to ‘better’ the city generally. Jane Black identifies the root of this rationale in the neo-Platonic tradition, where outward beauty was thought to reflect inward virtue. This concept could be suited to regimes such as the Duchy of Milan, where power was concentrated in an individual, dynastic ruler, rather than a faceless office.

Louis Green diverts from the work of Black, by suggesting that the emergence of renaissance magnificence was not linked to the typically accepted neo-Platonic tradition. Instead, he points to a political, Aristotelian-style explanation as demonstrated by Azzone Viscont’s attempts to display authority in 14th century Milan. Azzone, one of the last tyrant strongmen, had rapidly assembled a series of territories in northern Italy that lacked cultural continuity; one method by which this could be achieved was a programme of artistic works that centred around Visconti’s unifying role as ruler, and patron. The success of Visconti’s magnificence was memorialised by his theological adviser, Galvano Fiamma, who recorded in his Opusculum de rebus gestis ab azone, Luchino et Johanne Viceomitibus (1334-5) that:

Azzo Visconti, considering himself to have made peace with the church and to be freed from all his enemies, resolved in his heartto make his house glorious, for the Philosopher says in the fourth book of the Ethics, that it is a work of magnificence to construct a dignified house.

Fiamma clearly outlines the political advantages to a ruler who was willing to invest in lavish surroundings. In addition, his reference to Aristotle’s Ethics mounts support for the explanation of magnificence advised by Green.

Visconti put renaissance magnificence into practice, as he embarked upon an extensive programme of artistic patronage that celebrated the Duke on an individual basis. As part of the rejuvenation, the Chapel of the Blessed Virgin was renovated in gold and blue enamel detailing, as well as an enormous, elaborate tomb for himself (Fig. I). However, it was in the secular space of the Ducal Palace that Visconti sought to heighten his status in overt terms.  In the main hall of the re-purposed Palazzo del Broletto Vecchio, Visconti commissioned a series of paintings – of which no extant examples survive – that are believed to have been the work of Giotto di Bondone. The works are thematically linked to concepts of war, strength and military success; these would have been ideal themes for a strong-arm ruler, such as Visconti, to emphasise in artistic works. Personally, Visconti had numerous military successes, and had regained many territories that his grandfather Matteo I Visconti had lost in the Late-Mediaeval period. Therefore, pictorial references to war would have reminded beholders of Azzone’s numerous successes.  Visconti appears physically in the painting too, alongside historical nation-builders, such as Charlemagne and Aeneas. By juxtaposing himself with the legendary Trojan, Visconti incorporates himself into the ranks of an ancient, heroic tradition as well as displaying the classical refinement of his court. 

This process continued under the patronage of Galeazzo Maria Sforza (1444-76), who embellished his own personal status in the renovation of the Castello di Pavia. Though it would later be destroyed by the French in the early 16th century, numerous literary records attest to the various paintings that adorned the castle. Stefano Breventano, a Milanese chronicler, recorded that the palace was ‘the loveliest building that could be seen in those days’. A series of frescoes designed for the galleries of the Piano Nobile show conformity with typical, princely activity: the Duke taking petitioners; the duke and duchess engaging in falconry; and lastly, the duke effortlessly killing a stag during a hunt. The last of these scenes demonstrate the Duke’s engagement with what would later be called sprezzatura, by Baldissare Castiglione’s The Book of the Courtier (1528). The duke’s effortless demeanour whilst showing great skill is an attempt to convince the beholder of his individual supremacy.

However, behind this veneer of princely status was an unpopular, tentative leader. Galeazzo Maria Sforza had shown little authority within diplomatic and military spheres, and thus, attempted to create a commanding figure in visual art. Sforza attempted to assert his authority and amplify his status by giving his rule a veneer of legitimacy; technically, the Sforzas had conquered Milan in the 1450s. In his attempts to legitimise his regime, Sforza tried to provide visual links to the preceding Visconti line.

Unlike at Venice, where historical reference was made to the city’s achievements as a whole, Sforza continued the artistic legacy of the Viscontis in an attempt at dynastic continuity. This attempt is reflected in a letter from the Ducal secretary, Cicco Simmonetta dated from August 1469, that details a number of restorative works to be undertaken by Bonifacio Bembo. Cicco commented on the ‘maintenance of the old paintings’, as Bembo was instructed to carefully conserve the decorative panels from the era of the Visconti (Fig. II). This included numerous tissone, with a flaming branch and bucket that had served as an emblem for Filippo Maria Visconti – who happened to be Sforza’s maternal grandfather. Evelyn Welch has suggested that Sforza sought to extol his links to the previous regime by carefully conserving its symbols and iconography. The tisonne was incorporated into the decoration of the ducal apartments. Welch understates the significance of this move – in this period, nominally private rooms such as bedrooms would have essentially functioned as public spaces, receiving petitioners and housing illustrious guests. Therefore, providing pictorial reference to these links would aid in the transition of power to the Sforza regime and make up for deficiencies elsewhere. Sforza juxtaposed these images with that of his personal court, in an attempt to bond the two. Ultimately, Sforza’s attempt to generate authority through artistic continuity failed: Breventano remarked that he was a “lustful, unpopular duke” which may go some way in explaining his assassination in 1476 by a group of Milanese officials.

Milan was a city where heritage, antiquity and mythmaking were crucial in artistic patronage. Ultimately, this was geared towards the specific anxieties that accompanied a dynastic regime, where power was concentrated in the individual.


A black and white photo of a building

Description automatically generated
Figure I : Reconstruction of the Tomb of Azzone Visconti by G. Giulini.

A double photo of a building

Description automatically generated
Figure II: Restored section of decorative panels (1468-9), Castello di Pavia, Pavia.

Source: Green, L., ʻGalvano Fiamma, Azzone Visconti and the Revival of the Classical Theory of Magnificenceʼ, Journal of the Warburg and Courtauld Institutes, 53 (1990), 10.

Source: Evelyn Samuels Welch, ʻGaleazzo Maria Sforza and the Castello di Pavia, 1469ʼ, Art Bulletin, 71 (1989), 361. 


Black, Jane. Absolutism in Renaissance Milan : Plenitude of Power under the Visconti and the Sforza, 1329-1535. Oxford, [England] ; New York, N.Y.: Oxford University Press, 2009.

Dooley, Brendan. “M Onica A Zzolini . The Duke and the Stars: Astrology and Politics in Renaissance Milan.” The American Historical Review 119, no. 3 (2014): 1004-005.

Green, L., ʻGalvano Fiamma, Azzone Visconti and the Revival of the Classical Theory of Magnificenceʼ, Journal of the Warburg and Courtauld Institutes, 53 (1990).

Norbert Hulse & Wolfgang Wolters, The Art of Renaissance Venicearchitecture, sculpture and painting (1990).

Richardson, Carol M., and Open University. Locating Renaissance Art. Renaissance Art Reconsidered; v. 2. New Haven [Conn.] ; London: Yale University Press in Association with The Open University, 2007.

Ruggiero, Guido, and Wiley InterScience. A Companion to the Worlds of the Renaissance. Blackwell Companions to History. Malden, MA ; Oxford: Blackwell Publishers, 2007.

Evelyn Samuels Welch, ʻGaleazzo Maria Sforza and the Castello di Pavia, 1469ʼ, Art Bulletin, 71 (1989), pp. 352-75.

Welch, Evelyn S. Art and Authority in Renaissance Milan. New Haven; London: Yale University Press, 1995.


Written by Tristan Craig.

Ever since an Italian plumber called Mario entered the world of computer entertainment in 1981 on a never-ending plight to rescue Princess Peach, the damsel-in-distress trope became the driving narrative for the majority of video games in the early days of their development. Just as Perseus slew the beast that threatened his beloved Andromeda, it fell to the might of the male protagonist – who, in his first appearance, was a carpenter referred to as ‘Jumpman’ – to rescue his girlfriend: the somewhat less imaginatively named ‘Lady’. A simple plot device catering to a predominantly white-heterosexual-male market, Super Mario Bros. sold over 40 million copies upon its release in 1985, reviving the home computer following the crash of 1983 and paving the way for the platform format.

This was a format which sold a very large number of games, but which offered remarkably little in the way of representation beyond the white-heterosexual-male binary. As homophobia swelled in the wake of the AIDS epidemic of the same decade, LGBTQ+ inclusion was profoundly absent from the video game industry and those who did feature either did so in a pejorative or peripheral manner. 1986’s text adventure Moonmist is commonly cited as the first to include any illusion to a queer character. Vivien Pentreath, an artist struggling to cope in the aftermath of the suicide of her female lover, Dierdre, is thought to be the first lesbian character to feature in a video game, however at no point is her sexuality explicitly stated. The only reference to her sexual identity is a note in one of four possible endings stating that ‘Vivien was intensely attached to Dierdre’ and that she was jealous of the latter’s heterosexual marriage. It is also worth noting that in this story arch, Vivien emerges as the villain in an otherwise tertiary role.

Whilst the inclusion of non-heterosexual characters was particularly rare, transgender identities were almost non-existent. Following the massive success of Super Mario Bros., Nintendo continued to develop games starring their eponymous hero. The second release in the series, which arrived on the Nintendo Entertainment System in 1988, introduced a character named Birdo: a pink creature of indeterminate species and gender. Birdo arrived in the United States with a manual entry which read ‘He thinks he is a girl and he spits eggs from his mouth. He’d rather be called “birdetta”’. The game itself provided no further backstory nor allusions of any kind to Birdo’s gender identity, consigning it to a problematically worded blurb in a guide. Future iterations of the game removed any illusion to the concept of Birdo being anything other than a cisgender female – although a 2008 Japan-only release called Captain Rainbow would revisit her canon, in one country at least. 

As the larger development companies continued to indulge the majority of their market, the release of HyperCard software for the Macintosh in 1987 allowed independent designers to produce their own software with ease. The first fully LGBTQ+ game, written in HyperCard, subsequently emerged in 1989; Caper in the Castro follows a lesbian private investigator called ‘Tracker McDyke’ as she attempts to find her kidnapped drag queen friend. Released as charityware, a note from creator C.M. Ralph as the game is launched states she ‘wrote this game as a labor of [her] love for the Gay and Lesbian community’ and asks the player to make a donation to an AIDS charity of their choice. The game would be picked up by Heizer Software where it enjoyed success – albeit as the renamed and fully ‘straightwashed’ Murder on Main Street.

The 1990s began making somewhat more progressive steps away from the standard format. 1996 saw the introduction of a female protagonist in the guise of archaeologist Lara Croft. The Tomb Raider series of games broke away from the male dominated lead, yet Croft was lauded and criticised in equal measure for being both a highly intelligent and hypersexualised lead. Her inception, although hugely impactful to the video game market, was once again aimed primarily at a male target audience. But the late 1990s managed to provide a landmark for LGBTQ+ inclusion. Black Isle Studios’ Fallout 2, released in 1998, contained the first same-sex marriage in a video game – 6 years before the first US state would legalise them. Fast forward to the 2000s and the landscape is certainly more diverse. Advances in the technical capabilities of home computing and the subsequent rise of the roleplaying game has allowed players to craft their own identity, free from being forced down a singular heterosexual mode of gameplay. And yet, it is hard to deny the imbalance, particularly in representing gender identities beyond the male-female binary. 

So how colourful does the future of gaming look for the LGBTQ+ community? May 2020 is set to welcome to highly anticipated sequel to Naughty Dog’s The Last of Us who first introduced the post-apocalyptic world of Joel and Ellie in 2013. This time, our attention will be turned to Ellie – an openly lesbian protagonist. Following on from DLC, Left Behind features fourteen-year-old Ellie sharing a short but tender kiss with best friend Riley – developers have chosen to fully actualise the sexual identity of their protagonist. But development companies have a long way to go if they want to fully include representation of a large proportion of their demographic, as 2014 documentary Gaming in Colour explored. As 2020 celebrates the 40th anniversary of an Italian plumber and his relentless quest to save his princess, perhaps we ought to reflect not only on how far the video gaming industry has come but on how much further it could and ought to go.


‘Caper in the Castro: Internet Archive’, (accessed 15.02.20) (note: you call play an online emulation of the game at this link)

‘Censored or Celebrated (Flouting Margins: Part 2)’, (accessed 14.02.20)

‘LGBTQ Video Game Archive’, (accessed 14.02.20)

New York and the LGBTQ+ Community over a Century

Written by: Lewis Twiby.

The anonymity of big cities allows persecuted sub-cultures and identities to find room to exist. London, Berlin, and Paris are just three examples of cities with flourishing LGBTQ+ communities. In the United States, New York was one of the major sites for gay liberation. Throughout the twentieth century a flourishing and diverse LGBTQ+ community emerged where class, race, gender, and sexuality intersected, paving the way for the gay rights movement to emerge. This article aims to show a snapshot into this diverse movement over a period of a century, from around 1890 to 1990, and how LGBTQ+ culture emerged in New York.

George Chauncey argues that the emergence of a, principally, homosexual subculture began emerging in New York in the 1890s when Columbia Hall was reported as the ‘principal resort in New York for degenerates.’ An unfortunate trend in history is the marginalisation of those who are not included in the standard hegemonic order – whether by class, race, or any other reason. In the Euro-American mindset – something which was also forced on many cultures worldwide thanks to colonialism – same-sex relations, non-binary genders, and non-conforming gender roles were treated as ‘degeneracy’ or a mental illness. In the 1870s a ‘map’ was printed warning Latin American businessmen visiting New York of the type of ‘degenerates’ they could encounter including prostitutes, shoeshine boys, and a ‘fairy’. Other than the standard demonisation of those left excluded from the Gilded Era economic expansion, it shows the distrust of LGBTQ+ individuals. The term ‘fairy’ was widely used as a way to further demean male homosexuals, especially by drawing images of femininity. An investigator – homosexuality was classed as ‘indecent’ and consequently illegal – alleged that patrons to the Columbia Hall ‘are called Princess this and Lady So and So’. Misogyny and homophobia went hand-in-hand.

The working-class slums of New York, such as the Bowery, offered young men and women an ability to socialise outside more traditional bourgeois family units which emerged in the late-nineteenth century. ‘Scandalous shows’ aimed at titillating consumers soon evolved into bars and clubs where people were free to experiment with same-sex relations, or opportunities to challenge gender identities. As often what occurs in marginalised communities a new lexicon started emerging. Seeing an increase in use during the 1920s, ‘gay’ started being used as a way for homosexual men to recognise one another – by calling themselves ‘gay’ they could secretly identify other homosexuals, and those involved in the community. However, there was not one ‘gay community’ in New York. Gender and racial segregation harshly split the community, and among white men there were those who wanted to be distanced from ‘fairies’ – those who cross-dressed or were gender non-conforming.

During the 1920s and 1930s, encouraged by an air of secrecy fostered by Prohibition, New York developed two major gay enclaves: Greenwich Village and Harlem. Greenwich Village originated as a refuge for rich New Yorkers to escape the bustle of the city, but as the city expanded the rich moved out and impoverished migrants, mainly Italian, moved in. The ‘Village’ became known for its bohemian character as its quiet location and cheap housing invited in New York’s artists and writers. This bohemian character fostered an atmosphere of single-living and eccentricity allowing the LGBTQ+ community to live openly. The Village was known as the place for ‘long-haired men’ and ‘short-haired women’, and even radical challenges to society. Famous anarchist Emma Goldman would visit the Village in the 1920s and make speeches demanding gay rights. However, there was a limit to this freedom. Racism excluded gay African Americans and Puerto Ricans from the Village until after the Second World War. Following the First World War 6 million African Americans moved from the US South to escape economic poverty and intense racism. Due to Northern segregation they were forced to form their own communities, and one of these was Harlem.

1920s Harlem is best known for the Harlem Renaissance – a period of cultural revival where resident African Americans produced a wide variety of literature, poetry, art, and music. For example, jazz and blues properly emerged during this period. Part of the Harlem Renaissance saw the emergence of a gay enclave. Part of this was racialised – white artists declared that Harlem was ‘wide open…Oh, much more! Much More!’, in the words of artist Edouard Roditi, as they could enter these spaces openly. LGBTQ+ African Americans, who had to live in Harlem, could not have this luxury, but they made it their home regardless. The Hamilton Lodge ball attracted hundreds of drag queens, and their performances attracted thousands of spectators – many of them were black or Latino. From this the ‘ball culture’ emerged and subtly made an impact on white beauty standards. Contouring was originally used by drag queens in Harlem to emphasise their cheekbones to look more stereotypically feminine. LGBTQ+ people further shaped the Harlem Renaissance: the ‘Queen of the Blues’ Bessie Smith was openly bisexual, one of the creators of jazz poetry was Langston Hughes has been seen as possibly homosexual or asexual, and singer Ethel Waters went into a lesbian relationship.

It is important to not understate the levels of discrimination and outright oppression New York’s LGBTQ+ community faced. Gay clubs were often given discriminatory names, the Hamilton Lodge was called the ‘faggot club’, and LGBTQ+ people were regularly referred to as degenerates. In 1924, the play God of Vengeance by Sholem Asch opened on Broadway for the first time, and the theatre owner and the actors were charged with obscenity as it played with themes of lesbian identity. During Prohibition, speakeasies did give a new community the ability to experiment with their sexuality, while at the same time opening new excuses for the police to raid gay clubs. In 1940, New York allowed police to use a Prohibition-era law to continue raiding gay clubs until the 1960s. Post-war, issues even get worse. Joseph McCarthy said the homosexuals were communist sympathisers, or could be used by them, beginning the ‘Lavender Scare’ to go alongside the Red Scare – 420 government employees were fired between 1947 and 1950 for suspected homosexuality. The resurgence of conservative values – a view that society should be Christian, white, middle-class, and in heterosexual nuclear families – meant that any deviation from this was viewed as ‘un-American’. Gay bars across Harlem and the Village were raided, and the police at times sexually assaulted lesbians and trans-individuals to ‘prove’ their gender.

Meanwhile, the 1960s saw times of great changes. As women and African Americans began fighting for their rights, LGBTQ+ communities also started fighting for their rights. The first gay rights movements were formed in the 1950s, notably the Daughters of Blitis and Mattachine Society, and largely campaigned for rights in Washington. A slow rights movement started building up, but their only biggest achievement was in 1967 when ‘sip-ins’ forced New York bars to allow homosexuals to have drinks. The ball scene was still thriving and was growing. RuPaul Charles and Lady Bunny moved to New York and became famous for their presence in the ball scene, and Marsha P. Johnson viewed the Village as a ‘dream’. Johnson had moved to New York for the anonymity – as a poor, African-American, homosexual, and gender non-conforming individual she saw many layers of intersecting oppression. One of the key places to be for the gay community was the Stonewall Inn. Stonewall was owned by the mafia and only made it a gay club as they knew the LGBTQ+ would not report them to the police with homosexuality still being illegal in New York. An unexpected police raid would spark the key event in American LGBTQ+ rights.

On June 28, 1969 police raided the bar and began assaulting patrons who appeared gender non-conforming. When one was being arrested a riot broke out – in popular memory Marsha P. Johnson ‘threw the first brick at Stonewall’. Singing We Shall Overcome and chanting Gay Power, the patrons started fighting off police, and by the time backup arrived a crowd of over a hundred people had arrived to support the patrons. Sylvia Rivera, a Latino trans-woman and close friend of Johnson, later remembered that: ‘You’ve been treating us like shit all these years? Uh-uh. Now it’s our turn!… It was one of the greatest moments in my life.’  It is important to note that many of those involved were African American or Latino, and many were trans or non-conforming, as years of oppression based on race, gender, and class gave them the urge to say ‘no’. Elizabeth Armstrong and Suzanna Crage have argued that a big reason why Stonewall, and not of the other clashes with police, became the spark of the gay revolution was thanks to the first Gay Pride event. A bisexual woman, Brenda Howard, saw the impact Stonewall had and used the first anniversary of the riot to host the first Gay Pride event, and solidify the legacy of Stonewall.

In the aftermath of Stonewall the gay rights movement started in earnest. For the first time gay rights moved away from Washington and into New York – many of those who took part in Stonewall would go on to create new rights movements. Deeply inspired by the Black Panthers the Gay Liberation Front (GLF) was formed to directly fight homophobia in society. Like the Black Panthers they viewed capitalist society as reinforcing discrimination, and vowed to fight capitalism, the nuclear family, and traditional gender roles. As a way to become increasingly diverse a lesbian chapter was formed, called the Lavender Scare, and Marsha Johnson and Sylvia Rivera formed STAR, (Street Transvestive Action Revolutionaries), for impoverished trans and non-conforming young people. These movements were an incredible break with the past as they directly forced gay rights into the open. Directly calling themselves ‘gay’, now firmly associated with homosexuality, was an open challenge to the taboo over homosexuality.

Resistance continued throughout the 1970s and 1980s despite some monumental successes – namely having LGBTQ+ identity no longer being classified as a mental illness in 1973 and lifting the ban on homosexuality in New York in the early-1980s. Homophobia did not end here, and there were still immense challenges to overcome. A resurgence of conservatism under Richard Nixon would become amplified by Ronald Reagan’s emphasis on ‘family values’ which continued the demonisation of LGBTQ+ identity. When the AIDS crisis broke out, as it largely affected poor and non-white LGBTQ+ communities, the government did nothing to help and even cut funding to finding a cure. The shadow of the AIDS crisis still hangs over the LGBTQ+ community – the continued popularity of the musical Rent, despite its problematic treatment of non-white and LGBTQ+ characters, highlights this by having a major trans-character die due to AIDS. Tragically, Marsha P. Johnson was murdered in 1992, and a mixture of transphobia, homophobia, and racism meant that the NYPD refused to investigate – her murder remains unsolved. 

During the dark years of the late-1970s and the 1980s the LGBTQ+ community continued to fight on. In 1985 black feminist Audre Lorde released her pamphlet I Am Your Sister calling for white feminists and male African American activists to understand the intersection of homophobia, racism, and misogyny proudly ending the text ‘I am a Black Lesbian, and I am Your Sister’. The ball scene in black and Latino communities remained strong, and the documentary Paris is Burning brought them to attention. Highlighting drag queens overcoming poverty and discrimination, tragically a trans-woman interviewed was murdered during filming, it gives an insight into the ball scene of the late-1980s. Although controversial as the interviewer, a white woman, never appears and the profits were never given to the community, it helped propel ball culture to mainstream eyes. Several phrases, especially thanks to their regular usage in the reality show RuPaul’s Drag Race, have since become part of wider, straight, lexicon including ‘voguing’, ‘reading’, and ‘shade’.

New York is one of the most diverse cities in the world, and the LGBTQ+ community still is a key part of this. Since 2013, the Republican party and some sections of the Democrats have been embracing homophobia, and since 2016 have been openly advocating for transphobic policies. These policies are naturally disheartening – decades of fighting appear to have been destroyed within just a few years. However, by looking at New York’s LGBTQ+ community fight for rights despite intense oppression over a century, it gives hope for the future. No matter how dark the future gets, there will always be a Marsha P. Johnson to fight back.


Armstrong, E. and Crage, S., ‘Movements and Memory: The Making of the Stonewall Myth’, American Sociological Review, 71:5, (2006), 724-751

Chauncey, G., Gay New York: Gender, Urban Culture, and the Making of the Gay Male World, 1890-1940, (New York, NY: 1994)

Duberman, M., Stonewall, (New York, NY: 1994)

Eisenbach, D., Gay Power: An American Revolution, (New York, NY: 2006)

Livingstone, J., Paris is Burning, (1990)

Lorde, A., I Am Your Sister, (New York, NY: 1985)

Shikusawa, N., ‘The Lavender Scare and Empire: Rethinking Cold War Antigay Politics’, Diplomatic History, 36:4, (2012), 723-752

Stein, M., (ed.), The Stonewall Riots: A Documentary History, (New York, NY: 2019)


Mapping the Medieval World

Written by: Tristan Craig.

Engraved on a clay tablet and labelled with cuneiform script, the sixth century BCE Babylonian Imago Mundi or ‘image of the world’ is one of the oldest examples of cartography known to exist. Its depiction of the Mesopotamian Empire, with land and ocean masses carefully rendered on its surface, illustrates both the interest in geographical enquiry and limitations of global exploration for this ancient civilisation. This small tablet, having suffered a great amount of damage in the last three millennia, also exemplifies the complexities of attempting to understand an archaic worldview. By the Middle Ages, cartographic practices showed increasing scope and refinement due to expansions in world trade, however there remained a large disparity between maps which attempted to depict an accurate world view and those which served a predominantly ecclesiastical purpose. 

Image result for imago mundi"
Imago Mundi

Whilst the modern geographic map attempts to render topographical features in a manner as accurate as possible, cartography in the Middle Ages was less concerned with depicting physical space as it was abstract beliefs. As Christianity gained traction in the West, so too did ways of glorifying the omnipresence of God, with mappae mundi (‘world maps’) created to centralise the church on a global plane. The T-O format of map production – so called because of the T shaped bodies of water intersecting the landmasses of a disc shaped earth – was popular in the medieval period. In a tradition which originated toward the end of the Roman Empire, these depict a tripartite Earth with the continents of Asia, Europe, and Africa divided accordingly. Little was known of the antipodean landscape, presumed habitable only by beasts and monstrous creatures. The only completely surviving example of this medieval mapping tradition is the late thirteenth-century Hereford Map, named after the cathedral in which it is interred. With Jerusalem placed prominently in the centre and the biblical land of Paradise in the northernmost reaches, the emphasis in this mappa mundi is on the placement of Christianity in the medieval world. Whilst the inclusion of landmarks and terrain suggests at least some knowledge of global geography, its primary function as tool for teaching Christian doctrine is evidenced by its centring of the faith.

Image result for hereford map"
Hereford Map

Despite their commonality, the tripartite map was not the sole format used in the Middle Ages, however; beyond the realm of Christianity, cartographic practice on occasion showed a far greater attention to geographical precision. Bearing a closer resemblance to the modern atlas, Moroccan cartographer Muhammad al-Idrisi created a series of illustrations under the patronage of King Roger II of Sicily which, when combined, formed an entire world view. The resulting world map and accompanying text was entitled the Nuzhat al-mushtāq fī ikhtirāq al-āfāq or Tabula Rogeriana (‘Map of Roger’) and, despite being created over a century prior to the Hereford Map, is remarkable in its accuracy. Drawing upon his own journey from North Africa to the Mediterranean, evidence shows that al-Idrisi was well versed in creating portolan charts – maps depicting sailing directions and naval trade routes – and his seafaring knowledge is clearly represented in the Tabula Rogeriana. Each important town or city is indicated by a brown circle, the majority of which are located along the Arabian Peninsula and southern Europe in correlation with al-Idrisi’s own travels. 

Particularly striking is the south-up presentation of the map which was common in early Islamic cartography, before what is now considered ‘north’ became the accepted standard. The dichotomy between north and south, and their representation in early maps, is indicative of the role the map played in representing religious and political power struggles. Similarly, the prominence of the Italian peninsula in the Tabula Rogeriana, with its multitude of coastal trading hubs, may have been included to appease the commissioner of the work: the King of Sicily. To that end, the map also serves to tell us a great deal about the relationship between al-Idrisi and Roger II, a Muslim Arab scholar and a Christian monarch, in actualising this geographical work. Muhammad al-Idrisi himself hailed from Moroccan nobility with his aristocratic lineage having the potential to benefit the expansive aspirations of the Sicilian king. Incidentally, the map includes little about inland territories, further suggesting its purpose as a type of portolan chart. That it was created over a period of fifteen years, with frequent discussion held over its accuracy, denotes the attention to detail paid by al-Idrisi – in sharp contrast to the far more symbolic Hereford Map.

Tabula Rogeriana

Developing a nuanced understanding of the work rests largely on interpretation of the extant copies, however al-Idrisi’s map falls foul of a problem commonly facing medievalists. Unfortunately, as with the majority of manuscripts from the High Middle Ages, the original book containing al-Idrisi’s maps and an engraved silver disc commissioned by Roger II have since been lost. Two extant manuscript copies of the Tabula Rogeriana are currently housed in the Bodleian Library, one of which dates from 1553 CE and contains his complete work. More recently, a 1929 facsimile was created by Konrad Miller to depict how al-Idrisi’s original illustrations would have looked when combined to create an entire world view as intended. Even when objects – such as the sixth century BCE Imago Mundi – have survived millennia, their condition can present a plethora of problems, from analysing the map to ensuring its subsequent preservation. The intricate mappae mundi of the High Medieval Period existed as unique works of art. Fortunately, the existence of manuscript copies of al-Idrisi’s map allow the modern historian to view it how it would have appeared in lieu of the original.

As navigational tools, medieval maps are largely limited in their accuracy. Although global knowledge expanded as trade routes increased throughout the period, the influence of both ecclesiastical and secular politics infiltrated the development of map making. From the power of religious veneration to the prominence of mercantile activity in particular territories, maps tell us as much about the personal beliefs of the individual creating them as they do geographical information. Perhaps the most striking revelation of Muhammad al-Idrisi’s Tabula Rogeriana is not just in his accurate depicting of physical space but in the extent of global travel and cultural exchange in the twelfth century. It is vital that the modern historian observe the medieval map not just as an artefact judged on what it renders but as a product of an individual and the society in which they lived.


Ahmad, S. Maqbul. “Cartography of al-Sharīf al-Idrīsī,” in The History of Cartography Vol. 2 Book 1: Cartography in the Traditional Islamic and South Asian Societies, ed. J.B. Harley and David Woodward (Chicago, 1992).

Black, Jeremy. Maps and Politics (London, 2000).

Edson, Evelyn. The World Map, 1300-1492: The Persistence of Tradition and Transformation (Baltimore, 2011).

Hartnell, Jack. Medieval Bodies: Life, Death and Art in the Middle Ages (London, 2018).

Vernet-Ginés, J. “The Maghreb Chart in the Biblioteca Ambrosiana,” Imago Mundi 16 (1962): 1-16.

Maroon State: Slave community and resistance in Palmares, Brazil

Written by: Jack Bennett.

The emergence of Palmares, a quilombo – or community of self-liberated slaves – as a political and social reality in the Brazilian heartland between 1605 and 1695, posed a threat to the colonial order in the region through overt, subversive resistance. This alternative African state faced numerous military campaigns against it and remained unrecognised by Portuguese and Dutch colonial authorities throughout the seventeenth century. This was due to the colonial perspective that Palmares posed a corrupting threat to order and stability, with multiple alliances forming between the Crowns of Portugal, the Netherlands, and slave traders in an attempt to suppress and destroy the community. The quilombo was led by Ganga Zumba, and oversaw the emergence of a proto-metropole which had a sophisticated legal and bureaucratic structure and a population of some ten thousand formerly enslaved people. This removal of colonial bondage allowed for an element of autonomous freedom, in an Atlantic World of colonial domination, control, and slave trading. 

Crucially, what the emergence of Palmares reveals is the importance of Afro-Brazilian solidarity and resistance in identity formation, fights for independence and liberty, and the development of an anti-racist and anti-colonial dogma. With fifteen thousand slaves arriving in Brazil from Angola between 1620 and 1623, Thornton (1998) argues that African culture and kinship was not a fixed system, but one of multiple possibilities, continuing, accommodating and adapting to rapid change under colonial rule in Latin America. This was achieved through the transferal of a variety of African social and political forms across the Atlantic, that allowed for the assimilation of different African ethnicities and groups into socio-economic communities of predominantly former enslaved Africans that would endure under colonial pressures for almost a century. Moreover, Mintz and Price (1992) argue that when this heterogeneous population became a homogenous enslaved African population, differences in status overlapped with differences in culture, bridged by the shared condition of enslavement, facilitating the creation of new institutions; a culture defined by constant internal and external dynamism. This highlights the importance of change rather than simple, passive or static processes of cultural retention between Africa and the Americas. 

These rustic black republics reveal the dream of a social order founded on fraternal equality, and for this reason are incorporated into the revolutionary tradition of the Brazilian people. Parallels can be illustrated between defensive and insular African communities resisting the actions of slave traders, and formerly enslaved quilombo communities resisting colonial power in Brazil. For instance, infra-structurally, Palmares utilised the pitfalls and caltrops found in Buraco de Tatu as people from Angola used palisades. During the seventeenth century, the territory the Portuguese called Angola was disrupted by factors that included: the pressure of the Portuguese slave trade and occupation of the coast; the collapse of states such as the Kingdom of the Kongo to the north; and invasions, principally from the northeast. The people of central Angola responded by coalescing under the name Imbangala. Interestingly, the nascent Imbangala states gathered together diverse groups of people in a community without lineage. Since these communities existed in conditions of military conflict and political upheaval, they found in the institution of the Kilombo a unifying structure suitable for a people under constant military alert – these entrenched Angolan wars fed the Brazilian slave trade. This determined a distinctly African polity in Brazil, defined by confederation, tributary relations, and cross-lineage relations. The flexibility of the institution of the Kilombo as a mechanism for integrating a community without institutionalised lineage engaged in warfare and self-defence, as was Palmares, explains why some adaptation of the Imbangala institution would thrive in Brazil, even if only a minority of Palmares’s inhabitants were actually of Imbangala origin.

Military threats, challenges, and incursions shaped the very existence of Palmares during this period. Following a large influx of enslaved people during the 1630s as a consequence of the Dutch invasion of north-eastern Brazil, campaigns were led by slave traders and royally commissioned mercenaries to quash the proto-state of formerly enslaved people. Under this Dutch dominion and even after the Portuguese reconquest of Pernambuco by 1654, Palmares experienced a series of unsuccessful incursions and colonial attempts at dissolution. From the first large scale expeditionary force led by Captain Joao Blaer, in 1645, to over twenty assaults against Palmares between 1654 and 1678. All of which proved unsuccessful due to the vitality and defensive capabilities of the community of formerly enslaved people. The final campaigns against Palmares, however, including those of Domingos Jorge Velho, 1692 to 1694, brought about the destruction of Palmares. In the internecine peace, Palmarinos traded with their Portuguese neighbours, exchanging foodstuffs and crafts for arms, munitions, and salt. The trade with Palmares was such that many colonials opposed war with the Palmarinos, and in the 1670s there was widespread opinion that establishing peace was the best way to achieve stability in the colony. The threat posed to the stability of plantation slave labour resulted in Carrilho’s campaign of 1676-1677 and great devastation. In 1678, Ganga Zumba, tired of war, accepted the peace terms from the Governor of Pernambuco, which affirmed his sovereignty over his people on the condition that he return any fugitive slaves and move his people from Palmares to the Cucai Valley. Then in 1680, the military leader of Palmare, Zumbi, led a coup and proceeded to rule the quilombo with dictatorial authority until the destruction of Palmares in 1694. This perpetual state of instability and warfare defined the lifestyles of formerly enslaved people and the formation of Palmares as a persistent source of resistance in the Atlantic sphere of imperial dominance, an early indicator of future upheavals to come. 

Ultimately, the Central African solution of the Kilombo was a remodelled and transplanted socio-political construct, created through the forced transportation of enslaved Africans during the seventeenth century, and re-imposed within the Brazilian colonial order, in order to serve maroon communities. The Palmares of Brazil developed into a Creole society. Critically, this process of hybridisation facilitated the emergence of communities and new identities in colonial Latin America, which remained intrinsically connected to the enslavement of Africans and their forced transportation within the Atlantic triangulation of human and product commodification. Through the process, comrades (or malungos) from diverse ethnic backgrounds were united in the common cause of self-determination and independence from brutal repression and labour. Fundamentally, this was not achieved on the basis of lineage, but for the purposes of commodity production, raiding, and self-defence. Therefore, the persistence and adaptation of African cultural elements such as the Kilombo to the Brazilian context, in fact, demonstrates the continuity of African and African Diasporic cultures in the process of New World transculturation and the development of resistance and revolution. 


Image source: “Zumbi dos Palmares: An African warrior in Brazil – The legend of the nation’s greatest black leader continues to be a topic of debate and in spiration.” Black Women of Brazil. August 18, 2014. Accessed April 10, 2017.

Anderson, R. “The Quilombo of Palmares: A New Overview of a Maroon State in Seventeenth-Century Brazil,” Journal of Latin American Studies 3 (1996): 545-566.

Anonymous, “The War against Palmares,” in The Brazil Reader: History, Culture, Politics, eds. John J. Crocitti and Robert M. Levine (Durham, NC: University of North Carolina Press, 1999).  

Kent, R. “Palmares: An African State in Brazil,” The Journal of African History, 2 (1965): 161-175.

Mintz, Sidney and Price, Richard, The Birth of African American Culture: An Anthropological Perspective. (Boston: Beacon Press, 1992).

Thornton, John, Africa and Africans in the Making of the Atlantic World, 1400-1800 (Cambridge: Cambridge University Press, 1998).

Conflict, Chaos and the Florentine Inferno

Written by: Joshua Al-Najar.

On a preliminary reading, Dante Alighieri’s Inferno seems entirely unconcerned with political realities. Its setting is a fantastical reimagining of hell, imbued with mythological creatures and terrifying landscapes: an illusory space for Dante to contend with sin’s dramatic consequences. However, behind this veneer is a deeply incisive reflection on reality, as Dante seamlessly blends his own political convictions with the Inferno’s plot. What emerges, is Dante’s intense distaste for factionalism and disdain for corruptive authorities. These views did not arise in a vacuum but were strongly shaped by Dante’s own political career and eventual banishment from his native Florence. 

Much of the political subtext is centred around the Florentine Republic, which forms a model of the consequences which unbridled political infighting can bring. Dante relied on his own experience in the political institutions of Florence, having held numerous offices there prior to writing the Divine Comedy. The first explicit reference is provided by Ciacco, a Florentine, in Canto IV. Ciacco’s name is derived from the Tuscan for ‘pig’ and fits accordingly with his punishment in the circle reserved for gluttons – a thinly veiled critique at the city’s corruptive, gluttonous nature. In alluding to Florence’s gluttony, Dante sought to criticise the highly mercantile, money-driven mentality which he had characterised Florentine politics. 

Though Dante had been involved in these political institutions, his disapproval of the city’s character is an ongoing thread throughout the Inferno. Dante’s journey through the Divine Commedia elevates his intellectual and spiritual standing, though this is notably whilst he is absent from Florence. For him, the civil strife and decayed moral standards of the city prevented the development of his character – as such, separation from such a place allowed him to make this process “his own”. Achieving intellectual truth was a key in attaining spiritual salvation, and Dante had achieved this truth away from Florence and its chaos.

Ciacco later compounds this vision of Florence by claiming ‘pride, envy and avarice are the three sparks that inflamed the hearts of Florentines’ (Inf. IV, 12-14). From the offset, he conjures an image of a city that is inherently bound to jealousy and infighting, referring to it as ‘your city, which is so full of envy’ (Inf. IV, 45-46). By constructing this version of Florence, Dante uses his encounter with Ciacco as a means to criticise the series of civil conflicts which ravaged the city in the preceding century. He unambiguously concentrates on two periods in particular: the drawn-out struggle between the Ghibelline and Guelf sects of the 13th century, and the inter-partisan conflict. Dante pushes Ciacco, by asking whether some of the famous Ghibelline and Guelf leaders from the city’s history were damned, or not:

Tegghaio, Farinata – men of rank – Mosca, Arrigo, Rusticucci, too,

 and others with their minds on noble deeds,

tell me, so I may know them, where are they?

For I am gripped by the great desire, to tell, 

if heaven holds them sweet – or poisonous hell. (Inf. VI, 79-84).

In asking such a question, Dante radically challenges Florence’s collective memory of these individuals. He prompts a revaluation of Florence’s past heroes; for much of the city’s populace, the dramatic departure of the Ghibellines in 1471 was highly formative for Florence’s identity. To many, it represented the triumph over tyranny, and its leaders were reimagined as liberators. Ciacco defies this conclusion with his response:

These dwell among the blackest souls, loaded down deep by sins of differing types. If you sink far enough, you’ll see them all. (Inf. VI, 85-87).

Here, the Inferno depicts the Ghibelline and Guelf figures as eternally damned. For Dante, their damnation had been earned by the violence and political instability that these forces had unleashed upon Florence throughout the 13th century. Their heroic status had gone unchallenged by the body politic, as the turmoil had been legitimised by a mask of patriotic fervour. Yet Dante confronts this position, and in doing so, undoes generations of societal education.

Dante’s recurrent critique of factionalism is deeply informed by his exile from Florence. Ciacco prophesises the resurgence of civil disorder of 1300-2, in which the violence between the warring black and white Guelfs reached a bloody pinnacle: the return of the banished black Guelfs with papal assistance, and mass expatriation of white Guelfs. Florence’s political institutions had been deeply fractured, and Dante became a direct victim of the sheer instability this system incurred. He was exiled in 1302, alongside numerous other politically active Florentines. The Inferno – having been begun shortly after this – is wrought with contempt for the instability that Dante saw as inherent to such a politically divided city. Dante’s decision to place the Guelf and Ghibelline leaders in hell is a clear statement that, despite any quasi-heroic status, the civil discord that these individuals had spread had negatively impacted Florence’s citizenry, such as him.

Further into the Inferno’s narrative, Dante relates disapproval of the elite classes to his native Florence. In Canto X, Dante encounters Farinata Degli Ulberti, an aristocratic Florentine who had championed the Ghibelline cause in the mid-thirteenth century. Initially Farinata’s portrayal could be considered relatively positive. Upon recognising Dante as a fellow Tuscan, Farinata engages in sombre reflection of Florence’s violence:

You must be a native of that noble fatherland,
to which I perhaps did too much harm. (Inf. X, 12-13).

Farinata’s ruminations are profound. His reference to a ‘noble fatherland’ demonstrates his patriotism and enduring loyalty to the nobil patria. He remorsefully re-examines his own role in the Ghibelline conflicts by pondering whether the violence unleashed at Montaperti had been detrimental. For a moment, Dante allows the damned Farinata a measure of solemn nobility; his careful, considered reproach of his own actions is a momentary pause in Dante’s criticism. However, it is short-lived. 

Moments later, the situation is reversed, as Farinata becomes a mouthpiece for aristocratic factionalism. He begins by asking Dante about his heritage, and upon discovering the poet’s Guelf ancestry his entire demeanour shifts: 

As soon as I was at the foot of his tomb
somewhat he eyed me, and, as if disdainful,
then asked of me, “Who were thine ancestors?”

I, who desirous of obeying was,
concealed it not, but all revealed to him;
whereat he raised his brows a little upward.

Then he said: “fiercely adverse have they been
to me, and to my fathers, and to my party;
so that two several times I scattered them. (Inf. X, 40-49).

At Dante’s revelation, the cordial relations between the two are at an end. Suddenly, Farinata appears factional and divisive: gone are his saddened affections for the city of Florence. He represents the crux of the aristocratic folly for Dante; he is used to expose the fatal blindness of the Florentine elite in its excessive devotion to party and family at the expense of broader loyalties to the city and patria. 

Upon discovering Dante’s allegiances, Farinata is cutting and self-congratulatory – he boasts of his own military success during the Ghibelline and Guelf conflict, alluding to battles in 1248 and 1260. Suddenly, those events, that had caused Farinata remorse, now bloat his pride. Dante chastises this violence, lamenting its ‘staining of the flowing Arbia red’ (Inf. X, 85-6).

Dante informs Farinata that his family were exiled from Florence following his death, in an effort to wound the general – as part of his punishment, Farinata is unable to see the present, in what is perhaps a jibe at the aristocracy’s blindness to their actions. Farinata begins to ponder whether eternal damnation is a lesser punishment compared to witnessing his family’s exile from their homeland. Yet again, we see the questionable judgement of the Florentine aristocracy emerge as Farinata places his family’s status within Florence above all other considerations. That he considers ‘this bed of pain’ (Inf. X, 77-8) to be of lesser torment than mere political exclusion is emblematic of the irrationality of the upper classes. He questions the legitimacy of the city’s popular government, and the validity of its decision to banish his family – a clear display of the elite’ disdain for popular movements. 

However, Farinata also enacts a measure of revenge against Dante, by informing him of his eventual exile:

And yet no more than fifty times that face,
(the moon’s, who is our sovereign here) will shine
till you shall learn how heavy that art weighs (Inf. X, 79-81).

In a reciprocal act of animosity, Farinata reveals that soon Dante will experience the ‘art’ of exile. By displaying this series of insults and verbal injuries, Dante replicates the series of conflicts between the Guelfs and the Ghibellines in the 13th century. The pettiness and the futility of the discourse mirrors what Dante saw as the pointlessness of such conflict. 

To him, the civil disorder that had been so inherent to Florentine politics had stifled social progress and hastened moral decay. This conclusion had been greatly informed by his own victimhood at the hands of a deeply factionalised city – something which earned the city’s immortalisation in the depths of Dante’s imagined hellscape. 


Bernstein, A.E. ‘Heaven, Hell and Purgatory: 1100-1500’, in M. Rubin, Barański, Zygmunt G., and Simon Gilson. The Cambridge Companion to Dante’s Commedia. Cambridge: Cambridge University Press, 2019.

Honess, Claire E. From Florence to the Heavenly City: The Poetry of Citizenship in Dante. Italian Perspectives; 13. (London 2006).

Lansing, R.  The Dante encyclopedia, (London, 2010).

Murphy, Robert. “DANTE AND POLITICS.” History Today 20, no. 7 (1970): 481.

Najemy, John. ‘Dante and Florence’ in Jacoff Rachael (Ed.) The Cambridge Companion to Dante. Cambridge (2007) p236-256.

Woodhouse, John. ‘Dante and Governance’ London (2011).


Dionysos the Weird: Reading Bacchae through the lens of Lovecraftian horror

Written by: Justin Biggi.

Euripides’ Bacchae features some of the stranger imagery the playwright employed throughout his works. Focusing on Dionysos’ return to his homeland of Thebes, the play sees Dionysos’ cousin, Pentheus, meet a grisly end at the hands of, amongst others, his own mother, driven mad with other women by Dionysos himself. Pentheus’ grisly death becomes a reminder for the audience of what happens when one attempts to go against a god’s will – especially given the fact that this is blatant punishment for Pentheus’ actions of outlawing the cult of Dionysos. 

Much can be said about how this play approaches subjects such as ritual madness, hubris and gender roles (Pentheus is disguised as a woman by Dionysos, as he tries to secretly spy on his mother and her companions). Dionysos has long been interpreted as a god who blurs boundaries. He is first and foremost a god of wine, and therefore of drunkenness: this implies a lack of control ancient society (as well as our modern one) may not have been all that comfortable with facing on a day-to-day basis.  He is a god of excess, characterized by an entourage of priestesses, the Bakkhai or Maenads, and, in iconography, also accompanied by male, sexually-charged satyrs. He is also a god of magic and of rituals that were not open to the public or at least, not open to the predominant male citizen class: similarly to Orpheus, it has been argued that his cults attracted mostly women and slaves. 

The text of the Bacchae is not spared such perceived strangeness or liminality. While there are a number of single, isolated instances where something strange or unnatural happens (such as Dionysos tearing down the walls of the Theban palace, or Pentheus thinking the god has the head of a bull), there are two cases where we have a longer, in-depth description of strange, even terrifying, acts. In two instances of so-called “messenger speeches,” where characters come on-stage to relate events that have happened elsewhere but are nonetheless central to the plot, the audience is privy to two acts of intense violence which are meant to not only cause discomfort in the audience, but also, through the nature of the acts themselves, push the boundaries of what is or isn’t natural. In the first messenger speech, we find a graphic description of the Theban women, driven to Bacchic frenzy by Dionysos, tear apart a herd of cows with their bare hands. In the second messenger speech, the messenger describes Pentheus’ journey to spy on the Bacchae and his discovery which leads, on Dionysos’ urging, to his death by dismemberment, in particular at the hands of his mother.  

Much of what happens in these two episodes can be described as “uncanny.” The term is generally applied to Lovecraftian fiction or “weird” horror, and implies the intentional subversion of the “laws of nature” in favor of unsettling, often scary imagery. Bennett and Royle define the uncanny as follows: “the thoughts and feelings that may arise on those occasions when the homely becomes the unhomely, when the familiar becomes uncomfortably strange” (40.) This definition echoes the one put forward by H. P. Lovecraft: “[t]he weird tale has … [a] certain atmosphere of breathless and unexplainable dread of outer, unknown forces …. a malign and particular suspension or defeat of those fixed laws of Nature” (28.)

As explained above, Dionysos is a god often happy with pushing boundaries. It is no different in the Bacchae. In the first messenger speech, a cattle hearder, after narrowly escaping the Bacchae’s frenzy, rushes to Thebes to tell Pentheus what he has seen: the women, through the power of the god, were able to have milk, honey and wine spring from the ground (lines 869 – 877.) While this is certainly strange, it does not appear uncomfortable to the narrator, the herdsman and, in fact, he urges the disbelieving Pentheus to accept Dionysos as a god, which he is adamantly refusing to do throughout the play. Things become unsettling, uncomfortable, when the men try to grab Agave, Pentheus’ mother, to bring her back to court. Following this, the women turn to violence. 

The herdsman proceeds to give a vivid description of the women tearing cattle apart with their hands. Euripides does not spare the audience the gory details: “You should have seen one ripping a fat, young, lowing calf apart— others tearing cows in pieces with their hands” (909 – 910) while, once the carnage has been completed, “[y]ou could have seen ribs and cloven hooves tossed everywhere—some hung up in branches dripping blood and gore” (911 – 913.)

Through the graphic depiction of violence, the strangeness of the episode becomes uncomfortable. As I’ve discussed in my piece on Seneca’s use of violence, the violent act is uncomfortable precisely because it is dehumanising. A living, animated creature, human or animal, becomes simple meat: this is made even worse when the body is literally torn apart. In Ancient Greek, ritual dismemberment had a name: sparagmos. The sparagmos in the first messenger speech foreshadows a second, more terrifying one in the second messenger speech.

Towards the end of the play, Dionysos manages to successfully lure Pentheus up the mountain, under the pretense that there he will be able to witness the Bacchae’s ritual directly. It is, of course, a trap, and Pentheus is soon attacked by the women he was hoping to spy on. Urged by the god, the women tear Pentheus apart, and his mother, Agave, rips his head form his shoulders believing him to be a mountain lion. Similarly to the episode described above, the violence is preceded by a strange, but not necessarily unsettling, episode: Dionysos “[makes a] tree bend down,              forcing the mountain pine to earth by hand, something no mortal man could ever do” (1330 – 1332.) It is to help Pentheus gain better access to the women. After this, Dionysos disappears, and only his voice can be heard, urging the women to violence. Once more, Euripides does not shy away from the gorier details: “[s]he seized his left arm, below the elbow, pushed her foot against the poor man’s ribs, then tore his shoulder out … ripping off chunks of Pentheus’s flesh” (1391 – 1406.)

In both instances, the direct actions of a god, by definition inhuman, are the catalyst for the dreadful events to take place. In the first messenger speech, the women are “in Bacchic ecstasy” (897) and, the second time, it is Dionysus himself who urges them to violence (1345 – 1349.) As Lovecraft defined it, the weird is dreadful due to “outer, unknown forces” manifesting in a way which is percieved by the protagonist as menacing (28.) Throughout the play, one of the underlying threads is whether or not Dionysos is recognised by the other characters as being a god. Those who recognise his divine nature, like Tiresias, the herdsman or the second messenger, one of Pentheus’ attendants, are spared his wrath. Characters like Pentheus, however, or his mother, Agave (who is turned to Bacchic frenzy as a way to punish her for rejecting Dionysos’ mother, her sister) are forced to participate in dreadful, terrifying events as a way for them to fully recognise Dionysos’ divine nature. 

Certainly it is anachronistic to argue that the Bacchae is a piece of Lovecraftian horror literature, but by reading the play through the lens of the genre, we are able to add a further layer of complexity to the text: by reading the play as cosmic horror, we are able to see patterns, specifically in the two messenger speeches, which would have been lost otherwise. Thanks to this, the interpretation of Dionysus, both the character and the god, is made more complete. As Lovecraftian horror dabbles in the unspeakable and the terrible, so does Dionysus’ character in the play: by reading them as such, we can see the ways in which similar tropes have been used throughout the centuries, echoing each other and playing off of each other. 


Euripides, Bacchae, translated by Ian Johnson, (Vancouver: Vancouver Island University Press, 2003). 

Jameson, M., “The Asexuality of Dionysus” in The Masks of Dionysus, eds. Thomas H. Carpenter and Christopher A. Pharaone, (New York: Cornell University Press, 1993).

Lovecraft, H. P., Supernatural Horror in Literature and Other Essays, (Mineola: Dover Publications, 1973).

Otto, W. F., Dionysus:, myth and cult, (Bloomington: Indiana University Press, 1965).

Royle, N. and Bennett, A., An Introduction to Literature, Criticism and Theory, (New York: Routledge, 2016).