Prejudices and Priorities in a Previous Global Pandemic – Spanish Flu in the West Midlands

Maggie Andrews

Scientific research and understanding often follow pandemics; viruses were identified in 1933, more than 10 years after what is now commonly referred to as the Spanish Flu. In the face of uncertainty and partial knowledge, government decisions and popular perceptions just over a hundred years ago were shaped by cultural and political preoccupations, prejudices and already existing priorities. Scientists estimate that between 1917 and 1920 a quarter of the world’s population were infected by the Spanish Flu; between 50 and 100 million of them died.

In the West Midlands, the first cases appeared in April 1918. Britain was war-weary, from nearly four years of conflict; there was little hope of an end to the stalemate on the Western Front and the Germans were poised to retake Passchendaele. Those on the home front were struggling to cope with the death and disruption caused by men fighting in foreign fields, alongside Zeppelins and aeroplanes bombing Britain. Housewives battled to feed their families as food shortages led to price rises, food queues and finally the introduction of rationing in 1918.  At a national and local level winning the war was prioritised, concern about the influenza focused on how it compromised war production or the health of soldiers required for battle. Many in the Army Medical Corps considered that they had more pressing medical issues than flu, which was often seen in derisory terms as a mild complaint, compared to sepsis, gangrene, lice and enteric fevers. During this initial wave of the pandemic, which abated by late July, newspapers, doctors, propaganda and politicians conveyed the message that ordinary people should either ignore it, smile through the threat of illness, or take care of themselves. The Birmingham Mail reassured its readers that ‘fortunately the disease is not of a virulent type, carrying with it any actual danger’. Mortality figures suggest otherwise.

By early autumn 1918, a more lethal strain of the flu had arrived, lasting until the end of December. In November, as the guns of battle fell silent, Dr Masbyn Read, the Medical Officer of Health in Worcester, reassured the local population that although there had been 41 deaths from influenza in the city the previous week the ‘disease is past its worst’. The local press, however, noted that the disease was still widespread in Evesham and Stourport. Over 1,500 cases were reported in Kidderminster, which was considered to be particularly badly hit. Many workers were absent from the carpet and spinning mills in the town. There was a shortage of comics in Worcester, purchased to keep sick children amused when confined to bed. Despite demands made for theatres and cinemas to be closed or more tightly regulated, they were used to give the public information about preventing the spread of the disease, including the infamous film Dr Wise and a Foolish Patient. Schools were closed in some areas for 2 or 3 weeks, occasionally longer, despite anxiety that children running around the streets unsupervised would spread the disease.

Despite the Armistice, the war was not officially over until the Versailles peace treaty was signed on 28 June 1919, with the result that Germany continued to be blockaded and the British government was slow to release the doctors and nurses in the armed forces that were needed to combat the disease and care for the sick. Many doctors and medical students had volunteered for the forces and in early 1918 the government had raised the call up age for doctors to 55. Although elderly doctors were encouraged to return to practice, the number of doctors able to minister to patients in the West Midlands was heavily depleted. Doctors were run off their feet as were undertakers and vicars. Long queues were seen at chemist shops, people consumed lethal doses of aspirin. Charlatans made spurious claims and significant profits for patent products such as Dr William’s Pill for Pale People. Consumers were requested not to hoard Bovril, thought to prevent or at least aid recovery from the flu. The failure to release undertakers and grave diggers from the forces in late November 1918, or to make stocks of whisky (considered a cure) freely available, were widely criticized. A government with one priority – in this case winning the war – could not quickly change course and marshal resources to deal with another. Nevertheless, by late November, as the general election campaign got underway, soldiers in Britain were deployed to help dig graves and whisky could be obtained with a letter of recommendation from a doctor.

In European history and memory, the devastating consequences of the pandemic have often been either eclipsed by or entwined with the death and injury caused by the First World War. In all continents the flu was responsible for many more deaths than war and it was particularly devastating in India. In Britain approximately 850,000 were killed in armed combat whilst 240,000 died from the virus, although diagnosis was not always reliable. A unique feature of this pandemic was that it disproportionately affected younger adults; the peak age of death was 28. Thus, for many ordinary people like Mr and Mrs Slade Nash of Martley, Worcestershire, influenza was yet another element of the misery and sorrow that warfare inflicted upon their lives. They had already lost two sons in the war when their daughter Margaret died from influenza in October 1918.

Whilst the First World War did not cause the pandemic, the mobility of people in wartime (shifting between countries or regions) helped to spread the Influenza. The Spanish Flu is now understood to have originated in Kansas in the USA, and was brought to Europe by soldiers sailing across the Atlantic to fight on the Western Front. As a neutral country Spain was not subject to the censorship other countries had experienced, and freely reported on the progress of the disease, particularly when King Alfonso XIII and members of his government became unwell, and people came to associate the disease with Spain. The Times noted on 18 December 1918 that once the disease had reached London it radiated out through Birmingham, Nottingham and other major centers via the rail networks. Public activities, schools, factories, churches, cinemas, theatres and public transport and Armistice Day celebrations all helped to spread the virus. Soldiers on the battlefronts, on leave, in training or prisoner of war camps, in hospitals, or when demobilised, were efficient carriers of influenza.

In the absence of any genuine scientific understanding of viruses and how the influenza actually spread, multiple different explanations and theories emerged, shaped by prevailing cultural prejudices, xenophobia and class prejudice. The Birmingham Daily Post pointed out in June 1918 ‘the man in the street … is sometimes inclined to believe it is really a form of pro-German influence’.  The ‘unseen hand’ of the enemy was supposed to be spreading the illness. A quarter of a million Germans died from disease, nevertheless for some who were hyper-patriotic the link between germs and Germans was very close and a number of nicknames emerged for the virus that embodied this prejudice including ‘Flanders Grippe’, ‘Hun flu’, and the ‘Turco-Germanic bacterial criminal enterprise’. Soldiers writing from the front or visiting Britain on leave or to convalesce also conveyed rumours that the heart of this ‘German plague’ lay in the unburied corpses on the battlefields or the German’s use of poison gas.

The sense that the conditions of war, exhaustion, war weariness, and food shortages precepted the spread of the disease had more credibility, but was again laced with a touch of anti-German feeling. Thus, the Birmingham Daily Post argued in 25 June, ‘There can be no doubt whatever that it has been recurring in a very severe form in Germany, Austria, and the territories occupied by the Central Powers during the last two years.’ Those who were not convinced that the flu was the intentional work of the enemy powers often perceived it as an unintentional consequence of war and the arrival of groups of workers or troops from abroad. The French blamed the Spanish and in particular the Spanish workers who replaced French men who had joined the armed forces, whilst the Spanish singled out the Portuguese for culpability. The Germans apparently suggested that the flu had been imported by the 100,000 men who worked behind the lines, assisting the Allies in France, as part of the Chinese Labour Corps.

Other explanations for the Influenza included: the wartime propensity to dig up the land to grow more food, poor housing, and dirt. The Medical Officer of Health in Worcester long maintained that combatting this disease was a ‘purely a personal matter’ and that the spread was ‘due to the carelessness of individuals’. Newspapers warned against overcrowding, indiscriminate spitting and alcoholism and encouraged thorough cleaning of houses and whitewashing walls. Cultural assumptions and prejudices towards densely populated working-class areas of industrial cities was not however supported by evidence; influenza was a remarkably democratic illness. The Evesham Journal pointed out on 2 November that ‘All classes of the community are affected, and it appears to take an even more virulent form with the apparently hale and hearty.’ In the village of Badsey, where locals and 200 German prisoners of war worked on the land, the virus was rife.

This swirling mishmash of explanations, rumours, accusations and assumptions about the flu did have two positive longer-term influences. They added weight to arguments for the setting up of the Ministry of Health in 1918 and for slum clearances and the building of working-class housing in the inter-war years across the region from Birmingham to Tewkesbury.

 

The material in this blog is drawn from Maggie Andrews and Emma Edwards, Bovril, Whisky and a Shortage of Gravediggers: The Spanish Flu comes to the West Midlands (History West Midlands, 2019).

Professor Kay Mohanna’s Australian Collaboration

Griffith University, in Queensland, Australia, recently welcomed Kay Mohanna, Professor of Values based Healthcare Education at the University of Worcester.

Professor Mohanna visited the Griffith Institute for Educational Research on 24 February 2020 to begin consolidating plans for a collaborative, inter-disciplinary project with Dr Jeanne Allen, Associate Professor of Teacher Education, and a team of Griffith researchers. Kay and Jeanne also met with Dr Leonie Rowan, acting Director of the Griffiths Institute and Associate Professor of Education, to discuss the project, which will be framed within the politics of belonging and investigate the development of a sense of belonging and professional identity among the increasingly mobile student population in higher education.

Dr Allen is a co-editor of the Asia-Pacific Journal of Teacher Education, which publishes research that advances knowledge in teacher education across early childhood, primary, secondary, vocational education and training, and higher education. Her work with Kay builds on their previous research on students’ expectations and experiences in internationalised and globalised higher education (Kay is on the editorial board for Education for Primary Care, which aims to support the implementation of best-evidence medical education relevant to primary care). Their previous work includes a 2013 article in Educational Theory and Practice.

Through this link with the Griffith Institute for Educational Research, Professor Mohanna is working with is one of the largest concentrations of university-based education researchers in Australia. It comprises a vibrant community of scholars whose applied research with industry partners, education systems, schools, community and not-for-profit organisations is done with the aim of translating it into practice. The Institute has identified key research foci that include, but are not limited to, leadership, living with autism, and professional and practice-based learning. It identifies Research Programs based on evidence of current, successful research, led by five esteemed senior researchers who are also mindful of new intellectual projects that can drive their endeavours to the next level of productivity and reputation.

Initially this collaboration will carry out a scoping review of the literature. This will look at the implications of the globalized movement of people, ideas and capital including how international students navigate multiple types of borders in ways that can exacerbate their experiences of marginalization and development of belongingness.

British Press Perceptions of the 1857–58 ‘Indian Mutiny’

Arran Jenkins

In 1963 the skull of an Indian man was discovered in a small pub in Kent. Folded and pushed inside one of its eye sockets was a scribbled note that revealed he was called Alum Bheg and that he had lived over a century ago in India. He had met his end, like many others, when he was blown from a cannon for his alleged involvement in an attack on a British resident family. Bheg’s skull was a war trophy taken during a mutiny of sepoys, or native soldiers, that began in the ancient city of Meerut on 10 May 1857 and soon erupted into a wider conflict that engulfed an estimated one sixth of the East India Company’s (EIC) territory in India.

The revolt lasted fourteen months and its implications sent shockwaves through the martial and civil spheres in India. It also reverberated throughout British society. To see evidence of this we need look no further than the rivers of ink spilled in newspaper coverage. The numerous editorials, opinion articles and letters from the public that appeared in national and regional publications, such as The Times and Berrow’s Worcester Journal, expose a multifaceted debate revolving around the 1857 Uprising that touched upon such disparate topics as the shortcomings of the EIC, the character of the Indian people and the loyalty of Catholic soldiers. Capital punishment was another recurring topic and it is the focus of this piece.

Contemporary newspapers viewed the ‘mutiny’ as an event of great importance which attracted widespread attention in Britain. Such interest is likely to have been based more on familial than fiscal ties. The 30,000 British soldiers—and the family members that they brought with them—were comprised of people from a variety of backgrounds. And they maintained connections to their homeland. In July 1857, for example, there were over 20,000 letters sent home by settlers that described the startling events then unfolding across India and these were frequently published in British newspapers. In this way, the mutiny managed to reach many people in the United Kingdom and is one of the means by which the events scarred the national consciousness.

This resounded in British literature for decades, with related themes appearing in novels such as A Tale of Two Cities by Charles Dickens and E.M. Foster’s A Passage to India. An infiltration of ideas such as racial superiority and inherent ‘otherness’ into fictional works reveals the subtle ways in which the crisis affected the public psyche. The earliest histories produced narratives that were concerned with military battles and heroic acts. These largely cooperated with the colonial project rather than undermined it: some were even published before the rebellion was comprehensively crushed. Historicizing an event so swiftly is at least one indication of imperial arrogance. Nonetheless, reactions to the uprising largely defy generalisation.

What historians might term an ‘imperial’ narrative of events existed, but this did not go undisputed. In particular, that this was merely a tale of religious prejudices inflamed by the introduction of greased cartridges was never believed to be the only cause by all contemporary commentators. Many were able to trace the grievances of the rebels to a multitude of misjudged policies instigated years previously by the EIC, a once powerful organisation now in its death throes. The press did not tenderly ease its passing. Numerous past procedures of the EIC came in for criticism, reflecting long-held anxieties within Britain about merchants acting in the role of a state. The EIC was thus the primary target in the hunt for an adequate cause. This hunt was initiated by concern for British enterprise in India but lingered in the public discourse because of the rebels’ violent actions committed against British residents. A fierce reaction was sparked from the home press. From early on in the crisis, The Times, Glasgow Herald and Berrow’s Journal were demanding the blood of anyone considered culpable in the murder of British women and children. This sullying of the purest symbols of Victorian domesticity was proposed by The Times as an effort to ‘dishonour England itself’.

Wendy Webster demonstrates how this concept of domesticity being under siege also resonated in accounts of post-1945 colonial conflicts. Just as ideas of ‘white’ domesticity were shaken by fears of ‘black’ disloyalty during the Mau Mau rebellion in Kenya, the sepoy mutiny tainted a perceived paternal relationship on both a micro level—between a British officer and the sepoys—and a macro level between Britain and India. So much so that the popular illustrated newspaper, Punch, mocked the notion of a filial relationship forwarded by earlier thinkers like James Mill, with a cartoon that depicted the ‘Clement’ Governor-General Canning patting the head of a child-sized sepoy clasping a bloodied sabre.

The lampooning of those considered too merciful by many in the press is set against a context of progressive debate about the utilisation of capital punishment in Victorian society. Pro-abolitionists had made great strides with several government acts in the 1830s; reducing the amount of capital crimes by 66 per cent and by 1841 execution was only carried out for treason and homicide. By 1857 however there was a clear contradiction in contemporary discourse between the civilised, forward-looking Englishman and the blood-thirsty sensationalist Englishman, addicted to retribution as a means of revenge. The Times suggested that after an ‘immense amount of humanity’, with the reform of the penal code, there appears to be a ‘new phase of the British character to find a general consent in favour of strong measures in the East.’ Understood by some as a repressed retributive spirit connected to the teachings of the Old Testament, the Indian uprising was an episode that led to its resurgence in popular culture throwing into question the constructed identity of British civility.

Ultimately, the revolt failed to knock loose the jewel from the imperial crown. Yet, it is a testament to the sheer depth of the impact that it had on the people of the United Kingdom and India—political, social, cultural and intellectual—that it is still recognised as a landmark event in both these country’s histories.

 

Arran Jenkins is a History graduate of the University of Worcester. His final year Independent Study, examining the British press and the Indian Uprising, won the Stanford History Group Prize for Best Independent Study in History.

Witch-Hunts Past and Present

Darren Oldridge

President Donald Trump is perhaps the most famous and certainly the most outspoken recent victim of an alleged ‘witch-hunt.’ But he is by no means alone. In the last few years, other supposed victims of the phenomenon have included Hillary Clinton and several prominent members of the British Labour Party. Since at least the 1950s the language of ‘witch-hunting’ has played a role in public life, and its usefulness to politicians of all stripes means that it continues to thrive.

Like all modern victims of a ‘witch-hunt’, Trump uses the term to impute the motives and methods of his opponents. Witch-hunts are invariably wicked and unfair. To be accused is to be innocent; to pursue a witch is to act maliciously, hysterically or unjustly, and often all three of these at once.

As a historian of witchcraft I am wary of these connotations. While they illustrate modern-day perceptions of the past, they are a poor guide to the world in which real people suspected of harmful magic and compacts with evil spirits were once put on trial, and sometimes executed.  Indeed, our own language of witch-hunting is an impediment to understanding that world.

So how does today’s idea of a ‘witch-hunt’ compare to the historical record? The criminal prosecution of witches took place from the later 1400s until the early eighteenth century, and was responsible for around 50,000 deaths. The modern image of a witch-hunt corresponds most closely to the mass trials that occurred sporadically in this period, and especially in German-speaking lands. These could consume whole communities with dreadful speed, as happened in Trier in the 1580s and Würzburg in 1629.

Such events were mercifully rare. It was far more common for individuals or small groups of people to be accused, and in many cases they were probably not brought to trial. When they were, their treatment varied considerably from region to region, and was not always severe. In England, for example, only around a quarter of those formally accused of the crime were executed.

It was not the case, then, that to accuse a person of witchcraft was automatically to condemn them to death. In this respect the modern understanding of ‘witch-hunts’ misrepresents the past. Indeed, the majority of witchcraft cases cannot be viewed as hunts at all.

Nor were the people who dealt with witchcraft typically characterised by ulterior motives or disrespect for justice. Indeed, the occult nature of the offence meant that careful discrimination was required in its prosecution. Many experts on the subject, like the English physician and demonologist John Cotta and the Massachusetts pastor Increase Mather, argued for judicial caution in the face of a crime that was undeniably serious but difficult to prove.

Behind these observations is a deeper issue about belief. This is the fundamental difference between the fear of witchcraft in the past and the modern use of the term. Two core assumptions drove the witch trials in the sixteenth and seventeenth centuries: first and most commonly, witches were believed to harm others through the use of magic; and secondly, and of less importance to ordinary people, they were held to serve the Devil. In the most extreme version of this latter idea, witches were believed to fly to nocturnal assemblies where they committed atrocities and worshipped the Prince of Darkness.

Today very few people in the west believe in magic, and the concept is most familiar as either a metaphor or a kind of entertainment.  (And stage magic, of course, is expected to deceive.)  As a result, the most widely feared form of witchcraft in the past – the practice of destructive sorcery – is simply incredible.

What of the Devil? Here there are interesting international variations. A major survey in 1982 found that only 21% of the UK population believed that he existed; this had dropped to 10% by 2016. In contrast, a poll in 2005 found that 60% of Americans believed in the ancient enemy. This pattern reflects a more general difference in the prevalence of religious belief between western Europe and the USA. Interestingly, belief in God tends to be higher than belief in his adversary wherever the question is asked.

It may be hasty to declare the death of Satan. But it is true, nonetheless, that he is not an integral and uncontested part of the intellectual landscape of western communities, as he was in the age of witch trials. Nor is he connected to the practice of evil magic that is believed to cause real harm.

The modern use of the phrase ‘witch-hunt’ reflects this situation. As we no longer share the beliefs that once underpinned the crime of witchcraft, we find it hard to accept the crime at face value. We struggle to imagine the witch of the pre-modern world, whom people perceived as a figure of real menace. Conversely, we find it easy to assume that witch trials were impelled by ulterior motives. Typically, these include vindictiveness or greed, combined with a willing disregard for justice.

The current language of ‘witch-hunting’, then, indicates our own separation from the historical world of witch trials rather than the revival of pre-modern practices. (In some other, non-political contexts the parallels between past and present may be stronger.  Contemporary allegations of ‘ritual satanic abuse,’ for example, echo the early modern idea of the witches’ Sabbath. But that is another story.)

Ultimately, we have projected our own explanations onto the experience of those men and women who feared witchcraft in the past, recasting them as malicious or ‘hysterical’, and invariably unjust.  It is this version of the past – born of our profound separation from it – that underpins today’s talk of ‘witch-hunts’.

 

Darren Oldridge is Professor of Early Modern History at the University of Worcester. His most recent book is The Supernatural in Tudor and Stuart England (London & New York: Routledge, 2016), and he is currently writing a study of English demonology.

Management Practices in Developing Countries: The Case of Wasta Networks in the Arab Middle East

Sa’ad Ali

A majority of current management theories, developed since the start of the 20th century, have been created by researchers in the ‘developed’ countries of the USA and Canada in North America, countries of Western Europe, and Japan in Asia. These theories have been constructed based on data and observations of management practices in these countries and viewed by their researchers and practitioners as the ‘right way to manage’. Companies from these countries which have expanded to developing economies have tried to transfer these practices to these countries, often neglecting the different cultural, institutional and socio-economic context. However, management researchers have recently started paying attention to these differences. This has resulted in a new stream of management research coming from developing countries that examines the business environment by taking into consideration the wider context of a specific country or countries.

In line with this stream of research, my research during the past 7 years has focused on wasta in the Jordanian banking industry as a specific social and organizational phenomenon in Jordan and the countries of the Arab Middle East. The literal translation of wasta is mediation, or also intermediary, and in daily usage wasta refers to the use of connections in order to get something done. Someone who ’has wasta’ either has a degree of influence, or has access to those who do, which can help to ‘get things done’. An individual person in Arab countries might seek out someone with influence, who is then also called ‘a wasta’ in spoken dialect Arabic, or ‘waseet’ in classic Arabic, and utilised in order to find a job, secure a place at a university, or navigate the bureaucratic red tape that is so common in these countries. Wasta refers thus both to the action and those facilitating it. It has been suggested that wasta impacts nearly every facet of organisational life, and it is also a common source of discussion, and complaints, among Jordanians. Yet, how wasta is practiced in this context, and how it is experienced, remains a largely unknown phenomenon.

The etymology of wasta as an action and a person, is generally associated with the notion of occupying a middle place in a network. When one looks further into the linguistic roots of the word, one can simply understand it as the ability ‘to get things done through the use of social connections’. As a cultural phenomenon, it is not remote from other practices and ideas of reciprocal social relations, such as guanxi in China, blat in Russia or pulling strings in the United Kingdom or the general idea of a relational give and take in the business world. The degree to which each of these phenomena prevail is a reflection of how networked the society is and how members of these societies prefer to socialise and do business.

In Jordan, similar to other Arab countries, wasta impacts upon different business issues ranging from applying for trade licenses and government services to securing governmental bids. In particular, it influences recruitment and selection: job seekers use wasta as a medium to secure employment, and organisations use it to secure qualified employees. From a Westocentric perspective, wasta tends to be perceived as favouritism or nepotism which contrasts with the idea of a ‘modern’ (implicitly ‘Western’) workplace. Also, within Jordan, there is a widespread debate on whether wasta should be abolished for the sake of Western-style ‘modernization’ and in order to ensure equal employment opportunities for all.

On the other hand, my research, as with some others who study wasta and similar network-based phenomena, highlights the benefits of using it, as it can have positive outcomes on the micro-level for individuals when mediating between parties helps a qualified individual secure a job though the mediation process. It is also beneficial to the organisation which can secure a qualified and loyal employee in a country where certain skills and qualifications are scarce, for example, due to the brain drain of Jordanian employees to the Arab Gulf countries. However, it can have some severe negative outcomes on the macro-level as it reduces organisational diversity and leads to reinforcing power pockets in particular groups. Another negative impact of this use of wasta is that it weakens formal institutions as it reduces trust in the political and legal processes.

While my research highlights some positive aspects of wasta on the micro level, such as enabling qualified individuals to attain a chance of employment and securing trustworthy and qualified employees for the organisation, it is also important to consider the negative impact of using this practice. There are, for instance, diversity and exclusion issues, and this suggests also that an emic approach to culture-specific concepts can never be free of power-implications. As such, wasta itself is not purely ‘good’ or ‘bad’ but it is a way of doing things that can have positive or negative outcomes depending on how it is practiced and on what level it is viewed (micro or macro). Organisations seeking to do business in Arab countries need to acknowledge the wide spread practice of this phenomenon and try to accommodate its practice without breaking their own ethical principles.

Management practitioners are invited to consider how diversity and merit are not culture-free concepts; in fact, their recent prominence owes much to the post-modern Western need to come to terms with a new understanding that the global world does not map securely onto the largely Western-dominated management literature. This research invites managers to understand that what is considered good practice in different cultures should be balanced by the different cultural perspectives on what is considered ‘good’ and ‘the right way to manage’.

 

Dr Sa’ad Ali is Lecturer in Human Resource Management, University of Worcester.

A World Increasingly United by Growing Divides – Time for New Partnership Responses?

Gareth Dart

Earlier this semester I attended a conference exploring issues of disability in southern Africa, held at the University of Botswana in the capital, Gaborone. Having spent nearly a decade working in the country but not having revisited for the last eight years I was intrigued to see what might have changed. An issue that was a reinforced through my visit is that it is becoming more and more difficult to talk about differences in wealth and development (and let’s leave aside for now exactly what we mean by that…) in general terms between countries, but rather we need to be far more aware of nuances both between and within.

The new conference centre (complete with soon-to-be-opened hotel) at the University of Botswana makes our conference facilities appear charmingly retro: beautiful main auditorium, lots of well-furnished break out rooms, excellent catering facilities etc. Friends and former colleagues all seemed to be driving cars twice the size and much newer than I could aspire to, sitting in air conditioned comfort in traffic jams twice as long as I recall them being, even though there appear to be twice as many roads as I remember. Presumably all that extra fuel burning up while people get nowhere slowly is doing wonders for the country’s GDP.

Visits to two large villages that I know well provided a counter-story. In one, a friend who is a tailor continues to live a day to day existence in competition with the flood of cheap Chinese imports, sold from small shops run by Chinese migrants. In the other, a College of Education, my former work place, already in decline by the time I left has further decayed in terms of its fabric and role, a stark contrast to the gleaming new buildings popping up all over the University some 50km away. I popped my head into my old office, still the base for the Special Education team, and found an old handout of mine lying on one of the desks. “We have new material too!” the current occupant was keen to point out.

The other evening there was an item on the news about Hartlepool and the impact of the introduction of Universal Credit. What struck me was the paucity of cars in the streets. It looked like photographs that one sees from the 1950s  and is presumably (though any geographer reading this might want to put me right) an indication of poverty rather than an urge to live a greener, more sustainable lifestyle. The richer elements of Botswana are looking very much like our richer elements and the poorer parts of Britain are starting to look more like the poorer parts of Botswana. Though given the respective climates I think I know where I might prefer to lead a life of poverty if I ever have to.

A question for us as academics and practitioners interested in working in partnership in such contexts is what this emerging, more finely nuanced reality might mean for us. I wonder if we need to think about more equal partnerships where we work on mutually compatible problems: let’s explore what poverty means and how we might ameliorate its impact in peri-urban Botswana and Hartlepool: what does a rural Botswanan school’s attitude to including children from the whole village have to say to us in rural Hereford? We are now used to the need to demonstrate ‘impact’ and work ethically. Perhaps both those notions are due for a shift or broadening of focus.

 

Gareth Dart is Senior Lecturer in Education, University of Worcester.

Research Agendas: Dr Gyozo Molnar

Gyozo Molnar

Whose knowledge counts in Adapted Physical Activity? Adapted physical activity (APA) is a cross-disciplinary body of practical and theoretical knowledge directed towards impairments, activity limitations, and limited participation in physical activity (see full definition at http://ifapa.net/definition/). Despite its crossdisciplinary nature, APA still appears to lack diversity in terms of its approaches to research, and engaging with and representing voices of participants. Informed mainly by traditionalist paradigms, the field of APA has been critiqued for marginalising the experiences of its users and paying little attention to power imbalances inherent in traditionalist research approaches. Therefore, ‘Who is the expert and whose knowledge counts?’ is critical in APA. The purpose of my ongoing research project is to respond to these questions through a systematic content review of the prevailing research methodologies and discourses in APA and to propose alternate perspectives.

My research is ultimately inspired and guided by epistemic and ethical responsibility. In other words, it interrogates: to whom, with and for whom the field is intended to serve and support. Specifically, there is concern with the dominant epistemic authority – legitimate knowledge formation and the people with/of authority to speak about such – in APA. Preliminary investigation indicates that this authority is strongly associated with the medical model of disability and traditional, objective and researcher-centred approaches. Furthermore, an ongoing critique of the field is the lack of presence and voice of people experiencing disability in APA research and practice.

Consequently, my research will address a lack of scope in APA research in terms of the field’s engagement with specific research methods and to demonstrate the need for more diverse perspectives in order to address some of the overarching complex and key issues within the field (e.g., inclusion). While APA is considered a multidisciplinary and cross-disciplinary field guided by diverse epistemological and methodological approaches, preliminary inquiries suggest there is in fact limited breadth with emphasis on specific types of inquiry emanating from prevailing disciplines and the pervasiveness of the medical model of disability. Not only is this narrowing from a knowledge generation standpoint, but the opportunity to imagine and engage work that has the potential to be transformative in shifting both research and practice is also stifled. Complex problems are unlikely to be fully recognised and addressed within any one particular discipline or approach. Furthermore, there has been strong critique of the medical model as it does not account for the role of society in disabling people or other alternate understandings of disability.

As recently as 2015 a documentary analysis of research trends published in the Adapted Physical Activity Quarterly (APAQ) over a 10-year period. The identification of research methods was a category of analysis. However, differing significantly from this research, is the level of analysis with regard to research methods. It reported their findings primarily along qualitative and quantitative divides with accompanying descriptions of research design. My research seeks to exceed this type of review by critically attending at the level of epistemology. By revealing the assumptions underlying researchers’ methodological choices, the possibilities for research integration and active communication across disciplines becomes more plausible.

Beyond reporting current trends and possible future directions, my research will also explore alternate approaches and the possibility of methodological integration across disciplines. This offers a unique entry point for researchers to consider how they might collaboratively move across disciplines to address complex, key issues in the field. This is also timely, given recent calls for more interdisciplinary work. That is the focus is not simply to showcase and suggest the active application of alternative research approaches and development of cross disciplinary dialogues. My research will highlight and reinforce the epistemic responsibility of APA researchers, as builders and gatekeepers of the current APA scientific establishment, to actively engage with alternate approaches and participants by keeping the following questions under continuous examination: Who is the expert and whose knowledge counts?

 

Dr Gyozo Molnar is Principal Lecturer in Sports Studies, University of Worcester, and co-editor of Women, Sport and Exercise in the Asia-Pacific Region: Oppression, Resistance, Accommodation (Oxford: Routledge, 2018).

War Requiem

Howard Cox

The ending of the “Great War for Civilization” on 11 November 1918 has left an indelible mark on the national memory of the United Kingdom. Now that 100 years have passed since this momentous event took place, our recollections are no longer of the event itself but of its legacy in terms of the memorials, symbols, texts and artefacts that were created in its wake. I write these words whilst listening to Ralph Vaughan Williams’ poignant Third Symphony; an evocation of the battlefields of northern France in which the composer’s idea for the music first took shape. The symphony is one of many examples of the way in which it is possible to stimulate a recollection in the present day of an event for which no lived experience endures.

As an aspect of our social memory of the First World War, Vaughan Williams’ Third Symphony is now merely an historical testament. As the historian Jeffrey Olick has noted, ‘History is the remembered past to which we no longer have an “organic” relationship’. At the time of the symphony’s publication and first performance in 1921, however, the music would have fed into and formed part of a collective memory of the war which would have resonated widely with the direct, lived experience of those who experienced the conflict, facilitating the formation of a collective memory. For Olick, therefore, ‘collective memory is the active past that forms our identities’. Thus the same musical artefact which today provides us with an historical social memory of the war, at the time of its writing fed into a collective consciousness of the event that spurred its creation.

The collective memory of an event such as a world war can be generated at different levels of a society. As a piece of music listened to quite widely, Vaughan Williams’ symphony would have helped to form one element of a national collective memory of the war during the 1920s. Similar forms of collective memory can exist at a more disaggregated level of the social structure. Organizations such as businesses can generate symbols, texts and artefacts that stimulate sufficiently common forms of recollection such as to provide a collective corporate memory. In a recent paper, published as part of a Special Issue of the Journal of Management & Organizational History to commemorate the centenary of the end of the war, I have identified one case in point.

In 1915, shortly after the outbreak of the First World War, the British-American Tobacco Company (BAT) moved its London headquarters into a newly-built, prestigious building opposite the Houses of Parliament which it named Westminster House. Many of its employees had already volunteered to serve in the armed forces and space was set aside in the new building to exhibit various forms of memorabilia which the returning staff had garnered from the battlefields (including some live ammunition!). Staff in the company also began to publish a weekly newsletter, called the BAT Bulletin, in which correspondence from those serving abroad (suitably censored) was published and disseminated to work colleagues – including those at the front. In this way the company helped to generate a collective memory of the war that was specific to the firm’s own employees.

Interestingly, this process of creating a collective corporate memory of the First World War was not replicated during the conflict of 1939-1945. Although the impact on the company of the Second World War was every bit as profound as its earlier counterpart, the circumstances were such that no contemporaneous artefacts and texts were generated. Moreover, whereas the lived experience of many BAT staff during the conflict of 1914-18 took a similar form – due particularly to the static nature of much of the fighting – the Second World War saw the company’s staff experiencing the conflict from a far more diverse set of circumstances. Indeed, for many of the company’s employees, the war with Japan led to staff in China and the Far East being incarcerated for many years in prisoner of war camps, without ever being recruited to the armed forces.

In many ways this disparity in the corporate memory of BAT mirrors a more general pattern of the national collective memory of the First and Second World War. The most evocative symbol of our social remembrance – the poppy – was spawned directly by the battlefields over which Vaughan Williams fixed his gaze sometime around 1916. No corresponding symbol of remembrance serves the same purpose in relation to World War II. My own musical imagery of this later conflict finds its counterpart to Vaughan Williams’ Third in Dmitri Shostakovich’s Eighth Symphony which, the composer later wrote, ‘was an attempt to express the emotional experience of the people, to reflect the terrible tragedy of the war’.

 

Howard Cox is Emeritus Professor of International Business History, University of Worcester, and the co-author of Revolutions from Grub Street: A History of Magazine Publishing in Britain (Oxford: Oxford University Press, 2014).

Research Agendas: Dr David Storey

David Storey

I am currently working on two books reflecting my long-standing research interests in issues of territory and identity.  In Transferring Allegiance: Football, Place and National Identity, I explore the intricate connections between football, place and politics. The focus is on the phenomena of footballers that switch national allegiance from where they were born to where they live or the country to which they have family connections. The declaration of a sporting nationality that may differ from an ‘official’ one, casts light on ideas of cultural hybridity and highlights the need to see identities as fluid and flexible. Responses to this phenomenon from supporters, media and those involved in sport range from an essentialist and exclusionary view of national identity through to more progressive, inclusionary, flexible and pragmatic perspectives. Drawing on a range of examples from a variety of geographic contexts the book casts light on the complexities of ethnic and national identity and the ways in which sport becomes a medium through which allegiances are (re)produced and expressed.

A second publication, A Research Agenda for Territory, is an edited volume which interrogates how ideas of territory and territorial practices are intimately bound up with issues of power and control. Drawing together a range of contributors from various countries, the aim is to provide a critical assessment of key areas of scholarship on territory with a view to mapping out a future research agenda. Territories are socially produced and reflect specific ways of thinking about geographic space while territorial strategies convey messages of political power which are communicated through various means including the creation and securing of borders. Territories, and the ways in which they are imagined, play an important role in the formation of peoples’ self-identity and contribute to feelings of belonging or exclusion. People identify with territories, most obviously through ideas of the nation, and they can be seen to exist (with various degrees of control, contestation and bordering practices) across a range of spatial scales and in a wide variety of contexts. The chapters in the volume draw together discussions on the conceptualization of territory and the ways in which territory and territorial practices are intimately bound up with issues of power and control.

 

Dr David Storey is Principal Lecturer in Geography, University of Worcester. A Research Agenda for Territory will be published by Edward Elgar. Transferring Allegiance: Football, Place and National Identity, is to be published by Rowman & Littlefield.

Image: Dr David Storey on the Dutch-Belgian border, 2018.

Political Extremes: Too Hot to Handle?

Neil Fleming

Politicians and commentators regularly talk about ‘political extremism’ and ‘extremist politics’, often in response to violent outrages, but also to justify new laws and policies of surveillance. Yet, despite their regularity, public accusations of political extremism tend to take the form of general statements and condemnations rather than precise definitions. Legal proscriptions on extremism are targeted at specific groups, language and behaviour. But extremist beliefs are considerably harder to define and police, and any attempt to do so risks undermining a state’s liberal democratic credentials. What’s more, there is a reluctance among Western politicians to acknowledge that definitions of extremism are contested, and that they have shifted considerably over time.

What has remained constant, however, is the idea that ‘extremes’ are bad and the ‘middle’ good. Plato and Aristotle are seldom cited in today’s media coverage of extremism, yet they established some of our most basic assumptions. Both equated the middle with virtue and the extremes with degeneration and barbarism. Plato believed that the ‘middle’ could only be achieved through a mixed constitution of monarchy and democracy. Aristotle advanced the idea that different types of constitution existed on a continuum, and that a powerful middling class was the best guardian of the political centre. These ideas were neglected until the thirteenth century, when Thomas Aquinas applied them to commend virtuous rulers and warn against tyrants.

When translations of these works first entered England in the 1590s, they were regarded as a direct threat to the Tudor state. Within half a century, however, royalists and parliamentarians competed with one another to associate their respective causes with mixed government. In the wake of the Glorious Revolution of 1688, the virtues of constitutional monarchy became widely celebrated, and supplied a model for dissidents in Europe’s absolutist monarchies. The founding fathers of the United States developed this further to advocate a mixed constitution which ensured moderation through the division of power and the balance of opposing social forces.

This might have settled the matter but for the French Revolution. It established the now familiar left-right political division and cast both ends of the political spectrum as extremes. In France and elsewhere, liberals came to regard revolution and reaction as extremes, with some advancing the innovative idea that both extremes were alike. They applied the label ‘ultra’ to politicians who adopted an exaggerated political position, and the suffix ‘-icide’ to mark a destructive, murderous tendency. Reactionaries initially cast democracy as extreme, but over time they instead harnessed it for their own ends. For Marxists, the clash of extremes served as the motor of history and the means of achieving dictatorship of the proletariat. This led liberals to worry that extremist parties called forth one another in a never-ending struggle, and that the political middle could be shifted as a result. The danger seemed to be demonstrated by the US Civil War, which many held to be the result of extremism on both sides of the slavery debate. Not for the last time, liberals considered illiberal legislation to suppress extremism.

In the first half of the twentieth century, the extreme left and extreme right were readily identified with Communism and Fascism. The Italian politician and priest, Luigi Sturzo, argued that both systems were similar, in that they denied legitimacy to other parties and sought to determine public and private lives. Others took a more partisan approach, with conservatives viewing Fascism as a bulwark against Communism, and socialist governments adopting constructive relations with the Soviet Union.

Claims and counter-claims about extremism continued after the Second World War. Divided between East and West, the two German states denounced each other as extremist. The East’s criticisms of survivals from the Hitler era was later taken up by student radicals in West Germany. The Bonn authorities at first reacted harshly, but over time came to accept that many critics were not necessarily anti-constitutional or even extremist. This led some to express concern that the partial accommodation of erstwhile extremists might lead to relativism and weaken the basis of liberal democracy. This type of anxiety was not confined to Germany, and continues to exercise commentators today on a range of controversial issues.

The reaction to violent outrages such as the Charlie Hebdo attacks in 2015, the protests in 2017 about the Confederate imagery in the United States, and the alarm expressed about the recent rise of populist parties on the right and left in Europe and beyond, demonstrate that debate about the acceptable bounds of political behaviour is normal in healthy democracies. It can of course lead to illiberal responses, and knee-jerk reactions risk depicting people with reasonable grievances as extremists. But some kinds of extremism have been progressive forces in politics. As long as debate about extremism is encouraged, and set in a historical context, we might better avoid or move on from simply unreflective and unconstructive condemnation.

An earlier version of this post was published originally by History Matters, run by the Department of History, University of Sheffield.

 

Dr Neil Fleming is Principal Lecturer in Modern History, University of Worcester, and the author of Britannia’s Zealots, Volume I: Tradition, Empire and the Forging of the Conservative Right (London: Bloomsbury Academic, 2019).