International Conferences on Slavery and Race

Professor Suzanne Schwarz presented a paper and was part of a discussion panel at the Maghrib Conference on Race, Gender and Migration, Fes, Morocco between 15 and 17 December 2019. The conference, held in honour of  Catherine Coquery-Vidrovich, Professor Emeritus at the University of Paris-Diderot, was organised by the Center for Maghrib Studies at Arizona State University. It was hosted by Morocco’s International Institute for Languages and Cultures (INLAC), and sponsored by the Centre International de Recherches sur les Esclavages et Post-Esclavages and L’Unité de Recherche Migrations et Société.

Professor Schwarz was part of an international discussion panel on the ARTE film on Les Routes de L’Esclavage (Slavery Routes) with Catherine Coquery-Vidrovich, Klara Boyer-Rossol, Salah Trabelsi (Université Lumière Lyon 2), and Chouki El Hamel (Arizona State University).  Slavery Routes is a documentary in four parts, co-directed by Daniel Cattier, Juan Gélas and Fanny Glissant, with historical advisor Catherine Coquery-Vidrovitch. Suzanne Schwarz was interviewed for this series, which has been screened widely, including on Al Jazeera in 2018. She also presented a paper entitled, ‘Reinterpreting Cultural Encounters: European mariners in Morocco in the Late Eighteenth Century’, which analysed the contemporary accounts of a number of European mariners who experienced captivity in the Maghrib.

In March 2020, Professor Schwarz participated in an international online conference. The two-day conference entitled ‘Documenting Africans in Trans-Atlantic Slavery (DATAS)’ was due to be held at the University of Essex, but was moved online. The conference brought together 25 international participants from Canada, North America, Brazil, Costa Rica, France and various other European countries. This was the first in a series of planned conferences for the international research team engaged in the project: ‘Documenting Africans in Trans-Atlantic Slavery (DATAS)’, ( This research team is led by Paul E. Lovejoy, Distinguished Research Professor and Canada Research Chair in African Diaspora Studies at York University, Toronto. After submitting a successful bid for a collaborative research project to the ‘Trans-Atlantic Platform Social Innovation Call’, the team subsequently received funding from the Trans-Atlantic Platform: Social Sciences and Humanities. The DATAS project ‘develops an innovative method to explore African ethnonyms from the era of trans-Atlantic slavery, circa 1500-1867. Ethnonyms index African identities, places and historical events to reconstruct African culture that is linked to a history of slavery, colonialism and racism. The project centres on the need to understand the origins and trajectories of people of African descent who populated the trans-Atlantic world in the modern era. The development of a method for analysing demographic change and confronting social inequalities arising from racism constitutes a social innovation’. Tracing the availability of relevant archival sources about African men, women and children is central to this project, and Professor Schwarz presented a paper entitled ‘Research on Sierra Leone: British Library Endangered Archives Project, Sierra Leone Public Archives’.

The Second World War, VJ Day and Imperial Amnesia

Neil Fleming

Even with the cancellation of public commemorations marking the 75th anniversary of both VE Day and VJ Day, the former remains larger in the national consciousness. Indeed, this year, during the global health crisis resulting from the coronavirus, VE Day was held by some to possess a special resonance with its message of triumph over adversity.

Why not VJ Day? There could be a number of explanations. The war in Europe was physically much closer to home, as was the threat posed by Great Britain’s main adversary, Nazi Germany. The bulk of Britain’s defence forces were concentrated in the European theatre. The war in the Pacific was dominated by the United States. And it was the Americans who brought it to a conclusion by introducing a new and highly destructive weapon that confirmed its economic, scientific and technological superiority over the old powers, including the United Kingdom.

Whatever the reason, the tendency to focus on VE Day, which did not represent the end of the Second World War, rather than VJ Day, which did, is partly bound up with ‘imperial amnesia’. In focussing largely on the conflict (and its conclusion) in Europe, it becomes possible to forget about the vital assistance given to Britain during the war by millions of people in its far-flung colonies. It also means ignoring the difficult relationships and tensions between the British and their colonial subjects. It must be said that contemporaries understood that it was the British empire that was at war with the Japanese empire. It was an idea drummed into them by the wartime government, not least the prime minister, Winston Churchill.

The explanations for imperial amnesia are several. The fact that the empire was largely decolonised by the late 1960s is an obvious explanation. The reluctance of schools and museums to tackle the subject is another. There is wariness too of doing anything that has the potential to glorify or excuse imperialism; this has certainly served as a restraint on public commemoration in many former colonies.

However well-meaning, ‘imperial amnesia’ has had unintended consequences. In the 1960s, opponents of non-white immigration to the UK claimed that the Welfare State was meant to reward the nation’s wartime sacrifice, and that it was therefore unfair that South Asian, Caribbean, and African immigrants also benefitted. As historians demonstrate, the significant expansion of welfare after 1945 was not a ‘reward’ but a means of addressing the urgent need for post-war reconstruction. It was also a direct response to the intractable economic and social problems that had scarred communities across the UK in the 1920s and 1930s.

Rather than late comers, imperial networks of employment and trade meant that non-white communities had been present in Britain for several centuries. They were often subjected to racism, such as the infamous 1919 ‘race riots’, but their established presence in the country meant that they were an integral part of the UK’s war effort in 1939-45. Moreover, almost as many non-white men as white men volunteered to serve in the ‘British’ armed services during the war. By far the largest contingent came from British India (present day India, Pakistan and Bangladesh), though Africans and West Indians also enlisted in significant numbers.

Indeed, in the wake of the recent ‘Windrush scandal’, it is worth recalling that many of those on board the Empire Windrush in 1948 were Jamaican ex-servicemen. As such, they had already experienced living in wartime Britain and they were not alone. Lesser known examples include the hundreds of Caribbean workers in British munitions factories, and the almost thousand-strong workforce of Honduran foresters stationed across Scotland.

It is also important to recall the wartime role of Muslims, especially as Britain’s Muslim community is today the target of extreme right-wing groups that have appropriated iconic imagery associated with the Second World War. As Muslim leaders have asked, why is it not better known that over a million Muslims volunteered to serve in the armed forces of the British empire, just as an equally large number had served in the First World War?

Wartime mobilisation involved far more than recruiting soldiers, sailors and airmen. Civilian work was vitally important, in agriculture, industry and transport, and women and men were mobilised across the empire to keep the wheels of production turning. Canada’s role in the Atlantic convoys is still remembered in Britain, but almost completely forgotten is the presence on board many British-registered ships of large numbers of South Asian and African sailors.

In relaying all of this it is important not to replace one myth with another. It is true that the British empire was more cohesive in its response to the war than some had previously predicted. But there were still tensions between Britain and the self-governing ‘dominions’—Canada, South Africa, Australia and New Zealand—over wartime strategy and organisation. More significantly, the image of a united front was undermined by nationalist opposition to participation in the conflict. The Indian National Congress chose to boycott the war effort despite being sympathetic to its aims. It was frustrated that Indian politicians had not been consulted by the British before putting their country’s vast resources on a war footing, despite years of reassurances that India was moving towards self-government. Ireland (excepting Northern Ireland, which remained part of the UK), had become a dominion in 1922, and during the 1939-45 ‘Emergency’ it determinedly demonstrated its sovereignty by remaining neutral.

These examples of nationalist opposition were not completely effective. 43,000 Irish citizens joined the British forces, including deserters from the Irish army, and many Irish women and men moved to Britain to perform civilian war work. In the case of India, a staggering 2¼ million men served in the armed forces, playing a crucial strategic role in South Asia, the Middle East, and the Far East.

The motives of volunteer recruits and war workers varied, from the need to escape poverty to seeking a new life. But not everyone was a volunteer. In Nigeria and Tanganyika (part of present-day Tanzania), the colonial authorities were so determined to extract certain resources that they used techniques of forced labour.

As this suggests, racism remained a marked feature of the British empire despite waging a war against the Nazis. There was nevertheless some progress. The ‘colour bar’ preventing non-white men serving as commissioned officers was removed in October 1939. Non-white participation in legislatures was advanced in Nigeria, the Gold Coast (present day Ghana) and British Guiana (present day Guyana). In 1944 Jamaica was granted a House of Representatives elected on a universal franchise. Still, these latter developments should not disguise the fact that the British were determined to remain in control of the pace of constitutional reform; that they were willing, during the war and for almost two decades after it, to use coercion in resisting those actively challenging their authority.

That determination reminds us that the Second World War did not herald the end of the British empire, at least not immediately. The independence granted to India and Pakistan in 1947 was part of a process of constitutional reform dating back to 1919. The displacement of Britain during the war by the United States, especially in the Caribbean and Pacific, reflected the shifting balance of global power, but it did not eliminate completely Britain’s influence in either region.

‘Imperial amnesia’ is understandable given that racism, violence, and oppression were features of imperialism. Equally, a determination to avoid repeating Europe’s destructive wars has encouraged precious habits of reconciliation and cooperation among former foes. Yet, it has meant that Victory in Europe Day looms larger in the British imagination than Victory over Japan Day, despite most of those fighting in the service of George VI (as King, or for his Indian soldiers, as Emperor) having to carry on until the Japanese surrender in August 1945. If this preference for VE Day over VJ Day is unlikely to change, let us hope that Britain’s multicultural society can at least remember that Asian, African, and Caribbean men and women contributed to the victory marked this year.


Neil Fleming is Principal Lecturer in Modern History, University of Worcester. He is the author of Britannia’s Zealots, Volume I: Tradition, Empire and the Forging of the Conservative Right (Bloomsbury Academic, 2019), and co-editor of Histories, Memories and Representations of being Young in the First World War (Palgrave Macmillan, 2020).

Prejudices and Priorities in a Previous Global Pandemic – Spanish Flu in the West Midlands

Maggie Andrews

Scientific research and understanding often follow pandemics; viruses were identified in 1933, more than 10 years after what is now commonly referred to as the Spanish Flu. In the face of uncertainty and partial knowledge, government decisions and popular perceptions just over a hundred years ago were shaped by cultural and political preoccupations, prejudices and already existing priorities. Scientists estimate that between 1917 and 1920 a quarter of the world’s population were infected by the Spanish Flu; between 50 and 100 million of them died.

In the West Midlands, the first cases appeared in April 1918. Britain was war-weary, from nearly four years of conflict; there was little hope of an end to the stalemate on the Western Front and the Germans were poised to retake Passchendaele. Those on the home front were struggling to cope with the death and disruption caused by men fighting in foreign fields, alongside Zeppelins and aeroplanes bombing Britain. Housewives battled to feed their families as food shortages led to price rises, food queues and finally the introduction of rationing in 1918.  At a national and local level winning the war was prioritised, concern about the influenza focused on how it compromised war production or the health of soldiers required for battle. Many in the Army Medical Corps considered that they had more pressing medical issues than flu, which was often seen in derisory terms as a mild complaint, compared to sepsis, gangrene, lice and enteric fevers. During this initial wave of the pandemic, which abated by late July, newspapers, doctors, propaganda and politicians conveyed the message that ordinary people should either ignore it, smile through the threat of illness, or take care of themselves. The Birmingham Mail reassured its readers that ‘fortunately the disease is not of a virulent type, carrying with it any actual danger’. Mortality figures suggest otherwise.

By early autumn 1918, a more lethal strain of the flu had arrived, lasting until the end of December. In November, as the guns of battle fell silent, Dr Masbyn Read, the Medical Officer of Health in Worcester, reassured the local population that although there had been 41 deaths from influenza in the city the previous week the ‘disease is past its worst’. The local press, however, noted that the disease was still widespread in Evesham and Stourport. Over 1,500 cases were reported in Kidderminster, which was considered to be particularly badly hit. Many workers were absent from the carpet and spinning mills in the town. There was a shortage of comics in Worcester, purchased to keep sick children amused when confined to bed. Despite demands made for theatres and cinemas to be closed or more tightly regulated, they were used to give the public information about preventing the spread of the disease, including the infamous film Dr Wise and a Foolish Patient. Schools were closed in some areas for 2 or 3 weeks, occasionally longer, despite anxiety that children running around the streets unsupervised would spread the disease.

Despite the Armistice, the war was not officially over until the Versailles peace treaty was signed on 28 June 1919, with the result that Germany continued to be blockaded and the British government was slow to release the doctors and nurses in the armed forces that were needed to combat the disease and care for the sick. Many doctors and medical students had volunteered for the forces and in early 1918 the government had raised the call up age for doctors to 55. Although elderly doctors were encouraged to return to practice, the number of doctors able to minister to patients in the West Midlands was heavily depleted. Doctors were run off their feet as were undertakers and vicars. Long queues were seen at chemist shops, people consumed lethal doses of aspirin. Charlatans made spurious claims and significant profits for patent products such as Dr William’s Pill for Pale People. Consumers were requested not to hoard Bovril, thought to prevent or at least aid recovery from the flu. The failure to release undertakers and grave diggers from the forces in late November 1918, or to make stocks of whisky (considered a cure) freely available, were widely criticized. A government with one priority – in this case winning the war – could not quickly change course and marshal resources to deal with another. Nevertheless, by late November, as the general election campaign got underway, soldiers in Britain were deployed to help dig graves and whisky could be obtained with a letter of recommendation from a doctor.

In European history and memory, the devastating consequences of the pandemic have often been either eclipsed by or entwined with the death and injury caused by the First World War. In all continents the flu was responsible for many more deaths than war and it was particularly devastating in India. In Britain approximately 850,000 were killed in armed combat whilst 240,000 died from the virus, although diagnosis was not always reliable. A unique feature of this pandemic was that it disproportionately affected younger adults; the peak age of death was 28. Thus, for many ordinary people like Mr and Mrs Slade Nash of Martley, Worcestershire, influenza was yet another element of the misery and sorrow that warfare inflicted upon their lives. They had already lost two sons in the war when their daughter Margaret died from influenza in October 1918.

Whilst the First World War did not cause the pandemic, the mobility of people in wartime (shifting between countries or regions) helped to spread the Influenza. The Spanish Flu is now understood to have originated in Kansas in the USA, and was brought to Europe by soldiers sailing across the Atlantic to fight on the Western Front. As a neutral country Spain was not subject to the censorship other countries had experienced, and freely reported on the progress of the disease, particularly when King Alfonso XIII and members of his government became unwell, and people came to associate the disease with Spain. The Times noted on 18 December 1918 that once the disease had reached London it radiated out through Birmingham, Nottingham and other major centers via the rail networks. Public activities, schools, factories, churches, cinemas, theatres and public transport and Armistice Day celebrations all helped to spread the virus. Soldiers on the battlefronts, on leave, in training or prisoner of war camps, in hospitals, or when demobilised, were efficient carriers of influenza.

In the absence of any genuine scientific understanding of viruses and how the influenza actually spread, multiple different explanations and theories emerged, shaped by prevailing cultural prejudices, xenophobia and class prejudice. The Birmingham Daily Post pointed out in June 1918 ‘the man in the street … is sometimes inclined to believe it is really a form of pro-German influence’.  The ‘unseen hand’ of the enemy was supposed to be spreading the illness. A quarter of a million Germans died from disease, nevertheless for some who were hyper-patriotic the link between germs and Germans was very close and a number of nicknames emerged for the virus that embodied this prejudice including ‘Flanders Grippe’, ‘Hun flu’, and the ‘Turco-Germanic bacterial criminal enterprise’. Soldiers writing from the front or visiting Britain on leave or to convalesce also conveyed rumours that the heart of this ‘German plague’ lay in the unburied corpses on the battlefields or the German’s use of poison gas.

The sense that the conditions of war, exhaustion, war weariness, and food shortages precepted the spread of the disease had more credibility, but was again laced with a touch of anti-German feeling. Thus, the Birmingham Daily Post argued in 25 June, ‘There can be no doubt whatever that it has been recurring in a very severe form in Germany, Austria, and the territories occupied by the Central Powers during the last two years.’ Those who were not convinced that the flu was the intentional work of the enemy powers often perceived it as an unintentional consequence of war and the arrival of groups of workers or troops from abroad. The French blamed the Spanish and in particular the Spanish workers who replaced French men who had joined the armed forces, whilst the Spanish singled out the Portuguese for culpability. The Germans apparently suggested that the flu had been imported by the 100,000 men who worked behind the lines, assisting the Allies in France, as part of the Chinese Labour Corps.

Other explanations for the Influenza included: the wartime propensity to dig up the land to grow more food, poor housing, and dirt. The Medical Officer of Health in Worcester long maintained that combatting this disease was a ‘purely a personal matter’ and that the spread was ‘due to the carelessness of individuals’. Newspapers warned against overcrowding, indiscriminate spitting and alcoholism and encouraged thorough cleaning of houses and whitewashing walls. Cultural assumptions and prejudices towards densely populated working-class areas of industrial cities was not however supported by evidence; influenza was a remarkably democratic illness. The Evesham Journal pointed out on 2 November that ‘All classes of the community are affected, and it appears to take an even more virulent form with the apparently hale and hearty.’ In the village of Badsey, where locals and 200 German prisoners of war worked on the land, the virus was rife.

This swirling mishmash of explanations, rumours, accusations and assumptions about the flu did have two positive longer-term influences. They added weight to arguments for the setting up of the Ministry of Health in 1918 and for slum clearances and the building of working-class housing in the inter-war years across the region from Birmingham to Tewkesbury.


The material in this blog is drawn from Maggie Andrews and Emma Edwards, Bovril, Whisky and a Shortage of Gravediggers: The Spanish Flu comes to the West Midlands (History West Midlands, 2019).

Professor Kay Mohanna’s Australian Collaboration

Griffith University, in Queensland, Australia, recently welcomed Kay Mohanna, Professor of Values based Healthcare Education at the University of Worcester.

Professor Mohanna visited the Griffith Institute for Educational Research on 24 February 2020 to begin consolidating plans for a collaborative, inter-disciplinary project with Dr Jeanne Allen, Associate Professor of Teacher Education, and a team of Griffith researchers. Kay and Jeanne also met with Dr Leonie Rowan, acting Director of the Griffiths Institute and Associate Professor of Education, to discuss the project, which will be framed within the politics of belonging and investigate the development of a sense of belonging and professional identity among the increasingly mobile student population in higher education.

Dr Allen is a co-editor of the Asia-Pacific Journal of Teacher Education, which publishes research that advances knowledge in teacher education across early childhood, primary, secondary, vocational education and training, and higher education. Her work with Kay builds on their previous research on students’ expectations and experiences in internationalised and globalised higher education (Kay is on the editorial board for Education for Primary Care, which aims to support the implementation of best-evidence medical education relevant to primary care). Their previous work includes a 2013 article in Educational Theory and Practice.

Through this link with the Griffith Institute for Educational Research, Professor Mohanna is working with is one of the largest concentrations of university-based education researchers in Australia. It comprises a vibrant community of scholars whose applied research with industry partners, education systems, schools, community and not-for-profit organisations is done with the aim of translating it into practice. The Institute has identified key research foci that include, but are not limited to, leadership, living with autism, and professional and practice-based learning. It identifies Research Programs based on evidence of current, successful research, led by five esteemed senior researchers who are also mindful of new intellectual projects that can drive their endeavours to the next level of productivity and reputation.

Initially this collaboration will carry out a scoping review of the literature. This will look at the implications of the globalized movement of people, ideas and capital including how international students navigate multiple types of borders in ways that can exacerbate their experiences of marginalization and development of belongingness.

British Press Perceptions of the 1857–58 ‘Indian Mutiny’

Arran Jenkins

In 1963 the skull of an Indian man was discovered in a small pub in Kent. Folded and pushed inside one of its eye sockets was a scribbled note that revealed he was called Alum Bheg and that he had lived over a century ago in India. He had met his end, like many others, when he was blown from a cannon for his alleged involvement in an attack on a British resident family. Bheg’s skull was a war trophy taken during a mutiny of sepoys, or native soldiers, that began in the ancient city of Meerut on 10 May 1857 and soon erupted into a wider conflict that engulfed an estimated one sixth of the East India Company’s (EIC) territory in India.

The revolt lasted fourteen months and its implications sent shockwaves through the martial and civil spheres in India. It also reverberated throughout British society. To see evidence of this we need look no further than the rivers of ink spilled in newspaper coverage. The numerous editorials, opinion articles and letters from the public that appeared in national and regional publications, such as The Times and Berrow’s Worcester Journal, expose a multifaceted debate revolving around the 1857 Uprising that touched upon such disparate topics as the shortcomings of the EIC, the character of the Indian people and the loyalty of Catholic soldiers. Capital punishment was another recurring topic and it is the focus of this piece.

Contemporary newspapers viewed the ‘mutiny’ as an event of great importance which attracted widespread attention in Britain. Such interest is likely to have been based more on familial than fiscal ties. The 30,000 British soldiers—and the family members that they brought with them—were comprised of people from a variety of backgrounds. And they maintained connections to their homeland. In July 1857, for example, there were over 20,000 letters sent home by settlers that described the startling events then unfolding across India and these were frequently published in British newspapers. In this way, the mutiny managed to reach many people in the United Kingdom and is one of the means by which the events scarred the national consciousness.

This resounded in British literature for decades, with related themes appearing in novels such as A Tale of Two Cities by Charles Dickens and E.M. Foster’s A Passage to India. An infiltration of ideas such as racial superiority and inherent ‘otherness’ into fictional works reveals the subtle ways in which the crisis affected the public psyche. The earliest histories produced narratives that were concerned with military battles and heroic acts. These largely cooperated with the colonial project rather than undermined it: some were even published before the rebellion was comprehensively crushed. Historicizing an event so swiftly is at least one indication of imperial arrogance. Nonetheless, reactions to the uprising largely defy generalisation.

What historians might term an ‘imperial’ narrative of events existed, but this did not go undisputed. In particular, that this was merely a tale of religious prejudices inflamed by the introduction of greased cartridges was never believed to be the only cause by all contemporary commentators. Many were able to trace the grievances of the rebels to a multitude of misjudged policies instigated years previously by the EIC, a once powerful organisation now in its death throes. The press did not tenderly ease its passing. Numerous past procedures of the EIC came in for criticism, reflecting long-held anxieties within Britain about merchants acting in the role of a state. The EIC was thus the primary target in the hunt for an adequate cause. This hunt was initiated by concern for British enterprise in India but lingered in the public discourse because of the rebels’ violent actions committed against British residents. A fierce reaction was sparked from the home press. From early on in the crisis, The Times, Glasgow Herald and Berrow’s Journal were demanding the blood of anyone considered culpable in the murder of British women and children. This sullying of the purest symbols of Victorian domesticity was proposed by The Times as an effort to ‘dishonour England itself’.

Wendy Webster demonstrates how this concept of domesticity being under siege also resonated in accounts of post-1945 colonial conflicts. Just as ideas of ‘white’ domesticity were shaken by fears of ‘black’ disloyalty during the Mau Mau rebellion in Kenya, the sepoy mutiny tainted a perceived paternal relationship on both a micro level—between a British officer and the sepoys—and a macro level between Britain and India. So much so that the popular illustrated newspaper, Punch, mocked the notion of a filial relationship forwarded by earlier thinkers like James Mill, with a cartoon that depicted the ‘Clement’ Governor-General Canning patting the head of a child-sized sepoy clasping a bloodied sabre.

The lampooning of those considered too merciful by many in the press is set against a context of progressive debate about the utilisation of capital punishment in Victorian society. Pro-abolitionists had made great strides with several government acts in the 1830s; reducing the amount of capital crimes by 66 per cent and by 1841 execution was only carried out for treason and homicide. By 1857 however there was a clear contradiction in contemporary discourse between the civilised, forward-looking Englishman and the blood-thirsty sensationalist Englishman, addicted to retribution as a means of revenge. The Times suggested that after an ‘immense amount of humanity’, with the reform of the penal code, there appears to be a ‘new phase of the British character to find a general consent in favour of strong measures in the East.’ Understood by some as a repressed retributive spirit connected to the teachings of the Old Testament, the Indian uprising was an episode that led to its resurgence in popular culture throwing into question the constructed identity of British civility.

Ultimately, the revolt failed to knock loose the jewel from the imperial crown. Yet, it is a testament to the sheer depth of the impact that it had on the people of the United Kingdom and India—political, social, cultural and intellectual—that it is still recognised as a landmark event in both these country’s histories.


Arran Jenkins is a History graduate of the University of Worcester. His final year Independent Study, examining the British press and the Indian Uprising, won the Stanford History Group Prize for Best Independent Study in History.

Witch-Hunts Past and Present

Darren Oldridge

President Donald Trump is perhaps the most famous and certainly the most outspoken recent victim of an alleged ‘witch-hunt.’ But he is by no means alone. In the last few years, other supposed victims of the phenomenon have included Hillary Clinton and several prominent members of the British Labour Party. Since at least the 1950s the language of ‘witch-hunting’ has played a role in public life, and its usefulness to politicians of all stripes means that it continues to thrive.

Like all modern victims of a ‘witch-hunt’, Trump uses the term to impute the motives and methods of his opponents. Witch-hunts are invariably wicked and unfair. To be accused is to be innocent; to pursue a witch is to act maliciously, hysterically or unjustly, and often all three of these at once.

As a historian of witchcraft I am wary of these connotations. While they illustrate modern-day perceptions of the past, they are a poor guide to the world in which real people suspected of harmful magic and compacts with evil spirits were once put on trial, and sometimes executed.  Indeed, our own language of witch-hunting is an impediment to understanding that world.

So how does today’s idea of a ‘witch-hunt’ compare to the historical record? The criminal prosecution of witches took place from the later 1400s until the early eighteenth century, and was responsible for around 50,000 deaths. The modern image of a witch-hunt corresponds most closely to the mass trials that occurred sporadically in this period, and especially in German-speaking lands. These could consume whole communities with dreadful speed, as happened in Trier in the 1580s and Würzburg in 1629.

Such events were mercifully rare. It was far more common for individuals or small groups of people to be accused, and in many cases they were probably not brought to trial. When they were, their treatment varied considerably from region to region, and was not always severe. In England, for example, only around a quarter of those formally accused of the crime were executed.

It was not the case, then, that to accuse a person of witchcraft was automatically to condemn them to death. In this respect the modern understanding of ‘witch-hunts’ misrepresents the past. Indeed, the majority of witchcraft cases cannot be viewed as hunts at all.

Nor were the people who dealt with witchcraft typically characterised by ulterior motives or disrespect for justice. Indeed, the occult nature of the offence meant that careful discrimination was required in its prosecution. Many experts on the subject, like the English physician and demonologist John Cotta and the Massachusetts pastor Increase Mather, argued for judicial caution in the face of a crime that was undeniably serious but difficult to prove.

Behind these observations is a deeper issue about belief. This is the fundamental difference between the fear of witchcraft in the past and the modern use of the term. Two core assumptions drove the witch trials in the sixteenth and seventeenth centuries: first and most commonly, witches were believed to harm others through the use of magic; and secondly, and of less importance to ordinary people, they were held to serve the Devil. In the most extreme version of this latter idea, witches were believed to fly to nocturnal assemblies where they committed atrocities and worshipped the Prince of Darkness.

Today very few people in the west believe in magic, and the concept is most familiar as either a metaphor or a kind of entertainment.  (And stage magic, of course, is expected to deceive.)  As a result, the most widely feared form of witchcraft in the past – the practice of destructive sorcery – is simply incredible.

What of the Devil? Here there are interesting international variations. A major survey in 1982 found that only 21% of the UK population believed that he existed; this had dropped to 10% by 2016. In contrast, a poll in 2005 found that 60% of Americans believed in the ancient enemy. This pattern reflects a more general difference in the prevalence of religious belief between western Europe and the USA. Interestingly, belief in God tends to be higher than belief in his adversary wherever the question is asked.

It may be hasty to declare the death of Satan. But it is true, nonetheless, that he is not an integral and uncontested part of the intellectual landscape of western communities, as he was in the age of witch trials. Nor is he connected to the practice of evil magic that is believed to cause real harm.

The modern use of the phrase ‘witch-hunt’ reflects this situation. As we no longer share the beliefs that once underpinned the crime of witchcraft, we find it hard to accept the crime at face value. We struggle to imagine the witch of the pre-modern world, whom people perceived as a figure of real menace. Conversely, we find it easy to assume that witch trials were impelled by ulterior motives. Typically, these include vindictiveness or greed, combined with a willing disregard for justice.

The current language of ‘witch-hunting’, then, indicates our own separation from the historical world of witch trials rather than the revival of pre-modern practices. (In some other, non-political contexts the parallels between past and present may be stronger.  Contemporary allegations of ‘ritual satanic abuse,’ for example, echo the early modern idea of the witches’ Sabbath. But that is another story.)

Ultimately, we have projected our own explanations onto the experience of those men and women who feared witchcraft in the past, recasting them as malicious or ‘hysterical’, and invariably unjust.  It is this version of the past – born of our profound separation from it – that underpins today’s talk of ‘witch-hunts’.


Darren Oldridge is Professor of Early Modern History at the University of Worcester. His most recent book is The Supernatural in Tudor and Stuart England (London & New York: Routledge, 2016), and he is currently writing a study of English demonology.

Management Practices in Developing Countries: The Case of Wasta Networks in the Arab Middle East

Sa’ad Ali

A majority of current management theories, developed since the start of the 20th century, have been created by researchers in the ‘developed’ countries of the USA and Canada in North America, countries of Western Europe, and Japan in Asia. These theories have been constructed based on data and observations of management practices in these countries and viewed by their researchers and practitioners as the ‘right way to manage’. Companies from these countries which have expanded to developing economies have tried to transfer these practices to these countries, often neglecting the different cultural, institutional and socio-economic context. However, management researchers have recently started paying attention to these differences. This has resulted in a new stream of management research coming from developing countries that examines the business environment by taking into consideration the wider context of a specific country or countries.

In line with this stream of research, my research during the past 7 years has focused on wasta in the Jordanian banking industry as a specific social and organizational phenomenon in Jordan and the countries of the Arab Middle East. The literal translation of wasta is mediation, or also intermediary, and in daily usage wasta refers to the use of connections in order to get something done. Someone who ’has wasta’ either has a degree of influence, or has access to those who do, which can help to ‘get things done’. An individual person in Arab countries might seek out someone with influence, who is then also called ‘a wasta’ in spoken dialect Arabic, or ‘waseet’ in classic Arabic, and utilised in order to find a job, secure a place at a university, or navigate the bureaucratic red tape that is so common in these countries. Wasta refers thus both to the action and those facilitating it. It has been suggested that wasta impacts nearly every facet of organisational life, and it is also a common source of discussion, and complaints, among Jordanians. Yet, how wasta is practiced in this context, and how it is experienced, remains a largely unknown phenomenon.

The etymology of wasta as an action and a person, is generally associated with the notion of occupying a middle place in a network. When one looks further into the linguistic roots of the word, one can simply understand it as the ability ‘to get things done through the use of social connections’. As a cultural phenomenon, it is not remote from other practices and ideas of reciprocal social relations, such as guanxi in China, blat in Russia or pulling strings in the United Kingdom or the general idea of a relational give and take in the business world. The degree to which each of these phenomena prevail is a reflection of how networked the society is and how members of these societies prefer to socialise and do business.

In Jordan, similar to other Arab countries, wasta impacts upon different business issues ranging from applying for trade licenses and government services to securing governmental bids. In particular, it influences recruitment and selection: job seekers use wasta as a medium to secure employment, and organisations use it to secure qualified employees. From a Westocentric perspective, wasta tends to be perceived as favouritism or nepotism which contrasts with the idea of a ‘modern’ (implicitly ‘Western’) workplace. Also, within Jordan, there is a widespread debate on whether wasta should be abolished for the sake of Western-style ‘modernization’ and in order to ensure equal employment opportunities for all.

On the other hand, my research, as with some others who study wasta and similar network-based phenomena, highlights the benefits of using it, as it can have positive outcomes on the micro-level for individuals when mediating between parties helps a qualified individual secure a job though the mediation process. It is also beneficial to the organisation which can secure a qualified and loyal employee in a country where certain skills and qualifications are scarce, for example, due to the brain drain of Jordanian employees to the Arab Gulf countries. However, it can have some severe negative outcomes on the macro-level as it reduces organisational diversity and leads to reinforcing power pockets in particular groups. Another negative impact of this use of wasta is that it weakens formal institutions as it reduces trust in the political and legal processes.

While my research highlights some positive aspects of wasta on the micro level, such as enabling qualified individuals to attain a chance of employment and securing trustworthy and qualified employees for the organisation, it is also important to consider the negative impact of using this practice. There are, for instance, diversity and exclusion issues, and this suggests also that an emic approach to culture-specific concepts can never be free of power-implications. As such, wasta itself is not purely ‘good’ or ‘bad’ but it is a way of doing things that can have positive or negative outcomes depending on how it is practiced and on what level it is viewed (micro or macro). Organisations seeking to do business in Arab countries need to acknowledge the wide spread practice of this phenomenon and try to accommodate its practice without breaking their own ethical principles.

Management practitioners are invited to consider how diversity and merit are not culture-free concepts; in fact, their recent prominence owes much to the post-modern Western need to come to terms with a new understanding that the global world does not map securely onto the largely Western-dominated management literature. This research invites managers to understand that what is considered good practice in different cultures should be balanced by the different cultural perspectives on what is considered ‘good’ and ‘the right way to manage’.


Dr Sa’ad Ali is Lecturer in Human Resource Management, University of Worcester.

A World Increasingly United by Growing Divides – Time for New Partnership Responses?

Gareth Dart

Earlier this semester I attended a conference exploring issues of disability in southern Africa, held at the University of Botswana in the capital, Gaborone. Having spent nearly a decade working in the country but not having revisited for the last eight years I was intrigued to see what might have changed. An issue that was a reinforced through my visit is that it is becoming more and more difficult to talk about differences in wealth and development (and let’s leave aside for now exactly what we mean by that…) in general terms between countries, but rather we need to be far more aware of nuances both between and within.

The new conference centre (complete with soon-to-be-opened hotel) at the University of Botswana makes our conference facilities appear charmingly retro: beautiful main auditorium, lots of well-furnished break out rooms, excellent catering facilities etc. Friends and former colleagues all seemed to be driving cars twice the size and much newer than I could aspire to, sitting in air conditioned comfort in traffic jams twice as long as I recall them being, even though there appear to be twice as many roads as I remember. Presumably all that extra fuel burning up while people get nowhere slowly is doing wonders for the country’s GDP.

Visits to two large villages that I know well provided a counter-story. In one, a friend who is a tailor continues to live a day to day existence in competition with the flood of cheap Chinese imports, sold from small shops run by Chinese migrants. In the other, a College of Education, my former work place, already in decline by the time I left has further decayed in terms of its fabric and role, a stark contrast to the gleaming new buildings popping up all over the University some 50km away. I popped my head into my old office, still the base for the Special Education team, and found an old handout of mine lying on one of the desks. “We have new material too!” the current occupant was keen to point out.

The other evening there was an item on the news about Hartlepool and the impact of the introduction of Universal Credit. What struck me was the paucity of cars in the streets. It looked like photographs that one sees from the 1950s  and is presumably (though any geographer reading this might want to put me right) an indication of poverty rather than an urge to live a greener, more sustainable lifestyle. The richer elements of Botswana are looking very much like our richer elements and the poorer parts of Britain are starting to look more like the poorer parts of Botswana. Though given the respective climates I think I know where I might prefer to lead a life of poverty if I ever have to.

A question for us as academics and practitioners interested in working in partnership in such contexts is what this emerging, more finely nuanced reality might mean for us. I wonder if we need to think about more equal partnerships where we work on mutually compatible problems: let’s explore what poverty means and how we might ameliorate its impact in peri-urban Botswana and Hartlepool: what does a rural Botswanan school’s attitude to including children from the whole village have to say to us in rural Hereford? We are now used to the need to demonstrate ‘impact’ and work ethically. Perhaps both those notions are due for a shift or broadening of focus.


Gareth Dart is Senior Lecturer in Education, University of Worcester.

Research Agendas: Dr Gyozo Molnar

Gyozo Molnar

Whose knowledge counts in Adapted Physical Activity? Adapted physical activity (APA) is a cross-disciplinary body of practical and theoretical knowledge directed towards impairments, activity limitations, and limited participation in physical activity (see full definition at Despite its crossdisciplinary nature, APA still appears to lack diversity in terms of its approaches to research, and engaging with and representing voices of participants. Informed mainly by traditionalist paradigms, the field of APA has been critiqued for marginalising the experiences of its users and paying little attention to power imbalances inherent in traditionalist research approaches. Therefore, ‘Who is the expert and whose knowledge counts?’ is critical in APA. The purpose of my ongoing research project is to respond to these questions through a systematic content review of the prevailing research methodologies and discourses in APA and to propose alternate perspectives.

My research is ultimately inspired and guided by epistemic and ethical responsibility. In other words, it interrogates: to whom, with and for whom the field is intended to serve and support. Specifically, there is concern with the dominant epistemic authority – legitimate knowledge formation and the people with/of authority to speak about such – in APA. Preliminary investigation indicates that this authority is strongly associated with the medical model of disability and traditional, objective and researcher-centred approaches. Furthermore, an ongoing critique of the field is the lack of presence and voice of people experiencing disability in APA research and practice.

Consequently, my research will address a lack of scope in APA research in terms of the field’s engagement with specific research methods and to demonstrate the need for more diverse perspectives in order to address some of the overarching complex and key issues within the field (e.g., inclusion). While APA is considered a multidisciplinary and cross-disciplinary field guided by diverse epistemological and methodological approaches, preliminary inquiries suggest there is in fact limited breadth with emphasis on specific types of inquiry emanating from prevailing disciplines and the pervasiveness of the medical model of disability. Not only is this narrowing from a knowledge generation standpoint, but the opportunity to imagine and engage work that has the potential to be transformative in shifting both research and practice is also stifled. Complex problems are unlikely to be fully recognised and addressed within any one particular discipline or approach. Furthermore, there has been strong critique of the medical model as it does not account for the role of society in disabling people or other alternate understandings of disability.

As recently as 2015 a documentary analysis of research trends published in the Adapted Physical Activity Quarterly (APAQ) over a 10-year period. The identification of research methods was a category of analysis. However, differing significantly from this research, is the level of analysis with regard to research methods. It reported their findings primarily along qualitative and quantitative divides with accompanying descriptions of research design. My research seeks to exceed this type of review by critically attending at the level of epistemology. By revealing the assumptions underlying researchers’ methodological choices, the possibilities for research integration and active communication across disciplines becomes more plausible.

Beyond reporting current trends and possible future directions, my research will also explore alternate approaches and the possibility of methodological integration across disciplines. This offers a unique entry point for researchers to consider how they might collaboratively move across disciplines to address complex, key issues in the field. This is also timely, given recent calls for more interdisciplinary work. That is the focus is not simply to showcase and suggest the active application of alternative research approaches and development of cross disciplinary dialogues. My research will highlight and reinforce the epistemic responsibility of APA researchers, as builders and gatekeepers of the current APA scientific establishment, to actively engage with alternate approaches and participants by keeping the following questions under continuous examination: Who is the expert and whose knowledge counts?


Dr Gyozo Molnar is Principal Lecturer in Sports Studies, University of Worcester, and co-editor of Women, Sport and Exercise in the Asia-Pacific Region: Oppression, Resistance, Accommodation (Oxford: Routledge, 2018).

War Requiem

Howard Cox

The ending of the “Great War for Civilization” on 11 November 1918 has left an indelible mark on the national memory of the United Kingdom. Now that 100 years have passed since this momentous event took place, our recollections are no longer of the event itself but of its legacy in terms of the memorials, symbols, texts and artefacts that were created in its wake. I write these words whilst listening to Ralph Vaughan Williams’ poignant Third Symphony; an evocation of the battlefields of northern France in which the composer’s idea for the music first took shape. The symphony is one of many examples of the way in which it is possible to stimulate a recollection in the present day of an event for which no lived experience endures.

As an aspect of our social memory of the First World War, Vaughan Williams’ Third Symphony is now merely an historical testament. As the historian Jeffrey Olick has noted, ‘History is the remembered past to which we no longer have an “organic” relationship’. At the time of the symphony’s publication and first performance in 1921, however, the music would have fed into and formed part of a collective memory of the war which would have resonated widely with the direct, lived experience of those who experienced the conflict, facilitating the formation of a collective memory. For Olick, therefore, ‘collective memory is the active past that forms our identities’. Thus the same musical artefact which today provides us with an historical social memory of the war, at the time of its writing fed into a collective consciousness of the event that spurred its creation.

The collective memory of an event such as a world war can be generated at different levels of a society. As a piece of music listened to quite widely, Vaughan Williams’ symphony would have helped to form one element of a national collective memory of the war during the 1920s. Similar forms of collective memory can exist at a more disaggregated level of the social structure. Organizations such as businesses can generate symbols, texts and artefacts that stimulate sufficiently common forms of recollection such as to provide a collective corporate memory. In a recent paper, published as part of a Special Issue of the Journal of Management & Organizational History to commemorate the centenary of the end of the war, I have identified one case in point.

In 1915, shortly after the outbreak of the First World War, the British-American Tobacco Company (BAT) moved its London headquarters into a newly-built, prestigious building opposite the Houses of Parliament which it named Westminster House. Many of its employees had already volunteered to serve in the armed forces and space was set aside in the new building to exhibit various forms of memorabilia which the returning staff had garnered from the battlefields (including some live ammunition!). Staff in the company also began to publish a weekly newsletter, called the BAT Bulletin, in which correspondence from those serving abroad (suitably censored) was published and disseminated to work colleagues – including those at the front. In this way the company helped to generate a collective memory of the war that was specific to the firm’s own employees.

Interestingly, this process of creating a collective corporate memory of the First World War was not replicated during the conflict of 1939-1945. Although the impact on the company of the Second World War was every bit as profound as its earlier counterpart, the circumstances were such that no contemporaneous artefacts and texts were generated. Moreover, whereas the lived experience of many BAT staff during the conflict of 1914-18 took a similar form – due particularly to the static nature of much of the fighting – the Second World War saw the company’s staff experiencing the conflict from a far more diverse set of circumstances. Indeed, for many of the company’s employees, the war with Japan led to staff in China and the Far East being incarcerated for many years in prisoner of war camps, without ever being recruited to the armed forces.

In many ways this disparity in the corporate memory of BAT mirrors a more general pattern of the national collective memory of the First and Second World War. The most evocative symbol of our social remembrance – the poppy – was spawned directly by the battlefields over which Vaughan Williams fixed his gaze sometime around 1916. No corresponding symbol of remembrance serves the same purpose in relation to World War II. My own musical imagery of this later conflict finds its counterpart to Vaughan Williams’ Third in Dmitri Shostakovich’s Eighth Symphony which, the composer later wrote, ‘was an attempt to express the emotional experience of the people, to reflect the terrible tragedy of the war’.


Howard Cox is Emeritus Professor of International Business History, University of Worcester, and the co-author of Revolutions from Grub Street: A History of Magazine Publishing in Britain (Oxford: Oxford University Press, 2014).