History Today 09-2016

August 20, 2017 | Author: dzoni32 | Category: Cultural Revolution, Mao Zedong, Secession, Sovereignty, Bangladesh
Share Embed Donate

Short Description

History Today Magazine...


September 2016 Vol 66 Issue 9

CULTURAL REVOLUTION The madness and malice of Mao Zedong


50 years of independence

Cecil Beaton

Style as statement

Publisher Andy Patterson Editor Paul Lay Senior Editor Kate Wiles Assistant Editor Rhys Griffiths Digital Manager Dean Nicholas Picture Research Mel Haselden Reviews Editor Philippa Joseph Art Director Gary Cook Subscriptions Manager Cheryl Deflorimonte Subscriptions Assistant Ava Bushell Accounts Sharon Harris

The worst? The Triumph of Death by Pieter Bruegel the Elder, c.1562.

Board of Directors Simon Biltcliffe (Chairman), Tim Preston CONTACTS History Today is published monthly by History Today Ltd, 2nd Floor, 9/10 Staple Inn London WC1V 7QH. Tel: 020 3219 7810 [email protected] SUBSCRIPTIONS Tel: 020 3219 7813/4 [email protected] ADVERTISING Lisa Martin, Portman Media Tel: 020 3859 7093 [email protected] Print managed by Webmart Ltd. 01869 321321. Printed at W. Gibbons & Sons Ltd, Willenhall, UK. Distributed by MarketForce 020 3787 9001 (UK & RoW) and Disticor 905 619 6565 (North America). History Today (ISSN No: 0018-2753, USPS No: 246-580) is published monthly by History Today Ltd, GBR and distributed in the USA by Asendia USA, 17B S Middlesex Ave, Monroe NJ 08831. Periodicals postage paid New Brunswick, NJ and additional mailing offices. Postmaster: send address changes to History Today, 701C Ashland Avenue, Folcroft PA 19032. Subscription records are maintained at History Today Ltd, 2nd Floor, 9/10 Staple Inn, London WC1V 7QH, UK.

EDITORIAL ADVISORY BOARD Dr Simon Adams University of Strathclyde Dr John Adamson Peterhouse, Cambridge Professor Richard Bessel University of York Professor Jeremy Black University of Exeter Professor Paul Dukes University of Aberdeen Professor Martin Evans University of Sussex Juliet Gardiner Historian and author Tom Holland Historian and author Gordon Marsden MP for Blackpool South Dr Roger Mettam Queen Mary, University of London Professor Geoffrey Parker Ohio State University Professor Paul Preston London School of Economics Professor M.C. Ricklefs The Australian National University Professor Ulinka Rublack St John’s College, Cambridge Professor Nigel Saul Royal Holloway, University of London Dr David Starkey Fitzwilliam College, Cambridge Professor T.P. Wiseman University of Exeter Professor Chris Wrigley University of Nottingham All written material, unless otherwise stated, is the copyright of History Today

Total Average Net Circulation 18,161 Jan-Dec 2015


FROM THE EDITOR SOCIAL MEDIA IS ABUZZ with the question: is 2016 the worst year in history? The year so far has been a challenging one: a huge refugee crisis, born of the suffering in Syria and Iraq; increasing terrorism in Europe, a continent bewildered by Brexit; a failed coup and its sinister aftermath in the strategically critical state of Turkey; the Zika outbreak; growing racial tension in the United States; famine in northern Nigeria. And that is before mentioning some people’s primary concern, a spate of celebrity deaths. The prospect of Donald Trump and Vladimir Putin as leaders of the US and Russia suggests things could get worse before they get better. Those hopeful days when Ronald Reagan and Mikhail Gorbachev reached across the Iron Curtain seem a long way away. The study of history, however, offers some perspective. When the Black Death reached Europe in 1347 – a strong contender for the title ‘worst year in history’ – it went on to kill around a third of the continent’s population in little more than three years, having already ravaged much of Asia. Closer to our time, during 1942 and 1943, the Holocaust reached its height as the all-conquering Wehrmacht pushed ever further east into the Soviet Union before being halted at Stalingrad. Anyone who has read Timothy Snyder’s Bloodlands or seen Claude Lanzmann’s Shoah will know that, despite the efforts of ISIS, we are a long way from such industrial-scale slaughter. One’s perspective also depends, of course, on who and where one is: few Native Americans celebrate the year 1492, while 1845, the first year of the Famine, remains fixed in the collective memory of Ireland and its diaspora. A Syrian might claim 2016 as a nadir, with ample justification. If I was forced to name the worst year, it would probably be 1914. In July of that year, a European order that had brought peace, prosperity and extraordinary artistic and scientific progress, began to unravel. The vast conflict that followed led directly to the Russian Revolution, Nazi Germany, the Holocaust, the atomic bomb, the Cold War and the mess that is the modern Middle East. Only in 1989, with the fall of the Berlin Wall, did we enter a relatively stable period – the ‘End of History’ – before it came crashing down on September 11th, 2001. As the world seeks solutions to a seemingly endless series of crises, it is small comfort to recall the words of Edgar, who, in Shakespeare’s King Lear, observed: ‘The worst is not/ So long as we can say “This is the worst”’.

Paul Lay

HistoryMatters Westralia • Sovereignty • Godly Churchill • Bangladesh

When the Western Australian electorate went to the polls, they voted 68 per cent in favour of secession

A Separate Australia

Western Australia’s desire to secede as ‘Westralia’ in 1933 was undermined by a change in Britain’s attitude towards its Empire. Jack Peacock THIS YEAR’S REFERENDUM on the UK’s membership of the European Union and Scotland’s 2014 independence vote were just the latest in a long line of similar events. While Scotland joined Quebec (1995) in voting for the status quo, others such as Norway (1905) and Montenegro (2006) voted in favour of separation. One theme that seems common to all referendums is that ultimately the voters get what they vote for. A majority for separation means separation. Yet there are exceptions to this rule. On April 8th, 1933, Western Australia voted in favour of seceding from the Australian Commonwealth, though it remains together to this day. What allowed the democratically ex-

‘Westralia Shall Be Free’: Dominion League of Western Australia Secession Map, 1930s.

pressed will of the people to be ignored? And what did it mean for Australia and its relationship with the British Empire? Western Australia’s independent spirit appeared the moment it gained the right of self-government. This was in 1890, a year after talk of federalisation began. Not wishing to give up its newly acquired sovereignty, Western Australia did not attend the 1891 constitutional convention (although New Zealand did) and only sporadically and half-heartedly attended later conventions. The secessionist movement would always claim that Western Australia was cajoled into federating and in some sense this is true. It was a gold rush that tipped the balance. Settlers flocked in from the east, bringing with them profederal opinions. When they heard that

the Western Australian government was against federation, they started their own separatist movements. Thus Western Australia had a choice: refuse to federate and potentially see its goldrich lands break away, or federate and maintain its territorial integrity. They opted for federation. But it did not take long before Western Australians began to regret their decision. Before the end of 1902, the Australian parliament heard the first calls for secession. By 1919, the Sunday Times (one of Western Australia’s leading newspapers) had taken an openly secessionist stance and public demonstrations were held. The movement inspired rousing political rhetoric, poems and songs. It even received support from the governments of Tasmania and South Australia, which threatened their own referendums. And when the Western Australian electorate went to the polls, they voted 68 per cent in favour of secession. And yet secession never came. In just a few years the secessionists’ faith in the British Empire was shattered and their movement had crumbled. At the same time as the referendum, Western Australia held state elections. Despite overwhelmingly backing secession, the electorate simultaneously voted to oust the pro-independence Liberal government and elect the prounion Labour Party, who promptly tried to stall the secession process. But the new government could not stop things completely and, after a year of dithering, finally pressed ahead with a plan to SEPTEMBER 2016 HISTORY TODAY 3

HISTORYMATTERS attain independence. The method they chose to achieve this was a near 500-page petition filled with maps, arguments and the democratically expressed will of the people. The idea was to deliver this to the British Parliament which, they supposed, would pass a bill granting them independence. A delegation led by Keith Watson, chairman of the secessionist Dominion League, left

It was Western Australia’s loyalty to Britain and its Empire that derailed its move towards independence Perth for London with much fanfare and everyone expected things would proceed smoothly. The petition was presented to both Houses of Parliament in December 1934 and a joint committee was formed to examine it. But the committee’s task was not to judge the merits of the case for secession; its task was to determine whether or not the British Parliament had any right to receive the petition. This is where the secessionists misjudged Britain’s attitude to its Empire. The 1926 Imperial Conference had resulted in the Balfour Declaration (which led to the 1931 Statute of Westminster). The declaration carried one important passage; it declared Britain and its Dominions: Autonomous communities within the British Empire, equal in status, in no way subordinate one to another in any aspect of their domestic or external affairs, though united by a common allegiance to the Crown, and freely associated as members of the British Commonwealth of Nations. Britain had effectively given up any control over the Dominions. They were on their own and Britain would no longer interfere. The joint committee therefore rejected the Western Australian petition on the grounds that it had no authority to receive it. Western Australia would have to negotiate with the parliament in Canberra, which was not inclined to listen. ‘History will record this as the greatest and most despicable abdication of all time’, was Keith Watson’s response to the joint committee’s report. Even 4 HISTORY TODAY SEPTEMBER 2016

the anti-secessionist state premier Philip Collier claimed it was not the end of the matter and predicted that if major constitutional change did not come, the Australian Commonwealth would not last ten years. The Dominion League did not immediately accept the joint committee’s report. It continued to lobby and pushed for a debate in Parliament. Questions were even put to Prime Minister Ramsay MacDonald, who was noncommittal in response. The British authorities stalled and nothing happened. A dispirited Watson and his delegation returned to Australia and vowed to continue the struggle, but the mood in Western Australia had shifted. An economic recovery had begun and popular opinion blamed the incompetence of Watson’s delegation for the failure to deliver independence. Thus, just as life in Western Australia began to look brighter, the reputation of the secessionists was dented. In 1935 the Dominion League introduced a bill into the Western Australian parliament calling for unilateral separation, but interest was waning. The same year the Sunday Times saw a change of ownership, editor and opinion. Without this mouthpiece the secession movement dwindled to nothing. It was Western Australia’s loyalty to Britain and the Empire that derailed its move towards independence. Had the Dominion League taken a stronger stance, perhaps issuing a unilateral declaration of independence in 1933, the outcome might well have been different.

Jack Peacock is a researcher with a particular interest in environmental history. Alternative Histories by Rob Murray

Power and the People The relationship between sovereignty and the law is relatively straightforward. When it comes to politics, however, things are much more complicated. Richard Bourke THE IDEA OF taking back ‘control’ has come to dominate political debate in Britain. Much of the discussion has centred on the relationship between the United Kingdom and the European Union (EU). Indeed, the aim of achieving control substantially shaped the referendum on Britain’s membership of the EU. For the victorious Leave campaign, the promise of this kind of power resides in the restoration of sovereignty. Yet the analysis is based on a misunderstanding. While the future of Britain outside the EU is obviously hard to determine, one thing is certain: the possession of sovereignty does not guarantee the exercise of control. The modern debate about sovereignty began with the French thinker Jean Bodin (1530-96). Having joined the Carmelite brotherhood as a monk in his early manhood, Bodin was released from his vows in 1549 and then opted to study law at the University of Toulouse. Much of his education involved attention to Roman law and included the humanistic study of classical texts in political and legal philosophy. It was out of these materials that Bodin developed his conception of supreme power. In his most famous work, the Six Books of the Commonwealth, which originally appeared in French in 1576, Bodin presented a definition of sovereignty. He claimed that it was ‘the absolute and perpetual power of a commonwealth, which the Latins call maiestas [majesty]’. Later in his text, Bodin made clear that the Romans had yet other terms for sovereignty, summum imperium (ultimate authority) being conspicuous among them. Yet, while the Romans, like the Greeks and the Hebrews, had a con-


Under control: Jean Bodin, unknown artist, 1580.

Bodin claimed that sovereignty was ‘the absolute and perpetual power of a commonwealth, which the Latins call majestas (majesty)’ ception of supreme authority, Bodin believed that they had not fully understood its implications. Above all, he insisted, they had failed to grasp that the highest power of command was indivisible. It could not be shared among competing powers in the commonwealth. This meant in effect that, while a state might possess a mixed system of government, it could not be based on a system of ‘shared’ sovereignty. This insight has proved confusing to posterity, above all to admirers of the American constitution: since the United States can be seen as a mixed regime, surely its sovereignty is divided among the different organs of state? This thought was later used to characterise the European Union, too, which is similarly taken to exemplify the ‘pooling’ of sovereignty. The idea that sovereignty could be shared was not only denied by Bodin; it was also refuted by subsequent theorists of the state. The English political philosopher Thomas Hobbes (1588-1679) presented one of the most powerful refutations of the idea that sovereignty could be held by more than one power. Hobbes thought of authority in terms of a right of ultimate legal determination. A final decision had to be precisely that:

a judgement that could not be contradicted by a rival authority. Contradiction entailed conflict, which imperilled the stability of the state, opening up the prospect of a collision between powers. The idea of sovereignty had been invented to forestall this eventuality. After completing his legal studies at Toulouse in the 1550s, Bodin worked as an advocate in the Parlement of Paris. For the bulk of his tenure as a public official, Bodin operated against the background of the Wars of Religion, which afflicted France between 1562 and 1598. Aristocratic houses competed for power in the name of religion, undermining, as Bodin saw it, the majesty of the monarchy. Sovereignty was a recipe for overcoming this descent into factionalism, by subjecting divergent powers in the commonwealth to a single jurisdiction. Confronting the emergence of competition between the English Parliament and Crown in the late 1630s, Hobbes similarly resorted to sovereignty as a bulwark against faction. All sovereignty, he believed, had to be based on the people’s will, yet it did not have to reside directly in their hands. Supreme authority, in fact, might legitimately be the property of a monarchy, an aristocracy

or a democracy. It was this conclusion that was challenged most cogently by the Swiss philosopher Jean-Jacques Rousseau (1712-78), who limited legitimate sovereignty to the democratic form of state. He thought that the social compact underlying any valid political association gave rise to a collective body composed of the totality of the citizen body. This amounted to arguing that the general will alone – and not the will of some part of the community – should determine the common good of all. Rousseau’s model of direct popular sovereignty, according to which the people themselves should act as the source of the fundamental laws of the community, has had a complex impact on political debate since the publication of his work The Social Contract in 1762. On the one hand, the idea of direct popular ratification has increased its appeal since the middle of the 18th century. On the other hand, the fate of popular participation has been a mixed affair, sometimes resulting in the abuse of power. This suggests that popular sovereignty should be sharply distinguished from popular control. An ultimate right of ratification or final plebiscitary authority is very remote from substantial political power. In many ways this outcome tells us something about the nature of sovereignty itself. The idea of a supreme juridical will is a very effective tool for understanding a legal bureaucracy, but it is altogether more questionable as a means of unravelling daily politics. In a court system, based on a hierarchy of judgements, the highest jurisdiction has the final say. However, in political life ultimate authority depends on popular compliance. Sovereignty, in this case, does not mean control; it cannot bridle the forces of opinion or determine the course of events. Bodin introduced an essential concept into our political vocabulary, one which clarifies much about the legal basis of public life, yet it is hardly adequate as a theory of political power in general.

Richard Bourke is Professor in the History of Political Thought at Queen Mary University of London and is the c0-edit0r, with Quentin Skinner, of Popular Sovereignty in Historical Perspective (Cambridge, 2016). SEPTEMBER 2016 HISTORY TODAY 5


Churchill, God and the Bomb Did the idea of nuclear war make Britain’s wartime leader more God-fearing? Kevin Ruane ADDRESSING PARLIAMENT on August 16th, 1945, the Prime Minister Winston Churchill insisted that the decision to attack Hiroshima on August 6th, 1945 and Nagasaki on August 9th had been a joint one between the US and the UK. Over the next decade his public position was consistent and devoid of moral qualms: in war, he maintained, weapons get used. The A-Bomb was a weapon, the Allies were at war with Japan and, consequently, the A-Bomb was a legitimate military option. ‘The historic fact remains’, he wrote in 1953, ‘that the decision whether or not to use the atomic bomb … was never even an issue.’ In private, Churchill was more conflicted. For the clue to his true feelings, we need to turn, somewhat unexpectedly, to his relationship with God. Although brought up in the Anglican tradition, by his early twenties Churchill was expressing views which, if not atheistic, were in conflict with the doctrinal tenets of Christianity. As a subaltern in India in 1897-98, he wrote to his mother of his hopes for a future in which ‘science and reason’ triumphed over religious superstition. If he fell in battle, he advised her to seek out ‘the consolations of philosophy’ and added, apparently conclusively, that ‘I do not accept the Christian or any other form of religious belief’. If, in his subsequent public career, Churchill struck people as conventionally devout, this was due to his prominence at St Paul’s or Westminster Abbey on state occasions and to a prodigious memory, which allowed for perfect (and regular) quoting from Anglican hymns and the recitation of lengthy passages of the King James Bible. During the Second World War, his speeches often referenced the Almighty, notably in beseeching God’s deliverance from Axis evil, but his piety was strategic, an oratorical device rather than an expression of deeply held religiosity. The same could be said for his postwar speeches. In August 1945, he 6 HISTORY TODAY SEPTEMBER 2016

Guilty before God?: Churchill and President Truman, Washington DC, January 10th, 1952.

avowed that it was only by ‘God’s mercy’ that the Allies had beaten Hitler in the race for the A-Bomb. Ten years on, as he bowed out of frontline politics, he spoke in the Commons of his horror of the hydrogen bomb – a weapon 1,000 times more powerful than those used against Japan – and wondered what would happen ‘if God wearied of mankind’. In between these chronological poles there are many other examples of God and the bomb in oratorical juxtaposition. However, though Churchill claimed not to believe in an afterlife (‘I expect annihilation at death’ or else ‘black velvet’), he appears to have spent a good deal of time pondering how his responsibility for the atomic bombing would be weighed on the scales if he was wrong. In May 1946, he confided to a close associate that he expected to have to ‘account to God as he had to his own conscience for the decision made which involved killing women and children and in such numbers’. A little later, he admitted that ‘the decision to release the Atom Bomb was perhaps the only thing which history would have serious questions to ask about ... I may even be asked by my Maker why I used it but I shall defend myself vigorously and shall say – Why did

you release this knowledge to us when mankind was raging in furious battles?’ In January 1953 Churchill attended a dinner in honour of Truman, who was about to leave office. At one point in the evening he suddenly turned round and said: ‘Mr. President, I hope you have your answer ready for that hour when you and I stand before Saint Peter and he says, “I understand you two are responsible for putting off those atomic bombs. What have you got to say for yourselves?”’ A role-play game ensued in which the other guests - among them the Chairman of the US Joint Chiefs of Staff and the US Secretary of State – formed a jury composed of historic figures (Alexander the Great, Julius Caesar, Socrates). ‘The case was tried’, Truman’s daughter Margaret recalled, ‘and the Prime Minister acquitted’ of any atomic wrongdoing. For all his jocularity, Churchill’s behaviour hints at a mind not entirely at ease with what occurred in 1945. It was apparent again in May 1954, albeit in a different context. At a moment of widespread international alarm over the release of death clouds and toxic rain from US atomic testing in the Pacific, Churchill, prime minister once more, invited Billy Graham to Downing Street. Graham was fresh from his two-month ‘Greater London Crusade’, attendances at which had topped 1.7 million, thanks in part to public panic over what the preacher called ‘this Hell bomb’. Graham was allocated ten minutes with the prime minister but got 45. Appalled at the prospect of nuclear war, Churchill confessed that he was ‘an old man ... without any hope for the world ... unless it is the hope you are talking about, young man ... We must have a return to God’. Graham felt like he had met ‘Mr History’, while Churchill wrote that Graham had made ‘a very good impression’. It is sometimes suggested that the two prayed together, a fanciful notion, perhaps, in view of Churchill’s attitudes to the divine. Then again, if anything could bring Churchill to his knees, it was the bomb.

Kevin Ruane is Professor of Modern History at Canterbury Christ Church University and the author of Churchill and the Bomb in War and Cold War (Bloomsbury, 2016).


Bangladesh and its Search for Identity Born of civil war in 1971, the former East Pakistan has wrangled with issues of religion, secularism and democracy ever since. Salil Tripathi WHEN HASINA WAZED led the Awami League to power in 2009, she promised Bangladesh’s electorate that she would set up a tribunal to prosecute war crimes committed during the Liberation War of 1971. Bangladesh was East Pakistan at that time and its Bengali-speaking people had been seeking autonomy within Pakistan’s federal structure. Successive Pakistani governments had refused to accede to those demands. East and West Pakistan were divided by hundreds of miles of Indian territory. When British rule ended in the Indian subcontinent in 1947, two nations emerged. India had a Hindu majority but chose to be a secular republic, while Pakistan was to be home to the subcontinent’s Muslims. Constituted of two parts, the west dominated politics, military and business. When the Pakistani leadership refused to grant the Bengali language equal status with Urdu, discontent grew. Reportedly millions died in a cyclone in the east in 1970 and a poor response from the west convinced many that autonomy was required. In the elections of 1970, the east voted overwhelmingly for the Awami

Never forget: Children re-enact the death of intellectuals during the Liberation War, Dhaka, December 14th, 2013.

League, established as a Bengali alternative to the Karachi-based Muslim League in 1949. This won the party a majority in the national assembly, but instead of inviting its leader, Mujibur Rahman, to form a government, the incumbent Pakistani leader, General Yahya Khan, attempted to negotiate. Unknown to Bengalis, he had also sent thousands of troops to East Pakistan. When negotiations stalled, the government declared martial law on March 25th, 1971. Official records put civilian casualties at 26,000; Bangladeshis insist the number is nearer three million. Nearly ten million refugees left for India, where the government set up camps and provided assistance to Mukti Bahini (Freedom Fighters). In December 1971, after Pakistan attacked Indian airfields, India joined the war. Its troops overran what would become Bangladesh and Pakistan surrendered on December 17th. Rahman, arrested in March that year, returned to Dhaka in triumph. The Awami League won elections, but drought and corruption led to disenchantment. In 1975, junior officers staged a coup in Dhaka killing Rahman and most of his family. His daughter, Hasina, now prime minister, was abroad

and remained in exile until 1981. Successive governments began to tinker with the constitution. Rahman’s assassins were pardoned; some got diplomatic postings and one ran for president. The ban on Jamaat-eIslami, the pro-Islamic party that had opposed Bangladesh’s independence, was lifted and the country declared itself an Islamic Republic. While the Awami League was in power for one term in the mid-1990s, it did not have the majority to revive the tribunals, which would not resume until 2009. The tribunals have stirred up old controversies and questions: is Bangladesh a Muslim or a Bengali nation? Is it secular or religious? Most of those charged at the tribunal are from Jamaat-e-Islami: in 2013, when one of its leaders was given life imprisonment, thousands congregated in Dhaka demanding the death penalty. (The government appealed the verdict; the court sentenced him to death.) In response, thousands of followers of Hefazat-eIslam, a fundamentalist group, protested and violence was inevitable. Since then, extremists have targeted bloggers and publishers who criticise religion. The opposition, the Bangladesh National Party, boycotted the 2014 elections, meaning that, though the Awami League won, over half the seats were uncontested. A terrorist attack killing more than 20 people including 18 foreigners at the Holey Café in Dhaka occurred on July 1st, 2016. Though the attack bears the hallmark of ISIS, it has its roots in the unresolved debate over Bangladesh’s identity and domestic politics. The question as to whether the country is Muslim or Bengali is complicated by the fact that the country is also home to Hindus, Buddhists, Christians and atheists, as well as those who speak Sylheti, Khasi, Garo and other languages. At the Martyr’s Memorial at Rayer Bazar in Dhaka an engraved poem asks: Tomader ja bolarchhilo, bolchhe ki ta Bangladesh?, ‘Does Bangladesh speak what you wanted it to say?’. Unless Bangladesh resolves this question, it will continue to be haunted by violence.

Salil Tripathi is the author of The Colonel Who Would Not Repent: The Bangladesh War and its Unquiet Legacy (Yale, 2016). SEPTEMBER 2016 HISTORY TODAY 7


l a r u t l u The C

y r o t s i H s ’ A People


paign to m a c l ta u r b ’s g n o d e Z Mao , which a in h C t is n u m m o C y purif resulted s, 0 6 19 y rl a e e th in n bega has left t a th s o a h c f o e d a c e d in a nation’s e th n o in a st le b li e d an in ÖTTER. IK D K N A R F s y sa s, c ti poli

n o i t u l Revo

oming y to keep China from bec very hard to find the wa ing rch sea are la rril of African gue nist.’ an Mao received a group bureaucratic and revisio in the PeoN AUGUST 1963, Chairm dered man corrupt, oul -sh 6, an incendiary editorial are 196 squ , l, 1st tal e a Jun rs, on ito r, vis late ng rs yea you ee the mons’. Thr fighters. One of ay all Monsters and De that the red star a question. He believed ed readers to ‘Sweep Aw ort had exh sia, ily ode ple to Da Rh ’s peo rn ple ing the from Sou iets, who used to ral Revolution, urg ning shot of the Cultu had slipped away. The Sov in ope ml ing the s Kre try wa re the It r we o ove g wh hat shinin their enemies. ‘W es of the bourgeoisie , now sold weapons to ce those representativ not re oun we den s thi if are As . Squ n help the revolutionaries lism r Tiananme the road to capita said. ‘Will the red star ove lead the countr y down ders in the party had I worry about is this’, he our oppressors as to to s arm l t that four of the top lea sel ligh and to us e n cam ndo n aba soo it , you ll ugh Wi mayor of ? eno out in China go . ‘I understand your plotting against Mao. The under arrest, accused of e, puffing on his cigarette siv ced pla pen ple, to n e peo am bee the bec o of e and Ma t well?’ turned revisionis tried, under the nos ‘It is that the USSR has was among them. He had g ies jin nar Bei tio olu n’t rev wo erunt ina question’, he observed. you that Ch del of revisionism. Co tion. Can I guarantee to turn the capital into a cita We . tee has betrayed the revolu ran gua t you tha Right now I can’t give betray the revolution?



CHINA had sneaked into the party, the government and the army. Now was the beginning of a new revolution in China, as the people were encouraged to stand up and flush out all those trying to transform the dictatorship of the proletariat into a dictatorship of the bourgeoisie. Who, precisely, these counter-revolutionaries were and how they had managed to worm their way into the party was unclear, but the leading representative of modern revisionism was the Soviet leader and party secretary, Nikita Khrushchev. In a secret speech in 1956, which shook the socialist camp to the core, Khrushchev had demolished the reputation of his erstwhile master Joseph Stalin, detailing the horrors of his rule and attacking the cult of personality. Two years later, Khrushchev proposed ‘peaceful coexistence’ with the West, a concept that true believers around the world, including the young guerrilla fighter from Southern Rhodesia, viewed as a betrayal of the principles of revolutionary communism.


AO, WHO HAD MODELLED HIMSELF on Stalin, felt personally threatened by de-Stalinisation. He must have wondered how one man, Nikita Khrushchev, could have singlehandedly engineered such a complete reversal of policy in the mighty Soviet Union, the first socialist country in the world. He arrived at the answer that too little had been done about culture. The capitalists were gone, their property confiscated, but capitalist culture still held sway, making it possible for a few people at the top to erode and finally subvert the entire system. In short, a new revolution was required to stamp out once and for all the remnants of old culture, from private thoughts to private markets. Just as the transition from capitalism to socialism required a revolution, the transition from socialism to communism demanded a revolution, too: Mao called it the Cultural Revolution. It was a bold project, one that aimed to eradicate all traces of the past. But behind all the theoretical justifications lay an ageing dictator’s Previous page: ‘Greet the 1970s with the new victories of revolution and production’, poster, 1970; Red Guards rally, Beijing, 1966. Top right: Mao Zedong, c.1958. Right: Cultural Revolution propaganda poster, late 1960s. Far right: Mao’s Little Red Book on sale in Shanghai.


Mao combined grandiose ideas of historical destiny with an extraordinary capacity for malice. Insensitive to human loss, he nonchalantly handed down killing quotas determination to shore up his own standing in world history. Mao was sure of his own greatness, of which he spoke constantly, and saw himself as the leading light of communism. It was not all hubris. Mao had led a quarter of humanity to liberation and had then succeeded in fighting the imperialist camp to a standstill during the Korean War.


AO’S first attempt to steal the Soviet Union’s thunder was the Great Leap Forward in 1958, when people in the countryside were herded into giant collectives called people’s communes. By turning every man and woman in the countryside into a foot soldier in one giant army, to be deployed day and night to transform the economy, he thought that he could catapult his country past its competitors. Mao was convinced that he had found the golden bridge to communism, making him the messiah leading humanity to a world of plenty for all. But the Great Leap Forward was a disastrous experiment, as tens of millions of people were worked, beaten and starved to death. The Cultural Revolution was Mao’s second attempt to become the historical pivot around which the socialist universe revolved. Lenin had carried out the Great October Socialist Revolution, setting a precedent for the proletariat of the whole world. But modern revisionists

historical destiny with an extraordinary capacity for malice. Insensitive to human loss, he nonchalantly handed down killing quotas in the many campaigns that were designed to cow the population. As he became older, he increasingly turned on his colleagues and subordinates, some of them long-standing comrades-in-arms, subjecting them to public humiliation, imprisonment and torture. The Cultural Revolution, then, was also about an old man settling personal scores at the end of his life. These two aspects of the Cultural Revolution – the vision of a socialist world free of revisionism, the sordid, vengeful plotting against real and imaginary enemies – were not mutually exclusive. Mao saw no distinction between himself and the revolution. Mao was the revolution. There were many challenges to his position. In 1956, some of Mao’s closest allies, including Liu Shaoqi and Deng Xiaoping, had used Khrushchev’s secret speech to delete all references to Mao Zedong Thought from the party constitution and criticise the cult of personality. Mao was seething, though he had little choice but to acquiesce. The biggest setback came in the wake of the Great Leap Forward, a catastrophe on an unprecedented scale directly caused by his own obstinate policies. At a conference held in 1962, as some 7,000 leading cadres from all over the country gathered to talk about the failure of the Great Leap Forward, Mao’s star was at its lowest. Rumours were circulating, accusing him of being deluded, innumerate and dangerous. Some of his colleagues may have wanted him to step down, holding him responsible for the mass starvation of ordinary people. His entire legacy was in jeopardy. Mao feared that he would meet the same fate as Stalin, who was denounced after his death. Who would become China’s Khrushchev? The Cultural Revolution, then, was also a long and sustained effort by Mao to prevent any party leader from turning against him.

The Early Years (1962-66)

such as Khrushchev had usurped the leadership of the party, leading the Soviet Union back onto the road to capitalist restoration. The Great Proletarian Cultural Revolution was the second stage in the history of the international communist movement, safeguarding the dictatorship of the proletariat against revisionism. The foundation piles of the communist future were being driven in China, as Mao guided the oppressed and downtrodden people of the world towards freedom. Mao was the one who inherited, defended and developed Marxism-Leninism into a new stage, that of Marxism-Leninism-Mao Zedong Thought. Like many dictators, Mao combined grandiose ideas about his own

FOUR YEARS BEFORE the formal start of the Cultural Revolution, Mao went on the attack. In the summer of 1962 he launched a Socialist Education Campaign to raise revolutionary vigilance and clamp down on economic activities that took place outside the planned economy. During the last year or so of the Great Leap Forward, control over the economy had relaxed and parts of the countryside had started to de-collectivise in an effort to ward off starvation, as some of the land was handed back to individual farmers. These practices now came under attack, as ‘Never Forget Class Struggle’ became the slogan of the day. Liu Shaoqi, who had supported a measure of economy leniency to help the country get out of the famine in 1961, threw his weight behind the Socialist Education Campaign. As second-in-command, he soon veered more to the left than Mao. According to Liu, a third of the power in this country was no longer in the party’s hands: the talk was all about ‘taking power back from class enemies’. Liu presided over one of the most vicious purges SEPTEMBER 2016 HISTORY TODAY 13


Mao was particularly concerned with educating the young, seen as the heirs to the revolution Mao at a Communist rally in Tiananmen Square, Beijing, c.1965. Below: Red Guards and high school and university students parade through Beijing at the beginning of the Cultural Revolution, June 1966.

in party history, punishing over five million party members. Whole provinces were accused of taking the ‘capitalist road’. But repression alone would not suffice to counteract the pervasive effects of the counter-revolutionary ideology that had taken hold in the wake of the Great Leap Forward. Mao was particularly concerned with educating the young, seen as the heirs to the revolution. Students at all levels were educated in class hatred and made to study the works of Mao Zedong. Under Lin Biao, the army fostered a more martial atmosphere, in tune with the Socialist Education Campaign. In primary schools, children were taught how to use airguns by shooting at portraits of nationalist leader Chiang Kai-shek and US imperialists. Military ‘summer camps’ for students and workers were organised in the countryside. Before the Cultural Revolution began, young people were ready to take on imaginary class enemies.

The Red Years (1966-68)

ON JUNE 1ST, 1966 the People’s Daily published an editorial entitled ‘Sweep Away all Monsters and Demons’. That same day, celebrated as Children’s Day, a poster that had appeared a week earlier on the campus of Peking University was also widely publicised. It alleged that the university leaders were Khrushchev-type revisionists. Students had undergone years of indoctrination during the Socialist Education Campaign and were itching to lash out. They started scrutinising the backgrounds of their teachers, accusing some of being ‘bourgeois elements’ or even ‘counter-revolutionaries’. But some went too far, taking to task leading party members. They 14 HISTORY TODAY SEPTEMBER 2016

were punished for their activities by work teams sent by Deng Xiaoping and Liu Shaoqi, put in charge of the Cultural Revolution in the Chairman’s absence from Beijing. In mid-July, Mao returned to the capital. Instead of supporting his two colleagues, he accused them of suppressing the students and running a dictatorship. ‘To Rebel is Justified’ became his battle cry and rebel students did. Red Guards appeared in August, donning improvised military uniforms, carrying the Little Red Book. They vowed to defend the Chairman and carry out the Cultural Revolution. They declared war on the old world and went on the rampage, burning books, overturning tombstones in

cemeteries, tearing down temples, vandalising churches and attacking all signs of the past, including street names and shop signs. This was Red August. The Red Guards also carried out house raids. In Shanghai alone, a quarter of a million homes were visited and all remnants of the past seized, whether ordinary books, antique bronzes or rare scrolls. ‘Mao’s Little Generals’ also attacked those suspected of being enemies of the revolution, forcing some of them to swallow nails and excrement as jeering crowds looked on. One teacher killed himself after being set upon by students who forced him to drink ink. Another was doused in petrol and set alight. Others were electrocuted or even buried alive. By late September, more than 1,700 had been killed in Beijing alone.


AO WISHED TO PURGE the higher echelons of power and had turned to young, radical students instead, some of them no older than 14, giving them license to denounce all authority and ‘Bombard the headquarters’. But party officials had honed their survival skills during decades of political infighting and few were about to be outflanked by a group of screaming, self-righteous Red Guards. Many deflected the violence away from themselves by encouraging the youngsters to persecute ordinary people suspected of being class enemies. Some cadres even managed to organise their own Red Guards, all in the name of Mao Zedong Thought and the Cultural Revolution. In the parlance of the time, they ‘raised the red flag in order to fight the red flag’. The Red Guards started fighting each other, divided over who the true ‘capitalist roaders’ and revisionists inside the party were. In some places, Red Guards besieged the local party committee. In others, party activists and factory workers rallied in support of their leaders, leading to a stalemate. In response, the Chairman urged the population at large to join the revolution. Just as Mao had incited students to rebel against their teachers months earlier, he unleashed ordinary people against their party leaders in the autumn of 1966. The result was a social explosion

Above, from left: Mao, Liu Shaoqi, Peng Dehual and Zhou Enlai idealised in a propaganda poster, 1966. Left: Mao with Lin Piao, c.1965.

on an unprecedented scale, as every pent-up frustration caused by years of communist rule was released. There was no lack of people who harboured grievances against party officials. There were all those who had been reduced to starvation during the Great Leap Forward in the countryside. In the cities there were workers living in abject conditions, some barely able to feed their families. And, before too long, the victims of earlier campaigns also started clamouring for justice, including those punished during the Socialist Education Campaign. But the ‘revolutionary masses’, instead of neatly sweeping away all followers of the ‘bourgeois reactionary line’, also became divided, as

different factions jostled for power and started fighting each other. By January 1967 the chaos was such that the army intervened, asked to push through the revolution and bring the situation under control by supporting the ‘true proletarian left’. As different military leaders supported different factions, all of them equally certain that they represented the true voice of Mao Zedong, the country slid into civil war. Soon people were fighting each other with machine guns and anti-aircraft artillery in the streets. Still, the Chairman prevailed. He was cold and calculating, but also erratic, whimsical and fitful, thriving on willed chaos. He improvised, bending and breaking millions along the way. He may not have been in control, but was always in charge, relishing a game in which he could constantly rewrite the rules. Periodically he stepped in to rescue a loyal follower or to throw a close colleague to the wolves. A mere utterance of his decided the fates of countless people, as he declared one or another faction to be ‘counter-revolutionary’. His verdict could change overnight, feeding a seemingly endless cycle of violence in which people scrambled to prove their loyalty to the Chairman. SEPTEMBER 2016 HISTORY TODAY 15

Placards of Mao paraded through Beijing, c.1970.

Over the next three years, revolutionary party committees turned China into a garrison state, with soldiers overseeing schools and factories The Black Years (1968-1971)

THE FIRST PHASE of the Cultural Revolution came to an end in the summer of 1968 as new, so-called ‘revolutionary party committees’ took over the party and the state. They were heavily dominated by military officers, concentrating real power in the hands of the army. They represented a simplified chain of command that Mao relished, one in which his orders could be carried out instantly and without question. Over the next three years they turned the country into a garrison state, with soldiers overseeing schools, factories and government units. At first, millions of undesirable elements, including students and others who had taken the Chairman at his word, were banished to the countryside to be ‘re-educated by the peasants’. Many had no fixed abode. In some provinces, this was the case for roughly half of all exiled students, as they were forced to live in caves, abandoned temples, pigsties or sheds. Most went hungry. Sexual abuse was rife: thousands were raped by local bullies in the province of Hubei alone, including girls as young as 14. Besides students, entire families, in particular the most destitute and vulnerable ones, seen 16 HISTORY TODAY SEPTEMBER 2016

as a burden on the state, were removed to the countryside and left to their own devices. Then followed a series of brutal purges, used by the revolutionary party committees to eradicate all those who had spoken out at the height of the Cultural Revolution. The talk was no longer of ‘capitalist roaders’, but of ‘traitors’, ‘renegades’ and ‘spies’, as special committees were set up to examine alleged enemy links among ordinary people and party members alike. Anyone with a foreign link in their past became suspect. In Shanghai alone, close to 170,000 people were harassed in one way of another. More than 5,400 committed suicide, were beaten to death or executed. In Guangdong province as a whole, one estimate puts the body count at 40,000. In Inner Mongolia, close to 800,000 people were incarcerated, interrogated and denounced in mass meetings. Torture chambers appeared across the province. Tongues were ripped out, teeth extracted with pliers, eyes gouged from their sockets, flesh branded with hot irons. Although less than 10 per cent of the population in Inner Mongolia were Mongols, they constituted more than 75 per cent of the victims. After a nationwide witch-hunt came a sweeping campaign against

CHINA Below: Red Guards and students brandish Mao’s Little Red Book as they parade through Beijing, 1966; a kindergarten child thrusts a spear into an effigy labelled ‘US bad’; children aim toy pistols at a caricature of US President Lyndon Johnson.

corruption, further cowing the population into submission, as almost every act and every utterance – inadvertently poking a hole in a poster of Mao, questioning the planned economy – became potentially criminal. In some provinces up to one in 50 people were implicated in one purge or another. These years were also the high point of a huge industrial project called the Third Front. It aimed at nothing less than the building of a complete industrial infrastructure in the country’s interior. Paranoid about a possible enemy attack from either the Soviet Union or the United States, the one-party state carried out a colossal programme to move about 1,800 factories to the most remote and inhospitable areas of the hinterland, far away from the populated plains in the north of the country and the provinces along the coastline. Since about two thirds of the state’s industrial investment went to the project between 1964 and 1971, it constituted the main economic policy of the Cultural Revolution. It is probably the biggest example of wasteful capital allocation made by a one-party state in the 20th century. In terms of economic development, it was a disaster second only to the Great Leap Forward.


ELF-RELIANCE also become the guiding principle in the countryside, as everybody had to emulate Dazhai, a people’s commune located on a sterile plateau of loess in north China. Dazhai, in effect, was a return to the spirit of the Great Leap Forward, as everything in the village was collectivised once again. The Dazhai model was imposed by the army, as soldiers whipped up the workforce, using the villagers as foot soldiers to increase output. In a province like Zhejiang, a quarter of all production teams reverted to the radical collectivisation of the Great Leap Forward: pigs were slaughtered, private plots confiscated, every tree deemed collective property. Under the threat of war with either the Soviet Union or the United States, the emphasis was on grain and terraced fields appeared everywhere in imitation of Dazhai. Neither climate nor topography mattered, as lakes were filled, forests cleared and deserts reclaimed in desperate attempts, from the Mongolian steppes to the swamps of Manchuria, to emulate Dazhai. Dogmatic uniformity was imposed across the country. Mao was wary of the military, in particular Lin Biao, who had taken over the ministry of defence in the summer of 1959 and pioneered the study of Mao Zedong Thought in the army. Mao had used Lin Biao to launch and sustain the Cultural Revolution, but the marshal in turn exploited the turmoil to expand his own power base, placing followers in key positions throughout the army. He died in a mysterious plane crash in September 1971, bringing to an end the grip of the military on civilian life. The army was in turn purged, falling victim to the Cultural Revolution.

The Grey Years (1971-1976)

By now, the revolutionary frenzy had exhausted almost everyone. Even at the height of the Cultural Revolution, many ordinary people, wary of the one-party state, had offered no more than outward compliance, keeping their innermost thoughts and personal feelings to themselves. Now many realised that the party had been badly damaged by the Cultural Revolution. In the countryside, in particular, if the Great Leap Forward had destroyed the credibility of the party, the Cultural Revolution undermined its organisation. In a silent revolution, millions upon millions of villagers surreptitiously reconnected with traditional practices as they opened black markets, shared out collective assets, divided the land and opened underground factories. Take, for instance, Yan’an. Set amid dusty, sandstone-coloured hills in northern Shaanxi, it was one of the most hallowed places in communist propaganda, where Mao and his guerilla fighters had SEPTEMBER 2016 HISTORY TODAY 17


Clockwise from right: agricultural workers in Dazhai, 1971; the aftermath of the Tiananmen Square massacre, 1989; Chinese premier Deng Xiaoping (left) with Kim Il Sung, leader of North Korea, 1978.

established their temporary capital during the Second World War. When a propaganda team arrived in Yan’an in December 1974, it found a thriving and sophisticated black market. One village had abandoned any attempt to wrench food from the arid and parched soil, specialising in selling pork instead. In order to fulfil their quota of grain deliveries to the state, they used the profit from their meat business to buy back corn from the black market. Local cadres supervised the entire operation. Elsewhere in the province, entire people’s communes had divided up collective assets and handed responsibility for production back to individual families. In many cases, local cadres took the lead, distributing the land to farmers. Sometimes a deal was struck between representatives of the state and those who tilled the land, as the fiction of collective ownership was preserved by turning over a percentage of the crop to party officials. Across the country, from north to south, people raised ducks, kept bees, grew fish, baked bricks and cut timber, always in the name of the collective. In parts of Zhejiang, by late 1971 some two thirds of all villagers were independent – or ‘go-it-aloners’ as they were known at the time. Much of this was done with the tacit consent of the local authorities, who rented the land to individual households in exchange for a portion of the crop.


ANY DID SO out of sheer necessity, in order to stave off the starvation caused by the planned economy. But in less deprived regions, too, the market thrived. In the county of Puning in Guangdong, around 30 markets covered the needs of more than a million people. They attracted local farmers, artisans and traders, each with goods in their hands, on their back or in a cart. Pedlars offered colourful illustrations from traditional operas, books from the imperial and republican eras and collections of traditional poetry that had escaped the clutches of the Red Guards. There were itinerant doctors offering their services. Storytellers used wooden clappers to mark the most dramatic moments of their stories. Blind people sang traditional folk songs for a few alms. Touts stood outside restaurants selling ration coupons. In some markets, organised gangs travelled up and down the coast, going all the way to Shanghai to trade in prohibited goods. A few went as far as Jiangxi to procure tractors, acting on demand from local villages keen to mechanise. Some wealthier villages not only planted profitable crops for the market, but also began establishing local factories. There were also underground factories, dispensing altogether with the pretence of collective ownership. In Chuansha, just outside Shanghai, where

Even before Mao died in September 1976, large parts of the countryside had already abandoned the planned economy 18 HISTORY TODAY SEPTEMBER 2016

villagers were mandated by the state to grow cotton, the industrial portion of total production reached 74 per cent by 1975, a rate of growth far superior to the years of ‘economic reform’ after 1978. Even before Mao died in September 1976, large parts of the countryside had already abandoned the planned economy. It was to be one of the most enduring legacies of a decade of chaos and entrenched fear. No communist party would have tolerated organised confrontation, but cadres in the countryside were defenceless against a myriad of daily acts of quiet defiance and endless subterfuge, as people tried to sap the economic dominance of the state and replace it with their own initiative and ingenuity. Deng Xiaoping, assuming the reins of power a few years after the death of Mao, briefly tried to resurrect the planned economy. In April 1979 he even demanded that villagers who had left the collectives rejoin the people’s communes. But soon he realised that he had little choice but to go with the flow. By 1980, tens of thousands of local decisions had placed 40 per cent of Anhui production teams, 50 per cent of Guizhou teams and 60 per cent of Gansu teams under household contracts. The people’s communes, backbone of the collectivised economy, were dissolved in 1982. Not only did the vast majority of people in the countryside push for greater economic opportunities, but they also escaped from the ideological shackles imposed by decades of Maoism. Endless

campaigns of thought reform during the Mao era produced widespread scepticism even among party members themselves. The very ideology of the party was gone and its legitimacy lay in tatters. But political freedoms were not to follow. The leaders now lived in fear of their own people, terrified of allowing them to speak again, determined to suppress their political aspirations. In June 1989, Deng personally ordered a military crackdown on pro-democracy demonstrators in Beijing, as tanks rolled into Tiananmen Square. The massacre that followed was a display of brutal force and steely resolve, designed to send a signal that still pulsates to this very day: do not query the monopoly of the communist party of China. Frank Dikötter is chair professor of humanities at the University of Hong Kong and the author of The Cultural Revolution: A People’s History 1962-1976 (Bloomsbury, 2016).

FURTHER READING Cheng Nien, Life and Death in Shanghai (Flamingo, 1995). Zhuisui Li, Private Life of Chairman Mao: The Memoirs of Mao’s Personal Physician (Arrow, 1996). Frank Dikötter, Mao’s Great Famine: The History of China’s Most Devastating Catastrophe (Bloomsbury, 2010).


TheMap The Great Fire of London

ONLY A FEW DAYS after the fire’s end (see Months Past, page 8), several plans for rebuilding the gutted City, with imaginative street layouts, were submitted to Charles II by figures including the architect Christopher Wren, the natural philosopher Robert Hooke and the surveyor Peter Mills. None was used, as they all proved impractical for various reasons, but in order for rebuilding to happen accurate plans were needed. The king ordered a survey and the results were drawn up on six plates by John Leake in March 1667. Wenceslaus Hollar produced the engraving and added to it contrasting views of the City from Southwark, on the south bank of the Thames, before and after the fire. No copy of that original map survives but a reduced version, dedicated to Sir William Turner, then Lord Mayor of London, was issued in 1669. This reduced survey contains a key at the top right listing the buildings destroyed in the fire; lost livery halls are indicated by their coats of arms and wards are marked by broken lines and identified by capital letters, which are also listed in the key. Although titled an ‘exact surveigh’, the street plan is simplified, perhaps to be expected given its swift production. The Rebuilding Act of 1667 specified that new houses must be constructed of brick and stone, with tile roofs. A second rebuilding act in 1670 led to the construction of 51 churches, as well as St Paul’s Cathedral (the site can be seen on the map, left of centre). Wren and Hooke were commissioned for this rebuilding and by 1696 all the new churches were in use. St Paul’s was not finally completed until 1710 but was in use from 1697. Kate Wiles




50 Years of Botswana This month marks half a century of an independent Botswana. The intervening years have not been without turmoil, but the country has emerged, writes Stephen Chan, as a model African state.

B Seretse Khama and his Britishborn wife, Ruth, overlooking Bechuanaland, 1950.

OTSWANA ACHIEVED INDEPENDENCE from Britain on September 30th, 1966. In the 50 years since, it has become one of Africa’s success stories, though that success has also involved a half century of contradictions and difficulties. Its history certainly did not begin with independence. From ad 200 to 500, Bantu migrations from what is now Katanga in the Democratic Republic of Congo and northern Zambia swept southwards and, in Botswana, established the Toutswe state, built on cattle herding and control of the trade in gold, which found its outlets on the Indian Ocean coastline. There was a coin currency based on coastal shells. This made the

area attractive to what became, from the 11th century, the Great Zimbabwean state, with its long line of stone cities that acted as way stations for the gold trade, as well as that in salt and hunting dogs. Ancient Botswana was part of a complex, competitive and well-organised trading system, which dealt with the outside world. Great violence came in the 19th century with the conflicts between Ndebele and Shona peoples, which wracked Zimbabwe well into the 20th century, and with the white Boer settlers expanding from the Transvaal. With this expansion, it was inevitable that the white doctrine of racial superiority should begin to affect the region. SEPTEMBER 2016 HISTORY TODAY 23

BOTSWANA After an appeal by the Tswana kings for protection from the Boers, the British made ‘Bechuanaland’ a protectorate in 1885, although many Tswana people found themselves part of South Africa in the colonial map-drawing of the day. Under the British, the racism of the colonial era was still making itself felt, even as it became clear that independence should be granted. The first leader of the new state of Botswana, as Bechuanaland was renamed, would have to overcome many years of high-level obstruction.


ERETSE KHAMA, born in 1921, occupied the throne of the Bamangwato people. This is a hegemonic group within the majority Tswana group in today’s Botswana. He succeeded to the kingship at the age of four, with his uncle as regent. Like other leaders of what would become independent African states, such as Nelson Mandela and Robert Mugabe, he attended Fort Hare University in South Africa’s Eastern Cape. It was then a private institution with a multi-racial policy: a rare beacon in the region. After graduation he went on to Oxford University and the Inner Temple. In London he fell in love with Ruth Williams, she with him and they married in 1948. She was white. Controversy erupted. This furore involved South Africa and its racial policies, which attached a particular stigma to mixed-race marriages. The Prohibition of Mixed Marriages Act had been passed in 1949, one year before the Population Registration Act, which formally divided South Africans into racial categories. It also involved the outraged elders of the Bamangwato, including the regent, who demanded that Khama return home and the marriage be annulled. Ruth travelled with her new husband and her courtesy and charisma so won over the Bamangwato that the regent resigned. Britain, anxious not to offend South Africa, even with its increasing racism, launched a parliamentary enquiry into Khama’s fitness for the kingship. Exhausted and economically depleted by the Second World War, Britain needed cheap South African gold and uranium. It was also in no fit state to resist a South African military incursion into Bechuanaland, nor South African economic sanctions against the colonial administration there. The enquiry found that Khama was eminently fit to rule ‘but for his unfortunate marriage’. Its report was suppressed but, all the same, the British government exiled Khama from his own land in 1951. THERE WAS PUBLIC OUTRAGE in Britain over this obvious complicity with South African racism. There were calls for the resignation of the minister responsible, Lord Salisbury. In Bechuanaland, the Bamangwato simply refused to select a new king. Even so, Khama and his wife were only allowed to return to Bechuanaland in 1956 as private citizens. He renounced the throne and became a businessman. In 1961, however, he founded the Bechuanaland Democratic Party. His persecution by the British had hugely enhanced his appeal to the people and he swept to victory in the 1965 elections, becoming prime minister. The British granted independence to the new state of Botswana on September 30th, 1966 and he became president. Almost as an apology, Queen Elizabeth knighted him. At that time, Botswana was the world’s third poorest country. All around, there was turmoil. Zambia, formerly Northern Rhodesia, had achieved independence in 1964; but the white settlers in Southern Rhodesia (now Zimba-


Left: Southern Africa in 2016. Below: Seretse Khama’s grandfather, Khama III, and his father, Sekgoma, c.1920.

bwe) refused to accept the prospect of majority rule and, in 1965, made a Unilateral Declaration of Independence amid racist rhetoric and a promise that ‘not in a thousand years’ would there be black rule in ‘their’ territory. In 1966 the United Nations terminated the postwar South African trusteeship over what is now Namibia, but the South Africans refused to leave and instituted apartheid policies and laws as if the territory was merely an extension of South Africa itself. The surrounding countries might attain independence, but South Africa was determined to treat

them as dependent territories. Resisting such determination was a hard task. Zambia, a country with 100 university graduates, became independent. But Botswana had only 22 or 21, excluding the new president – and it had only 100 secondary school graduates. In this vast country there were just 12 kilometres of tarmacked road. It was landlocked, with no access to the sea. Stronger political powers bordered it on all sides. All the roads and the railway from the time of Cecil Rhodes went southwards. At any time, South Africa could apply an economic squeeze and Khama’s new country would be strangled. A history of independent Botswana, 50 years old this month, begins with a history of delicate balancing. It was an act of balancing that was possible only with huge internal support for President Khama. Although Botswana had a multi-ethnic polity, Khama’s ethnic group was by far the largest. The Tswana people, 80 per cent of the population, was further divided into eight related groups, of which the Bamangwato was historically the most powerful. Khama’s royal Bamangwato blood legitimised him, as did the symbolism of his marriage, which was in itself a defiance of South African apartheid. Symbolism and rhetoric aside, however, the delicate balancing could often tilt very much in South Africa’s favour. Even when vast mineral deposits were discovered and Botswana had some economic leverage of its own, the most practical way of crash developing a mining industry was to host South African corporations such as De Beers, which became a key actor when diamonds were discovered in 1967.


Top: Seretse Khama and Ruth Williams, May 1960. Above: Seretse Khama (standing) and his legal advisor, Percy Fraenkel, address a tribal council, March 1950.

S IT WAS, well before independence, South Africa had moved to tie the neighbouring states to its economic apron strings. In 1910, the same year South Africa attained Dominion status (effectively its independence from British rule), it established a Customs Union between itself, Bechuanaland, Basutoland (now Lesotho) and Swaziland. In the 1960s, when independence came to the latter three, the arrangement continued and in 1969 was updated to the Southern African Customs Union (SACU). It was not all one-way traffic. SACU allowed Botswana to negotiate a greater share of mining revenue for itself. Again, in 1975, further renegotiation allowed 50 per cent of the revenues to accrue to Botswana. The budget surplus that resulted allowed huge investment in infrastructure, cattle industry subsidies and, in particular, health care. This was to be of major importance in the 1980s, when the AIDS pandemic began. Botswana fielded the most comprehensive effort in the region – with Uganda, the most comprehensive in Africa – in an attempt to deal with the disease. This would not have been possible without mining, which allowed Botswana to achieve the fastest rate of economic growth in the world between 1966 and 1980. Khama did not rely solely on mining and South Africa. He was instrumental in securing favourable trading terms with the European Economic Community for Botswanan beef. The economy was secure enough for him to introduce Botswana’s first national currency, the pula, in 1976. Khama invested little in the army. Nor did he allow liberation groups seeking majority rule in the white minority regimes of Angola, Rhodesia and South-West Africa (now Namibia) to operate from Botswanan territory. This discretion and caution was not to save his SEPTEMBER 2016 HISTORY TODAY 25


Above: Seretse Khama and family fly into exile in England, 1950. Left: Seretse Khama and his family, c.1956. Below: Ruth Willliams and her children, – from left, Anthony, Ian and Jacqueline – watch Seretse Khama being made a Knight Commander of the Order of the British Empire, September 1966.


country from military reprisals after his death in 1980. Almost as a precursor of what was to come, shortly before his death, just after Rhodesia had become independent as Zimbabwe, Botswana joined the new Southern African Development Coordination Conference (SADCC) of the nine majority ruled independent states in the region. SADCC was meant to lessen dependence on South Africa, so Botswana then had to apply its balancing act to SACU and SADCC. But SADCC also became the consortium known as the Frontline States, those at the forefront of the struggle against Apartheid South Africa. By this time, Angola and Mozambique had secured independence from Portugal. Only South-West Africa remained effectively an outpost of South Africa. The response of South Africa was militarised: tank columns swept into Angola; destabilisation by proxy rebel groups was sponsored in Mozambique. Khama died in July 1980 and was succeeded by Quett Masire. The first reported case of AIDS occurred in 1985 and South African commandos struck Botswana that same year. The regional war for freedom had now embraced the once peaceful country.


HE TENURE OF Masire continued the legacy of Khama. The economy continued to grow but economic links with South Africa also continued. The contradictions came early in his presidency when, in 1981, he was elected president of SADCC. Botswana was still a member of SACU and it was in 1981 that South Africa embraced the doctrine of ‘Total Strategy’, adapted from the French effort to suppress the Algerian uprising, by applying both economic and military pressure to the surrounding region. In 1982 it blew up the entire Zimbabwean airforce and sought to sponsor ethnically based rebellion in the west of Zimbabwe. By then, Masire had acquired Soviet arms for his defence forces, to South Africa’s disquiet, but they were probably as much a deterrent to ANC guerrillas seeking to operate from Botswanan soil as to South African forces. Even so, four years of tense relationships ensued. In May 1984 Masire complained that he was under huge pressure to sign a non-aggression pact with South Africa, of the kind forced upon Mozambique – the Nkomati Accord – just two months earlier. Skirmishes broke out between the forces of the two countries in 1985 and, following the commando raid which destroyed buildings and took lives in Gaborone, the capital city, Masire was forced to declare the expulsion of ANC personnel. Gaborone was nevertheless bombed from the air in 1986. It was military defeat by Cuban forces in Angola in 1988 that forced South Africa to abandon Total Strategy. US mediation led to the withdrawal of both Cubans and South Africans from Angola, to the independence of Namibia, to a change of president in South Africa in 1989 and to the new President F.W. de Klerk’s negotiations with Zambia’s Kenneth Kaunda in 1989 and South Africa’s release from imprisonment of Nelson Mandela in 1990. For Botswana, it was a huge relief. Masire could say, however, he had taken

a stand of sorts against apartheid and had suffered for it; but he also had to say that De Beers held equal shares with his own government in the Botswanan diamond mining industry. Masire stood down from the presidency in 1998, in favour of Festus Mogae.

President Quett Masire of Botswana and his wife, Gladys, with British Prime Minister Margaret Thatcher, October 1980.


N ELECTIONS IN 1999, Mogae’s Botswana Democratic Party (BDP) won 30 of the 40 seats. It was regarded as a clean democratic exercise but it also reflected the affiliation of the dominant ethnic group with the ruling party. As long as that affiliation remains, the party cannot easily lose. Despite apparent democracy, liberal ‘freedoms’ are not always protected: the attempted removal of ‘Bushmen’ from the Kalahari occurred in Mogae’s time, as did the beginning of the effort to remove the San people from the vast Okavango Delta. In 2005 the Australian-born academic, 72-year-old professor Ken Good, was expelled from Botswana for having given a seminar paper at the university criticising presidential successions in the country as being not truly democratic. Even a fully democratic country would have had trouble rolling back the enveloping AIDS crisis. Despite recognition of the problems and comprehensive plans and programmes to combat the disease, life expectancy fell from 61 to 47 in the year of Mogae’s accession to power. Even so, in the 2004 elections, his party captured 44 of the 57 seats. In a country the size of Texas, with a population of just two million, the sparsity and spread of population makes the organisation of opposition difficult, even without majority ethnic affiliation to one party. It should also render the government sufficiently unafraid for its future to sanction a political pluralism that was robustly oppositional. The government, however, thanks largely to the mining industry, has sufficient resources to ensure development, to take meaningful steps in terms of ecological and wildlife conservation and even to achieve a degree of international fame from Alexander McCall Smith’s book, The No 1 Ladies’ Detective Agency, set in Botswana; the first film ever made in the country, based on the book, was shot there in 2007. In 2008, De Beers and the government formed a joint venture, the Diamond Trading Company, which continued the embedding of the South African corporation in the Botswanan economy. Even though apartheid had gone in South Africa, the old ties were there, such as SACU, but now with the addition of Namibia. In neighbouring Zimbabwe, Robert Mugabe has clung on to power despite a hugely disputed election in 2008. In that year, Ian Khama, the son of the first president and his white wife and inheritor of the same throne occupied by his father, came to power as the president of Botswana. For a half-white president to step into the shoes of a father who had defied convention and prejudice on both sides by marrying a white woman meant prejudice of a different sort. The Mugabe regime’s expulsion of white

Despite apparent democracy, liberal ‘freedoms’ are not always protected in Botswana


BOTSWANA farmers from their lands, beginning in 2000, had a racism of its own, a reversal of what had occurred before. In addition Botswana, as a beacon of democracy, notwithstanding the constraints outlined above, stood in stark contrast to the gerrymandering that now marred what was meant to have been a democratic election in Zimbabwe. In the fraught months of negotiation under the South African President Thabo Mbeki, which led ultimately to a compromise government in 2009, Ian Khama and Robert Mugabe were at loggerheads. By contrast, in the 2009 Botswanan elections, Khama won a landslide that spoke not only to majority ethnic affiliation but to a personal popularity and to a genuine sense of nationhood, of a sense of heritage and self-assurance, despite difficulties and contradictions, in a changing world. And the world was coming to Botswana.


FORMER COMMANDER of the army, the austere Ian Khama, was rumoured to have had difficulties in accepting that the civil service was not a command institution, but one with processes and consultations. His efforts to fire striking workers led to prolonged industrial unrest. But he did understand the need to internationalise the country’s economy. It could not be all De Beers. In 2009 a western donor conference pledged $1 billion to improve infrastructure in the region. Although that was to benefit several countries, the Japanese initiative in the same year to develop platinum mines in South Africa and Botswana and, in concert with Australian and Canadian groups, to develop both platinum and nickel mines in Botswana alone, meant a greater number of players in the mining sector. Sweden is also a presence in mining. Not all internationalism was welcome to Khama’s government though. The sustained campaign by Survival International on behalf of the San and their rights to live in their traditional habitat has been a thorn in the sides of both the government and De Beers, which seeks to mine in their homelands. By contrast, since 2012, women can now inherit a family home, notwithstanding customary law, in a perceived step towards greater gender equality. But Khama’s establishment of the Directorate of Intelligence and Security (DISS) with police powers, an internal agency in a country with few security problems, has been seen as an extension and consolidation of the readiness for a return to the intolerance of the Ken Good episode. It is a mixed record for Khama, but what it means is that the contradictions he must manage are more plural and more complex than those faced by his father and his immediate successors and that he brings to them a view of management not yet divorced from those of his army command. He criticised again the Zimbabwean election victory of Robert Mugabe in 2013, having himself won re-election easily in 2014, without the need for coercion or rigging. The country is stable and, although it attracts criticism, is seen as a model state in an Africa still emerging from the problems of colonialism and its aftermath. The perceived model nature of Botswana has meant


President Ian Khama attends a summit in South Africa, January 2009.

Festus Mogae’s membership of Mo Ibrahim’s good governance in Africa project – representing in his person, and by his apparent record, the nature of emerging good governance on the continent. But the government of Botswana has never witnessed the defeat of the ruling party: its democratic credentials have never been tested by a surrender of power. The genealogy of the Khama family, as well as of the Bamangwato kingship, may be jeopardised if Ian Khama leaves no heir, although his brother is also a member of the government and occupies a leading role in party politics. It is in party politics that change may be possible. The ruling Botswana Democratic Party (BDP) appears to have developed factions. Khama wished his brother to become vice president in 2014, but the BDP chose instead Mokgweetsi Masisi, which suggests that there may be an inheritor of power in 2019 who is not of the Khama line. In addition, the main opposition party, although failing to attract a significant national following on its own, is nevertheless increasing its urban power. The Umbrella for Democratic Change won four of Gaborone’s five seats in 2014. Together with the Botswana Congress Party, it reduced the BDP’s popular vote for the first time to less than 50 per cent. There are problems that will dominate the 2019 hustings. The global economic slowdown, including that of the Chinese economy and its purchasing of minerals and beef from Botswana, has meant a 15 per cent fall in diamond production in the first half of 2015. Electric power generation, in line with almost all the region’s states, has not kept up with industrial and domestic expansion, though tourism is increasing and a coal industry is up and running and may help with electricity production as well as diversifying the economy. But the nature of Botswana’s second 50 years may be more complicated and difficult than its first, during which it accomplished a great deal: it is certainly no longer the third poorest country in the world. Botswana has the second highest per capita income in Africa, at $17,595, eclipsing that of South Africa’s $11,750. Despite unequal distribution of such wealth and despite the likelihood of problems to come, it is at least a legacy to bequeath to the second 50 years. Stephen Chan is Professor of International Relations at the School of Oriental and African Studies and the author of Southern Africa: Old Treacheries and New Deceits (Yale University Press, 2012).

FURTHER READING Jack Parson, Michael Crowder, Neil Parsons, Succession to High Office in Botswana: Three Case Studies (Ohio University Center for International Studies, 1989). Dan Henk, The Botswana Defence Force in the Struggle for an African Environment (Palgrave Macmillan, 2010). Kenneth Good, Diamonds, Dispossession and Democracy in Botswana (James Currey, 2008).

Out of the As the search for lost medieval kings continues, interest in them seems stronger than ever. But a warning from the past speaks of their – and our – ruin, writes Eleanor Parker.

HERE WAS much excitement recently about the news that – hot on the heels of the finding of Richard III’s body in Leicester a few years ago – an investigation has begun to explore the site of Reading Abbey, which may involve locating the remains of Henry I. Work to find out more about this important monastic site is very welcome. But it is rather a shame (though unsurprising) that media attention focused chiefly on the possibility of finding the king’s body, rather than on what we might learn from these investigations about the larger story of Reading Abbey. Searching for royal relics seems to be in vogue; there are also explorations going on in Winchester to find the remains of Alfred the Great and to study the bones of Cnut, Harthacnut and other 11th-century kings and queens who have for centuries been unceremoniously jumbled up in mortuary caskets in the cathedral. Fortunately, Henry I does not arouse such strong passions as Richard III. Even if he is found, there is unlikely to be a repeat of the controversies that surrounded Richard’s reburial last year. Henry is certainly an important part of Reading’s history: he was the founder of the abbey, where he was buried – while it was still incomplete – after his death in 1135. But the monastic history of the town precedes him by more than a century: a religious house for women was supposedly founded in Reading in the tenth century by Queen Ælfthryth, mother of Æthelred the Unready, in memory of her young murdered stepson, Edward the Martyr. Furthermore, the history of Reading Abbey, of course, continued for 400 years after Henry’s time. It was at Reading that the first polyphonic song surviving in English, ‘Summer is icumen in’, was

his time within a considerably longer written down in the 13th century – perspective. one particular highlight in a long and ‘This is the year which holds the distinguished institutional history, writer: the thirty-fifth year which concluded violently of the reign of the glorious when the last Abbot of and invincible Henry, king Reading was hanged, drawn of the English’, he begins. and quartered at his own But then he surveys the abbey gate in 1539. great men of 135, the The above-ground ruins of emperors, kings, bishops Reading Abbey are currently and archdeacons like Henry closed to the public, but the himself. What survived of abbey precincts are now octhem, he asks, after a thoucupied by a park and a collecsand years? ‘If any of them tion of modern office buildstrove to win fame,’ he says, ings. When I first explored ‘and no record of him now this part of Reading on a survives, any more than sunny autumn weekend, I of his horse or his ass, why was struck by how ghostly did the wretch torment his and lifeless those tall buildspirit in vain?’ ings were, in their glittering Then he looks forward glass emptiness, towering and speaks to those living over the fragments of stone Transitory power: Henry I in ‘Sir in the third millennium, remaining from the abbey. Thomas Holme’s Book of Arms’, in 2135. ‘Consider us, who They were more eerie than English, 15th century. at this moment seem to any medieval ruin could be be renowned, because we, miserable and just as expressive (a medieval hiscreatures, think highly of ourselves ... torian might think) of the transitory Tell me, what gain has it been to us to nature of earthly wealth and power. have been great or famous? We had no Reading’s newest skyscraper was built fame at all, except in God.’ in 2009, at the southern edge of the Henry’s meditation on mortality strikes a timely note as we inhabitants of the third millennium search for the earthly remains of his ‘glorious and invincible’ king. He might have pointed out that the bodies of the poor and forgotten buried in England’s country churchyards have lain more peacefully, undisturbed for generations, than Henry I in his splendid abbey church. abbey precincts; will it last centuries, Finding kings under car parks has decades, or just a few years? become something of a popular joke – The historian Henry of Huntingbut a medieval historian might instead don, writing in the year of Henry I’s have taken it as a poignant reminder of death, provides a memorable take on how easily, over the centuries, sacred these questions in the epilogue to his places become waste. Historia Anglorum. From his perspective in 1135, he looks back to the year 135, and forward to 2135, to situate Eleanor Parker is a medievalist and writes a blog himself and the powerful people of at aclerkofoxford.blogspot.co.uk.

‘[If ] no record of him now survives ... why did the wretch torment his spirit in vain?’



VERDUN The Killing Field The epic German offensive to take the strategically crucial fortress in north-west France reached its bloody end 100 years ago this month. Robert Foley looks at how and why Erich von Falkenhayn, the Chief of the German General Staff, sought to break the deadlock on the Western Front.


N THE EARLY HOURS OF FEBRUARY 21ST, 1916, the defenders of the hitherto quiet sector around the French fortress of Verdun found themselves facing the greatest concentration of German guns and mortars that anyone on the Western Front had faced to date. The German Fifth Army, commanded by Crown Prince Wilhelm with his chief of staff, Konstantin Schmidt von Knobelsdorf, had assembled more than 1,400 artillery pieces and mortars, along with several million rounds to fire in the offensive’s first few days. Their plan was to use this overwhelming artillery to stun the French defenders and allow a relatively easy capture of the eastern bank of heights of the River Meuse. From these dominating heights, the German Chief of the General Staff, Erich von Falkenhayn, expected the Fifth Army to be able to inflict huge casualties on any French counter-offensives. Falkenhayn hoped to use the tactical success gained by the Fifth Army to have a debilitating effect on the French army and people.


Men of the German Fifth Army march in the direction of Verdun.

A German howitzer gunner at Verdun. Opposite: a dead French soldier in a shell hole.

A German ammunition depot in Romania with shells intended for the Verdun offensive.



German gunners fire from a 77mm field gun in Champagne, 1916. Below: French gunners man a 105mm field artillery Schneider in the Verdun region.


The Battle of Verdun was an enormous expenditure of resources – both lives and material – in a tiny section of the Western Front. Between February and September 1916, German soldiers fought and died to take the heights overlooking the French citadel, only to lose almost all their gains by Christmas to French counter-offensives. Along this 20-kilometre section, the German Fifth Army and the French Second Army are estimated to have fired more than 30 million artillery shells. To make matters worse, the German Landser upon whom fell the grim task of continual attacks and counterattacks had no idea that the strategic goal of Falkenhayn was to ‘bleed white’ the French people via the attrition of its youthful army.


INCE ITS INCEPTION, both soldiers and historians have struggled to understand Falkenhayn’s strategic goal of attrition. Yet, far from a ridiculous objective, Falkenhayn’s plans in 1916 were a sophisticated, if grisly, response to the stalemate of the trenches. His plans drew on his experiences of the war to date and the failures of both the Central Powers and the Entente to achieve victory through traditional means. Verdun represents the war’s first deliberate battle of attrition. Indeed, one observer, the future Chief of the General Staff and German War Minister, Wilhelm Groener, later wrote that he could find ‘no analogue’ in military history with which to compare Falkenhayn’s approach at Verdun. Falkenhayn was an unlikely choice to succeed Helmuth von Moltke as chief of the General Staff in the wake of the German defeat in the Battle of the Marne in September 1914. Falkenhayn had spent much of his career as an advisor and instructor to the Chinese army, far from the sometimes feverish centre of the army, the General Staff in Berlin. Many put Falkenhayn’s rise to prominence, first as Prussian Minister of War in 1913 and then as Chief of the General Staff, down to his political ties and acumen rather than his abilities as a strategist or military thinker. This image of a ‘political general’ would dog him throughout the war. Yet, far from a mindless sycophant, Falkenhayn’s pre-war career allowed him to see more clearly than most that the war had challenged the assumptions on which strategy had been based and allowed him to develop a unique response, which would be used at Verdun. IN THE WAKE OF THE FAILURE of his offensive in Flanders in 1914, Falkenhayn pressed the German Chancellor, Theobald von Bethmann Hollweg, to use diplomacy to detach one of Germany’s enemies from the coalition: ‘So long as Russia, France, and England stay together ... we run the risk of slowly exhausting ourselves.’ Falkenhayn used the offensive on the Eastern Front in 1915 to attempt to detach Russia from its allies, hoping to use the conquest of Russian Poland as a bargaining chip with the Tsarist government. That this failed, despite the vast amount of territory conquered by Austro-Hungarian and German forces in 1915, did not shake Falkenhayn from his belief that there was no simple military solution to Germany’s strategic dilemma. Rather, the experience of 1915 convinced Falkenhayn that the German Foreign Office could not be relied upon to take sufficient advantage of the opportunities presented by battlefield success. He concluded he would have to find another way of detaching one of Germany’s enemies from the Entente. In 1916, Falkenhayn returned to the concept of striking an enemy’s political weakness in order to have strategic effect. In this case, the German strategist focused on manpower. Although France was undoubtedly still a ‘Great Power’ in 1916, her long-term future as such was not assured, in large part because of her ageing and ultimately declining population. Thanks to a low birth rate, by 1914 France had to conscript nearly 85 per cent of her eligible manpower to maintain the size of her army at a parity with that of Germany. By contrast, the population of Imperial Germany was growing by leaps and bounds, meaning that it conscripted less than 50 percent of its eligible manpower. In 1913, the introduction of a three-year service law designed to increase the

Falkenhayn’s plans in 1916 were a sophisticated, if grisly, response to the stalemate of the trenches

Top: Erich von Falkenhayn, when Prussian War Minister, 1913. Above: Marshal Philippe Pétain, ‘Hero of Verdun’, in the interwar years.


VERDUN trained manpower available to the French army at any one time was met with howls of protest from the French population, worried about the impact of extended service on its youth. In the wake of this, the French army was forced to improve living conditions in barracks and make other concessions. These factors did not pass unnoticed by German intelligence. They had identified manpower shortages as a critical vulnerability of France and concluded that it could not sustain prolonged losses without political consequences. By 1916, the French army, and its population, had suffered enormous losses, further weakening the army and its resolve in the eyes of Falkenhayn and his intelligence officers. BY 1916, Falkenhayn was convinced that the French army was in longterm decline, thanks to the high casualties it had suffered in 1914 and 1915. In August 1915, German intelligence completed a report on France: France’s victims in this war are so many that the government can bear the responsibility for them neither before the people of France nor someday before history. Soon [the French government] will be faced with the question of whether, despite all outside help, the ending of resistance is a more fitting path for the future of the nation than the continuation of this hopeless war.

Fort Douaumont before and after the Battle of Verdun.

The plunging fire and the heavy shells thrown by German howitzers made them ideal weapons for trench warfare


The experience of the German offensives in Flanders in 1914 and of the Franco-British offensives in 1915 also convinced Falkenhayn of the futility of attempting a large-scale breakthrough of the modern defensive positions that dominated the Western Front. Indeed, the success of the German Army of the Western Front (Westheer) in resisting the superior Franco-British forces in September and October 1915 demonstrated the power of defence when backed by sufficient artillery. The war to date had shown clearly, and not just to Falkenhayn, the importance of firepower in both offensive and defensive operations. Here, the German army was better off than its opponents. It had invested in mobile heavily artillery, particularly howitzers, while the French and Russian armies had focused on field artillery. The plunging fire and the heavy shells thrown by howitzers made them ideal weapons for trench warfare. Their high angle of fire meant they could hit targets hidden by obstacles and could penetrate earthworks and other cover that the flat trajectory field guns could not hit. Several offensives had demonstrated to the German army how this superiority could best be put to use. First, in late 1914 and early 1915, units of the German First Army carried out two successful limited offensives near Soissons. Later in 1915, Austro-Hungarian and German forces launched a series of large offensives against Russia, which all but destroyed the Russian army with the loss of all of Russian Poland. One of the factors in the success of these offensives was the use of artillery. Possessing large numbers of howitzers, the Germans could be more targeted in their use of artillery: in repeated attacks through the spring and summer of 1915, heavy shells fired at high trajectory laid waste to Russian defensive positions. Lacking howitzers, by contrast, the allies still had to rely on an overwhelming mass of artillery in their offensives and were unable to destroy their targets. Moreover, these German offensives demonstrated the importance of the psychological effect of artillery, in addition to its destructive effects. German artillery by early 1916 aimed not only to destroy key components of the enemy defence, but also to stun the defenders through short, sharp bombardments, which would enable German infantry to manoeuvre. At first glance, Verdun seems like an odd choice of location for a major German offensive. It had been a quiet sector of the front since 1914. So quiet, in fact, that the French High Command had stripped the fortress of much of its heavy artillery for use in the field army. Yet the fortress itself remained inherently strong. Its distributed forts were far enough from each other that they could not all be taken in one go, but were close enough to provide mutual fire support. Moreover, most of the forts were well protected from all but the heaviest of artillery fire.


ESPITE ALL THIS, Verdun offered the Germans some advantages. First, the fortress was at the centre of a large salient within the German lines. This meant that German artillery could fire into the French position from three sides, rather than simply from the front. Second, the Germans possessed good lines of communications – heavy and light railways as well as roads – that would enable the rapid supply of men and munitions. This contrasted sharply with the situation for the French: only one road and one rail line ran into the salient, which was vulnerable to long-range German artillery fire. Finally, the geography of the battlefield offered significant advantages to the Germans. The fortress of Verdun was bisected by the River Meuse, making coherent defence more challenging. Additionally, the land that was on the right bank, where the German offensive would first strike, was significantly higher than that of the left. If this could be taken, the Germans would be in a position to dominate the lower terrain with well-directed artillery fire. These factors came together in Falkenhayn’s plan for the offensive at Verdun and were the basis of his hope for strategic success in 1916. The offensive was designed by Falkenhayn to hit French manpower, which was judged to be France’s main vulnerability. Heavy losses on top of those suffered elsewhere would compel the French public to put pressure on its politicians to bring the war to a conclusion. This, in turn, would meet Falkenhayn’s long-held strategic goal of breaking the Entente. The experience of the war to date also offered Falkenhayn ideas of how this attrition was to be carried out at a low enough cost for German attackers. The experiences of 1914 and 1915 demonstrated that offensives could achieve limited terrain goals, if properly supported by artillery and if properly executed. The German First Army had done this

near Soissons in January and the French and British had done this in Champagne and around Loos in September and October. The terrain around Verdun was central to Falkenhayn’s plans. He believed that the units of the Fifth Army would be able to seize the dominating heights around Verdun on one rapid jump. Once in possession of these heights, they would be in solid defensive positions, would be in a position to overlook any preparations for counterattack and would be able to withstand them with comparatively light losses. From the start, he recognised that the French and their British allies might attack elsewhere on the Western Front. Indeed, he hoped the British Expeditionary Force would be compelled to attack to assist their French allies. Given the relative ease with which a much weaker Westheer had been able to withstand the Franco-British offensives of 1915, Falkenhayn believed a Westheer, reinforced by units from the Eastern Front, would be able to hold its ground in 1916. Given all this, Falkenhayn sought to limit some of the Fifth Army’s more ambitious plans for the Verdun offensive. Although the Fifth Army was clear about the goal of ‘bleeding white’ the French army, it was less certain about how this was to be accomplished. The early plans focused on the rapid capture of the fortress and an offensive on both banks of the Meuse. Wanting to limit potential German casualties, as well as wanting to maintain a reserve to meet potential counteroffensives elsewhere on the Western Front, Falkenhayn restricted his army’s offensive to the right bank. He assumed that once the heights on this bank were captured, then heavy German artillery would be able to suppress any French artillery fire coming from the lower heights of the left bank. However, these conflicting views of the offensive would ultimately cause problems in the detailed planning for the operation and its conduct. SEPTEMBER 2016 HISTORY TODAY 35

VERDUN The Ossuary of Douaumont, the national necropolis at Verdun, which contains the skeletal remains of 130,000 German and French soldiers.


HE GERMAN OFFENSIVE at Verdun began with intensive fire from more than 1,400 guns, howitzer and mortars. As Falkenhayn had directed, the Fifth Army’s plan relied on the effect of artillery to support the advance of its six attack divisions along a 14-kilometre front. Unlike the French and the British in 1915 or later at the Somme in 1916, the Fifth Army’s preparatory artillery bombardment was designed to target key enemy defensive positions and lines of approach precisely and destroy them rapidly. The Fifth Army trusted that the psychological effects of an intense bombardment would paralyse the French defenders long enough for the German infantry to close upon their lines. Thus, the initial bombardment lasted for 10 hours, but fired close to a million shells. A further 1.5 million rounds were stockpiled to support the advance over the next couple of days. Though the element of surprise had been lost by a delay caused by poor weather, the Fifth Army’s attack met its initial goals. On the first day, all three attacking corps had taken the French first lines and some units had penetrated the second lines. By February 24th, the Fifth Army was poised to break the French resistance at Verdun and capture the heights overlooking the Meuse. The entire French second defensive position was in German hands and the French defending units were all but annihilated. However, the German attackers had suffered heavily, too, and thanks to Falkenhayn’s desire to limit the offensive, the Fifth Army did not have fresh units with which to maintain the attack’s momentum and to break French resistance once and for all. The Fifth Army’s commander, Crown Prince Wilhelm, later wrote: On the evening of 24 February, the resistance of the enemy was actually broken; the path to Verdun was open! ... We were so close to a complete victory! However, I lacked the reserves for an immediate and ruthless 36 HISTORY TODAY SEPTEMBER 2016

exploitation of the success we had achieved. The troops, who had been engaged in unbroken, heavy combat for four days, were no longer in the condition to do so. Thus, the psychological moment passed unused. To make matters worse for the Fifth Army, the French government and army had finally decided that Verdun was to be held at all costs. On February 25th, Philippe Pétain, whose motto had long been ‘le feu tue’, or ‘firepower kills’, was given command of the defence of the fortress. The new French commander rationalised the French line and created a coherent defensive fire plan. He massed artillery on the west bank of the Meuse, where it could fire into the flank of the attacking German troops on the east bank. He also stiffened the morale of the defenders. His General Order Number One read: ‘The mission of the Second Army is to stop at any price the enemy effort on the Verdun front. Every time the enemy wrests a parcel of terrain from us, an immediate counterattack will take place.’ Pétain also set in place a unit rotation policy, which allowed the French army to endure the massive casualties of the battle without breaking. He insisted that units be replaced before they were completely worn out, which allowed each unit to preserve a cadre of trained and experienced manpower around which a new unit could form. This ensured that most of the French army experienced the ‘hell of Verdun’ at least once. It also meant that the French contribution to the planned Franco-British offensive on the Somme had to be continually reduced. Failing to achieve the heights along the east bank of the Meuse in a rapid advance, the Fifth Army now had to fight its way through tangled terrain against a motivated and well dug-in defender. The German Fifth Army continued to pound away at the French defences on the east bank with limited gains. In early March, the Fifth Army finally convinced Falkenhayn that an offensive also needed to be launched to seize the

high ground on the west bank, from which the French defenders were pouring artillery fire into the flank of those German units attacking on the east bank. This, too, failed, with heavy casualties. Stung by the increasing losses, at the end of March Falkenhayn questioned whether or not the offensive should be ended. The Fifth Army argued that the offensive should be continued, as it was achieving the goal of wearing down the French army. Indeed, from the German perspective, it appeared as if their attritional goal was being achieved, though poor intelligence meant that they had a far from accurate picture of the damage being inflicted upon Verdun’s defenders. Falkenhayn believed that the French suffered five casualties for every two Germans. In early March, he assumed that the French had suffered more than 100,000 casualties. The reality was different. By March 15th, the French had taken nearly 70,000 casualties defending Verdun, but this had cost the German attackers more than 52,000 of their own. WITH THE FIFTH ARMY COMMITTED to continuing the battle, Falkenhayn found it difficult to bring it to an end. It did not help that Kaiser Wilhelm II stated on April 1st that: ‘The decision of the war of 1870 took place in Paris. This war will end at Verdun.’ Indeed, too many important reputations were linked to the offensive. It was clear that a loss at Verdun would pave the way for Falkenhayn’s rivals, Paul von Hindenburg or Erich Ludendorff, to take his place as Chief of the General Staff. But the prestige of the Fifth Army was also heavily invested in the offensive. Part of the rationale for attacking at Verdun was for the German Crown Prince to be seen to be leading the German army to victory against the French. Failure at Verdun would taint the reputation of the future German emperor. While not the only reason for continuing the attritional battle, the importance of reputations was clearly significant in the decision to continue the battle, despite rising costs and limited gains. In the end, failure to produce the collapse of the French will to fight combined with the pressures of a largely British relief offensive on the

Somme, the spectacular successes of an enemy thought destroyed on the Eastern Front and the entry of Romania into the war on the side of the Entente brought Falkenhayn’s time as Chief of the General Staff to a close. With this came the end of the German offensive at Verdun. However, this was not end the battle. French counteroffensives retook most of the terrain seized by the Fifth Army by the end of 1916. The battle had, of course, led to the anticipated attrition of the French army, but it did not have the expected result of destroying the French will to fight. Thanks to Pétain’s system of unit rotation, 74 of the 98 divisions available to the French army in 1916 passed through the ‘hell’ of Verdun. Nearly 380,000 Frenchmen were casualties of the battle, including 62,000 dead. Despite its ‘victory’ at Verdun, the French army never really recovered from it. By September 1916 when the German offensive ended, the strength of the French infantry was 150,000 below its establishment strength and the lack of manpower meant that the French army would never gain reach the size it had been at the start of 1916. The German army also suffered heavily, though only 39 divisions took part in the battle, the equivalent of 30 per cent of the German army’s strength. These units suffered about 337,000 casualties between February 21st and the end of 1916. The German army, though, was able to make good its losses and even those suffered during the other defensive battles of the year, on the Somme and on the Eastern Front. Thanks to a deeper pool of manpower, the German army did not reach the peak of its strength until June 1917. The broader impact of the battle is difficult, though, to quantify. The failure of the offensive resulted in Falkenhayn’s dismissal and, with his departure, Germany lost its most imaginative strategist. His replacements, Paul von Hindenburg and Erich Ludendorff, failed to recognise that the rise of mass armies and trench warfare necessitated new approaches. In 1918 they attempted to follow the same old strategic script that had failed in 1914 with predictable results. The French ‘victory’ also lost some of its lustre. Pétain’s unit rotation system may have enabled the French army to endure the battle in the short term, but it also ensured that three quarters of the army experienced the ‘hell of Verdun.’ This contributed to longer-term problems of morale that surfaced in the mutinies of 1917. The large-scale participation in the battle, its relentless attritional nature and the lack of any clear conclusion also helped establish Verdun as the symbolic battle of the war for both the French and German peoples. While the Battle of the Somme has come to symbolise the First World War in the eyes of Britain and its former Empire, Verdun holds this place for Germany and France today. Indeed, over time, the shared experience of the Battle of Verdun came to play an important role in Franco-German relations. It was at the great Douaumont Ossuary, in which the bones of 130,000 unknown French and German dead are co-mingled, that the French President François Mitterand and the West German Chancellor Helmut Kohl joined hands in a poignant gesture of reconciliation between the two former beligerents during a memorial service in 1984 to the fallen of the Battle of Verdun. Robert Foley is head of the Defence Studies Department, King’s College London and the author of The German Army in the First World War (Cambridge, 2017).

FURTHER READING Alistair Horne, The Price of Glory: Verdun 1916 (Penguin, 1993). William F. Buckingham, Verdun 1916: The Deadliest Battle of the First World War (Amberley, 2016). Malcolm Brown, Verdun 1916 (History Press, 2003). President Mitterand (left) and Chancellor Kohl join hands at Douaumont Ossuary, 1984.

Robert T. Foley, German Strategy and the Path to Verdun (Cambridge, 2015). SEPTEMBER 2016 HISTORY TODAY 37



South-East Asia’s ‘Golden Triangle’ dominated the world’s opium production during the 1980s. David Hutt reveals how a young soldier from north Burma took on the United States government to become the region’s most notorious drug lord.

The Opium King Poppy Culture: portrait of Khun Sa by an unknown artist, c.1990.

IN 1977, KHUN SA made the United States government an offer they could – and did – refuse. The Burmese drug lord put it to the Americans that if they really wanted to stop heroin from entering their borders then they should buy his entire opium supply. They could do with it as they wished and he would have the money to support his people. Instead, the United States government indicted him for drug trafficking and slapped a $2 million bounty on his head, making him one of the world’s most wanted men. The United States’ Drug Enforcement Administration estimated that, in 1990, 45 per cent of the entire global supply of heroin originated from the Golden Triangle, an area of South-East Asia encompassing eastern Myanmar (then known as Burma), northern Laos and Thailand. At the height of his power during the late 1980s, Khun Sa is believed to have controlled as much as 70 per cent of the heroin leaving the Golden Triangle. In a 1977 interview with the now-defunct magazine Bangkok World, the rarely modest Khun Sa dubbed himself the ‘King of the Golden Triangle’. The ‘King’ he might have become, but Khun Sa was from a modest background. Born on February 17th, 1934 in a small hamlet in Burma’s northern Shan state, close to the Chinese border, his mother was from the indigenous Shan ethnic group and his father a Chinese soldier who fought for the nationalist Kuomintang. In 1947, as British colonial rule came to an end in Burma, the Shan were promised autonomy. However, following the military coup by General Ne Win in 1962, which brought an end to Burma’s flirtation with republican democracy, the new hardline nationalist government ended the Shan’s hope of self-rule. Khun Sa followed in his father’s footsteps and fought for the Kuomintang from an early age, before throwing his weight behind Burma’s new nationalist hardliners. In 1963, he established his own militia loyal to Ne Win, an outsourced army that fought against Shan separatists. Instead of being paid in money or provisions, Khun Sa received a concession to use state roads and facilities for SEPTEMBER 2016 HISTORY TODAY 39

| KHUN SA drug trafficking. It was at this time that he began combining military aggression with drugs trafficking, something he would continue to do for the rest of his life. With the backing of the Burmese government, it was not long before his opium trading had expanded to such an extent that he came into conflict with his former allies, the Kuomintang, who had held a near monopoly on the illicit flow. This erupted in a brief drug war between the two groups as Kuomintang forces attempted to disrupt shipments into Laos by Khun Sa’s militia. However, the trade was far from unilateral at this time. As the historian Alfred W. McCoy documented in his influential book, The Politics of Heroin in Southeast Asia (1972), where the movement of drugs was concerned, the demarcations between illegality and international conspiracy were far from clear. McCoy’s most notable suggestion was that rather than being the leading crusader in any proto-War on Drugs, the United States actually engaged with and promoted the trade, primarily as a means of funding the anti-communist Kuomintang. Indeed, reporting on the heroin produced in the Golden Triangle during the mid-20th century, McCoy wrote:

Opinions are divided as to whether his reputation was justified. He was a great cultivator of his image, granting bellicose interviews to the press, where he was described as a professional drug smuggler and the only Shan warlord capable of transporting large quantities of opium. He was, in the verdict of McCoy, ‘the first of the Golden Triangle warlords to be worthy of his media crown as “kingpin”’. Not everyone agreed with this version of Khun Sa. Bertil Lintner, an expert on Myanmar, interviewed him many times and described him as an illiterate peasant, a frontman for an ethnic- Chinese dominated organisation. ‘He was basically a country bumpkin … He was a peasant and never the brains behind the organisation’, Lintner claimed. Like many drug lords, Khun Sa’s reputation was ambiguous. For some, he was a hero in the mould of Robin Hood, nobly struggling for his own people against a tyrannical government. Others considered him an opportunistic criminal, whose claim of social justice masked the ruthless, bandit nature of his militia, which commandeered children from villages and forced peasants to cultivate poppies. Khun Sa always claimed that the Shan people were his first priority and he only sold opium to fund their autonomy movement. However, as the Economist wrote on his death in 2007, ‘the Shan themselves thought him just another drug warlord, and half-Chinese anyway’.

For some, Khun Sa was a hero in the mould of Robin Hood ... others considered him an opportunistic criminal

It is transported in the planes, vehicles, and other conveyances supplied by the United States. The profit from the trade has been going into the pockets of some of our best friends in Southeast Asia. The charge concludes with the statement that the traffic is being carried on with the indifference if not the closed-eye compliance of some American officials, and there is no likelihood of its being shut down in the foreseeable future. What’s more, it was widely believed that the CIA’s covert activities in Burma actively fuelled the operations of Khun Sa and his forces. Prince Prosperous By 1967, however, the Kuomintang’s conflict with Khun Sa had come to an end with his defeat. Following this, his relations with the Burmese government soured, leading to his imprisonment from 1969 to 1974 in a jail in the country’s then capital Yangon (formerly Rangoon). Five years in prison, however, did nothing to deter him. Within two years of being released, Khun Sa had established a base in northern Thailand, where he founded the Shan United Army, a militia group that agitated for Shan independence – something he had previously fought against – as well as carving, often brutally, its own place in the region’s heroin and opium smuggling trade. It was around this time that he also adopted his popular nom de guerre. Khun Sa translates as ‘Prince Prosperous’ in the Shan dialect. He had been born Chang Chi-fu. Despite clashes with the Thai army, by 1975 the Shan United Army boasted more than 10,000 soldiers and controlled much of the Thai-Burma border. It would later spread across most of Burma’s Shan State, a region that borders northern Thailand, western Laos and the southern part of China’s Yunnan province. This was to be the peak of Khun Sa’s power. It was at this moment that he offered the United States government his entire opium crop and, in doing so, became an internationally infamous drug lord. 40 HISTORY TODAY SEPTEMBER 2016

The drugs don’t work On January 7th, 1996, Khun Sa was to surrender to the Burmese government for the last time. In his stronghold of Ho Mong, a town of 6,000 people in the Shan State, the Burmese army arrived at his villa, nicknamed the ‘White House’, though his departure from the stage was not one conducted in a blaze of glory. By 1995, the United States’ Drug Enforcement Administration had infiltrated his connections with trafficking brokers, seriously denting his revenue. With money drying up, his own soldiers began to mutiny. Surrendering seemed the best option. And it worked for Khun Sa. Instead of imprisonment, he was provided with a mansion in Yangon, where he lived with his four Shan wives. The Burmese government even denied the United States’ request for extradition. The last ten years of his life were of relatively peaceful retirement. He told the press he had become ‘a commercial real-estate agent with a foot in the construction industry’. He also invested in the country’s booming ruby and jade mining industry. On October 26th, 2007, at the age of 73, Khun Sa died in a Yangon hospital. In something that could be seen as a bitter tribute, the death of Burma’s most notorious drug lord occurred at the same time as the country’s share of the heroin trade plummeted. Afghanistan was to take over its mantle as the poppy-growing centre of the world. A decade on, however, with the ghost of Khun Sa still lingering, the fortunes of Myanmar’s opium producers are on the rise again. Cultivation peaked in 2013 at 143,000 acres, providing a new generation of warlords a chance to compete with Khun Sa’s infamy. David Hutt is a journalist based in Phnom Penh, Cambodia.



BOOKS ‘Once they have burned books they will end up burning human beings’ Heinrich Heine, 1821 As the holders of both our cultural and personal memories, books seem sacred and their destruction, no matter the cause, is always shocking, writes KENNETH BAKER. THE EARLIEST RECORDED BOOK BURNING took place in 213 bc, when the Chinese emperor, Qin Shi Huang, having conquered the seven warring states, set about shaping a new nation by creating a common currency, a common language and a common set of weights and measures. To protect this new state he ordered the building of a Great Wall and to protect himself after death he had built a great army of terracotta warriors. He also ordered that all the books that carried the collective memory, all the history and traditions of his new subjects, should be burnt: for good measure he killed 460 scholars, probably by burying them alive. In the Cultural Revolution of SEPTEMBER 2016 HISTORY TODAY 41


Previous page: SA officers carry books from a bookshop to be burnt, Hamburg, 1933. This page: the execution and burning of Confucianist intellectuals and their books under Emperor Qin Shi Huang. Silk painting, 17th century.

‘of all man’s instruments the most wondrous is, without any doubt, the book ... it is the extension of memory and imagination’


The burning of Jews and heretics. Woodcut from the Nuremberg Chronicle, Germany, 1493.

1966-79, Mao Zedong ordered that ‘old ideas, old customs, old habits, and old books should be burned’. After the fires and the killing of 46,000 scholars, Mao went on to boast: ‘We have surpassed Qin Shi Huang by a hundred fold.’ Books are the essential cornerstone of every civilisation and so we see their destruction as a desecration. In the Abbasid Caliphate of the ninth century the great Arabic scholar Al Jahiz wrote: The composing of books is more effective than building … for there is no doubt that construction eventually perishes and its traces eventually disappear while books handed down from one generation to another and from nation to nation remain ever renewed. John Milton, in his great pamphlet on free speech, Areopagitica, wrote that ‘a good book is the precious life-blood of a master’s spirit, embalmed and treasured up on purpose to a life beyond life’. In the 20th century, Jorge Luis Borges noted that ‘of all man’s instruments the most wondrous is, without any doubt, the book ... it is the extension of memory and imagination’. The key word here is ‘memory’. Books form the collective memory that any conqueror, dictator or fanatic seeks to destroy. George Orwell’s totalitarian government in his novel 1984 had a department whose duty it was to collect books on the written record of the past to be burnt in secret furnaces. The power of religious or political regimes derives from their leader’s absolute conviction that what they are doing is right. This moral certainty has driven many religions and regimes to impose belief, to compel obedience, to censor and burn books, to spy on dissidents, to imprison, to torture and to kill. Such certainty lay behind the Inquisition in Spain, Calvin in Geneva, Stalin in Russia, Hitler in the Third Reich, Mao in China, the Stasi in East Germany, the generals in Brazil and Argentina and today the Taliban, al-Qaeda and ISIS. Just as books can represent the unwanted shared memory of a culture, so, too, can its people and the removal of their voices take place alongside the destruction of books. Not a single person was burnt

at the stake in Europe for over 600 years following the fall of the Roman Empire, until Robert II of France, known as the Pious, ordered 16 heretics to be burnt at Orleans in 1022. This established burning as the punishment for heretics and witches until the last suspected witch was burnt in northern France in 1835.


N 1480 FERDINAND AND ISABELLA established the Spanish Inquisition. It started burning copies of the Talmud and Torah and then burnt 700 Jews between 1481 and 1488. When the Conquistadors came into contact with the Aztecs in 1519, they realised that they had stumbled upon a sophisticated civilisation with imposing buildings and substantial wealth in gold and books. It had to be destroyed, for it was pagan, and so Franciscan monks organised bonfires where books, written records, pictures, idols and traditional ceremonial clothes were all consumed by flames. Such religious motivations for burning are not uncommon. William Tyndale had fled to the Netherlands, where he thought he would be safe to publish his English translation of the Bible, but he was betrayed and burnt at the stake in 1536. His last words were: ‘Ope the King of England’s eyes.’ Mary Tudor was determined to reverse the Reformation and, during her short reign, 286 Protestants were burnt, including 56 women. When Wolsey and Pygot were burnt at Ely they were clutching English Bibles in their hands. On the other side of the religious divide, in 1553 Calvin justified the burning in Geneva of Servetus, a biblical scholar who challenged the whole concept of predestination. He was burnt with what was thought to be the last copy of his book chained to his leg, although in fact three have survived. Calvin also condemned to the flames anyone who supported Servetus. In recent years Islam has excised unwelcome voices. The Satanic Verses, Salman Rushdie’s 1988 novel, was burnt in Bradford and London and he was subjected to an Iranian fatwa, which encouraged devout Muslims to kill him. Only this year the price on his head has been SEPTEMBER 2016 HISTORY TODAY 43


Above, from left: ‘The Martyrdom of W. Wolsey & R. Pygot at Ely’, illustration from Foxe’s Book of Martyrs. c. 1703; a girl stands in front of a poster of Ayatollah Khomeini in Beirut, February 1989. Opposite: The Burning of the Books or St Dominic de Guzman and the Albigensians, by Pedro Berruguete, 15th century. In 1207 St Dominic burnt a Bible alongside a document from the heretical Cathar religion. A page from the Bible flew up to the rafters, while the Cathar article was destroyed by the flames. It signalled the beginning of the Albigensian Crusade.

increased by $600,000 by a group of Iranian zealots, a shocking move as Rushdie had committed no crime. The jihadi terrorists who invaded the offices of Charlie Hebdo magazine in Paris in 2015 shouted ‘Allahu Akbar’ as they killed three cartoonists and nine other people. The most infamous burning of books in the 20th century was the bonfire organised by Joseph Goebbels in the Opernplatz, Frankfurt on May 10th, 1933. Bands were laid on to accompany student songs, torches were provided for the marchers, news cameras had been summoned and Goebbels was billed to make a speech at midnight: It is a strong, great and symbolic performance, a performance which should document for all the world: here the spiritual foundations of the November Weimar Republic sink to the ground. But out of these ruins there will arise the phoenix of the new spirit ... the past lies in the flames ... today under this sky and with these flames we take a new oath: the Reich and the Nation and our Leader, Adolf Hitler, Heil! Heil! Heil! He was using fire not just to cleanse and purify German culture but also to destroy Marxism, socialism and trade unionism. To obliterate, in Hitler’s own words, an ‘un-German spirit’. One of the most shocking aspects of this particular book burning was that its ringleaders were not looters or thugs, but students, egged on by their professors. The general reaction of the world’s press was of amazement rather than anger. It was dismissed as the childish action of witless youths, an act of stupidity, rather than an exercise in evil, and it was left to the cartoonists in Europe and America to express the true horror of what had happened. 44 HISTORY TODAY SEPTEMBER 2016


OT ALL BOOKS ARE BURNT intentionally or with malice and there have been numerous unfortunate losses. The maid of J.S. Mill, for example, used the first hand-written volume of The French Revolution by Thomas Carlyle to light a fire. Then there are the libraries burnt as a result of war: the Alexandrian library was burnt three times; the library of Lieges was destroyed in each World War; and the Serbian leader, Slobodan Milošević, ordered the destruction of the National Library of Sarajevo in 1992. Perhaps overlooked when we consider book burnings, are the writers, usually authors or poets, who burnt their own works. Thomas Hardy was one of the most assiduous burners and his gardener was kept busy at Max Gate in Dorset, destroying a vast amount of material, including the early drafts of his poems and the notebooks he always carried with him to note down interesting events, characters and poetic phrases – just one of which survives. Many authors destroy material they think is simply not good enough: Jorge Luis Borges remarked of his early books: ‘If the price wasn’t too high I would buy and burn them.’ Graham Greene came to dislike his first book, Babbling April, a collection of poems, and destroyed copies. Others, like Hardy, did it because they did not want some future scholar digging into the privacy of their lives. In characteristic style, he commented: ‘If all hearts were open and all desires known – as they would be if people showed their souls – how many gaspings, sighings, clenched fists, knotted brows, broad grins and red eyes should we see in the market place.’



Joseph Goebbels gives his speech at the Opernplatz book burning, Frankfurt, May 10th, 1933.

To protect her husband Richard’s posthumous reputation, Lady Isabel Burton burnt his latest translation of The Scented Garden with its infamous ‘Terminal Essay’ on pederasts. She sent a letter to the Morning Telegraph explaining that she did it out of fear that her husband might be considered a homosexual. ‘Sorrowfully, reverently and in fear and trembling I burnt sheet after sheet until the whole volume was consumed.’ Philip Larkin, just three days before he died, asked his secretary and lover, Betty Mackereth, to destroy his diaries. She had a peek at some of them: ‘They were very unhappy – desperate really.’ Amid this history of burning there are some lucky escapes. Virgil’s request, that the whole of the Aeneid should not be published after his death since he had not completed it, was countermanded by the Emperor Augustus. Franz Kafka died at 40, his only notable published work being Metamorphosis, and he gave explicit written instructions to his friend Max Brod to burn everything left behind, which included his unfinished novels, The Trial and The Castle. Brod ignored the instruction and published everything Kafka had left. Vladimir Nabokov had also left an incomplete and fragmentary novel, The Original of Laura, and gave clear instructions for it to be burnt after his death but this was ignored by his son, who eventually published it in 2009. The book did nothing for Nabokov’s reputation but it did occasion a debate about whether such final directions from the grave should be followed: Tom Stoppard said ‘burn it’; John Banville said ‘save it’.

else. Censorship is possible, as China has shown, but it is expensive, elaborate and almost certainly not comprehensive. The burning of books has become technically futile but it survives as a symbol to impress the naive, warn the dissenter or rally the faithful. A previously unpublished poem by Ted Hughes perfectly encapsulates the power of burning books. Here is an extract:

EFORE THE 15TH CENTURY it was just possible to destroy every copy of a book in circulation – the limitations of copying by hand meant numbers were comparatively low – but the invention of the printing press put an end to that. It has been estimated that in the whole of the 14th century clerical scribes across Europe produced a little over 2.5 million books. In the 1550s, the printing presses of Europe produced that number in just one year. This led Pope Paul IV in 1559 to publish the Index of Forbidden Books, which inevitably was a failure. In the electronic age it is impossible to destroy information. The ‘delete’ button does not delete; the material will turn up somewhere

Kenneth Baker is a former Cabinet minister, an expert on political cartoons and the author of On the Burning of Books (Unicorn Press, 2016).



Where any nation starts awake Books are the memory. And it’s plain Decay of libraries is like Alzheimer’s in the nation’s brain. And in my own day in my own land I have heard the fiery whisper: ‘We are here To destroy the Book To destroy the rooted stock of the Book and The Book’s perennial vintage, destroy it Not with a hammer not with a sickle And not exactly according to Mao who also Drained the skull of adult and adolescent To build a shining new society With the empties ... Ted Hughes, July 1997

FURTHER READING Haig Bosmajian, Burning Books (McFarland, 2012). Lawrence Hill, Dear Sir, I Intend to Burn Your Books: An Anatomy of a Book Burning (University of Alberta Press, 2013). Rebecca Knuth, Burning Books and Levelling Libraries: Extremist Violence and Cultural Destruction (Praeger, 2006).

CECIL BEATON Right: Cecil Beaton, self-portrait, c.1928. Below: Cecil Beaton, self-portrait, early 1920s.


HAT DOES Cecil Beaton look like? Well, I think he’s very distinguished looking with aristocratic features, an aquiline nose, lots of cartilage and good bone. Rather hideous mouth. Very nice hands. Tall; very good length of leg, which is something that’s most important, especially as one puts on a good deal of weight – Michelin tyres – round the middle. I don’t think good looking at all. I mean, it’s not the sort of looks that I would like to have at all; I’d like to have a much more wart face, a mushroom face, an embryo face. I don’t want to be distinguished. I hate that.’ This self-description, at once caustic and congratulatory, reveals much about the character of Cecil Beaton, which could be simultaneously arrogant and anxious, and the emphasis he placed on appearance. As an international photographer, costume designer, author, diarist, would-be playwright and serial socialiser, Beaton used his appearance, his clothing in particular, to proclaim his professional achievements and to protect himself from the biting comments of critics, who seemed ever-present in the world of arrivistes and aesthetes in which he moved. But Beaton’s attitude to his appearance was not unique. Born in 1904, too young to have participated in the Great War, he, like many of his contemporaries, was keen to leave behind the thinking of earlier generations, which appeared old-fashioned and, with the failure of the League of Nations and

The photographer, designer and aesthete Cecil Beaton brought a distinctly historical awareness to the realm of fashion, as Benjamin Wild explains.

Through a glass, lightly


CECIL BEATON Neville Chamberlain’s policy of appeasement during the 1930s, errant. The wearing of avant-garde styles was one of the ways in which young people could demonstrate their desire for change during the interwar years. As the French poet and flâneur Charles Baudelaire observed of the dandies who had sashayed onto London’s streets in the 19th century, new styles of dress were more likely to appear at times of social transition and unease. The history of dress provides many examples of people changing their garments during moments of acute strife in response to a reassessment of their personal and professional roles: Christian Dior’s ‘New Look’ for women, unveiled in 1947, is a frequently cited example. While it can be misleading to connect sartorial innovation too directly with social and economic developments, Patricia Mears and G. Bruce Boyer reflect in their study of interwar fashions, Elegance in the Age of Crisis (2014), that ‘in times of such crisis, various aspects of culture come to assume hyper-importance’, dress not least.


ETWEEN THE First and Second World Wars, the most distinctive dressers in Britain were the Bright Young Things, whom Beaton worked hard to befriend. The sons and daughters of aristocrats and entrepreneurial pioneers, these pleasure-seeking youths shocked London with their late-night carousing, providing an endless supply of hilarious and hard-to-believe stories for newspapers and magazines in the years prior to the Wall Street Crash and the Great Depression; think Great Gatsby-

style parties on a smaller, cheaper, less choreographed scale. Beaton parted with the ‘It’ crowd in the late-1920s, when his photographic work took him to the US, but throughout his life he continued to dress in a way that would be lauded and lambasted in almost equal measure. In this regard, Beaton was quite different from his contemporaries, for when the responsibilities of adulthood and the realities of work came into conflict with leisured youthful existence, many of the Bright Young Things – those not ravaged by debt and self-induced illness – lived quiet, unremarkable lives. By contrast, Beaton continued to behave and dress in a way that shocked.

The wearing of avant-garde styles was one of the ways in which young people could demonstrate their desire for change The actor and playwright Noël Coward thought Beaton’s clothing ‘conspicuously exaggerated’ because he tended to match the colour of his socks, tie and pocket square. Coward was concerned that this provocative clothing would fuel suspicions about Beaton’s sexuality. Homosexuality was a crime and, more damningly, a social stigma that many people in the creative industries of this period were fated to endure, not least Coward himself, whose trademark

Far left: Beaton, dressed for a Cambridge Footlights production of All the Vogue, 1925. Left: David Bowie, 1974.


turtle-necks were considered equally provocative. The writer Beverley Nichols advised Beaton to stop wearing cosmetics, a popular trend among Britain’s Bright Young Things, but alien to many in the US, where Beaton was trying to establish himself. Criticism of what Beaton wore was tempered by admiration. University friend and literary critic Cyril Connolly described Beaton as ‘Rip Van With It’. Connolly was no style aficionado – anything but – though he had a gift for bons mots. His appellation encapsulates how Beaton pushed at society’s sartorial seams to create a look that enabled him to achieve personal and professional distinction. The greatest acclamation of Beaton’s style came in 1970 when he was named on the International BestDressed List, along with some of the most revered dressers of the day, including couturiers Pierre Cardin and Hubert de Givenchy, the president of Fiat, Giovanni ‘Gianni’ Agnelli and fellow photographer Norman Parkinson. In part, Beaton’s challenging appearance and apparel stemmed from his desire to escape, if not erase, what he termed his ‘collar and tie’ upbringing, which was conserva-

Beaton wearing a seersucker suit by Savile Row tailors Anderson & Sheppard, c.1936.

tive and a little too middle class considering his aspiration to become a feted photographer and society stalwart; his father was a timber merchant. Beaton also used his clothing to project an image of confidence and success to facilitate the process of becoming professionally and personally successful. To this end, he kept abreast of new styles in menswear and often incorporated them into his wardrobe. The story of Beaton’s evolving style is therefore interesting and important because his concern to stay ‘on trend’ reflected the often dramatic changes in the silhouette of men’s dress before and after the Second World War. When Beaton was born in 1904, the memory of the dandies who had strolled along London’s streets, largely in deference to the past, was probably still vivid. In 1980, when Beaton died at the age of 76, Mods, Punks and New Romantics caroused in coffee houses and clubs and looked to the future. As a student at Cambridge, Beaton dressed to be noticed. On his first day at St John’s College he wore an evening jacket, red shoes, black-and-white trousers and a large cravat. In addition to the cosmetics he wore on his face, SEPTEMBER 2016 HISTORY TODAY 49


Beaton moved easily among a new generation of creative talent, which included Mick Jagger, Andy Warhol, David Hockney and the photographer David Bailey

he put an ointment in his hair to keep it shiny and smooth. He often wore gloves and liked co-respondent shoes, lowheeled brogues or oxfords, made in two contrasting colours of leather. These clothing choices were atypical and daring. The co-respondent shoes would have been particularly striking because of the negative connotations they aroused: the name ‘co-respondent’ was taken from the law courts, where it referred to a man who was conducting an affair with a married woman. By wearing co-respondent shoes in his early twenties, Beaton was being deliberately provocative. He also had a penchant for traditional Austrian clothing, characterised by short jackets with enlarged lapels and knee-length shorts, all heavily embroidered. Austria had become a popular destination for well-heeled Europeans during the interwar years and Alpine-style garments were much in vogue before the Second World War (less so during!). If dramatic clothing styles helped Beaton to get noticed and to get a foot in the door, they proved problematic when his photographic career began to take off in the early 1930s, after he secured a lucrative contract with the owner 50 HISTORY TODAY SEPTEMBER 2016

Beaton on the set of Donald Cammell and Nic Roeg’s film, Performance, with its stars, James Fox and Mick Jagger, 1968.

of Vogue magazine, Condé Montrose Nast. In America, Beaton’s flamboyant style was frequently criticised. In response, he began to confine the more romantic and daring clothing that had characterised his Cambridge years to Ashcombe, the Wiltshire home that he rented between 1930 and 1945. In public, Beaton’s look became more disciplined, although never demure.


O BRING ORDER and maturity to his wardrobe Beaton turned to London’s Savile Row. Beaton had accounts with several of the Row’s tailors, including Anderson & Sheppard, Huntsman and Sullivan & Woolley (now part of Henry Poole & Co). He first visited the Row in October 1934 on the recommendation of Johnnie McMullin, a writer for British Vogue. Before this, Beaton’s diaries reveal that he regularly received items of clothing – including suits – and dress accessories from family members on the occasion of his birthday and at Christmas. The relationship that developed between Beaton and his tailors from the early 1930s was important in the expression of his style, for despite becoming a

successful costume designer – he won Academy Awards for his costume designs for the Hollywood films Gigi (1957) and My Fair Lady (1964) – he never learned how to make clothes. Cooperation did not preclude conflict, however, and in 1965 Beaton criticised the Row’s tailors for their traditionalism. As Beaton changed his wardrobe to suit the times, to demonstrate his continuing relevance and to detract from the signs of old age, he felt aggrieved that his London tailors did not. In 1965, at the age of 61, he lambasted Savile Row for what he perceived to be its sartorial lethargy: It is ridiculous that they go on turning out clothes that make men look like characters from P.G. Wodehouse. I’m terribly bored with their styling – so behind the times. They really should pay attention to the mods ... the barriers are down and everything goes. Savile Row has got to reorganise itself and, to coin a banal phrase, get with it. This sartorial spat, like so many in Beaton’s life, was shortlived and he returned to the Row after only a few months, having chastised himself for being ‘foolish enough’ to buy

Beaton in a Salzburg jacket before the ‘hands’ of guests at Ashcombe House, Wiltshire, 1934.

suits from the house of Pierre Cardin in Paris that cost twice as much as those from Savile Row. Four months later, on October 8th, he bought his first suit, a green worsted three-piece, from Huntsman. It is interesting that Beaton signalled his return to the Row by visiting one of its more expensive tailors. It is possible that he was lured by the shop’s cutters, particularly the head cutter Colin Hammick and Robert Andain-Holt, who were subtly, but no less surely, bringing fresh ideas to the Row’s tailoring traditions; while Hammick may have been responsible for popularising the (now ubiquitous) wearing of blue shirts with suiting, Andain-Holt designed Beaton a sufficiently distinctive outfit with high-waisted trousers, asymmetrical cummerbund and a lapel-less jacket. Far from demonstrating that Beaton merely kept pace with contemporary menswear trends, this suit would have done much to establish the sexagenarian as a sartorial pace setter. But looks could be deceptive. The Huntsman suit, which is not known to survive, indicates how Beaton’s requirements from his wardrobe changed as he aged. The desire to shock and awe was still present, but the high-waisted SEPTEMBER 2016 HISTORY TODAY 51


trousers were almost certainly a concession to comfort; around this time, he was also ordering waistcoats with longer back panels, presumably to provide greater warmth or greater discretion as he bent down to take photographs. Beaton’s contrasting sartorial decisions of this period stemmed from personal insecurities. In January 1971, at the age of 67, he wrote in his diary: I still try to battle against all physical odds, and to try to wear clothes that are sufficiently attractive and unusual to take people’s eyes off the horror they camouflage. And someone told me a day ago that I had been counted as one of the best-dressed in a ‘list’ compiled in the USA. But what’s the point of my ... ordering a new suit if it has to be worn with a cherry on the tip of my nose? As a result of the oestrogen supplements that he was taking following a prostate operation, Beaton’s upper body and face broke out in freckled spots in the late sixties, hence the reference to a ‘cherry on the tip of my nose’. This must 52 HISTORY TODAY SEPTEMBER 2016

Above: Beaton dressed for the ‘Ascot’ scene in the musical My Fair Lady, whose costumes he designed, New York, 1956. Right: with Andy Wahol at the artist’s Factory, New York, 1969.

have been especially galling for a man so sensitive about his appearance. The adoption of modern styles convinced contemporaries of his continuing relevance and Beaton appeared to move easily among a new generation of creative talent, which included Mick Jagger, Andy Warhol, David Hockney and the photographer David Bailey, who believed that Beaton could ‘fit into any time’ because he was adaptable, like a chameleon. In the early 1970s Beaton commissioned trousers that were belled, or flaired, to be two inches wider at the hem, clearly in line with the latest clothing trends. His diaries, however, published in a highly edited form from 1961, reveal that new menswear styles agitated him. Beaton may have criticised the Row for its traditionalism, but in his private writing he lamented the passing of Edwardian glamour, which was being challenged by a new generation of ‘beatnik teenagers’, ‘peasants and roughnecks’, clad in ‘sandals and blue jeans’. Beaton’s critical reaction to the very latest styles may explain why he adopted a more

bohemian style of dress in his late sixties, characterised by open-necked shirts, worn with scarves and nylon hats. Headwear helped Beaton to hide his thinning hair, which he had always cherished, but there is a more general sense that he was repudiating styles of clothing that he felt were being adopted without appreciation. Beaton’s response to postwar fashion probably explains why he signalled out few people whom he admired for their dress and why his own approach to dressing is known to fewer, perhaps more ardent, style aficionados. But things are changing. For champions of dandyism and vintage clothing, Cecil Beaton is increasingly invoked as somebody who successfully combined clothing from different countries, periods and styles. In light of his enduring enthusiasm for historic styles, it is appropriate that interest in his wardrobe has been stimulated by a similar appreciation of clothing styles from the past. The renewed appreciation of vintage vogues has, in large part, been engendered by recent economic upheavals.


INCE THE ECONOMIC CRISIS of 2008, social and sartorial commentators have observed that many men have changed how they dress, either by reverting to traditional garments that convey authority and professional accomplishment – demonstrated by a renewed interest in classic tailoring – or they have eschewed the suit and adopted a softer and more relaxed silhouette, with bolder colours and contrasting textures. The two styles are not strictly dichotomous and, in not a few cases, they have been combined, producing unusual and exciting contrasts, for example, the use of richly textured and brightly decorated fabrics in formal tailoring. What is clear, though, is that more men are looking to the past for sartorial inspiration. They appear to believe that by reviving the fashions of their fathers and grandfathers, they will obtain, or at least project, the confidence and certainty of the men who wore them. In this, their attitudes towards dress is

reminiscent of the Bright Young Things and the 19thcentury London dandies who caught the eye of Baudelaire. While social changes have created circumstances in which people might look more appreciatively and enviously on Cecil Beaton’s wardrobe, the appeal of his clothing has long been acknowledged by couturiers and designers. The British fashion designer Giles Deacon describes Beaton as a ‘phenomenal creative force’, who combined ‘lovely British wit and a sense of craftsmanship’. Savile Row tailor Richard James’ Spring/Summer 1990 collection was directly inspired by Beaton’s wardrobe. One of the highlights, which paid homage to a fancy dress costume he wore in 1937, was a Nehru-style jacket in pink raw silk. The four-button jacket was decorated with yellow silk organza appliqué roses and green silk embroidery. The survival of Beaton’s wardrobe in museums on both sides of the Atlantic is a consequence of a deliberate, if ill-documented, attempt he made to preserve his style. It is possible he was inspired by those friends who had (sometimes inadvertently) given their clothes to the Victoria and Albert Museum, London, following his ‘Fashion: An Anthology’ exhibition in 1971. The decision to part with his clothing, particularly, in 1974, his Austrian clothes, may have been galling, but the stroke he suffered in the same year presumably gave his plans greater urgency. In 1976, Beaton agreed to auction his photographic archive, a decision Alistair O’Neill suggests was conceived to ‘secure an income and [cover] the cost of care for Beaton’. Beaton’s belief that his wardrobe was worthy of becoming part of permanent museum collections in London and New York hints at arrogance, but it also demonstrates his awareness of how dramatically menswear had changed during his lifetime and an acknowledgement that, however fast fashion’s wheel revolved, new styles would always pay homage to the past. As he observed in his 1954 book, The Glass of Fashion: There is nothing new under the sun, and in art as in evolution, each new manifestation is merely the last link in a chain that stretches back to the beginnings of time. Beaton’s assessment was as true then as it is today. In bequeathing his wardrobe to future generations, his condescension was therefore tempered by a concern that we recognise the cultural and historical meaning within our clothing, as he did in his, and be aware that, however distinctive the dresser, what we wear is always a product of the society in which our clothes are conceived, created and consumed. Benjamin Wild is a consulting lecturer at Sotheby’s Institute of Art, London and the author of A Life In Fashion: The Wardrobe of Cecil Beaton (Thames & Hudson, 2016).


Beaton’s critical reaction to the very latest styles may explain why he adopted a more bohemian style of dress in his late sixties

Hugo Vickers, C. Beaton, The Glass of Fashion: A Personal History of Fifty Years of Changing Tastes and the People Who Have Inspired Them (Rizzoli, 2014.) Hugo Vickers, C. Beaton, Portraits and Profiles (Frances Lincoln, 2014). D. J. Taylor, Bright Young People: The Rise and Fall of a Generation: 1918-1940 (Vintage, 2008). SEPTEMBER 2016 HISTORY TODAY 53

Portrait of the Author as a Historian The Booker Prizewinning writer eschewed autobiographical novels for historical fiction in a bid to resolve the porous distinction between objective and subjective history, writes Alexander Lee.

Faulty facts: Penelope Fitzgerald in 1999.

No.3 Penelope Fitzgerald Born: December 17th, 1916, Lincoln. Died: April 28th, 2000, London.


IN 1875 1875, Dante Gabriel Rossetti began a painting of Mnemosyne, the Greek personification of memory. It was an intriguing piece. Looking distractedly towards the viewer, Mnemosyne’s expression hovers between resignation and regret. In her right hand she holds a shining golden lamp, while in her left she grips the ‘winged chalice of the soul’ from which she has filled it. For Rossetti, the painting articulated the ambiguities of memory. As he asked: ‘Is remembrance the greater sorrow, or is it a pleasant room in a bitter hell?’ But the painting also had wider implications. Since Mnemosyne was the mother of the Muses and counted Clio (history), Thalia (comedy) and Melpomene (tragedy) among her daughters, it seems to suggest that the solace or suffering engendered by memory is the source of the poetic imagination on which fiction and history depend. As such, Rossetti’s Mnemosyne Penelocould be read as an illustration of Penelo pe Fitzgerald’s attitude towards writing the past in her historical novels. Coming to reject the positivism of the early 20th century, she embraced developments in contemporary scholarship to produce a blend of memory, history and fiction. It was not something to which she was instinctively drawn. When she was a student in the 1930s, history was dominated by those who regarded it as a science. By examining documentary evidence with ‘scientific’ detachment, they believed that they could attain a purely objective understanding of the past as it had actually been. Memory and literature – being ‘subjective’ – were derided as antithetical to all that history sought to achieve. Seduced by the promise of certainty, Fitzgerald found such ideas hard to resist. When she embarked on her literary career, a century after Rossetti had begun Mnemosyne, she was still in thrall to Mnemosyne them. Her first works were biographies of Edmund Burne-Jones (1975) and of her father, E. V. Knox, and his brothers (1977). Coldly analytical, they were firmly in the ‘scientific’ tradition, never venturing beyond documentary sources. Even when writing about her own family, she refused to mention herself by name and avoided dwelling on her memories. Turning to fiction, she continued to view admithistory as ‘different’. She was, admit tedly, happy to use historical details and her own memories to add depth to her novels. In Human Voices (1980) she drew on her experiences as a staffer at Broadcasting House to lend credibility

to her description of wartime life at the BBC. But this did nothing to challenge history’s status as a ‘science’, distinct from fiction and remembrance. Evidence of change By the mid-1980s Fitzgerald wanted a change. Tired of novels and biographies, she felt the need to ‘journey outside [her] self’. Historical fiction offered exactly what she was looking for. Though entering her seventies, she wrote four books over the next decade. Fascinated by the lives of ordinary people confronting great changes, she threw herself into research with characteristic enthusiasm. But as she did so, she queried the assumptions on which ‘scientific’ history was based. Like many recent scholars, she was troubled by the nature of evidence. No matter how much material survived from a given period, it would never be more than a fragmentary and incomplete record. It was not simply that a great deal had been lost or destroyed over time; rather, the majority of human experience – especially the ordinary, the everyday and the humdrum – had never been recorded in the first place. But where did this leave history? Although Fitzgerald had believed that history’s purpose was to apprehend the truth about the past, she was no longer sure. If it was really ‘scientific’, surely it would be better to define its purpose as comprehending only the tiny portion of the past visible in surviving documents? As Fitzgerald noted, a parallel could be found in debates about atomic theory in the 1910s. In The Gate of Angels (1990), a young physics don, Fred Fairly, discusses whether sub-atomic particles were ‘real’ and, if not, whether they were a proper subject for scientific study. Fred is a ‘realist’. Even if particles were unobservable, their presence could still be inferred. And since science’s purpose was to provide a true description of the universe, they deserved to be studied. But his colleague, Prof. Flowerdew, is an ‘anti-realist’. If sub-atomic particles are unobservable, they are merely a plausible idea. Believing that science’s purpose was to describe only the observable universe, he dismisses them as unworthy of study. The parallel is instructive. Just as Flowerdew’s anti-realism could not explain some phenomena, documentary history could not explain why certain events happened, much less say anything about unrecorded experiences. If history was to understand the past as it was, something besides evidence was needed.

One possibility was literature. In the mid-1970s, Hayden pointed out that documentary shortcomings undermined the distinction between objective history and subjective fiction. Although documentary sources might contain some evidence about the past, they did not provide a framework that would allow sense to be made out of the fragments. The historian needed to craft a narrative out of the limited materials at their disposal; and this was as much a matter of imagination as anything else. Fitzgerald went one step further. All that the historian was doing was using imagination to connect the dots between bits of evidence; why shouldn’t she use hers to fill in the gaps where everyday life had been? After all, as Friedrich von Hardenberg (1772-1801) – better known as Novalis – had observed, ‘novels arise out of the shortcomings of history’.

If history was to understand the past as it was, something besides evidence was needed Fittingly, it was Novalis who provided her with the opportunity to put this into practice. Captivated by his Romantic philosophy, but aware that her German was not good enough to attempt a biography, she set about writing The Blue Flower (1995) as an imaginative retelling of his love for Sophie von Kühn. Festooned with quotations from Novalis’ works, it was not a fictionalised account per se, but an attempt to rescue those ordinary moments omitted from the historical record and to get inside Novalis’ head more completely than any biography. History from below Another possibility was to be found in memory. Alarmed by the invisibility of ordinary people in documentary evidence, some scholars had come to believe that they should focus on how the past was constructed in the memories of living people. By examining the interplay between experience and recollection in oral testimonies or lieux de mémoire (sites of memory), historians like Jan Vansina, Pierre Nora and Luisa Passerini were able to see how ordinary people rationalised their lives and negotiated a sense of self-

hood over time. On this basis, they endeavoured to write a history of memory ‘from below’. Fitzgerald recognised that she could use the theme of memory as a powerful means of exploring history’s hidden human dimension. Although framed as a story of love and disappointment, Innocence (1986) offered a powerful portrait of the social upheavals that took place in Italy in the mid-1950s. As Fitzgerald intimated, the economic miracle was about to begin; domestic migration and social mobility were increasing. Politics, too, was changing. The Italian Communist Party spurned the Soviet Union and the Christian Democrats gained control of the South. But it was happening too quickly. Shaped by life under Fascism and moulded by tradition, Italians struggled to reconcile past and present. This is explored through the relationship between Chiara Ridolfi, the daughter of an ancient, but impoverished, Florentine family, and Salvatore Rossi, a ferociously independent neurologist of southern peasant stock. Coming from such different backgrounds, their romance shows how much society had already changed and bodes well for the future. Yet they seem unable to make each other happy. They are imprisoned by their memories. Rossi is haunted by the memory of when his father – an ardent communist – took him to visit Antonio Gramsci in a prison hospital at the age of ten. It was a pitiable episode. Wracked by tuberculosis, Gramsci was not the working-class hero he had expected. Unsure how to behave, his father embarrasses himself. Repelled, Salvatore turns his back on his family, rejects communism and resolves to study medicine. Yet remembering that day, he feels guilty for abandoning his roots and for seeing Gramsci’s illness rather than his heart. As a result, he mistreats Chiara – the symbol of the privilege his father had always hated – even though he loves her. For Fitzgerald, it revealed how right Rossetti had been. Quoting his couplet, she, too, queried whether memory was a blessing or a curse. It could make or mar our present and future, making our history a comedy or a tragedy. Either way, it fell to literature to remind us that, in the end, ‘we have each other’, something we need to remember now more than ever. Alexander Lee is a fellow in the Centre for the Study of the Renaissance at the University of Warwick. His book The Ugly Renaissance is published by Arrow.

Key works Offshore (1979), The Bookshop (1978), The Blue Flower (1995) Born Penelope Mary Knox, Fitzgerald’s literary career might be described as late-blooming. She started writing at the age of 60 and won the Booker Prize with her short novel Offshore in 1979, by which time she was a widow with three adult children. In doing so, she beat the overwhelming favourite, V.S. Naipaul’s A Bend in the River, in a result that shocked both the reading public and, according to one of those on the panel, Hilary Spurling, the judges themselves. While Fitzgerald’s early works largely drew on autobiograhical material, she turned to historical fiction later in her career, beginning with Innocence in 1986. She produced nine novels before her death in 2000. SEPTEMBER 2016 HISTORY TODAY 55

Chris Wrigley on Octavia Hill Emma Griffin on the rise of England’s middling sort Geoffrey Robertson on the origins of genocide and crimes against humanity


A medieval thinker who confounds modern stereotypes Robin Lane Fox’s wide-ranging and compelling study is a vivid evocation of Augustine’s travails, while Rowan Williams gives us a more contemplative offering.

I HAVE written self-indulgently, as I myself like to read about the past. I do not like the proper names of nonentities, numbered dates of unknown years or refutations of other men’s views … I am bored by institutions and I do not believe in structures. Others may disagree. So wrote Robin Lane Fox in his 1973 book Alexander the Great. I suspect he would not modify his words greatly now. His subsequent works, including Pagans and Christians (1986) and The Classical World: An Epic History (2005) have established him as a historian whose work cuts through the usual boundaries: 56 HISTORY TODAY SEPTEMBER 2016

Greek versus Roman history, classical versus Christian, academic versus popular. His latest book, Augustine, sells itself on a similarly idiosyncratic prospectus: ‘There are many fine short books on Augustine … I saw no reason to write another, so I opted for a long book.’ There are many fine long books on Augustine, too, so what is the constant pulling power of the man? What does Lane Fox bring to the party? Augustine’s life (ad 356-430) spanned a formative historical period, coinciding with events that we see as separating the classical world from the medieval: the rise of Christianity and of

the uneasy relationship between secular and church powers in western Europe; the spread of monasticism; the increasing separation of the Greek Eastern Roman Empire from the Latin West; the encroachments of barbarians into the Empire and, in 410, into Rome itself. When Augustine died, in Hippo Regius in North Africa, a Vandal army was camped outside the city gates. Augustine is also a thinker who confounds modern stereotypes. That he embraced Catholicism because he found it fundamentally compatible with the findings of science and not despite; that neither he nor many

early Christian theologians saw any need to interpret the Book of Genesis literally; that sexual abstinence was not a peculiarly Christian fetish but an idea shared by other serious-minded people in antiquity – these are not new facts, but they still have the capacity to surprise. Lane Fox takes explicitly Augustine’s Confessions as the backbone to his own study. He sets out his work as a triptych, flanking Augustine with his contemporaries Libanius, the pagan orator and intellectual of Antioch, and Synesius, the Christian bishop of Ptolemais in Libya, and weaving events and

themes from their lives in with those of Augustine. The result is a book that is wide-ranging and sure-footed, from its vivid evocation of the travails of the late-antique schoolboy to its account of how Neoplatonism informs Augustine’s vertiginous speculations on time and memory, which conclude the Confessions. I particularly enjoyed Lane Fox’s summary of Manichaeism, the clearest guide I have read to this notoriously baroque religious system. He has a good nose for a racy anecdote too – but no spoilers here. There is also an excellent mini picture gallery with commentary. There are quibbles. Some are minor: Augustine would have winced at the suggestion that Christians ‘worshipped’ martyrs, though perhaps not all his flock would have done likewise. Some are bigger: I would question Lane Fox’s claim that Augustine

Lane Fox succeeds magnificently ... reminiscent of Peter Brown’s great works ... the same easy intimacy with the sources ... the same rich and lucid prose narrowed the scope of the term libido from the classical Latin ‘instinctive desire’ to ‘sexual desire’ and suggest that Augustine saw sex as the example par excellence of a desire that latches onto inappropriate objects to an inordinate degree. But overall this book succeeds, magnificently. I began as a sceptic. We had, I felt, excellent short introductions to the Confessions from Gillian Clark and Catherine Conybeare, among others; who would read a long one? Long before the halfway point I was convinced this is a compelling book, reminiscent of Peter Brown’s great works, especially his Augustine of Hippo (1967) and World of Late Antiquity (1972). There is the same easy

intimacy with the sources, the same rich and lucid prose. There is also the same unspoken assumption that we are all gentlemen and scholars, at home in a shared world of wider literary and historical allusion. Rowan Williams, too, has a serious pedigree as a historian of fourth-century Christianity, notably through his Arius: Heresy and Tradition (1987), a sympathetic study of the Alexandrian theologian, whose thoughts on the relations of the persons of the Trinity were to prove so very stimulating. His latest volume of essays, On Augustine, markets itself on a very different premise from Lane Fox’s. Williams writes as a believer; Lane Fox as a courteous outsider. Lane Fox writes as a historian; Williams states that ‘none of [his chapters] is primarily concerned with strictly historical or chronological issues’. And, unlike Lane Fox, Williams writes in a tradition which eschews the flashing phrase; his Augustine is someone who ‘reflects carefully on a central tension in the human condition between the fact that we have to begin all our thinking and praying in full awareness of our limited, embodied condition and the fact that we are summoned by our creator to go beyond limited and specific desire, reaching out to an endless abundance of life’ (in this connection, Lane Fox presents the Confessions as the result of an agonising bout of anal fissures, probably brought on by Augustine’s ascetic lifestyle). Paradoxically, then, Williams’ dehistoricised Augustine is not only less of his time, but also less of ours, at least as most of us experience it. Williams’ Augustine is a more contemplative, bookish type. He is much less the busy bishop and much more the Augustine of the postgraduate seminar in philosophical theology. Those with interests in that direction may find much in this book. Philip Burton Augustine: Conversions and Confessions by Robin Lane Fox Allen Lane 672pp £30 On Augustine by Rowan Williams Bloomsbury Continuum 224pp £25

This Orient Isle

Elizabethan England and the Islamic World by Jerry Brotton Allen Lane 384pp £20 THE Mediterranean world loomed large in English culture in the 16th century. What is made strikingly clear in Jerry Brotton’s new book, This Orient Isle, is the extent to which the Muslim inhabitants of North Africa and the Middle East played their part in this English enthusiasm for things Mediterranean during the reign of Elizabeth I. Close links between Muslim rulers and Elizabeth’s court, forged by English travellers – diplomats, merchants, captives and spies – to the courts of Marrakech, Istanbul and Isfahan, built the foundations for this enthusiasm. Many left exciting accounts of their experiences and these, now familiar from other studies of early modern English engagement with the Islamic world, form the basis of Brotton’s readable and engaging narrative. We read of Anthony Jenkinson’s audience with the Safavid Shah Tahmasp in Qazvin and of William Harborne’s appointment as Elizabeth’s ambassador at the court of the Ottoman Sultan Murad III. We follow the adventurer Anthony Sherley to Shah Abbas’ new capital at Isfahan and the Warrington-born Thomas Dallam as he played his clockwork musical organ before the Ottoman Mehmed III. In This Orient Isle, Brotton offers a two-pronged approach,

combining diplomatic and cultural history, as he links his narrative of these travel accounts convincingly with the increasing appearance of Muslim-inspired characters in plays of the period. For example, we learn that between 1576 and 1603 at least 60 plays were staged in London with Turks, North Africans and Persians among their characters. As Brotton shows, this may not have inspired much cultural understanding between the English and the Moroccans, Turks and Persians to whom they were introduced through the London stage, but clearly Shakespeare and Marlowe were not alone in drawing on their dramatic potential to highlight aspects of contemporary society. Brotton suggests it was Protestant England’s relative isolation from its Catholic European neighbours that forced its rulers to seek closer relations with Muslim powers. However, what is absent is

England’s relative isolation from its European neighbours ... forced its rulers to seek closer relations with Muslim powers proper discussion of the Middle Eastern contexts for the emerging relationships. What did Ottoman sultans and viziers, let alone Safavid shahs, think that England, a seemingly powerless monarchy over a thousand miles away, had to offer them? Only very brief answers are given here, although Brotton provides a clearer picture of Moroccan interest in an English alliance. Although little of the travel and diplomatic history that Brotton discusses is new, at a time when it might seem that Britain’s horizons are narrowing somewhat, This Orient Isle is a timely reminder that England has rarely been free from some form of dependence on maintaining relationships with other peoples and powers around the world.


REVIEWS This latest work, not quite Vol. 2, Broken Idols of the English Reformation, completed just before Aston’s death, brings together her gifts as a historian, the depth and breadth of her scholarship and clarity of exposition. Any book this long, especially one that presupposes knowledge of a predecessor that looked in a chronological fashion at the medieval and continental-reformation background, is going to be hard work and readers may want to dip in for people, places or themes of particular interest to them. The book is in three parts. by Margaret Aston The first examines the mental Cambridge University Press 1136pp £120 world of iconoclasts and is also very good not only on ‘official’ IN 1988, Oxford University Press attitudes (sermons, etc) but also published Margaret Aston’s on ‘popular’ participation and England’s Iconoclasts. Vol.1: Laws initiative. Thomas Cromwell’s Against Images, a book which love of burning images rather developed Aston’s long engagethan smashing them (cauterising ment with the Lollards and their festering blasphemous wounds allergy to images and projected in the Body of Christ) is just one it forward through the first point very well brought out; century of the Reformation. but the zeal that maimed and

Broken Idols of the English Reformation


disfigured images (while leaving headless torsos as reminders) is shown to be a powerful trigger for Laudian ‘beauty of holiness’ and a renewed preoccupation with architectural evocations of Solomon’s Temple. The second part offers case studies of iconoclasm and its limits. There is a chapter on saints and the very different fates of the cult of good St George and bad St Thomas Becket, every image of whom was to be eradicated, every reference in service books expunged and the story of his martyrdom airbrushed out of official histories. The tomb itself – ‘a treasure trove of gold and jewels’ – was sacrificed on the altar of Henry VIII’s greed. There is a chapter on bells and organs, silenced and melted down, and another on the destruction of images and representations of the Trinity. In the final part, the first chapter focuses on the destruction of stained-glass windows

(destruction, that is, of many but by no means all of them) and the second examines many debates about the Cross of Jesus (and more especially the crucifix) in their various forms (including signing with the cross). It culminates with a gripping account of the desecration and restoration of Cheapside Cross in London during Elizabeth’s reign and its final nemesis in the first months of the Civil War of the 1640s. The book ends with a chapter that examines new and refined ways of deploying words to replace images as a way of capturing the hearts and minds of those on the margins of literacy. This is a book of deep learning, if a little shaky on the finer distinctions within protestant theology, but which chronicles brilliantly and explains well why the Reformation involved such an orgy of violence against objects of special devotion, the disfigured remnants of which even now reprove those who ordered it. John Morrill



The rise of the middling sort Alexandra Shepard provides an authoritative account of how the social hierarchy of early modern England developed which is fundamentally different from anything that has gone before.

OUR UNDERSTANDING of early modern Britain has always been shaped through big, academic monographs and this latest contribution by Alexandra Shepard is as big as they come. Accounting for Oneself returns to a source that is well known to historians: testimonies by witnesses to the church courts. Yet Shepard deploys the material in a manner that is fundamentally different from anything which has gone before. Where the rich material from the church courts has traditionally been used for small-scale local studies, Shepard has undertaken an extensive trawl through thousands of them – over 13,500 in all – from ten different English regions. In the process, she has created a national database which can be analysed in various ways, permitting her to intervene in a number of debates about early modern England with an authority that few have ever achieved. Foremost must be the demonstration that social polarisation was both more pervasive and more extensive than hitherto realised. Social historians have long argued that the 16th and 17th centuries witnessed the concen-

tration of wealth and power in the hands of the ‘middling sort’, but the local nature of the studies upon which this thesis is grounded has made it difficult to gauge the true extent of this process. Shepard’s many thousands of depositions provide a definitive way forward. In the middle of the 16th century, yeomen estimated their average wealth at £9.88.

Accounting for Oneself is an immense book ... certain to become the standard point of reference for many years to come 100 years later, this had risen to a phenomenal £143.06. Even allowing for inflation, this was a considerable increase, as can seen by contrasting the gains made by labourers: their wealth increased from £2.03 to just £4.75 over the same period. This picture is further complicated by the high inflation that occurred over this period. Factoring for inflation reduces the yeomen’s gains to a

(still very considerable) ten-fold increase, while the doubling of the labourers’ stated worth actually masks a deterioration in their real worth. The middling sort were the beneficiaries of agricultural improvement and an economic growth that they helped to create and were part of a more invidious process by which the nation’s wealth was transferred increasingly from the hands of the many to the hands of the few. Shepard’s huge dataset permits her to demonstrate the broad trends of widening inequality, as well as to shade in considerable detail. Of course, English society was not comprised just of yeomen and labourers. Shepard also tracks the fortunes of gentlemen, husbandmen and craftsmen over the period, demonstrating that they made much more modest gains than the yeomen. There was little between yeomen, husbandmen and craftsmen in 1650. But, in the following century, yeomen forged ahead to achieve wealth at similar levels to the gentlemen, while the husbandmen and craftsmen remained much closer to the labourers at the bottom of

the scale. Where once most people’s wealth had clustered around the mean, large differences had emerged between the haves and the have-nots. This provides an important social context for the emergence of the disruptive religious voices that also occurred. Underpinning this elongation of the social hierarchy were fundamental changes in the meaning and purpose of wealth. After all, why did every deposition start with questions about the value of the individual’s goods once all his or her debts had been paid? This was because moveable goods provided a public marker of an individual’s wealth and (by the same token) of their reliability in business – an important matter in a society where very little coin flowed, so most transactions involved credit rather than the actual exchange of money. If your debtor’s money – or goods or services to the value of that money – never materialised, you could ask the courts to permit you to take their table, or their clothes, or their spade instead. As yeomen’s wealth increased, they began to appreciate their possessions not simply as a marker of their ability to pay their debts, but also as objects to own and enjoy. This is the origin of our own modern sensibilities towards material goods and their emotional, as well as purely functional, value. Accounting for Oneself is an immense book and even in this relatively extended review, it is possible only to scratch the surface of some of the many themes explored in great detail: social inequality, personal identity, gender, consumption and women’s work, to name a few. It is wide-ranging, incredibly detailed and rich in historical example and complex analysis. It is a monumental achievement and certain to become a standard point of reference for many years to come. Emma Griffin Accounting for Oneself: Worth, Status and the Social Order in Early Modern England by Alexandra Shepard Oxford University Press 384pp £65 SEPTEMBER 2016 HISTORY TODAY 59



Self-help and saving open spaces Strong-willed, dictatorial and a mid-Victorian in an Edwardian age, Octavia Hill was also a tireless housing reformer, successful social activist and a founder of the National Trust.

OCTAVIA HILL (1838-1912) was an eminent Victorian, revered in her prime as a tireless housing reformer but often seen later in her life as a relic of an earlier era of self-help and resistance to state and municipal intervention. Yet, in regard to her work to defend urban and rural open spaces, she has been acclaimed as being ahead of her time. This excellent, well-illustrated book arose from a conference to mark the centenary of her death and it provides a much-needed, wide-ranging reassessment of Hill. Hill was driven by her Christian beliefs and her sympathy for the poor. She drove herself too hard. Her health broke down periodically, most seriously in 1877. She was dominant, even domineering, in her activities and, before her health problems, reluctant to delegate. She was an effective housing reformer who took on blocks of housing that needed improving and provided more space for families and better facilities. Her initial work was funded by John Ruskin, who required a five per cent return on his investment. She expected tenants to pay their rent on time and, if they failed to do so and 60 HISTORY TODAY SEPTEMBER 2016

there were no convincing extenuating circumstances, then they would be evicted. The families were helped with their personal problems by Hill and her team of socially concerned weekly rent collectors. Several of these women went on to be notable social reformers away from Hill and the stern Charity Organisation Society, including Henrietta Barnett, Beatrice Webb and her sister Catherine Courtney.

This excellent, wellillustrated book ... provides a muchneeded, wide-ranging reassessment of Hill In a fascinating essay, William Whyte surveys housing improvements by Hill, or inspired by her, in Southwark, as an example of the impact of her work. The essay shows the benefits and limitations of the improved housing, which was beyond the means of those in most need. In one building there was a disproportionate number of policemen due to the regularity and level of their earnings.

By the early 20th century many of Hill’s beliefs seemed outmoded, as Whyte notes in his essay, sub-titled, ‘A Mid -Victorian in an Edwardian World’. Hill remained vigorously opposed to state or municipal intervention, advocating instead the benefits of charity. She opposed non-contributory old age pensions, free medicine, trade unions, workingclass guardians (as liable to provide money to their relatives) and votes for women. Yet, in one major area of her activities, it is argued in this book, she was ahead of her times and was radical. Namely, her campaigning to save open spaces for the public, as very ably reviewed in essays by Elizabeth Baigent and Paul Readman. The cause of saving open spaces in the London area had been taken up by the Commons Preservation Society in 1865 and Hill joined its committee. For Hill, the desirability of open spaces out-trumped her powerful concerns about undermining working-class self-reliance. In her 1875 essay, ‘Space for the People’, Hill wrote that space ‘may be given by the city, the state, the millionaire, without danger of destroying the

individual’s power and habit of energetic self-help’. In arguing for the defence of footpaths, her rhetoric could have come from some of the radical land reformers. At a meeting in the Lake District in 1888, Hill warned that footpaths were ‘vanishing … closed by Quarter Sessions, the poor witnesses hardly daring to speak, the richer dividing the spoil’. Hill believed that people needed to enjoy the beauty of the countryside and thereby ‘commune with God’. The experiences of safeguarding open spaces led Hill and other members of the Commons Preservation Society (notably Robert Hunter, Canon Rawnsley and the Duke of Westminster) to set up in 1894 the National Trust as a land-holding body. Hill’s role in the National Trust is reviewed carefully in essays by Melanie Hall and Ben Cowell. Hill herself had suggested it be called the ‘Commons and Gardens Trust’. As with her housing work, one of Hill’s major contributions was to secure financial support from aristocrats and other wealthy people. She secured such support not only through her dynamic work and her great integrity but also through her self-help and Christian moral philosophy, which did not frighten the established order. Queen Victoria, Lord Salisbury and many eminent Liberals all approved of her endeavours. After her death, early memorial accounts tried to portray her as sweet and gentle. But she was feisty and intense. Henrietta Barnett, who knew her well, commented that she was ‘strong-willed … often dictatorial in manner … and she dealt out disapprobation and often scorn to those who fell below her standards for them’. With this book, Octavia Hill gets the critical attention that her outstanding, complex, career deserves. Chris Wrigley ‘Nobler imaginings and mightier struggles’: Octavia Hill, social activism and the remaking of British society Eds. Elizabeth Baigent and Ben Cowell Institute of Historical Research 360pp £40



Reykjavik dockyard, early 20th century

Remembering the forgotten In this round-up, voice is given to Thomas Hardy’s housemaid, an Icelandic boy witnessing the 1918 Spanish flu epidemic and a young Englishwoman caught up in revolutionary Paris. WE MOVE from Iceland to Paris, from Thomas Hardy to Georges Danton. Revolutions come and Icelandic volcanoes go and the historical novel genre greedily chases them all. Our broad theme is the marginal – figures and places that for one reason or another have been ignored, silenced or forgotten. The historical novel specialises in reclaiming and recollecting in this way, focusing our attention on those sometimes ignored by the mainstream of historical narration. So here we have a near-silent cineaste living on the ‘edge’ of European modernity, a jilted mistress watching the violent birth of the 19th century and a housemaid reporting on the final days of one of Britain’s greatest writers. There has been something of a vogue recently for the servants’-eye view, from Downton Abbey and Upstairs, Downstairs to Longbourn, Jo Baker’s version of Pride and Prejudice (2013). Perhaps prompted by Alison Light’s magnificent Mrs Woolf & The Servants (2007) and with the motif of the unseen and the unsaid from Ishiguro’s The Remains of the Day (1987) always in the background, these novels scrutinise telling moments and important figures. Such a narrative is a way of reconciling social history with literary history, of reinvigor-

ating the ‘country house’ genre by paying attention to those who, in the words of Damien Wilkins’ narrator, are ‘invisible’ yet ‘somehow part of it all, another stone in the path, in the damp wall’. Max Gate (Aardvark Bureau) tells the last days of Thomas Hardy through the blunt voice of housemaid

Revolutions come and Icelandic volcanoes go and the historical novel chases them all ... focusing ... on those often ignored by mainstream historical narration Nellie Titterington. The struggle for Hardy’s memory is compelling stuff: his notebooks and manuscripts were burnt by his executors, his ashes buried in one place (Westminster Abbey) and his heart in another (his home parish of Stinsford) and The Early Life of Thomas Hardy published by his second wife in the same year of his death. It is a story of intrigue and strangeness, as the great writer’s attempt at curating his legacy and constructing his image for posterity

meets public commemoration and politics. The great novel of Hardy’s death and the struggle for his memory is W. Somerset Maugham’s Cakes and Ale (1930), which if you have not read recently, you should return to soon. Like Maugham, Wilkins approaches Hardy tangentially, as if our understanding of such writers cannot be augmented if we look straight at them. For both, this method provides a way of thinking about memory and the value of writing. Wilkins’ novel renders a very engaging and interesting story about the value of reputation and the very material struggles over legacy. It is sad and elegiac, a slight but pleasing read. In similar melancholic mood, the Icelandic author Sjón’s strange and valedictory Moonstone: The Boy Who Never Was (trans. Victoria Cribb, Sceptre) takes us to Reykjavik in 1918 and the coming of Spanish influenza. The central character, Máni Steinn, wanders the streets of the increasingly empty city, consuming as much cinema as he can. He acts as a strange kind of filter for the historical experience of the epidemic, participant and observer. The novel balances discussion of nationhood, sexuality and aesthetics. Sjón gently meditates upon the development of Iceland, of Europe and of a modern world understood through popular culture, in this case the cinema. It is a mournful book, as much in many ways about exile (internal and external) as about film. Sjón has said that he is drawn to the ‘silences in the past’ and this delicate novel carefully exposes these absences without seeking to disturb or control them. It is a lovely, spare, lyrical volume that – despite, or perhaps because of, its brevity – contains much beauty. By contrast Hallie Rubenhold’s The French Lesson (Doubleday) is an engaging romp around late 18th-century Paris, blending elements of sensation, the novel of sensibility and the novel of revolution. Key writers lurk in the background – de Laclos, Fielding, Lennox, Dickens and Burney – and at times the novel is delightfully arch and delicate. It is a light read, fun to skip through, without much to pause over (this sounds like a criticism, but is not). The plot is a confection recounting many key events of the early part of the French Revolution and there is much to admire. Rubenhold sketches revolutionary Paris marvellously and there is an admirable lightness of touch. Jerome de Groot SEPTEMBER 2016 HISTORY TODAY 61


Churchill and Ireland

by Paul Bew Oxford University Press 240pp £16.99

WINSTON CHURCHILL had a long association with Ireland, from his infancy in Dublin in the 1870s to his second premiership in the 1950s. During his life he adopted various stances on Ireland’s political relationship with Britain. As a young Conservative he was a strong Unionist and opponent of Irish home rule. As a young Liberal minister, he supported


home rule and appeared to favour coercing Protestant Ulster into submission to a Dublin government. After the Irish War of Independence, he switched from a policy of suppressing Sinn Féin and the IRA to closely cooperating with Michael Collins and the Free State government in defence of the Anglo-Irish Treaty. While he supported the right of the majority of the people of Northern Ireland to determine their own fate, he always hoped that Ireland would be reunited with the United Kingdom. Churchill’s shifting position prompts the charge that his Irish policy was motivated by self-serving opportunism. Indeed, in Ireland, Churchill has been denigrated as an anti-Irish imperialist. But, as Paul Bew points out in this informed, balanced study, Churchill liked the Irish – nationalist as well as unionist – and had many personal contacts with them. As a distinguished Irish historian,

Bew brings much knowledge of the Irish background. However, he pays less attention to the ministerial aspect of Churchill’s policies, which were partly fashioned by his changing departmental responsibilities. Bew arguably attaches too much weight to some of Churchill’s outspoken private comments, which were less moderate than his actions as a minister. Churchill’s Irish policies were initially influenced by those of his father, Lord Randolph, who favoured reform in Ireland but within the union with Britain. Bew acknowledges that debt but underestimates the extent to which Winston’s speech at Belfast in 1911 was compatible with Randolph’s speech there in 1886. Both men appealed to Protestants and Catholics to avoid sectarian hatred. Churchill’s response to the IRA campaign, after the First World War, was to win a military victory and then negotiate a generous peace settlement. In

that respect his Irish policy echoed his stance on other conflicts such the Boer War and two world wars. More particularly, as Bew points out, Churchill’s antagonism to the IRA between 1918 and 1921 was partly conditioned by his hostility to Bolshevik terrorism in Russia. It would also have been revealing if Churchill’s response to insurrection in Ireland had been compared with his views on disorders elsewhere in the Empire. Bew concludes by observing that, although Churchill wished to see Ireland reunited, the generous financial settlement that he secured, as Chancellor, for the Stormont government in Northern Ireland helped to perpetuate Partition. Yet the division between north and south in Ireland was much deeper than a merely fiscal one and, in any case, the Ulster Unionists were a powerful element in the Conservative Party, which Churchill could not ignore. Roland Quinault


EXHIBITION Army Film and Photographic Unit cameraman, Sgt. George Laws, June 1945.

What is real war? The most famous scene in the official film of the Battle of the Somme was faked, yet had immense impact. A new exhibition on the history of war films seeks to separate artifice from reality. Real to Reel A Century of War Movies The Imperial War Museum Jul 1st, 2016 – Jan 8th, 2017 WHAT IS IT LIKE to fight in a war? How true is film to the reality of conflict? These are some of the questions posed in the new exhibition at the Imperial War Museum (IWM), Real to Reel: A Century of War Movies. A giant central screen shows how Hollywood mimics reality. One of the most powerful war sequences in cinema history is the opening of Steven Spielberg’s Saving Private Ryan (1998), showing US troops landing on Omaha beach on D-Day. Many of the shots of troops in landing craft are copied directly from record film shot by cameramen of the Army Film and Photographic Unit (AFPU), seen in another clip of British troops landing on Sword beach. A further inspiration was Robert Capa’s photographs of men sheltering among the beach defences. A case of art following reality? Or has Saving Private Ryan, with its explosions, wounded soldiers, body parts and gallons of stage blood, created a spectacle of what we imagine must be real? The documentary

film looks far more calm, if more confused in places, than the dramatised version, even allowing for the differences between Sword and Omaha beach that morning. What is real when it comes to film of war? This also arises with the Battle of the Somme film, the starting point for the exhibition. This propaganda film, released in August 1916 while the battle still raged, was seen by nearly half of the British population. Ninety-five per cent of the film

More than anything, Real to Reel is a vital reminder that every sort of artifice has been used in making war films is authentic. However, the key moment, when the officers led their men out of the trenches and over the top, could at the time only be filmed as distant shots that are difficult to read. The army’s standard 50mm lens was not good enough to capture detail of the action, the cameras too heavy and cumbersome. Also, to expose oneself to machine gun and artillery fire would have been supremely dangerous. So Geoffrey

Malins, an ex-feature film cameraman and one of two men who filmed the Big Push, faked an over the top sequence in a training school miles behind the front, complete with smoke, barbed wire and men falling as they went forward. The sequence had immense impact at the time and the scenes are among the most famous images of the First World War today, repeated in books and television documentaries and shown endlessly throughout the Somme centenary. The most celebrated piece of film of the Great War was staged like a feature film. Real to Reel asks why war films are made, how they are made and how and why we watch them. Of course, many war films explore reactions to extreme pressure. They probe human qualities of courage, loyalty and self-sacrifice. These are displayed just as much in films about lovers caught up in war, such as Casablanca (1942) and Captain Corelli’s Mandolin (2001), as in action movies, such as The Cruel Sea (1953) or The Dam Busters (1955). The exhibition also includes propaganda films and and the abundant vein of anti-war films, like those of Stanley Kubrick. It ranges from the Somme to Eye in the Sky (2016) and displays a wonderful array of objects, from David Niven’s RAF uniform from A Matter of Life and Death (1946) to Ken Adam’s original sketches for the War Room scene in Dr Strangelove (1964) and the storyboard for the helicopter attack in Apocalypse Now (1979). There are Peter O’Toole’s flowing robes from Lawrence of Arabia (1962), a model from Das Boot (1981) and Elmer Bernstein’s music from The Great Escape (1963); a Moy and Bastie hand-cranked camera used on the Somme and the late Tim Hetherington’s tiny digital camera used for Restrepo (2010) in Afghanistan. The IWM is the ideal place to reflect on all this as the Museum boasts one of the world’s oldest and most respected film archives. The exhibition is a celebration of the imagination and a marvellous evocation of the sounds and images of a century of war movies. More than anything, Real to Reel is a vital reminder that directors, writers, set designers, actors and armourers have used every sort of artifice in making war films. In the end, they are just that, artificial. Let us enjoy war movies for the dramas they explore but leave the reality of war to AFPU cameramen and journalists. It might not look as grand or as spectacular but at least it is real. Taylor Downing SEPTEMBER 2016 HISTORY TODAY 63



Human rights and historical coincidences In his quest to find out more about the early life of his late grandfather, Philippe Sands gives us a powerful insight into the lives of the two great jurists who defined crimes against humanitiy and genocide.

LEGAL HISTORY can be the dullest of subjects. Stories of the development of court rules and practices necessarily involve explanations of complex intellectual juggling, plodding attention to precedent and to the mental calisthenics of lawyers who, on the whole, live unadventurous lives. In describing the birth of international criminal law, Philippe Sands ingeniously avoids tedium by telling how two great jurists went to work, while their families were being transported and gassed, interposed with the similar fate of his own grandparents at the hands of Hans Frank, Nazi Governor of Poland. Hersch Lauterpacht developed the idea that state sovereignty could not be impregnable, if the state ran amok and murdered its own citizens. Rafael Lemkin invented the word and the concept of ‘genocide’, the worst crime of all: mass-murder carried out with the intention of destroying a racial or religious group. Sands often treats their contributions and the men themselves as competitive, although they were in alignment by the time of the Nuremberg trials. Thereafter, Lauterpacht’s 64 HISTORY TODAY SEPTEMBER 2016

demonstration that international justice had the power to punish crimes against humanity needed elaboration (for example, by establishing that the crime could be committed other than at the time of a declared war), while Lemkin’s true achievement also came at a time outside the focus of the book, when his Genocide

Lauterpacht and Lemkin ... established that a state cannot immunise its leaders against international prosecution Convention was adopted by the UN. Nonetheless, Sands gives a compelling account of how their ideas resounded in the minds of statesmen and lawyers as they strove to find a legal way of punishing the perpetrators of the Holocaust, men whose depraved actions had been authorised by their national law. The story gathers its force – and its horror – from telling what happened to these men’s

families and to the author’s family, all connected through early times in or near East West Street in the city of Lviv (formerly Lemberg) in southern Poland (and now in the Ukraine). Their joys, depicted through album photographs, gradually turn to ashes after the emergence of Hitler. The Führer’s one-time lawyer, Hans Frank, was given plenipotentiary authority in Poland and soon Treblinka and Auschwitz were going full blast, exterminating Jewish subjects. The poignant family theme in the book continues through interviews with Frank’s son, a child at the time, who now believes in his father’s inexcusable guilt. East West Street is a book about the remembrance of atrocity: those who were children when it happened cling to the last memories of their parents, recalled running from the Gestapo or offering them false and frightened reassurance. Those who got away grieve privately ever after, sleeping with mementos of their lost loved ones under their pillows, refusing to speak even to their own children about the unbearable inhumanity they have witnessed and suffered. I do not

think, however, that Lauterpacht and Lemkin were driven by their fears for their families; indeed Lemkin had convinced himself of the need for a law against genocide after his study of the 1915 Armenian massacres. The evidence for some of the moving stories in this book has had to be dug out and put together, as if from the remains of a mass grave. That process, depending on conjecture and the inference from circumstances, can be fallible, but no such problem affected the judgement at Nuremberg. As the prosecutor Robert Jackson pointed out, there could be no reasonable doubt about Nazi guilt: it was there in the court documents (many of them gathered by Lemkin) as a result of ‘the Teutonic habit of writing everything down’. That habit has not been followed by other genocidaires and the current work of the International Criminal Court is bedevilled by the difficulties, without any police force or enforceable discovery procedure, of obtaining evidence against its suspects. The UN Security Council will not permit the indictment of Assad, nor act against those states which welcome the Syrian leader. But even so, the Nuremberg legal legacy to which Lauterpacht and Lemkin contributed established that a state cannot immunise its leaders against international prosecution for the worst of crimes against its own citizens. We must applaud these lawyers, while acknowledging that Hans Frank, through his own confessions in diaries, provided them with the compelling evidence to which their theories could be applied. The denizens of East West Street are long departed, but their ghosts hover through the pages of this fine book to remind us that genocide is different from other crimes: its perpetrators can neither be forgotten, nor forgiven. Geoffrey Robertson East West Street: On the Origins of Genocide and Crimes Against Humanity by Philippe Sands Weidenfeld and Nicolson 464pp £20



Searching for ‘the truth’ of Vietnam John A. Wood’s meticulous study of Vietnam veterans’ memoirs unpacks the myth that they offer untroubling access to the reality of the war. THE VIETNAM WAR has long been represented through the ‘authenticity’ of the GI experience. Those who ‘were there’ and related their experiences of a chaotic, brutally violent war have served as a cultural conduit for the conflict in ways which have shaped our collective memory of it. The GI, as a victim of a ruthless enemy and a wrong-headed American military machine, has come to stand as the injured party in America’s fateful excursion in South-East Asia. John A. Wood’s study of veteran memoirs is a serious and largely successful exercise in unpacking the myth that GI memoirs offer us untroubling access to the reality of the war. This book is meticulous in its methodological approach to 58 memoirs and oral testimonies published between 1967 and 2005. Wood insists that the canon of Vietnam memoirs gives us limited understanding of GI experience and even less of the war in its totality. The memoirs are written mostly by educated middle-class officers with an

average age of 27, while the average age of the overwhelmingly working-class infantry soldier, the ‘draft bait’, was 19. Wood also engages with race, on two levels: the vitriolic racism against the Vietnamese people, which structures most of the memoirs, and the racism within the US army. African-American and other minorities were disproportionately represented among the US infantry in Vietnam. The wartime experiences of non-white veterans have been marginalised to the point of exclusion in the American story of Vietnam and Wood’s engagement with this body of ‘countermemories’ of the war is especially welcome. He argues that readers of white memoirs come away with the ‘false notion that race played almost no role in the history of the war at all’. Latinos and Asian Americans were regularly viewed as the enemy by other GIs and the anti-Vietnamese racism so prevalent among white veterans is less common among non-white soldiers. Wood places non-white narratives in relation to the

white canon of veteran memoirs and the race politics of the Black Power era, enabling a complex, nuanced look at the racialised structures of US society. To counter the overwhelmingly gendered field of Vietnam memoirs, Wood includes narratives by American and Vietnamese women. In traditional memoirs, the latter are objectified as unnamed racialised and sexualised others, while American women are represented through a less complicated, if similarly totalising, male gaze. The inclusion of women’s voices here is an attempt to examine the toxic forms of masculinity which structure the GI war story and again underlines the problem with treating memoirs as providing ‘authentic’ access to the war. One reason that the Vietnam veteran has become the moral vector of the war is the perception that they were often ignored, abused, hated and marginalised by the US establishment and anti-war activists. Wood places this within a longer narrative of US homecomings and, while recognising the damaging legacies of the war, questions the apparent uniqueness of the difficulties that Vietnam veterans faced returning to civilian life. In the latter chapters of the book Vietnam is placed in the context of the wars that have come before and after and the impact of cinema on the shape of war memoirs. A strange cultural circuit emerges in which fictional representations take their authority from recollections that are themselves informed by cinematic culture. This important study is not a disinterested reflection on how the most prominent memoirs are expressions of raced, classed and gendered subjects rather than ‘the truth’ of Vietnam. Wood does not suggest that these narratives have nothing to tell us about war. That story is darker than the most bleak memoirs. Cathy Bergin

CONTRIBUTORS Philip Burton is Reader in Latin and Early Christian Studies at the University of Birmingham. His books include Language in the Confessions of Augustine (Oxford University Press, 2007). Cathy Bergin is Literary and Cultural Historian at the University of Brighton and author of Key Texts in Anti-Colonial Thought: African American Anti-Colonial Texts 1917-1937 (Edinburgh University Press, 2016). Taylor Downing writes about film and history. His latest book is Breakdown: The Crisis of Shell Shock on the Somme (Little, Brown, 2016). Emma Griffin is Professor of History at the University of East Anglia and the author of Liberty’s Dawn: A People’s History of the Industrial Revolution (Yale University Press, 2011). Jerome de Groot is Senior Lecturer in the Arts, Languages, and Cultures at the University of Manchester. John Morrill is Emeritus Professor of British and Irish History at the University of Cambridge. Harry Munt is Lecturer in Medieval History at the University of York. Roland Quinault is a Senior Research Fellow at the Institute of Historical Research, University of London and the author of British Prime Ministers and Democracy, from Disraeli to Blair (Bloomsbury, 2011). Geoffrey Robertson QC is the founder and joint head of Doughty Street Chambers, London and the author of several books, including Crimes Against Humanity: The Fight for Global Justice (Penguin). Chris Wrigley is Emeritus Professor of History at Nottingham University.

Veteran Narratives and the Collective Memory of the Vietnam War by John A. Wood Ohio University Press 200pp £21.99 SEPTEMBER 2016 HISTORY TODAY 65


FromtheArchive The ‘middle Medici’ – two popes, two dukes, two bastards and a future queen of France – are too often left out of the dynasty’s history. Catherine Fletcher addresses that gap.

Florentine Family Feuds THE HISTORY OF the Medici rulers of Florence is usually written in two halves. Christopher Hibbert’s 1974 article for History Today explored the first: the rise and fall of the Medici bank, ending in 1492 with the death of Lorenzo ‘the Magnificent’, great patron of arts and letters. The second part features the Medici grand dukes from 1532 in a tale of marvellous patronage, if no longer such political or financial power. What, though, of the 40 years in between? Historians of this period have often been more interested in the men surrounding the Medici – Machiavelli, Guicciardini and Michelangelo, for example – than the family itself. Exiled from Florence in 1494, the Medici held their own, thanks to Cardinal Giovanni de’ Medici, Lorenzo’s second son, and his power base in Rome. With the help of Spanish troops, he secured the restoration of Medici power in Florence in 1512; the following year he was elected Pope Leo X, the first of two Medici popes. Their reputation was long clouded by their failure to rise to the challenge of Protestantism and for being preoccupied instead with securing Florence for their family. That task was made all the harder by the untimely deaths of Leo’s brother Giuliano and nephew Lorenzo, which left the Medici in 1519 with no legitimate male heir. The family’s future lay with three young children: Lorenzo’s infant daughter Catherine (future queen of France), her acknowledged but illegitimate cousin Ippolito, Giuliano’s son, and her half-brother Alessandro, also illegitimate, darkskinned and rumoured to be the son of a slave. Ippolito was initially promoted as a potential ruler for Florence but, amid crisis in 1529, was made cardinal 72 HISTORY TODAY SEPTEMBER 2016

instead, which left the lower-born Alessandro as the only available candidate for a dynastic marriage and the rulership of Florence. Ultimately, he became the city’s duke. The archives of both Ippolito and Alessandro are largely lost and so reconstructing their lives – and vicious rivalry – is a challenging business. It is in the material and visual culture of politics that recent studies, including my own biography of Alessandro, have found intriguing sources for

The Medici dukes were showmen, but their show was packed with meaning the history of the Medici family, their allies and enemies. The Medici dukes were showmen, but their show was packed with meaning. Not for nothing did Ippolito – keen to leave the cardinalate and return to the life of a layman – make appearances in secular dress rather than the robes of a cardinal. Alessandro, meanwhile, had himself portrayed by Vasari in armour. The first armorial portrait of any of the Medici, this made a sharp statement of his triumph over his opponents. He was also painted in sober, black dress, respectful of Florentine civic norms, though his wardrobe records show a preference for luxurious fabrics. The tales about Alessandro’s ethnicity (he is variously described as the son of a ‘half-Negro’ or Moorish woman, of a slave and of a peasant) have been given new context by the extensive recent research on European attitudes towards ethnic ‘others’. Besides Alessandro’s own case, the archives reveal many more fascinating stories on the theme of race: Cardinal

Ippolito’s household of ‘Numidian’ horsemen, ‘Tartar’ archers and ‘Ethiopian’ wrestlers; cases of runaway slaves; and (not least) Alessandro’s own predilection for dressing up his court as Turks, Moors, gypsies and peasants. There are many possible interpretations of these masquerading outfits, but in a world where Italians saw the Turkish sultans as illegitimate rulers, gypsies as thieves and Moors as military rivals, it is hard to avoid wondering if they were a deliberate riposte to critics of his ‘tyrannical’ rule or low birth. It is far from easy to piece together the manoeuvring and strategising of the Medici in these middle 40 years. But new approaches to historical research have prompted different and fruitful lines of inquiry. These may not be the ‘great days’ of the 15th-century Medici, nor yet the stunning decades of grand-ducal court patronage, but they are, without any doubt, years that have much more still to reveal. Catherine Fletcher is the author of The Black Prince of Florence: The Spectacular Life and Treacherous World of Alessandro de’ Medici (Bodley Head, 2016).

VOLUME 24 ISSUE 8 AUG 1974 Read the original piece at historytoday.com/fta

View more...


Copyright ©2017 KUPDF Inc.