Monday, 17 December 2012

Can't read, won't read

One morning last week I went to the orthopaedic clinic in Worcester. By the door, in large letters, was a sign. It told patients not to go straight to the desk, but to collect a number from a box below the sign, and sit in the waiting room for their number to be called. This message was repeated on another large sign above the reception desk, which also told patients not to stand near the desk in order to maintain the privacy of individual patients speaking to the receptionist. I collected my number (21) and waited. After five minutes, my number was called. Another man (number 22) hovered round the desk and tried to barge in front of me, but I held my ground. I was registered by the receptionist and waited to see the physiotherapist and consultant.

While I did so, I noticed that half the people entering the clinic failed to pick up a number from the box and had to be told to do so by the receptionist. Some could not even find the box, and had to be guided by other patients. Why this manifest failure to follow a simple instruction? The reason did not seem to be age or infirmity. The great majority of patients in the clinic were young or middle-aged, and this group included almost all those who failed to read the signs. I suspect the main reason was simply functional illiteracy.

By ‘functional illiteracy’, I mean not just the inability to read, but also the unwillingness to do so. Many people are able to read, but find that it requires an unwelcome effort of concentration. When faced by a sign with words (even large ones like those in the clinic), they look the other way. There are many other examples of this. When I go to Foregate Street Railway Station in Worcester, I note the screens that indicate for each train the time, destination and platform (there are only two at this station). Around the window where you buy tickets, there are lists of the next trains to the main destinations. Yet sreveral customers do not read any of this written information and ask at the ticket office.

Many people therefore live in a world where they depend entirely on the spoken word, whether face-to-face, on a mobile phone, or on television. There may be few if any books in their homes, and parents do not try and help their children learn to read. People can lead a satisfactory life with functional illiteracy until they encounter the world of formal bureaucracy, whether public or private. In such cases, there is a wall of mutual incomprehension between an organisation dominated by written rules, job descriptions and standardised forms, and an individual whose understanding of the world is based on the spoken word. People with functional illiteracy faced by an official-looking letter must either show it to someone with greater expertise, or just ignore it. The latter course of action can be catastrophic. When I was a social worker in West Lothian in the 1970s, we were routinely told by the Housing Department of impending evictions (usually for non-payment of rent). This information was usually received on the day of the eviction. We would visit the tenant to see if there were implications for childcare, or if a disabled person was involved. I visited the S Family, and found they were utterly unaware of what was due to happen at 11am that day. I found a series of letters from the Housing Department, unopened and unread, behind the clock on the mantelpiece. Later, Mrs S told me in distress of her life with an abusive husband, a son in prison and an estranged daughter. “Mr Cumella”, she said with feeling, “I feel so thingy”. So functional illiteracy was coupled in her case with a lack of vocabulary to express emotions.

Much of the debate in education about how to improve literacy seems to be about teaching techniques. English spelling does make learning to read a bit more difficult than would be the case in a language like Spanish, in which spelling is phonetic. But literacy levels are high in Japan, where the written language uses three different orthographies. I think the main problem is not teaching technique or the quality of primary school teachers, but the lack of commitment among some parents to education itself and the low expectations they have of their children.

It is not easy to see how we can deal with this problem, but in the meantime we need to help people with functional illiteracy cope better with formal organisations. This means using symbols and colour schemes to supplement written signs. At the Orthopaedic Clinic, this would mean using a sign of a large arrow (in a contrasting colour to the wall) pointing to the box, with a picture of a hand picking a card. The use of symbols of this kind is widely-accepted in services for people with a learning (intellectual) disability: we just need to recognise that there are very many more people who have problems in reading and learning than we have previously estimated. 

Saturday, 8 December 2012

A short walk in France: best memories



When you are feeling down or just bored, review your best memories - the ones that make you smile with recollection. If you could plot these on a personal timeline, you would find they form clusters - a few days of your life which produced many great memories. One cluster in my life was a week’s walk in South West France along a footpath called the GR6. This took place in - I think - September 1980, when I travelled by train and ferry to Sainte-Foy-la-Grande in the Gironde, and then walked to Les Eyzies-de-Tayac in Périgord. I had a small tent and stayed in campsites and once in a cheap hotel.

This was my first trip to France, and I discovered not just that I could understand French, but also that the French I met in the small towns and villages on the path were friendly, kind and hospitable. The path was easy to follow, and led across a rolling landscape of vineyards, woodlands and riverbanks. The weather was dry and hot.

My best memories of this trip included:

▸    Walking in the crisp early morning through the centre of Paris from the Gare du Nord across the Seine and the Île de la Cité to the Gare d’Orléans. The streets were being watered, the boulangeries were opening and there was a smell of fresh bread and French tobacco. Like so many before me, I fell in love with Paris.

▸    I walked over the open hills near Bergerac to the splendid Chateau of Monbazillac, surrounded by vineyards. I decided to go on a guided tour. For the first time in my life , I tasted the original and greatest of the sweet white wines of France. I sat stunned by the experience. I shall drink Monbazillac again this Christmas.

▸    One evening, I walked into the small village of Lanquais. I went to the café to ask if there was anywhere I could pitch my tent. After some discussion, the locals decided I could camp on the touchline of the rugby pitch and use the club showers and toilet. I pitched my tent, left my backpack in it, and went back to the bar. I bought a meal and a drink and chatted to an old man called Josephe. I bought him a whisky, and he invited me back to his home, which was a room under the town clock. There we shared a meal of pig’s trotters. The next day, I returned to the café for breakfast and set off for the next day’s walk.

▸    One day, I walked through a quiet wood of tall trees. In the middle of the day, I sat down to rest in a clearing. I heard a shrill bird call, and, looking up, saw a snake eagle circling high above me. After my rest, I walked downhill until I came to a chateau, abandoned and shuttered, being reclaimed by the woods.

▸    Another day (or perhaps the same day), I walked into a hamlet of old stone houses, all in the honey colour of this part of Périgord. There was a small café open. I was the only customer. I ordered a glass of red wine, which came with a glass of water. The wine was better than many you find in expensive restaurants in England. I left feeling benign but reinvigorated.

▸    At the end of the walk, I came to the prehistoric caves at Les Eyzies. A very old man showed me round, and was careful to point out the genitalia on each of the stick men drawn on the walls of the cave 16,000 years ago. Some of the local inhabitants still live in caves, in houses built into the rock, facing the wide and beautiful River Vezere. See the photograph above.

In 1984, recently married, my wife and I went back to this Les Eyzies as part of a holiday in the Dordogne. But that is a different cluster of best memories.

Wednesday, 28 November 2012

French exiles, royal and presidential in Worcestershire

Look up a list of pretenders to the thrones of Europe and you will find that France has three: from the Bourbon; Orleanist; and Bonapartist lines of succession. Each of these royal houses ruled France at different times in the 19th Century. When not in power, the senior members of each house would go into exile, often in England and, in a few cases, in Worcestershire.

The royal exile first to reach our county was Lucien Bonaparte, Prince Français. He was a younger brother of Napoleon and played a key part in the latter’s rise to power. But he lost out in palace intrigue, and was exiled to Italy. In 1809, he decided to flee to the USA, but his ship was intercepted by the Royal Navy. Once in England, Lucien seems to have been treated with great civility, and was allowed to buy a house in Worcestershire, although he was kept under close surveillance and his mail was intercepted. The house, called ‘Thorngrove’, is in a secluded part of Grimley Parish not far from the River Severn, and was built in the 18th Century. It has large landscaped gardens and is Grade 2 listed. Lucien eventually left Thorngrove in 1814, although not before one his sons, Louis Lucien Bonaparte, was born in the house. Thorngrove House is not open to the public, and so, although it is only a few miles from my home, I have never visited it.

After the fall of the Second Empire in 1870, France became a republic almost by default. Although monarchists won the elections for the National Assembly, the three competing royal houses could not agree on an acceptable candidate for the throne.

The next royal exile in Worcestershire was the Duc d’Aumale, one of the sons of King Louis-Philippe, who ruled France between the two revolutions of 1830 and 1848. The Duc bought Wood Norton Hall near Evesham in 1872 as a hunting lodge. On his death in 1897, the property passed to Prince Philippe, Duc d’Orleáns, who was the official pretender for the throne on behalf of both the Bourbon and Orleanist royal houses. Philippe rebuilt Wood Norton Hall as a splendid stately home in a wooded setting overlooking the River Avon. The house became an important social centre, with a royal wedding for the Bourbon family in 1907 (the bride was the grandmother of the current King of Spain). Philippe had an active social life too, being cited as a co-respondent in a divorce case and having an affaire with Dame Nellie Melba.

Philippe died in 1926 and the house eventually passed to the BBC for use as an emergency broadcasting centre in the Second World War and in the Cold War (with a nuclear bunker added), a staff training centre, and the location for some early Doctor Who episodes. A few years ago, I had a superb meal at Wood Norton Hall, followed by coffee and petits fours in a dark lounge lit only by an oak fire. The building is now a luxury hotel and worth visiting for its food, its setting and its history.

The next important French exile in Worcestershire was a future president rather than a would-be king or emperor. After the defeat of France in 1940, General de Gaulle created the Free French Army. In 1942, he set up training school for officer cadets in Ribbesford House near the beautiful riverside town of Bewdley. Although not based there, he visited the House and the town before the School closed in 1944 and its newly-trained officers joined in the liberation of France. Half of them were dead before the end of the War.

Ribbesford House probably dates from the 16th Century but has been much enlarged, partially demolished and restored since then. It is not open to the public, but can be seen from the nearby parish church. It is perhaps a little reminiscent of a French country chateau, standing in front of a wooded hill, facing fields leading to a long winding river. The river flows past villages and towns to join the great ocean where all rivers meet. 

Wednesday, 21 November 2012

Fire my light

Joseph Chamberlain became mayor of Birmingham in 1873, and took urgent action to deal with the City’s shambolic gas and water supply. Two rival gas companies dug up the City’s streets seemingly at random, while the water supply was intermittent, polluted and spread fatal diseases. Under his leadership, the City bought out the private companies and radically improved the service. The profits from the new municipal gas enterprise helped fund schools, libraries, parks, swimming pools, and what would now be called ‘social housing’. This policy was sometimes called ‘municipal socialism’, but Chamberlain was a successful industrialist and certainly did not see his actions as part of a war on capitalism: rather, they were  practical steps to ensure that the City received an efficient set of public services, and that the profits from these enterprises would benefit all its citizens.

In the last thirty years, Chamberlain’s policies have gone into reverse. Public services have been sold off to large international corporations, the whole process being managed by a process of ‘contracting’ that involves millions of pounds of public funding for lawyers, accountancy firms, and management consultants. Utility companies have to be bribed with further tax receipts to provide essential public services: local authorities are being encouraged by government to pay BT to install high-speed broadband in rural areas; while central government will soon commit £100 billion capital expenditure to update the water and electricity networks. Why do the public have to pay? Because long-term investment in infrastructure does not generate the short-term profits needed to maintain corporations’ share prices.

Private contractors for public services are regulated on behalf of the public by a set of QANGOs. These can be active. In the last year, EDF have been fined £4.5 million for mis-selling electricity and gas to vulnerable customers, Npower was fined £2 million for breaching regulations on handling customers’ complaints, while British Gas was fined £1 million for lying about the amount of electricity it generated from renewable sources. Still pending is action over the cartel used by the major gas suppliers to jack up the prices paid by consumers.

The supposed benefit of privatisation is of course that competition generates efficiency and hence low prices. But the British people now pay well above the median pre-tax prices in Europe for gas and electricity, and our rail fares are the highest in Europe. What competition does generate successfully is a multitude of tariffs and fares. Go to a price-comparison website and enter details of your electricity ‘supplier’. Then check the box of your tariff. You will see dozens of choices, some time-limited, some pre-paid, all confusing. This is bewildering because each supplier is selling an identical commodity. Electricity does not vary between EDF, Npower or Eon - no ‘supplier’ has some magic ingredient in their electricity. It all comes through the same supply lines and does exactly the same thing whoever charges you for it. The latest government initiative is to force ‘suppliers’ to place their customers on the cheapest tariff. But this intervention only proves that the whole paraphernalia of market competition, contracts and regulation fails to work for the benefit of the public.

What would Joseph Chamberlain do? He would realise that high power bills cause hardship to many families and impede business success. His initial step would therefore be to set up a corporation to buy electricity from the various generating companies on behalf of the public. This would charge a few simply-understood tariffs, but would use its monopoly purchasing-power to push down prices. In the case of the railways, he would probably take over the companies running main-line services and scrap the whole crazy franchising system that means that one set of companies run the trains, another set own the rolling-stock and another one owns the track.

These are not a particularly radical steps: they are what happens in most of Europe. One of the strange features of the operation of privatised public services in the UK is that most of the private suppliers of public services are foreign-owned, in some cases owned by foreign governments. The power ‘supplier’ EDF is mainly owned by the French government. The German government is the controlling shareholder of DB, which owns Arriva Trains, Chiltern Railways, Cross-Country, Grand Central Railway, Tyne and Wear Metro, and the main UK railfreight company DB Schenker. The prices EDF, DB and similar foreign-owned companies are allowed to charge in their own countries are significantly lower than in the UK, which means that the profits they make in our country cross-subsidise our European neighbours. Next time you catch a train owned by DB or buy fuel from EDF, take pleasure in the fact that part of the inflated prices you pay is helping keep taxes down in Germany and France.

Thanks to The Observer Business Leader for 18th November 2012, page 44. 

See also http://stuartcumella.blogspot.co.uk/2011/01/meet-new-boss-same-as-old-boss.html

Friday, 16 November 2012

Private morality and public morality

A long time ago, on a train to Edinburgh, I was approached by two Mormon missionaries. As usual, these were Americans in business suits, speaking in the friendly outgoing way that is such an attractive part of their culture. They told me that becoming a Mormon would confer many advantages for my career. Mormons, they said, had such a reputation for honesty that Howard Hughes recruited only Mormons to run his business interests in Las Vegas. I pointed out that Howard Hughes was running a vast gambling enterprise in that city, which was also a major centre for (legal) prostitution. The missionaries were used to rejection, but found my point puzzling. They saw no problem in combining morality in their private life with the promotion of mass corruption in business.

I was reminded of this encounter by the recent presidential election in the USA, in which another Mormon (Mitt Romney) managed to combine a strict personal morality with a political campaign that involved a startling sequence of dishonest statements, bewildering changes of opinion on the most fundamental issues, and a set of policies that would drive millions of the poorest people in his country to even greater misery. Like the Mormon missionaries on the train to Edinburgh, he drew a clear distinction between two quite different moral standards: private morality (how we behave with our family and friends) and public morality (how we behave with everyone else).

This distinction is not of course found among Mormons alone, and is much older than the Mormon religion. The idea that we should be compassionate with our families and cruel to others probably dates from the earliest human history: we co-operate with and trust members of our own tribe, but arm ourselves against other tribes. In warfare, honest and peaceful men come together to kill strangers. The justification for distinguishing our private and public moralities is therefore that members of other tribes can not be trusted, and that our relations with them are a matter of actual or potential warfare. It is a simple matter for some people to extend this principle to the operation of politics and economics: whatever moral standards we apply to our family lives, it is assumed that survival in political and business affairs necessitates fraud and double-dealing. 

The trouble with living with this distinction is that it confirms the evils of the world. Advances in human life have come about when people have chosen to apply the ethics of their private morality to public affairs. In his great book Bury the Chains, Adam Hochschild gives the example of the English sea captain John Newton. In 1748, he experienced a spiritual conversion (we would say he was ‘born again’), which led him eventually to become ordained in the Church of England and to write several hymns. The most famous of these, Amazing Grace is often mistakenly categorised as a ‘negro spiritual’. This is particularly ironic since Newton made his living by captaining ships bringing slaves from Africa to the North American colonies. Newton was apparently able to reconcile his conversion with an active role in the most brutal and cruel of all trades. Even after leaving the sea in 1754, he continued to invest in the slave trade and said nothing against it for another 34 years.

But during that time, the campaign to abolish the slave trade gathered strength. This was at first led by Quakers, who were joined later by evangelical Christians and political radicals. The campaign at some point must have produced great soul-searching in Newton, and eventually in 1788, by now a prominent Anglican clergyman, he published a forceful pamphlet denouncing the slave trade and describing the horrors he had seen as captain of a slave ship. He later became a star witness before Parliament on behalf of the Committee for the Abolition of the Slave Trade, and lived to see the success of the campaign with the passing of the Slave Trade Act in 1807.

So if Newton could eventually apply the principles of his private morality to his public morality and have such a positive influence on mankind, there must surely be hope even for Mitt Romney.

Monday, 29 October 2012

Sailing to Switzerland



On the 23rd August this year, my son Andrew and I sailed to Switzerland. We had not intended to do so. Our original plan for our holiday was to fly to Milan Malpensa, hire a car and drive to Stresa on Lake Maggiore. The next day, we had intended to drive to Locarno or points North, and then spend the next few days in Austria. But on the day before we were due to leave, I realised that I had lost my driving license. Andrew has not yet passed his driving test, and so car hire became impossible. The flights were already booked, and so we decided to backpack. Two trains from Malpensa got us to Stresa, where we stayed two nights. Then we took a steamer up the Lake to arrive at Locarno quayside in the early afternoon. Three nights in Locarno included a trip to the extraordinary castles of Bellinzona, and a train journey along the valley West of the city to the hilltop village of  Intragna. Another train journey took us back to Italy and the town of Como, where we spent one night before returning home.

Travelling by boat and train meant that I had more time to look at the scenery, and note the sharp change crossing the border. Stresa is a beautifully-preserved resort town, with great hotels lining the lake front. The town centre is full of narrow pedestrian streets leading to a town square full of restaurants. Italian restaurants seem incapable of cooking bad meals, and we ate better in Stresa and later in Como than anywhere in Locarno. The enjoyment of food in Switzerland is of course lessened by the price you have to pay for it. When we arrived in Intragna, we found it almost deserted - the only place open was the small Hotel Stazione. We went through to the back and found we were the only customers in a splendid restaurant with a view down the valley to Locarno. Andrew ate Gnocchi and I ate Risotto ai Funghi. Both were the finest we had ever eaten, and we were not put off our food by the constant sound of gunfire. The waiter explained that this was from the firing range used by the Swiss Army on its summer training.

On reflection, it is not so unusual to arrive in Switzerland by steamer. Many years earlier, our family had travelled across Lake Geneva by ferry from Evian-les-Bains in France to Lausanne. That was a happy time too.

Friday, 5 October 2012

Student life long ago: the LSE in the 1960s

I became a university student for the first time in September 1965, when I registered for a B.Sc(Econ) at the London School of Economics. I was, as far as I know, the first member of my family to go to university, at a time when very few school-leavers did so. I was not, however, a school-leaver. My secondary school career had ended in failure when I got only two A levels (in chemistry and maths). I switched to social sciences at Handsworth Technical College across the other side of Birmingham. Freed of the obligation to attend sport, school assemblies, cadet training and other ‘character-building’ nonsense, I was able to get three good A levels in a year. Handsworth Technical College also prepared me for the way in which universities organised their teaching, with lectures and seminars spread through the week, with long gaps for self-directed study.

Universities in those days were small institutions, and undergraduates got the chance to listen to lectures by and discuss their ideas with the most elevated of intellectuals. I remember lectures by Michael Oakeshott, Ralph Miliband (father of the current leader of the Labour Party), William Robson, Geoffrey Sterne and Peter Self. My tutors included Alexander Irvine (later Lord Chancellor), Edward Mishan and William Letwin (father of Oliver). The were many essays to write, but exams only at the end of the first and third years. There was none of the endless grind of modular continuous assessment which students experience these days. The LSE was indeed a wonderful place to learn, and not just about economics and politics.

The LSE was in easy walking distance of everywhere in central London, near an incredible range of cinemas, theatres and galleries. To me, the act of becoming a student in London involved joining the world, and learning about art and culture. I went to the theatre when I could afford it, and the cinema almost every week. I bought a student membership of the British Film Institute, and spent any free afternoons watching classic old films. Saturday evenings were less cultural. These were spent at a dance in one of the colleges of the University in a determined (but usually unsuccessful) attempt to find a woman.

This rich life was possible despite a severe shortage of cash. My grant was £110/term, from which I met all living and travel costs. I visited home every half-term and vacation, and saved money by hitch-hiking. I phoned my parents from a call box every Sunday morning at 11am. My main recreation was walking, and I travelled over large areas of London each Sunday.

Most students in the 1960s lived in halls of residence (which in those days were more like barracks than flats) or lived in ‘digs’ with families who registered with the university to provide bed, breakfast and evening meal for a small group of students. I spent the first year living with two other male students with a family in Streatham. This involved daily commuting by suburban train from Streatham to Blackfriars, and then a walk either along the Embankment or through the Temple to Fleet Street. For someone raised in the dreary suburbs of Birmingham, this was all immensely exciting.

In the second year of my degree, I moved to shared rooms in a house in Lordship Road in Stoke Newington, and then in the third year to rooms in Finsbury Park Road. I had to learn to cook for myself, which I did most evenings and at weekends. However, there were alternatives. LSE students avoided the appalling canteen in the School, and gatecrashed canteens in local businesses. My specialty was the Indian High Commission - a large building just across the Aldwych from the LSE. I would enter by an unmarked back door, which was always opened just after noon. Past the lift shaft and along a corridor, I would join a queue of Indians shuffling forward. Food was served by friendly canteen staff from huge metal pots. I would pay and find a table - often the only person in the entire canteen who didn’t look Indian.

Finsbury Park Road was a tough area at the time I lived there. Three doors up from my house was a brothel, inhabited by haggard women and filthy, neglected children. A car was abandoned outside, and the brothel children reduced it to a wreck in days. Requests to the local council had no effect, and in the end one of the exasperated local residents fire-bombed the car. It was cleared the next day. This was the start of Finsbury Park’s long climb to respectability. Now my son lives in a quiet and pleasant street in another part of Finsbury Park. There are no abandoned cars there any more: indeed, residents and visitors need to pay a fee to the local council to park outside their own houses.

There was no fire-bombing at the LSE, but much talk of revolution. By 1968, my quiet world of study was disrupted by strikes and protests. I personally saw little wrong with the LSE, but the various groups of student ‘agitators’ in the School saw protests about trivial issues like security gates as a kind of small-scale simulation for the imminent British revolution in which they would become its officer class. This desire for power coupled with a strange admiration for violence in other countries was cloaked in an intense and intolerant form of Marxist mumbo-jumbo.

Marxism has now all but disappeared, but the LSE lives on. When I can, I meet with other ageing alumni in the West Midlands. We talk about politics like we did when we were young, and I remember the time when I walked out of the working-class Birmingham suburbs into the world of learning.

See also: http:Surviving school


Friday, 31 August 2012

Rank bad

The London Olympics were a festival for statisticians. As an example, let’s look at the men’s 100 metres. Usain Bolt won this race with a time of 9.63 seconds (a new Olympic record). But all the entrants had very similar times apart from Asafa Powell, who pulled up with a groin injury and completed the race in 11.99 seconds. Leaving him out of the calculations, the difference between Usain Bolt and the last-but-one runner was only 0.35 seconds. This tiny sliver of time is only 3.6% longer than Bolt’s winning time. So a statistical summary of the race (using nonparametric statistics because of the skewed distribution) would say that the median race time was 9.84 seconds, with an interquartile range of  0.76-0.97 seconds. Or, in everyday language, they all (but one) ran very fast and there was little variation in the times they took.

Of course, this misses the point. The 100 metres final is a race, and what matters in a race is the ranking of the runners - who comes first (and to a less degree second and third). But just as it makes no sense to ignore ranking in races, it is equally pointless to analyse all areas of human activity as if they are races. This does not stop people doing it, and rankings are now common not just for things that are simple to measure (such as the time taken to run a race), but also for institutions and the complex range of activities they perform. This means that such rankings have to be derived from multiples of ‘scores’ based on a range of unreliable data about something that can never be reliably measured.

As an example, let’s take university research and teaching. Recent years have seen a growing collection of international ‘league tables’ of universities. These use a wide range of data, including statistics about numbers of staff and students, numbers of publications in academic journals, income from research grants, and rankings by panels of academics. This data is then scored, weighted and combined to produce a combined score which can be ranked. Individual universities can then congratulate themselves that they have ascended from the 79th best university in the world to the 78th best, while those that have descended a place or two can fret, threaten their academic staff and sack their vice chancellor.

Yet higher education systems differ greatly between countries, and universities themselves are usually a diverse ragbag of big and small research groups, teaching teams, departments, and schools. This makes a single league table a dubious affair, even if it is based on reliable data. But that of course is not the case. The university in which I worked (by most standards a well-managed institution) struggled to find out what its academic staff were doing with their time, or the quality of their achievements. In other universities, the data on which international league tables are based may be little more than a work of mystery and imagination.

But the problem lies not so much in the dubious quality of the data, but the very act of ranking. Even when the results are derived from a single survey in one country, the results are often analysed in a misleading way. As an example, look at the National Student Survey in the UK, which is taken very seriously by the UK higher education sector. In its most recent form, this comprises 22 questions about different aspects of the student’s university and course. All ranked on a five-point Likert scale and given a score from ‘definitely agree’ (scored 5) to ‘definitely disagree’ (scored 1). A conventional way of comparing universities and courses would therefore be to take the mean score on each scale, and this is how they are analysed in papers like the Guardian. So, by institution, the results for the overall satisfaction question vary from the maximum of 4.5 (the Open University) to 3.5 (the University of the Arts, London).

This is all seen as being a bit too technical for prospective students, and so the Unistats website reports only the responses to a single statement in the Survey: “Overall, I am satisfied with the quality of the course”. It then adds together the number of students with scores of 5 (‘definitely agree’) and 4 (‘mostly agree’) to produce the percentage of ‘satisfied’ students. This is quite a common survey procedure (I admit to having done it myself), but it is flawed. Someone who ‘mostly agrees’ may still have important reservations about their course.

If you actually analyse satisfaction scores, you find that the great majority of universities fall within a narrow range, with some outliers. As an example, ‘satisfaction’ among students with degrees in medicine in English universities range from 99% in Oxford to 69% in Manchester. The median satisfaction is 89%, with the interquartile range between 84% and 94%. So half of all medical schools fall within only ten percentage points around the middle of the range. This makes ranking pointless, because a small change in percentage satisfaction from one year to the next could send an individual medical schools several places up the rankings, but would amount to little more than the usual fluctuations common to surveys of this kind.

A far more useful step is to look at the outliers. What is so special about Oxford (99%) and Leeds (97%)? Alternatively, are there problems at Manchester (69%) and Liverpool (70%)? Before we get too excited about the latter two universities, note that they have levels of satisfactions that most politicians and people in the media would only dream about. However, to see if there are particular problems, we need to look in more depth at the full range of results. We could also see if there is anything distinctive in the way they teach. Actually, we do know that both universities have been very committed to problem-based learning (PBL). This is a way of teaching medicine that involves replacing conventional lecture-based teaching by a system whereby small groups of students are set a series of written case descriptions. Students then work as a group to investigate the scientific basis for the presenting problem and the evidence for the most effective treatment.

Research on PBL in medicine is (in common with a lot of research in education) inconclusive. But medical students are very bright and highly-motivated, and would probably triumph if their education amounted to little more than setting them a weekly question to answer and presenting them with a pile of textbooks to read. Come to think of it, this more or less describes how PBL operates.

Tuesday, 17 July 2012

The impact of research into intellectual disability

Universities in Britain are currently preoccupied with preparing for the next round of assessment for the allocation of government research funding (called the ‘REF’). A particular feature of this round is that universities are required to submit information about the ‘impact’ of their research, which will then be scored and included in the formula for allocating funds. As I noted in a previous posting, the idea that the whole of research can be scored on a simple scale is a prime example of a crackpot idea. However, institutions and governments cling to crackpot ideas, and respond to their inadequacies by making them more complex and hence even more crackpot. So it is with the REF.

Nevertheless, it is still important to consider which research has had the greatest impact, particularly where this concerns the lives of people who are disadvantaged, ill or disabled. If you look at the impact of research in my own field of public policy for people with intellectual disability, you find startling results. The two pieces of research with the greatest impact in the last five years have without doubt been the secret filming by the BBC in 2011 of staff abusing the residents of Winterbourne View, and the 2007 Mencap report Death by Indifference on the death by neglect of six people with an intellectual disability in general hospitals. Both studies received wide publicity and led to government reports, debates in Parliament, and legislation. Yet neither study was carried out by academics, and neither were published in academic journals. No academic study in this field over the last five years has had a remotely comparable impact. Why is this?

One reason is the way in that most policy research in intellectual disability is funded by central government. The Department of Health decided by the time it published the white paper Valuing People in 2001 that the healthcare of people with an intellectual disability was no longer a major policy concern. Little effort was therefore expended on commissioning research into general hospital care for this group of people, and there is still hardly any published academic research on this subject. Department of Health policy also favoured the decanting of people with an intellectual disability into small homes managed by the private sector and a reduction in specialist health services for those with mental health and/or behavioural disorders. However, small homes are usually unable to manage people with severe behavioural problems, and appropriate specialist care is costly. The private sector moved in to fill the gap by providing what were essentially long-term mental hospitals stripped of the level of trained staff that would have been provided if the NHS had managed the institution. The Department of Health did not see this as a problem before the Winterbourne View scandal, and so research was not commissioned.

But even if there had been a desire by the Department to commission this kind of research, it could not have been carried out by academics. Academic research involving human participants needs the approval of an ethical committee and (in the case of healthcare) of the NHS research governance system. Neither of these would have approved of secretly filming staff abusing residents or collecting data from families about how hospital staff caused death by neglect. Health and social services have a streak of self-interest which can be used to obstruct research into the poor quality of the care they can provide. But ethical committees block research for different reasons. Ethical committees apply rules developed in medical research, in which trials of new medications or surgical procedures carry a risk of harm to subjects. Medical trials are required to ensure that subjects give their informed consent to participation, and there are comprehensive guidelines for what counts as ‘informed’ and ‘consent’. This of course has the advantage of transferring liability to the subject. People who have problems giving informed consent therefore present particular difficulties for ethical committees, and it is difficult to gain approval for any research involving children, people with dementia, or people with intellectual disabilities. This is even the case when the research is (like most social research) descriptive and therefore involves no risk of harm from medication or surgery. In fact, the emphasis among ethical committees on the issue of consent makes it almost impossible for an academic to get approval to carry out any descriptive research involving people with an intellectual disability.

This leads to a bizarre paradox: research is deemed ‘unethical’ even where it aims to expose the grossly unethical treatment of vulnerable people (and hence protect them from further abuse). We therefore know remarkably little about the real world of residential care as experienced by this group of people or the real extent of the neglect and abuse they may suffer. This leads to a further paradox. Academics carry out little research in this important field of public policy: we rely for what we know on journalists and investigators in charities.

See also:
http://stuartcumella.blogspot.co.uk/2010/08/research-without-fear.html
http://stuartcumella.blogspot.co.uk/2009/10/great-crackpot-ideas-of-past.html

Tuesday, 10 July 2012

Living the cliché: I remember the sixties

Yes, I remember the 1960s, even though I was there. I had of course spent the late 1950s as a teenage soldier fighting with Fidel in the Sierra Madre. But now I was back in London smoking dope with Mick and Keith, going to mass protests in Grosvenor Square, and raving at the Isle of Wight Festival. I rode the hippie trail to Katmandu, and went to the summer of love in San Francisco, but most of the time I lived in a squat in Chelsea. I naturally had long hair and a beard and wore a flowery shirt and kaftan (bought in Carnaby Street). I had free love with so many girls, who wore (successively) miniskirts, hotpants, and maxi-skirts. I remember that they all had long straight blond hair and lots of black eye shadow (although it was hard to tell because of all the Pink Floyd strobe lights in the discos we went to).

What happy days they were - for someone else. Actually, I was only 13 at the start of the sixties. After a miserable time at King Edward’s School in Birmingham, I spent a year in Handsworth Technical College doing a new set of social science A-levels (which were not taught in King Edward’s School), and went to the London School of Economics. I left in 1968 and started a master’s degree at the University of Strathclyde. I lived in Streatham, Finsbury Park and Glasgow, and never in a squat. Nor did I ever take drugs, though I drank a fair quantity of beer. I did go out with one blonde, but she had shortish hair and was careful with the eye makeup. I am ashamed to admit that I had a flowery shirt and for a time did have a beard.

My experiences of living in the 1960s were probably quite similar to those of many other people of my age. But when I think back, my personal history has become infested with another set of memories: the tired collection of recycled clips and clichés that appear on television programmes about that decade. This raises an important question: at what point do people reject their own memories and start inserting fictional ones in their place? The only corrective to this tendency may be some kind of home-made reminiscence therapy: watch contemporary films made in Britain, which show what the streets looked like, how people dressed, and how they spoke to each other. I recommend you start with A Kind of Loving, This Sporting Life, The Ipcress File, and Alfie.

Tuesday, 29 May 2012

The lost insight

1976 did not begin well for me. I was living alone in a town where I knew no-one, still legally married and waiting for divorce, working unhappily as a social worker in West Lothian. I could have regarded my new single status as an opportunity and travelled. But I was wounded and needed to return home. So, quite by chance, I learnt of a job with the new Employment Rehabilitation Research Centre (ERRC) in Birmingham. I was interviewed in London by the Head of the Centre, Dr Paul Cornes, and became a ‘Higher Social Work Researcher’ with the Manpower Services Commission. The ERRC was an unusual creation. Part of a civil service agency, it was intended to help solve the problem of employment rehabilitation centres. These provided courses a few weeks long for disabled people, with the aim of helping them return to ‘open’ employment (ie jobs in the ordinary labour market, rather than sheltered work). Being part of the civil service was a problem - publications were seen as breaches of the Official Secrets Act. The senior civil servants who had seen the Centre as a solution were soon replaced by newcomers who saw it as a problem. This was not helped by the almost total ignorance among the same senior  senior civil servants of disability or employment rehabilitation. Paul Cornes navigated his role with great skill and increasing frustration, and proved one of the best bosses I ever had.

I joined the ERRC a few months after it began work, and replaced a kindly, committed but rather intense social worker called John Hannigan. The two of us overlapped in post for a week, to enable me to learn from him. We visited one or more (I forget how many) employment rehabilitation centres, and spent a lot of time talking. On the Friday, just after lunch, John gave me his considered verdict about myself, which went something like this:
“You’re a strange contradiction Stuart. On the face of it, you’re a brash loudmouth, but...”
At that point, someone came in the office and I never learnt the rest of the sentence. John’s great insight about me was lost forever. So I carried on being a brash loudmouth, albeit with a newly-acquired doubt that I might be a different sort of person underneath.

I stayed at the ERRC for some years, and, as usual, failed to take advantage of my employment. I did not publish, did not make the kind of contacts that would have developed a career in disability research, and failed to get a transfer to the permanent civil service. However, the research I carried out at the ERRC did form the basis for my PhD, and I spotted a rather special woman on the staff of the employment rehabilitation centre next door. The ERRC has long gone, but our marriage has remained the most important thing in my life.

Wednesday, 23 May 2012

The land of make-believe

You are watching the television programme Who do you think you are?, in which a famous person traces their family background. They start by visiting an aged relative. You see the famous person in their car talking about what they expect to learn from this visit and why this is important to them. You next see the car pulling up to the house of the aged relative and the famous person getting out and going to the front door. Inside the house, you see the aged relative go the front door, open it, and say “What a surprise”.

It is not a surprise, of course. There is a cameraman in the house, probably supported by a producer, a lighting engineer and a sound engineer. The scene will have been carefully set up. The scenes outside the house and in the car (perhaps filmed on a different day) will also have been rehearsed and then edited. The whole scene, in other words, is a skilful fraud. But it is fraud we are used to in factual programmes just as much as in film and television drama. This is the ‘naturalistic’ model, or the pretence that we watch people behaving spontaneously in the complete absence of all the cameras and the teams of people that make the pictures and sound happen. We are so used to this fraud that we no longer see it as such. When we see a lone presenter walking across the moors or up a mountain, we believe she is alone on the hills. Yet we see pictures of her taken from a helicopter, or pictures of her struggling up to some peak taken by a cameraman waiting for her on that peak. We are led to believe the commentator has walked the whole distance by themselves and found their own accommodation or pitched their own tent. The true surprise occurs when a commentator on television does mention his cameraman or production crew.

The frauds can go further. Wildlife programmes intersperse shots of animals in the wild (usually taken by highly-skilled camera team after weeks of careful work) with pictures of similar animals in zoos. This is all to give the impression that when we see an animal such as a polar bear go into its lair in the wild, we are then seeing the same polar bear in its lair feeding its cubs even though these later shots are actually of a different polar bear filmed in a zoo. ‘Reality’ television programmes involve set piece arguments provoked by a presenter, all to give the impression that the performers before us present a shocking insight into the raw side of life. More shamefully, contests on children’s television programmes have been fixed to produce the most entertaining outcome. 

This does not occur because the people who produce television are any more dishonest than the rest of us. Rather, they would probably justify these deceptions on the grounds that they help viewers comprehend an underlying truth (what family history tells us of the past, what a particular long walk is like, how polar bears feed their young, and so on). But they also do so to conform with the public’s expectations of television, derived in turn from films and other drama which, with very few exceptions, obey the ‘naturalistic’ model. This is so well-established that a whole genre of comic television programmes exists in which a presenter points out examples where the mechanics of film-making have accidentally been made evident (a visible sound boom or a shot of a camera crew in a mirror). These programmes are supposed to be funny because they show a breach of a sacrosanct social convention.

This is very different from drama on stage, in which it is more evident that we are watching actors portraying characters and following a script written for them. A good actor can make the audience forget this, provided the audience are engaged. Television, by contrast, requires less engagement by its audience and less suspension of disbelief. Indeed, the naturalistic model and the lack of apparent difference between factual and fictional programmes is a cause of morbid confusion for some television viewers, who send flowers when a character (not an actor) in a soap opera dies. Others have interpreted science fiction films and television programmes as the truth and truly believe that a race of intelligent reptiles have taken on the appearance of world leaders, or that the moon landings were an elaborate fake. For them, television (whatever they are watching) is the true reality, while their daily lives are but flickering images on the walls of the cave.

Thursday, 17 May 2012

Going local


The Conservative Party has recently discovered to its dismay that it is unpopular. This is despite having a leader who seems fluent and intelligent, and who actively promotes ideas about society and politics that deserve some respect. These ideas (summarised as ‘localism’ and the ‘big society’) echo some fundamental themes in British conservatism, dating from Edmund Burke or before. In particular, conservatives propose that society is not a structure that can be demolished and rebuilt at will, but an organic whole, in which each citizen has a wide range of personal affiliations which involve mutual interdependence, guidance, and support. The term ‘big society’ has presumably been coined by David Cameron’s advisors to contrast with ‘big government’, which, according to conservatives, involves the replacement of these relationships by the impersonal imposition of centrally-determined rules, and the creation of subordination to the state in place of the mutual interdependence of free people. Localism is the logical consequence of this central idea. Conservatives see an efficient government as an essential requirement for a civilised society, but wherever possible it should be organised as part of that society, being delivered locally by people who know each other’s needs, rather than imposed centrally.  

Conservatives do not of course believe in equality as a virtue. Indeed, many conservative theorists celebrate inequality - a feature that makes conservatism particularly attractive for those with great wealth or high social status. Yet conservatives are usually reluctant to recognise that the extreme inequalities in this country undermine their very aspirations for both the big society and localism. Wealthy corporations and individuals make decisions with little regard for the day-to-day lives of their workers or the people who live down the road from their offices and factories. They prefer to build houses where they like and erect large standardised supermarkets to replace the distinctive local shops in our traditional high streets. Local control of planning would inevitably obstruct such developments.

These conflicts can be seen in the government’s Localism Act and its proposals for neighbourhood planning. One element of the Act is the ‘community right to challenge’. This involves parish and town councils taking over the running of services previously provided by city, county or district councils. But once a case is made by the parish or town council, then the service is open to tender. This means that the likes of Serco and Capita, which excel at winning contracts but not at providing decent services, will move in and expand their empires.

Another conflict, particularly relevant to my own village, is over new housing. The Localism Act has introduced ‘neighbourhood planning’, which is intended to give individual parishes, towns and neighbourhood of cities the right to develop their own plan for the development of their locality. ‘Town planning’, however, no longer involves the sort of vision that gave the world Bath or even Welwyn Garden City. Instead, it now means little more than the allocation of land for housing and commercial development, and ensuring that the necessary services are in place to support them. Nevertheless, devolving the right to determine land use is potentially important. The English countryside with its scattered communities is precious and vulnerable. But developers prefer greenfield sites (which are cheaper) in country villages (which are more attractive for prosperous commuters). Local control over development sites would enable towns and villages to block the speculative housebuilding which has covered so much of the countryside with breeze block and stud walls.

However, localism and neighbourhood planning contradicts the general  operating principles  of national and local government in England, which are based on centralisation, secretiveness, and subservience to the powerful rather than respect for the views of the ordinary citizenry. In my part of England, three district councils have come together to develop the South Worcestershire Development Plan (SWDP). If it had followed the principles of localism, it would have identified future needs for employment, housing, transport and other public services in the South of the County, and then set some general targets to be discussed and implemented locally. Instead, the SWDP not only specifies individual development sites in each town and village, but also the precise number of houses to be built on each site. The total number of houses for South Worcestershire has been based on national targets (no doubt set after a cosy discussion between senior civil servants and the building industry), and the allocation of housing numbers to individual towns and villages has been based on sites already identified by developers. A ‘consultation’ by the District Council’s planning department consisted of a planner visiting the village hall and telling us what was going to happen. 

And so Malvern Hills District Council, which only six years ago fought against an appeal by a developer to despoil a local view in my village, now supports the SWDP which schedules the very same site for 62 houses. At the same time, it is organising workshops to explain its commitment to localism. Perhaps they should listening to people as a first step.

Sunday, 22 April 2012

The man in the iron knee



The photograph above was taken four years ago, on a trip with my son along the road on the South East side of Loch Ness. Note the bent right knee. This appears in many of the pictures taken at this time - in others I am sitting down, sometimes also with a grimace of pain. The cause of this distress was psoriatic arthritis, which affected my right knee and developed from an irritating pain to crippling agony over a period of two years. Standing with my leg straight became impossible. Walking became an ordeal. I changed in two years from a man who ran up stairs to one who took the lift whenever he could. Even the shortest journeys required careful planning and the use of a car whenever possible. Even so, a trip of a quarter of a mile would leave me in severe pain and sweating with exhaustion.

Pain also crippled the soul. I was less able than I had been to stand up to the usual bullies and egoists that are found in most organisations (and certainly in universities). My sense of personal effectiveness declined, and the avoidance of pain occupied an increasing part of my waking thoughts. Psychologically, I became a painful knee with a man attached.

My return to health required surgery, which replaced my right knee with one made from titanium. This only took place after the NHS tried and failed with the various non-surgical alternatives, from steroid injections to physiotherapy and orthotics. Eventually, my wife and children urged me to request a knee replacement. I went to my GP, and a referral to an orthopaedic surgeon followed almost immediately. The surgeon explained in great detail the operation, the risks involved, and the probable outcome. There was no waiting time for the operation in the local NHS unit, where I spent four nights in a private room with en-suite facilities. After discharge, I received aftercare from physiotherapists and various home adaptations from occupational therapists. None of this of course required any negotiation with an insurance company or personal payments for treatment.

I spent an elated three months living without knee pain, until the back pain began. It seems that adapting to prolonged pain when walking had twisted my back. This was followed by further physiotherapy advice and some adaptations at my workplace. But by then, I had become tired of the long journey to work and took early retirement. Now I have lost weight, walk good distances, and live without pain. I still have the yellow car though.

Monday, 2 April 2012

How to teach skills

There is a crisis at the heart of British education - not university fees, academy schools, or the declining standards of A levels, but the inability of our education system to produce sufficient people with the right kinds of skills needed by employers. How could this happen after two decades in which there has been a massive increase in university admissions? The answer is that universities are good at teaching knowledge and the skills needed to generate and analyse knowledge, but not very good at teaching the skills needed in other workplaces. Expanding the number of people who go to university therefore diminishes the proportion of new entrants to the labour force who have appropriate skills when they begin work.

What is the best way to teach skills? There is a useful example from history. In the middle of the Battle of Britain in the summer of 1940, the RAF upgraded 1050 of its fighter aircraft by installing new constant-speed props. These added 7000 feet to each plane’s altitude and significantly reduced wear and tear on aircraft engines. The way this was done is a model for how to promote innovation and how to teach skills. The new props were developed by de Havilland, who formed a team of expert fitters to tour each airfield. Each squadron was told to select its best fitters, who were trained by the expert team. Training involved three stages: the expert team would first demonstrate one installation; then supervise the squadron’s fitters as they carried out another installation; and finally assess the squadron’s team when they carried out an installation without supervision. If all went well, the expert team would move to the next squadron.

This three stage model of demonstrate-supervise-assess is the basis for all successful programmes for teaching skills. Of course, the de Havilland team was teaching one skill to people who already had many skills in this field. Most occupations require hundreds, if not thousands, of individual skills, and this means that training people to become skilled doctors, plumbers, or lawyers takes years rather than days. Most skills training also involves starting with newcomers who, unlike the fitters in RAF squadrons, initially lack relevant background skills. Some skills can be assessed quickly, but others (particularly those involving an ability to respond to emergencies or to deal with difficult or distressed people) require observation over a long period of time.

The example from the Battle of Britain also shows another essential feature of skills training: it should be done by the most skilled practitioners. These are the best people to judge what counts as skilled practice, and therefore also the best people to supervise trainees and assess whether they are sufficiently skilled to be counted as a competent practitioner.  This model of training, supervision and assessment by the most skilled practitioners has been a key principle in all trades and professions since they existed. Of course, skilled practitioners also need to be knowledgeable practitioners, and so training in professions and trades has traditionally involved more formal classroom teaching, often on a day-release basis.

And that was how almost all professions and trades were trained until recently. Most lawyers, nurses, professions allied to medicine, and librarians did not go to university, but gained their professional qualifications while working in a junior capacity in their chosen occupation. Most craftsmen became apprentices, and attended colleges of further education on day-release. The exceptions to this model, in Britain at least, were divinity and medicine. These were taught at university, although a university degree was not a pre-requisite for a medical career until well into the 19th Century. Teaching medicine at university had the advantage of making it easier to standardise the curriculum and the way it was taught. There remained an emphasis on acquiring skills because students spent much of their time in practice settings under the supervision of medical professors, who were chosen as being the most eminent practitioners in their specialty.

All this changed during the last 30 or so years. Virtually all professions are now taught either as a university degree or following a university degree. This has enhanced the status of these professions, but it has come at a cost. Universities have become vast institutions competing for funds from the state and research funding bodies. Competition for funds has meant a greater emphasis by the universities on their research output, often at the expense of the quality of teaching. This affects the recruitment of senior academic staff, including those who are responsible for teaching skills. As an example, medical professors are now appointed not primarily for their skills as practitioners, but for their research record, even where this exclusively involves experiments with mice and rats in laboratories.

As universities have sucked up educational funds and ever more students, the teaching of skills for the rest of the population has been allowed to atrophy. In Germany, over half of all young people complete an apprenticeship, taught in companies and vocational colleges. Indeed, as many as 40% of all employees in Germany have completed an apprenticeship. The proportion is similar in Australia. In England, it is 11%. Apprenticeships in most countries take three or more years to complete. In England, the equivalent figure is between one and two years.

It is hardly surprising, therefore, that Britain has needed to import trained people from the rest of the EU at the same time that it has record numbers of young people out of work. The failure to train all our young people in vocational skills not only harms the British economy, it also demonstrates a cruel neglect of the young people themselves.

Saturday, 3 March 2012

The discovered country


The best part of a touring holiday is arriving at a place not in the guide books, and finding it to be beautiful and unique. I have had many such experiences. I remember a holiday in Brittany with my wife when my son Andrew was young enough to be in a pushchair. We passed through the small market town of Quintin on the way back from somewhere else, and stopped. We found streets of fine old houses and a chateau. At the chateau café, looking out on the gardens and the river, a well-dressed lady took a liking to Andrew, and fed him Madeleine cakes.

Many years later, now with an adult Andrew and an adult daughter Rosemarie, my wife and I travelled Eastwards from Vancouver. This was a quiet winding route with scenery of staggering beauty. Travel meant crossing a sequence of wooded ridges, each running North-South, with long narrow lakes in the valley floors. We travelled through EC Manning Provincial Park, Penticton, Kelowna (where we ate at an Indian restaurant whose owner was nostalgic for Birmingham), Upper Arrow Lake, and then, on the third day of travel, arrived at Nakusp. This was the perfect British Columbia village, arranged on a small grid plan, with a high street of buildings lined with the kind of wooden facades seen in wild west films.

Nakusp had once been an industrial community, but the lake had been raised and the lakeside industry had been flooded. There were now gardens and a lakeside footpath. The old Leland Hotel (built shortly after the village was established) survived, although rather closer to the lake than it once was. Built in 1892, it is ancient by BC standards. We stayed there and had a view from our bedroom window that people in Switzerland would pay thousands to see.


I wonder if the European Union has any funds for village twinning, so that people from Quintin and my own village of Martley could get to know each other? Nakusp is rather far from Martley, but I intend to visit again as a one-man twinning association.

Monday, 20 February 2012

Graduating joy

University students invest several years of their lives and thousands of pounds of their money in their education. When their course of study is complete, they feel a great sense of achievement accompanied by a desire to celebrate. How well do universities help them in this task. From my experience in England and Scotland, hardly at all. I have attended three graduations as a student, half a dozen as a member of academic staff, and three as a proud parent. I found that these graduations fell into three types:

1.    Stuffy and self-congratulatory. This was my dominant experience of graduation ceremonies, from when I graduated with my masters degree from the University of Strathclyde, my social work qualification from the University of Stirling, and my PhD from the University of London (I didn’t attend the ceremony for my first degree). Also stuffy and self-congratulatory were the degree award ceremonies I attended as a member of staff at the University of Birmingham. All of these events shared a familiar British obsession with meaningless procedures (usually ‘traditions’ of recent invention), wearing funny hats, and a complete lack of fun. In each case, large numbers of students were processed at high speed, with no recognition of their individual talents, the promise they held for the future, or the fact that all the professors, vice-chancellors, administrators and so on depended for their income on the money these students brought into the university. The most memorable of these events was at the University of London. This took place in the splendid setting of the Albert Hall, and was chaired by Princess Anne, as Chancellor of the University. Graduands were supposed to bow to her when they paraded on the stage to receive their degree. Some churlishly refused, and in each case an amused smile appeared on her face. The dullest graduation ceremonies of all were at the University of Birmingham, but the Medical School did make up for this afterwards by holding a slightly more informal reception for graduands and their parents. One irritating feature of Birmingham’s ceremonies was the refusal to honour students who graduated with diplomas and certificates rather than degrees. The University of Stirling could do this, but Birmingham preferred to exclude this group of students (but take their money of course). 

2.    Utterly commercial. All of these graduations, however, were infinitely preferable to Nottingham Trent University (NTU), where my daughter graduated with a BA in Textile Design. The University imposed charges above her fees for all the materials she used in her degree, and then charged parents for attending the graduation ceremony. In other respects, the ceremony was exactly like the others, with a large hall, boring speeches, and an endless parade of graduands. There was a meagre reception afterwards. I suggest NTU’s administration have a chat with its academic staff teaching the degree in management and marketing to learn the concept of customer care.

3.    Small and respectful. The best ceremony I attended was at Leeds University, where my son graduated with a BA in Politics. This was a small affair, limited to students in the same subject, and followed by a very friendly reception afterwards where staff, graduands and parents could meet over a buffet meal. I was pleased to find that the academic staff in the Department of Politics knew their undergraduate students well. The academic speeches were witty, and the several of the students cheered each other when they received their degree. My son went on later to the University of Leiden in the Netherlands, from where he graduated with an MSc in International Public Administration in 2011. I did not attend this ceremony, but I did see a video. This took place in the oldest building in the University, and, after the ceremony, graduands signed their name on its walls, next to centuries of previous graduates and honorary graduates. It must be nice to write your signature next to that of Albert Einstein. But what was particularly good about this ceremony was that each student’s academic supervisor made a speech outlining the subject of their dissertation and its contribution to knowledge.

I have seen pictures of graduations in other countries, which include parades of students through the local town, concerts, gigs, dances and general fun. This could happen here too, but it would need universities to recognise that their primary purpose is the dissemination of knowledge, and that they collaborate with students to achieve this, and that success in this task is grounds for real celebration. Of course, it also requires that the British middle class lose some of its dreary stuffiness and learn how to have fun.

Sunday, 5 February 2012

A few words about snow

The Eskimoes, it is reported in urban myth, have lots of words for snow. But English is sparse indeed when it comes to describing the white stuff. The media usually only distinguishes between a ‘blanket’ of snow and a ‘light dusting’. Both of these terms are ‘dead metaphors’, which must at one time have required some imagination to devise. A similar dead metaphor is used to describe ice, which is always referred to as ‘treacherous’. To the literally-minded, this would suggests that journalists believe that ice is sentient, that it aims to falsely reassure us of its safety, and then without warning and with devious malevolence becomes cold and slippy just as we choose to walk or drive on it.

The main consequence of snow, according to the media is ‘chaos’. This term, originally meaning a chasm in Greek, was later adapted by philosophers to designate the formless void that they believe preceded the act of creation (now renamed the ‘big bang’ by scientists). However, when used in the media, ‘chaos’ simply means any disruption, large or small, to transport timetables. Cars and trains are then said the ‘grind to a halt’. Of course, lines of cars lined up motionless on motorways are the very opposite of a formless void, but they still represent ‘chaos’ to journalists and the general public.

In Britain, transport disruption at the first heavy snowfall of the winter is regarded as a uniquely British phenomenon, not found in more organised countries. It thus becomes an opportunity for an intensive episode of national self-denigration. Yet, at the time of writing, cold weather and snow in much of Europe has disrupted transport and caused many deaths from accidents and exposure. This is true even in Germany (the most efficient country in the World as far as the British media are concerned). Still, it would be a pity to inject some evidence and fresh thinking into the accustomed narratives and dead metaphors. Otherwise, the pages of our newspapers would be empty and newsreaders on television would stand mute before us.

Thursday, 2 February 2012

Driving and the English brain

Rudyard Kipling did not quite say “Who knows England who only England knows?”, but the meaning is clear: no-one can hope to understand their own country until they have become familiar with other places. Only then, can they understand what makes their own land different and special. However, there is an alternative to prolonged travel abroad, and that is to learn from foreigners who have become familiar with our own land. There are of course reports and books by foreign journalists based in London, but the most interesting source of information comes from the blogs written mainly by expats about their daily lives in many different parts of England. These are conveniently gathered in the website Blog England, Expat England. Most are written by women, and include family photographs, recipes, and reports on family holidays and outings. But they also compare England with their country of origin and reflect on the differences.

The bloggers have chosen to live in this country, and are passionate about the beauty of its countryside, and the way in which ancient sites and history are packed into a small area. Those in London are fascinated by the mix of nationalities in the city, and the experience of living in what is probably the biggest collection of theatres in the world. American bloggers are puzzled by the different names for food in England, the small size of domestic appliances, and the problems navigating in a land where street names change at each junction and where city grids are almost entirely absent. But the particular, and almost insoluble, problem for many Americans is having to drive in a car with a steering wheel on the right and with manual gears.

Why should this prove so difficult? After all, a third of the world’s population live in countries which drive on the left, and most us in England adapt easily to driving in the other two-thirds. I think the reason for the particular problems experienced by Americans  lies in how our brains adapts to daily mental exercises. There has been little research on how driving affects the brain, but a study over ten years ago found that London taxi drivers have a larger hippocampus (the part  of the brain associated with navigation) than other people. Indeed, part of the hippocampus grew larger as the taxi drivers spent more time in the job. It is also possible that the layout of the car we drive may also affects how our brain operates.

Driving in the USA and Canada is a right-handed affair. The gear lever is hardly used in the automatic cars that are almost universal in these countries, and it is, in any case, to the right of the driver. There is no clutch for the left foot to use. North America is overwhelmingly right-handed when it comes to eating as well. Diners hold the food to the plate with the fork, and then cut a single bite-sized piece with the knife. The knife is then placed on the plate, the fork transferred from the left hand to the right hand, and used to bring the food to the mouth. The fork is then transferred back to the left hand and the knife is picked up with the right. This elaborate procedure seems to exist to ensure the main tasks in eating are performed only with the right hand, and to avoid the co-ordinated movement of knife and fork required by the usual European style of eating.

Even more left-right co-ordination is required in driving in Britain, Ireland and other places which drive on the left and use cars with manual gears. The left hand is frequently used to change gear and the left foot to press down on the clutch, while at the same time the right hand is used for steering and the right foot for the accelerator and brake. How does this affect our brains? Our brains are divided into two distinct hemispheres by a longitudinal fissure. The limbs on each side of our body are controlled by the opposite hemisphere. Driving (and eating) in left-hand drive countries requires a constant exchange of information and adjustment between both hemispheres. It has been found that the part of the brain connecting the two hemispheres (the corpus callosum) differs in shape between men and women, and is larger among occupational groups such as musicians. My hypothesis is that it will also differ between typical drivers in England and those in the USA. If so, it would explain why people from North America find it so hard to adjust to driving in England.

This could of course all be tested by neuroscience researchers, aided by large sums of research council funding. At least it would get them out of their labs and on to the road.

Expat blog England 

Saturday, 21 January 2012

Flooring the Beast




In May 2011, President Obama’s car was successfully attacked by an Irish speed bump. The car (called the ‘beast’) is a triumph of American engineering. When it was unveiled to the press in 2009, a spokesman said that “Although many of the vehicle’s security enhancements cannot be discussed, it is safe to say that this car’s security and coded communications systems make it the most technologically advanced protection vehicle in the world.” The BBC noted that the car probably included “bullet proof glass, an armoured body, a separate oxygen supply, and a completely sealed interior to protect against a chemical attack... Some joke the car is so tough it could withstand a rocket-propelled grenade. Its tyres are said to work flat, so the vehicle will keep going even if shot at.”

The fate of the beast and its vulnerability to a very low-tech attack by a speed bump is a metaphor for the fate of many military interventions in the last generation. The massive US effort in Vietnam was held back by an army of peasants equipped with AK47 rifles and supplied by thousands of men pushing bicycles on trails through the jungle. In Afghanistan, both Soviet and NATO armies have been defeated by tribesmen with rifles and home-made bombs. Somali pirates make vast areas of the sea unsafe despite all the navies, nuclear-powered aircraft carriers and submarines of the world. Even the Israeli Army was driven out of Southern Lebanon by the Hezbollah militia armed with obsolete Russian rockets and anti-tank missiles.

Why have armies and navies experienced such problems? One reason is that military forces need to be prepared for multiple threats, including the (hopefully rare) possibility of wars against each other. In such a conflict, armour, mobility and firepower would be crucial. They therefore compete to accumulate the best and most modern equipment, and this shapes the way they are trained and organised. The US Army, easily the best equipped in the world, took a few hours to gain victory in both its wars against the army of Saddam Hussein in Iraq. However, the same high technology weapons are less effective against warriors who do not wear uniform and merge back into the local population when their spree of killing is over. High technology armies could of course retaliate by bringing destruction on a vast scale on the civilian populations from which the guerillas come, but governments are now much less willing to wage wars of extermination than their predecessors in the last century. Instead, there have been some limited punitive or reprisal raids, such as those against Fallujah or Gaza. In the absence of action against civilian populations, the main role of soldiers in high tech armies is to be moving targets for snipers and roadside bombs.


There is a further problem with high technology forces - their very complexity. All depend on long and sometimes vulnerable supply chains, particularly for diesel and aviation fuels. The most vulnerable supply chain of all is money, and this depends on the willingness of governments to go on spending it. The experience of the Second Iraq War has shown that it is possible to mobilise public support for a limited period if the enemy can be presented as a threat to the homeland. The Libyan intervention shows that the public will support a short war of bombing. But less than a quarter of British people now believe our troops should remain in Afghanistan, and no political party in this country bothers to make a case for remaining. It is likely that future NATO policy towards that country will fall back on the old British Empire policy of ‘butcher and bolt’ - or punitive raids, carried out (nowadays by drones rather than by the Khyber Rifles) in retaliation to some terrorist outrage deemed to originate in that country.