Wednesday 22 December 2010

Mind and Society

One of the great unresolved questions in philosophy and neuroscience is the nature of mind. Many textbooks in these subjects begin their discussions with Descartes, who proposed a separation between matter (which occupies space) and mind (which involves thought and does not occupy space). Most discussions focus on how these two substances could interact, since the a person’s mind can clearly lead to physical action. One increasingly dominant view (held by ‘materialists’) is that the mind is located in a series of chemical processes in the brain.

Less attention seems to have been given by philosophers and neuroscientists to the slightly different problem of the location of society. This might at first sight seem a puzzling question because the institutions of society and government seem solid and permanent - so permanent in fact, that we give them the names of the buildings that are associated with their headquarters. We speak of the ‘White House’ or ‘10 Downing Street’ to designate their current political inhabitants and their staff. Or, extending the metaphor, we talk of the ‘structure’ of society. Yet this solidity is an illusion because apparently ‘solid’ governments and societies can disappear almost overnight. The German Democratic Republic had large armed forces, a secret police with records on the majority of the population and a network of informers (one for every 6.5 citizens), no overt opposition, and was defended against its enemies (and those of its inhabitants who wished to live elsewhere) by a wall, fences, minefields and machine-guns. Despite surviving with little change for 40 or so years, this ‘structure’ blew away in the wind in a few weeks in 1989.

This indicates that, however solid they may appear to be, societies are generated in the minds of individual people. For years, they wake up each morning and go about their daily routines, thereby maintaining patterns of behaviour that sustain those in authority over them. They may, from time to time, reflect on their lives and decide individually or in small groups to look for other work, seek a new partner, or move to another part of the country. These decisions will take account of the rewards and sanctions for alternative courses of action. On a few occasions, they may act in accord with many others to make a sudden and major change in their daily habits. This will result in attempted political uprisings, outbreaks of mass violence, adherence to a new religious order, support for a new form of music, and other such revolutions.

If society is in people’s minds, then where is it located? The materialist approach is unlikely to be helpful. Even if it became possible to interpret the chemical processes of the brain with greater sophistication than with the current fMRI scanners, all we would find in each individual would be sets of hopes, fears, expectations, habits of thought and expectations of routine behaviours. This would give us some understanding of how that individual functioned in society, but not much else. Instead, we need to think of societies and other human institutions as sets of recurring patterns of human behaviour that are akin to Descartes’ understanding of the mind. They are sustained by thought but are not in themselves a material substance. Perhaps then we could stop talking about organisations of people as if they were blocks of stone and concrete.

See also: Working in the Machine

Friday 17 December 2010

The Quality of Gulag Life

In his novel One Day in the Life of Ivan Denisovich, Alexander Solzhenitsyn drew on his own experience to describe life in a Soviet prison camp in the Arctic. This involved almost unimaginable suffering: imprisoned without any prospect of release for what would not be a crime in any reasonably free society; forced to work long hours in subzero temperatures; appalling food and accommodation; and a probable early death from illness, exhaustion, or murder by the guards. Yet the hero of the book looks back on his day with some satisfaction:
    “Shukhov went to sleep fully content. He’d had many strokes of luck that day: they hadn’t put him in the cells; they hadn’t sent the team to the settlement; he’d pinched a bowl of kasha at dinner; the team-leader had fixed the rates well; he’d built a wall and enjoyed doing it; he’d smuggled that bit of hacksaw blade through; he’d earned something from Tsezar in the evening; he’d bought that tobacco. And he hadn’t fallen ill. He’d got over it. A day without a dark cloud. Almost a happy day” (pp142-143).

This extract tells us something about how people assess the quality of their own lives. Almost everybody would rate the life of Ivan Denisovich Shukhov in the most negative terms, even compared with that of ordinary citizens of the Soviet Union at that time. Yet he had adapted to his way of life, and was able to win sufficient minor triumphs over adversity to regard it as ‘almost a happy day’. There is therefore sometimes a difference between a person’s quality of life rated by an observer, and that person’s own sense of well-being.

Researchers call the first of these an ‘objective measure’ of quality of life, compared with a person self-rating, which is termed ’subjective’. But these phrases are misleading because ‘objective’ measures are simply the subjective assessments of others about what constitutes the good life. Researchers may poll lots of people in a particular society about what they wish from life and hence obtain some numerical score for quality of life, but in the end they are just measuring the extent to which a particular individual conforms to other people’s subjective assessments of what makes them happy. Needless to say, some people willingly choose a way of life that would not appeal to all of us - as monks spending a spartan life of prayer, as soldiers living in barracks facing daily danger, and so on.

Of course, Ivan Denisovich Shukhov did not choose to be in a gulag, but his life there had narrowed his awareness of the alternatives. This may have been essential for psychological survival, but would also have made adaptation after release particularly difficult. Like Brooks, the elderly released prisoner in The Shawshank Redemption, suicide would have seemed a reasonable option. This raises questions about whether we should give preference to people’s own assessments of their preferred way of life over that of ‘objective’ opinion. This already happens routinely when the person in question is deemed unable to make a rational decision because of childhood, mental illness or severe learning disability. But none of us ever make a rational decision about what we wish from life, in the sense of considering and weighing all the possible alternatives. Instead, most of us choose between the limited range of alternatives we know about or just copy what other people do, while a few reckless souls leap into the unknown.

Does this mean that most of us are living in a gulag of our own making? For some people, this is true. It is possible to meet people who live narrow and restricted lives without poetry, the excitement of sport and the stimulation of friendship and good conversation. This can happen even when there are the financial means to live otherwise. My advice for escaping from such a gulag is to travel to new places when you can, try out new foods, learn new languages, make an effort to meet different kinds of people, read more and watch less television.

Read: Solzhenitsyn A (1963) One Day in the Life of Ivan Denisovich. Penguin, Harmondsworth.

See also:

Looking down on others' needs

Tuesday 7 December 2010

I-Spy van Gogh (and a Rodin)

When I was young, I collected train numbers, bus numbers, and also filled in I-Spy books. Each I-Spy book covered a particular topic such as vehicles, aeroplanes, birds, insects and so on. Each would include information about a particular car, animal etc, and space to record where you first saw it. The book awarded marks for each first sighting, graded according to rarity. I Spy books still exist, glossier than they were in the past, but still small enough to fit into a boy’s pocket.

I-Spy books are for collectors, but collectors of memories rather than objects. The dominant drive is the same in both cases: the desire to accumulate for its own sake. However, the I-Spy motive is the most innocent form of accumulation. The accumulation of memories does not usually destroy what is being accumulated, or deny others its pleasures. Collectors of memories do not steal or hoard great works of art, but view them in galleries and tick them off the their mental list.

I suspect that the I-Spy motive drives a substantial proportion of those that visit art galleries and museums. There are of course other reasons for seeing great works of art - to be inspired, to look with awe at a work completed with great skill, or to understand the mind of the artist. However, these more aesthetic objectives are hard to achieve when visiting the major art galleries. This is because crowds gather round the most familiar paintings because they are famous, because they wish to tick them off in their mental I-Spy book of great paintings.

I found this to be true last month in the van Gogh Museum in Amsterdam. The museum was packed, with people gathering around the most well-known pictures. Van Gogh was a particularly obliging artist for collectors because he often painted multiple pictures of the same subject - nine different pictures of sunflowers in a vase, now all spaced round the world in different collections. How many people would appreciate these paintings if they were not in all the art books, and sold for vast sums to museums? Some of the early paintings by van Gogh certainly struck me as dismal efforts.



The potato eaters is a grotesque cartoon of peasant life. The peasants were painted as ugly semi-animal figures with outsize hands and noses. This was supposedly a work in honour of manual labour, but it is degrading. I suspect that the peasants who were models for the picture would have preferred to be seen in their Sunday best, to have their dreams and hopes respected.

A pair of pictures elsewhere in the museum also the limitations of van Gogh’s early works. His view of Paris seemed to me to be an uninspired technical exercise, flat and conveying nothing of the city and its life.



Next to it in the gallery was Bonnard’s view of Montmartre which, like van Gogh’s picture, was painted from the window of a flat. Bonnard’s picture sparkles with city lights in the rain, evoking the excitement of a great city at night. But Bonnard is not as famous as van Gogh, so the I-Spy crowds did not gather around this wonderful picture.




Van Gogh’s later work is of course far more attractive and commercially successful (although sadly only after the artist’s early death). His series of paintings of iris flowers are utterly beautiful and, like many people,  I have a reproductions of one of these on my living room wall. But I was affected much more by the sole sculpture in the Museum by Rodin, of one of the Six Burghers of Calais. This showed the anguish of the senior men of the City, who had agreed to surrender their lives to the besieging English army in exchange for it agreeing not to massacre the inhabitants when the City surrendered. The English king ordered that they walk out of the City gates dressed in rags, wearing nooses around their necks, and carrying the keys to the city and its castle. Rodin’s sculpture portrays the anguish of the starving magistrate, his teeth grit in determination and sacrifice, as he stumbles to surrender and death. These emotions burn our hearts when we think of the lives of our forefathers in Europe who lived through the first half of the 20th Century - the century of genocide and mass murder, the century from hell.



Much more comforting to turn aside and look at sunflowers.

Tuesday 30 November 2010

Empire Windrush Day, Harvest Home and others

All people in England know that our public holidays are a mess. Half of them fall within about six weeks in Spring, there are large gaps in the calendar with no public holidays at all, and none specifically celebrate England and its people. I therefore propose a new set of public holidays, spread a little more evenly through the year, and all designed to celebrate more than just a day off for the banks. First, the holidays we should keep:

▸    Christmas Day and New Year’s Day. These bracket what is now the most important holiday of the year - the one week we all celebrate together. It is impossible to imagine life without this long break in the middle of the cold, dark and wet English winter.

▸    Good Friday and Easter Monday. Let’s keep these as well. They are a bit of a problem because Easter wanders about in a way that is incomprehensible even to the devoutest believers. In this century, Good Friday has varied between the 25th of March and the 18th of April. Still, this is a Christian country with an established church, and we should celebrate its most important holidays.

We should abolish the early Spring break, late Spring break and late August bank holidays, and replace them by these new holidays:

▸    Empire Windrush Day (22 June). This is named after the passenger liner which bought 492 Jamaicans to England in 1948. England has been a destination for immigrants for centuries, and the English are a mix of Gaels, Germans, Scandinavians, French, West Indians, South Asians and a lot more besides. We could choose any day to celebrate the mixing of people that has created the English, but the Empire Windrush is special because many West Indians see it as their Mayflower. So we should respect that.

▸    Harvest Home (Monday after second Sunday in September). This is traditional country celebration of the harvest, which has been adopted by the Church of England as the Harvest Festival and by North Americans as their thanksgiving holidays. The usual time for harvest home is near the Autumn equinox, but I have chosen a slightly earlier date to space out the public holidays better. Harvest Home should celebrate the productiveness of this land and the people who work in it.

▸    Remembrance Day (11 November). This day is widely celebrated in many countries in the World, but not in Britain. This is an insult to the memories of the 994,000 British people who died in the First World war, the 450,000 who died in the Second World War, and the many thousands who have died in other conflicts. We should celebrate their lives quietly, and think also of those who serve in the armed forces.

▸    Shakespeare Day and St George’s Day (23 April). This falls in Spring and in some years even coincides with Easter, but there is nothing I can do about that. This day should celebrate England as a nation which has made a massive contribution to civilisation, science, the arts, and good government. Shakespeare was our greatest poet and playwright: a man who used our language like no other. We should therefore celebrate him every year, as well as the creativity we share.

Thursday 25 November 2010

Two new ways of making money

My search for financial security continues. So here are two new entrepreneurial ideas to be submitted to the Dragon’s Den of the Internet. Cash incentives to be sent to me in a plain brown envelope please.

No. 1. Smokers Airlines. My idea for this came when I heard some years ago that some friends of my parents were no longer taking their usual annual holiday in Tenerife because the airlines had banned smoking. Their need for nicotine was so great, that they were only able to bear the shortest of flights. Since then of course, smokers have been restricted even more, being driven out of restaurants and bars, public buildings and workplaces. This is driven by more than just concerns about health and ‘passive smoking’: all non-smokers can now take part in the joy of persecution, allied to the joy of feeling superior to those who apparently can not control their chemical desires. What better idea, therefore, than to enable this persecuted minority to gather together when they travel. Smokers Airlines will be staffed entirely by chain smokers and will only accept bookings from people who can produce a doctor’s certificate proving they are a smoker. Free cigarettes will be handed out when people board, and nicotine patches when they leave the plane and have to endure the smokeless terminal building. All our planes will of course have voluminous ash trays and plentiful supplies of fire extinguishers.

No. 2. Crash helmets for pedestrians. This idea came to me when I went last weekend to Holland to visit my son. The streets, pavements and squares were full of cyclists. All railways stations and public buildings had acres of cycle racks. Every piece of railing or fence had several bikes padlocked to it. Among all the thousands of cyclists, not one wore a crash helmet. This was not because the Dutch are neglectful of safety, but because cyclists are the dominant traffic life form in the Netherlands. There is no need, as there is in England, to signify vulnerability by wearing bright clothing or a shiny crash helmet. Indeed, pedestrians in the Netherlands are the most endangered traffic life form - my wife was run over twice by cyclists while innocently walking through Amsterdam and Leiden.

Selling the idea of crash helmets for pedestrians will of course require a substantial marketing budget. This will involve media scares about attractive young people (preferably female) who have suffered brain injury as a result of falling over while walking. The danger of tripping up and falling down would be emphasised. Papers would note that this is most likely to happen to young people and vulnerable elderly people, and suggest that crash helmets should be made compulsory for pedestrians in these categories (as a first step). An MP (supported generously by the crash helmet trade association) would introduce a private member’s bill before Parliament. The crash helmets will of course all be made in China.

If these ideas don’t make money, I can always go back to my original idea of complimentary therapy.

Compliments not complements

Thursday 18 November 2010

Antoni Cumella: the greatest Cumella of all

I visited Barcelona for the first time in the early 1970s, just after Franco died. This seemed to be a city on the edge of revolution. The streets were patrolled by menacing squads of the Policia Armada, dressed in well-cut gray suits and lemon yellow cravats. Each night, some would station themselves on the Ramblas, while the surrounding population jeered at them by voice and car horn. They could do so without fear because the old murderous order established by Franco was disappearing with astonishing speed, and the Policia Armada consequently no longer signalled fear. Street signs in Castillian Spanish were being replaced by Catalan. Calles were becoming Carrers. Around every corner in the Gothic Quarter there were headquarters of various short-lived Marxist and anarchist parties, each with a large red flag over the doorway. There were vast left-wing marches down the Passeig de Gracìa. The cinemas were full of pornographic films, mainly variations on Emanuelle. The whole city exuded excitement - Catalonia and Spain had awoke from their long coma, and were joining the world.

I was aware that my family had come from this city, but it never occurred to me to seek out a relative. And so I missed meeting the greatest person to bear my family name: the ceramicist Antoni Cumella. He was born in 1913 in the town of Granollers near Barcelona, and died in 1985. Antoni’s stepfather was a potter, and so he followed in the family business. But he was greatly influenced by Mies van der Rohe’s German Pavilion in the 1929 Barcelona Exposition, and began making ceramics as art. He was a medical officer with the Republican Army in the Civil War, and was imprisoned at its end. After his release in 1940, he exhibited his work internationally, achieving great success in Germany, where he had a joint exhibition with Marc Chagall. Antoni Cumella is famous in his home town, where the local high school (IES) is named after him.

The last time I went to Barcelona, I made up somewhat for my earlier ignorance. I visited the Royal Palace Museum, which has a museum of ceramic art and half a room is dedicated to the work of Antoni Cumella. His work demonstrates wonderful shape and colour, ranging from objects that are of recognisable form, to abstract ceramic sculpture. The pictures show some examples from the wide range of distinctive artforms that he created.





This raises the question for me of why sculpture is regarded by art critics as a kind of high art, while ceramics is seen as a craft. Both require great skill, and the ability to work in shapes and forms, but ceramic art adds colour. 

After I returned from Barcelona, I learnt (from Google) that Antoni Cumella’s work is continued by his sons, still in Granollers, at Ceràmica Cumella. So that’s the destination for my next trip.

More details are at this website:

www.antonicumella.org

Sunday 14 November 2010

The rise of X Factor politics

Once we watched television to see the best singers, actors, dancers. Now we watch incompetent amateurs get a little more competent week-by-week at singing, dancing, or other skills. These are amusingly called ‘reality shows’, but they are as staged and as fake as almost all television. Reality shows come in three distinct formats. The first and least-popular are the talent shows. These involve amateurs or semi-professional performers, cooks, and dancers who have a genuine talent and are assessed in a friendly but critical manner by a panel of experts in that field. In programmes like Britain’s Best Dish, or So You Think You Can Dance?, contestants clearly have a lifetime commitment to their skill, and usually state a wish to become professionals.

The second format is the celebrity learner show. The subjects here are people from entertainment, sport or some other field in which fame can accrue, who lack previous experience in dancing, cooking or whatever, but who learn week-by-week under the guidance of an expert. In Strictly Come Dancing (called Dancing with the Stars in many countries), the winners usually demonstrate a real advance in skill, but never seek a career in their newly-acquired expertise. Their fame may, however, be enhanced.

The third format is the most popular, and is the Cinderella show. These superficially resemble the talent show, but have a quite different appeal. A wide range of people from humble backgrounds are recruited, most with minimal talent. These are weeded out in some amusing but rather cruel episodes, and a select group go forward to become temporary celebrities. In The X Factor, this involves performing before a vast audience, surrounded by dancers, lasers and so on. The appeal lies not in the special talent of the performers, but in their ordinariness. They are cinderellas, transformed suddenly from the sculleries of life to become princes and princesses. The audience can empathise and imagine themselves as stars for a day. The ‘judges’ in these programme understand the rules of the Cinderella shows very well. They are usually not critical of the (lack of) talent of the performers - instead they judge them according to their personal qualities and their ability to withstand the pressure of fame. The mass media too report in intimate detail the personal lives of these new celebrities-for-a-day, but have little to say about how well they sing one pop song or another.

Reality programmes have become so dominant in popular culture that they can affect how the public sees the wider world, including politics. Politics, like other occupations, has its career paths. In most countries, this involves either a demonstrated skill in winning election through a sequence of ever more important elective offices, or some form of apprenticeship in national policy-making. Political skills are hard-won, and success usually requires great perseverance. But reality programmes promote the amateur: since complete amateurs can apparently become singers, cooks, and dancers on television shows, they can surely also become national politicians. Indeed, their very ignorance and lack of experience can be promoted as a sign of their integrity and their ability to represent the ordinary citizen. In the USA, the Tea Party movement extols this kind of X Factor politician, and Sarah Palin is the archetype Cinderella figure. Her opponents fail to understand that criticising her for her aggressive ignorance and lack of experience increases her apparent ordinariness and hence her appeal to voters. Critics, like judges in reality programmes, are booed and barracked by the audience if they dare to make a less than complimentary remark about the Cinderella performer in front of them.

In the UK, there is dissatisfaction but as yet no X Factor politics. Instead, we have gone in the opposite direction, turning to the old elites. The Prime Minister is an Old Etonian, and the cabinet is dominated by Oxbridge graduates who have been privately-educated. The Labour Party, which once provided a route of advancement for people from more humble backgrounds, is now dominated by a group of academic and media families from North London. These elites may be drawn from a narrow range of society, but at least they transmit an impression of confidence and competence. But if the economy and public services deteriorate, voters might decide that it is time for Cinderella to replace the prince.

Friday 12 November 2010

What old men wear

Each generation lives in its own world of customs and tastes, carrying on into old age the habits and styles of dress they learnt in their youth. On the few occasions I go to the supermarket in the morning, it is full of the very elderly - people in their 80s and 90s trundling around behind their trolleys. What I notice most are the strange clothes worn by the old men: shapeless fawn jackets, often worn with fawn trousers and shiny black shoes. Some wear hats, flat caps being the most popular. This style of dress is not worn by any other age cohort, and certainly not by those in their 60s, who are usually seen in jeans and trainers and who rarely wear hats. 

This shows the peril of talking about ‘the elderly’ as if they are a single undifferentiated group. It is more true that there is a major cultural divide between those in their 60s and those in their 80s, resulting from the great postwar changes in British society. A man aged 85 would have been born in 1925, would have served in the armed forces in the Second World War (or at least in some war-related occupation), would have been raised at a time when what was once called ‘leisure wear’ was unknown, listened to swing music and crooners when an adolescent (teenagers had not yet been invented), and thought about sex before contraceptive pills were invented. Someone who is 65, by contrast, was born after the War ended, listened to rock music, wore jeans and casual clothes, and met women whose sexual desire was no longer constrained by fear it would lead to an unwanted pregnancy. A man in his 80s became of age fearing death in an actual war. A man in his 60s was raised in peacetime but feared nuclear war.

This is not to suggest that a man in his 60s resembles younger age cohorts. I was born in the suburbs of Birmingham in 1946. As a child, I lived in a world in which virtually no-one owned a car. The roads were therefore empty, and we played in them whenever we were not at school until it got dark. We walked to primary school by ourselves at quite a young age, and came home during the dinner break for a substantial meal cooked by our mothers who did not of course go to work. Children who stayed indoors were a matter of concern to us all. We knew of hardly any children from single-parent families, no-one with skin darker  than our own, and no-one whose first language was not English. The only celebrities were film stars, and television was broadcast for a few hours a day. There were no computers of course, but we read books and comics.

People younger than those of my age live in a world with new fears. There is the unspecific threat of some sort of ecodoom. But local fears are more important, and constrain the lives of children. Their parents are rightly scared of traffic, but also have fantasy fears that the streets are full of murderous paedophiles. Children still play in the street, particularly in quiet towns and villages, but spend much of their time in front of computers and television screens. The clothes they wear seem to designate membership of one or other youth tribe, each associated with a particular kind of music. This seems a more elaborate world than the mods and rockers I remember. Still, none of them wear shapeless fawn jackets.

Wednesday 3 November 2010

The curse of the course

Beware of your metaphors, for they shall make you their slaves. We use metaphors so often in our everyday speech that we fail to recognise how they smuggle implications into our thinking. One example (discussed in a previous blog) is ‘stress’. Another, much used in education, is ‘course’. This word, presumably taken from horse-racing, has multiple smuggled implications. On a racecourse, all the horses start at the same line at the same time. They all jump over a pre-determined sequence of fences in the same order. They all complete the course at the same finishing line, being ranked according to the order in which they finish.

Applied to education in universities, the metaphor implies that groups of students on a course all start and finish their studies at the same time, progressing through their experience of learning in the same order and jumping over the same set of assessments. Students are ranked at the end of the course, with a mark taking the place of speed of completion. However, students who do not complete the course in the same time as others, are regarded as non-completers and fail. Lets challenge each of these smuggled implications.

In the first place, there is no need for students to start and finish a programme of study at the same time. Many people wish to study part-time, and combine university education with work. This usually means that their studies take at least twice as long as a full-time student. Many part-time students are mature and have families. They are therefore more likely to need a break of studies because of childbirth, change of employment , and so on. This is administratively inconvenient for universities, but part-time study may soon be the only way in which many people are able to pay for their studies. Part-time study has another advantage: academic education can be dovetailed with vocational training, also enabling students to apply their increased skills to the workplace and increase their productivity. Universities began, and largely continue to be, places where people are educated into the knowledge, skill and values required for particular occupations. But the domination of full-time study in British universities has split vocational education from vocations. As a result, employers complain that graduates lack the basic skills needed to perform their work, while many graduates fail to find employment appropriate to subjects they have studied for three or more years.

Secondly, not all students need to follow the same sequence of learning materials. In many areas of knowledge, some subjects do indeed have to be understood in sequence (the ‘building block approach’). But this is not always true. In many of the programmes and modules I have taught, much of the material could be studied in any order. Other approaches to learning require students first attain an overall (if rather simplified) perspective of the subject before studying a series of individual areas in more detail, leading to their developing a greater understanding of its complexity. In such cases, students need a shared introduction to the subject and in some cases a shared conclusion, but can explore the remainder of the curriculum in any order at their own pace.

Thirdly, ability should not be confused with speed of completion. Some people just take longer to learn, but are as capable at the end of their studies as the fast finishers. Why should they be penalised or categorised as failures because they take six months longer to acquire the necessary skills, knowledge and values than the average student? One reason is that universities do not have a defined threshold of skills etc which students should attain. Instead, they rank students at completion of their course by ‘first honours’, ‘upper second’ and so on. Yet this ranking system has become increasingly meaningless because of grade inflation. In the last 10 years in England, the proportion of students awarded a first honours has doubled, while another 60% now receive an upper second. This has happened at a time when the proportion of school-leavers entering universities has increased substantially and teaching hours per university student have decreased.

Why is the metaphor of the course still so dominant in British higher education? One reason is that it is convenient for universities and for the government agencies that fund them. Full-time students can all be processed efficiently on a three-year conveyor belt, exams can all be set at the same time for all, and universities can be freed from the difficult business of co-ordinating academic and vocational education with employers. Governments can fund universities using a simple block grant based on the assumption that the great majority of students are full-time. Indeed, the current funding system in England disadvantages part-time students.

Yet there are types of university education which does not correspond to the course metaphor. Professional training degrees (such as in medicine, nursing, social work and the professions allied to medicine) require students to complete part of their training in hospitals and other workplaces, where they acquire skills under the supervision of senior professionals in health and social services who have been co-opted into university education. Degrees of this kind also aim to produce students who meet a defined standard of professional competence, rather than rank them by the marks they achieve on their assignments.

This model could be expanded to other degree programmes. Then, instead of students being given a general education with limited vocational training followed by employment in a lowly administrative post, they could get a job and study for a vocational qualification part-time. This would probably require more distance learning, but we have excellent institutions in this country which can provide this. It would also free up large areas of our cities which have been given over for student rentals and are occupied for only 30 weeks each year. This would have the very useful side-effect of doing something practical to reduce homelessness.

See: My life as a steam engine

                           

Friday 22 October 2010

The ‘real me’ and the ‘actual me’.

We sometimes explain a desire to break with the routines of life, to do new things or even to go on holiday as a desire ‘to find ourself’ or ‘find the real me’. Since, by definition, we have yet to find the real me, we are usually rather vague about what it looks like, which means in turnthat we can never be sure if we have found it or not. But we talk as if the real me is our authentic identity, more creative and sensitive than the way in which we are usually experienced by others and by ourselves. For some of us, the real me may be a more decisive and courageous version of ourselves. But the common element is that the we use the term ‘the real me’ to designate the type of person we wish to be rather than our actual identity. Few if any of us seek a real me that is a stupider, more insensitive, or more boring version of ourselves.

It is not clear whether finding the real me is ever successful. When we are on holiday, we are often required in a short space of time to drastically change our way of life, encounter new foods, new places, and new ways of relaxation. But we soon re-assert routines - after a few days of disorganisation, we end up going to the same harbour café to drink a glass of San Miguel at the same time each day, have a meal at the same time (although perhaps at a different time compared with home), and look at the same beautiful view each evening. There is a sense that the holiday is settling down as we experience the comfort of the familiar.

Our sense of the real me might therefore shape our behaviour and aspirations, but is ultimately kept close to our actual me by our need for routine. The reactions of others are also an important check on fantasy. Claiming we are a great singer when we are not will usually invite ridicule. We can only persist with this illusion if we categorise others as exceptionally misguided or hostile. Like Don Quixote, we thereby invite humiliation, which we can see when people are recruited for programmes like The X Factor or The Apprentice.

As we get older, however, a subtle change occurs in the relationship between our sense of the real me and the person we actually are. We look in the mirror and see a younger version of ourselves. Less biassed sources of information, such as photographs, come as a shock and are rejected. We still believe we are capable, as we once were, of striding over hills and running down streets, when now we can only stumble. When people ask us if we need help, we refuse because we believe our ‘real me’ is capable and independent. We hold dearly to our discrepant identity so that, like the rejected contestants in The X Factor, we become angry when faced by the evidence of our incompetence. So many old people reject the help they need, and decline into isolation as a means of keeping alive their real me.

Friday 15 October 2010

Enlightenment and Authority


In 1784, the philosopher Immanuel Kant completed an essay on the nature of ‘enlightenment’ and its implications for society. He defined the term to mean the personal transformation of people’s way of thinking, not just by the accumulation of learning, but by individual people’s willingness to derive their own conclusions about life based on their reason, intellect and wisdom. He equated ‘enlightenment’ with intellectual maturity, which he contrasted with depending on others (such as religious authorities) for one’s beliefs and opinions.

Kant was aware of the problems generated by enlightenment. Most people (even those in authority) were not yet enlightened by his definition. But if enlightenment became common, then there would be problems in maintaining order - people might choose to disobey their rulers, women their husbands, and children their parents. The answer was for enlightenment to be coupled with obedience. People could be enlightened in their private life, but should adhere to convention and consequently express accepted views in their work and public duties. The exemplary society in this respect was the Prussia in which he lived, and Kant designated his time as ‘the Age of Frederick’, named after the authoritarian king of that land.

Kant’s conclusion that enlightenment is compatible with and may even require authoritarian government has been a common stance among public intellectuals since his time. Intellectuals may not support such governments with the same enthusiasm as Kant, but they have limited power in wider society, and therefore depend on that of their rulers to ensure the application of their ideas. This has been particularly the case when intellectuals have sought to achieve major changes in the lives of the rest of us. Given the ‘unenlightened’ character of most people, and hence their probable resistance to such schemes to improve their lives, it has been particularly tempting for intellectuals to support the use of authoritarian methods to create a new kind of person. Historically, this has often involved the mass slaughter of many of the older kinds of person. Popular resistance could be rationalised as ‘lack of enlightenment’, ‘superstition’, ‘false consciousness’ and so on.

Even in more democratic societies, intellectuals may find authoritarianism tempting as a means for achieving a better life for the rest of us. Jeremy Bentham proposed an extraordinarily dehumanising regime for prisoners called the ‘panopticon’. Later intellectual reformers proposed that mental illness and intellectual disability could be best managed in vast authoritarian institutions. In the 20th century, public intellectuals argued for massive slum clearance projects, fragmenting communities and re-housing people in poorly-maintained blocks of flats built miles away from their families, employment, entertainment, shops and so on. The Red Road flats in Glasgow (shown above) were one of the extreme demonstrations of this type of social engineering.

What these exercises have in common is the belief that the good life can be discerned by reason, that it can be applied in an undifferentiated way to whole categories of people such as ‘the peasantry’ or ‘the working class’. This simplified view of the complexities and diversity of people may be a product of lack of experience and seeing the world through books and political debates rather than through observation and experience. In other words, many intellectuals are themselves unenlightened by Kant’s definition.

Wednesday 11 August 2010

England’s great divide walk

Several years ago, I read Stephen Pern’s excellent book, The Great Divide. This described his walk along the watershed between the Pacific and Atlantic Oceans in the USA. He began in the Mexican desert, and followed the crest (or as close to it as he could manage) of the Rocky Mountains to the Canadian border. If he had continued North, passing West of Banff, he would have come to the Columbia Icefield. This glacier is the origin of rivers that flow into three oceans - not just to the West and the East, but also to the North. The main Northbound river is the Athabasca, and you can follow it along a road called The Icefield Parkway, as it descends from a mountain torrent to a wide sweeping river between meadows near the town of Jasper. After Jasper, the river travels hundreds of kilometres North to become part of the River Mackenzie, flowing through tundra into the Arctic Ocean.

The courses of rivers and their watersheds are easy to follow on the maps of a vast empty country like Canada, but much harder in a crowded one like England. So what would be the route of a great divide walk in England? Most people would guess that the Northern section would be close to the Pennine Way. This is generally true, although the watershed actually crosses the Scottish border several miles West of the Pennine Way, near Kielder. From there, it heads South along the boundary between Cumbria and Northumberland, towards Once Brewed near Hadrian’s Wall. The great divide walk would then follow the Pennine Way South to Edale, close to the crest line of the Pennines, across Saddleworth Moor and the High Peak. But what happens after you reach Edale?

Following the map, you can trace a strange circuitous route around the heads of lowland rivers systems. First, you would walk South around the Western edge of the Peak District, near the Roaches, and then across Staffordshire to the West of Stoke and Stafford. From then on, my imaginary long distance path is difficult to trace, as it passes through the Black Country over Frankley Beeches to the Lickey Hills South of Birmingham. From then on, it would swing East and then South in a long arc around the catchment of the Warwickshire Avon, eventually reaching the Cotswold escarpment and the Cotswold Way. You would follow this lovely scenic long distance path for most of its way until a few miles North of Bath. The great divide path would then need to head East around the watershed between the Gloucestershire Avon and the Thames until you reach an area called the North Down near Devizes. This is England’s equivalent to the Columbia Icefield, albeit a low-lying hill without ice. From it, rivers head West to the Atlantic, East to the North Sea, and South to the English Channel.

As human beings moved back into Britain after the end of the last Ice Age, they would have travelled along the dry watersheds to avoid the marshy and tree-clogged valleys. The North Down and the nearby Salisbury Plain would then have been the great junction of this Stone Age transport system. Early inhabitants have marked this busy place with rings of standing stones, white horses carved into chalk hills, barrows and mounds. This is a kind of commercial, political and religious city, but dispersed over several hillsides and occupied seasonally. It would be a superb end to my Great Divide Walk. While I walk it in my imagination, others may do so on foot.

Friday 6 August 2010

Research without fear

A week ago, my computer connection to the Internet ceased. This was puzzling because my son was still able to log on from his laptop, via the home wireless network based on my computer. Eventually, I traced the problem to the Norton Security software I had installed. With remarkable success, this had prevented any viruses infecting my computer by blocking access to all websites. Computers of course mimic the humans that create them. The Norton approach is found among experts on ‘security’ who argue that the only way to protect our liberties is to lock up without trial people who might possibly be terrorists, give the police free rein to assault and kill peaceful demonstrators, and subject all citizens to permanent CCTV surveillance. A similar destructive enthusiasm is found in the system used in the National Health Service for assessing the ethics of proposals for research.

Just as there are real computer viruses and real terrorists, so there is a true history of unethical research on patients. The most outstanding, described in every book on medical ethics, was the Tuskegee Experiment carried out by the US Department of Public Health, in which 400 poor black people infected with syphilis were monitored from 1932 onwards. Although penicillin was identified as an effective treatment by 1940, none of the subjects of the research were informed or treated, leading to the infection of their spouses and other sexual partners, and their children. The whole ghastly racist experiment only came to an end when a whistleblower informed the New York Times.

This case is a warning that medical scientists are no more ethical than other people, particularly when the subjects of their research are poor and from racial minorities. Bad ethical practice in research still exists, although usually in a less extreme manner than the Tuskegee Experiment. There have been cases of people included in clinical trials without their knowledge, people being denied effective treatments, pointless research inflicted on people, and so on. To prevent these kinds of problems, the National Health Service has set up a complex network of ethical committees to thoroughly assess all applications for research. This is backed by a parallel system of ‘research governance’, which ensures that the recommendations of ethical committees are followed by researchers, that research in insured by its ‘sponsor’, and that the cost implications of the research for the NHS are taken into account.

As governments have attempted to extend the protection of human rights (the 2005 Mental Capacity Act, the 2004 Human Tissue Act, and others), so the work of the ethical committees has become more demanding. Nevertheless, the system has improved greatly in efficiency in the last few years, and the various local ethical committees strenuously seek to protect the public from unethical research. However, there is a problem - the same sort of problem encountered by all who seek to avoid risk and create a world free of fear.

All autonomous or creative human action, whether individual or collective, involves risk and therefore danger. The risk may be very small, but the resulting fear may lead to disproportionate and even harmful precautions. In my village, both the primary and secondary schools are in easy walking distance for most children. But many parents drive their children to school because they perceive the quiet roads of a peaceful village to be dangerous and the pavements crowded with paedophiles. As a result, the roads through the village at the times the schools open and close become crowded with large 4x4 vehicles. This has the effect of making travel more dangerous at these times, even for those children and parents who do walk to school. Disproportionate precautions of this kind are particularly common among those with a professional responsibility to protect the rest of us. After all, it seems common sense to believe that one can not be too safe or too ethical.

In the case of ethical committees, this can result in extreme precautions being evoked for the simplest and least offensive of research projects. One of my masters students (a qualified and very experienced child mental health nurse) proposed to interview a small number of experienced and qualified paediatric nurses about their experiences of managing children who are admitted to accident and emergency following overdoses. The ethical committee expressed concern at the impact of these interviews on the state of mind of the nurses, and eventually insisted that an independent counsellor be made available to alleviate distress. Another student wished to send a questionnaire to fellow therapists about the impact on their career of changes in NHS employment practices. This has required six months of applications, and permission from dozens of separate NHS trusts. I know of many similar cases.

Why do ethical committees need to be involved when members of staff pose a few questions to other members of staff? After all, if we applied the same procedures to any other walk of life, all collective human activity would cease. One reason may be that all these research projects concerned the care of either children or people with an intellectual disability. Both are included in the ever-expanding group deemed ‘vulnerable’, a term that flashes warning lights to some ethical committees.

I raised these problems with a team sent to carry out a routine quality assurance review of our masters programme. I was advised that the solution is to discourage students from carrying out research of the kind that requires ethical committee approval. This is of course the Norton Security solution and ultimate triumph of risk avoidance - the danger of unethical research in healthcare will be completely eliminated by preventing any research from taking place. The cost of doing this is that we continue to treat children and people with an intellectual disability with medications for which have limited evidence of effectiveness, and that we fail to investigate the reality of care they receive behind the bland brochures and policy statements. As a result, their real vulnerability to poor quality health and social care is increased.

Wednesday 28 July 2010

The wisdom of economics

Economics rarely gets a good press. This is probably because of the regrettable tendency among economists to pose as modern soothsayers: predicting (usually wrongly) the long-term direction of the stock markets, which countries will go bankrupt, and which economies will prosper. But there is great wisdom in economics, which should be known more widely. Here are three concepts from economics, which, if applied, would make British universities far more effective.

The first concept is marginal cost. This means the additional cost incurred in producing one extra item on a product line. A related concept is marginal income, or the additional income received from selling this one extra item. As a general rule, therefore, firms should expand production as long as the marginal income they receive for doing so exceeds the marginal cost. This seems obvious, but it is not how many organisations (let alone universities) operate. I have had very frustrating discussions in which decisions on whether to expand student numbers on a course have been based on average cost/student rather than the (usually very low) marginal cost of adding one extra student to a course that is already in operation. To make matters worse, the government penalises financially those universities which do expand student numbers beyond the ‘planned’ targets that have been set centrally.

The second concept is comparative advantage. This was originally developed to explain how trade can produce additional wealth if each trading partner specialises in producing those goods which it can produce at the lowest cost compared with the cost of producing the same type of goods elsewhere. Comparative advantage can be adapted to analyse the kind of ‘trading’ of activities within an organisation. Suppose an academic department has two members of staff: Cain is an excellent researcher and a mediocre teacher; Abel is a mediocre researcher and a mediocre teacher. What usually happens is that both are required to do their share of research and teaching. As a result, all the teaching in the department is mediocre, while half the research is excellent. If, however, the university applies the wisdom of economics, then they will specialise their staff. In that case, all the teaching will remain mediocre, but all the research will now be excellent. Of course, it would be even better if Abel had been an excellent teacher. But who knows - the more he specialises, the better he might get.

The third economic concept is satisficing. This involves seeking a solution to a problem that is adequate and costs the least. Satisficing is therefore an alternative to seeking a ‘rational’ solution (which would involve comparing all possible alternatives whatever the cost), and to plumping for the ‘excellent’ solution (which involves selecting the best possible  outcome irrespective of cost). Satisficing is what most people probably do most of the time when they come to make decisions about where to live, who to marry, which university to attend and so on. But universities are temples to rationality and excellence: the greatest contributions to knowledge have often come about because academics have taken infinite pains to collect data and have considered radically new ways in which the world and the cosmos can be explained. Such commitment, essential as it may be in scholarship, can have a damaging effect on how academic committees function. Every committee-member thinks up ways in which the outcome of a decision could be even more excellent, and deliberates whether there might be options no-one has yet thought of. It is hardly surprising that university leadership becomes exasperated and takes decisions without discussion or just hands things over to the administrators.

Monday 28 June 2010

The strangest ethnic group

The strangest ethnic group in Britain is the upper class. Most of us in this country spend our lives without meeting an aristocrat, let alone attending an expensive private boarding school, the royal enclosure at Ascot, or any of the various other places that the upper class gather. But every so often our paths cross at a distance, and we have a chance to observe them, and note their outlandish forms of behaviour, dress and speech.

One example from my own life shows how this can happen. In the late 1960s, I was a very poor student in London. Without money, my only entertainment at weekends was to walk. I walked hundreds of miles across London, and at the end of the summer term on a Saturday afternoon one year (I forget which one), I arrived in Knightsbridge. In those days, posh shops closed on Saturday afternoon, and the streets were quiet. I entered a narrow street with a church at one end. Suddenly, the doors of the church opened and a wedding party emerged. The groom and most of the male guests were in the military uniform of officers in the Guards, and they walked, almost marched, arm-in-arm with their wives along the street to a hall at the other end where, I assume, the reception was to take place. Far from the noisy family weddings I was used to, this march took place in complete silence. The only discrepant sight among the military and Georgian buildings was a single scruffy student - myself. This experience, with its discipline, conformism and utter lack of spontaneity, reminds me of the scene at Royal Ascot in the film My Fair Lady.

I was raised in a left-wing working-class family, and so I inherited a suspicion of the upper class, and even a hostility to them. But, looking back, I think these ideas were wrong. There was no reason to believe that the people in the wedding party lacked kindness, consideration or charity towards others. Judging by the performance of the British Army, the men I saw did not lack courage. We should instead regard the upper class as one of many different cultures that may be incomprehensible to each other, but can live together in harmony. Rather than judging people by accent or appearance, we should assess their personal virtue as shown by their deeds to others, their skills, and their opportunity to occasionally bewilder the rest of us. 

Wednesday 23 June 2010

The status wars in universities

Henry Kissinger (quoting several earlier writers) once remarked that disputes in universities are bitter because the stakes are so small. This is true if, perhaps like Kissinger, you regard money and power as the only stakes worth playing for. But academics are not usually greedy or power-hungry - what they seek is status. More specifically, they seek to have status attributed to them by those who already have status in their area of study. There are three signals of academic status: research publications, research grants,  and membership of esteemed academic societies. Status in all three is graduated in an extended hierarchy of infinitely small steps. In each academic field, there is an array of learned journals, ranked mathematically according to how many times the papers they publish are cited by other papers in similar learned journals. This makes it possible to rank the status of individual academics (and the academic departments they work in) according to which journals they publish in, and the ‘citation score’ for their papers. Research grants are of course ranked by size, but also by source. One million pounds from a prestigious peer-reviewed research fund (such as the Medical Research Council or the Wellcome Trust) counts for more in terms of academic status than the same sum from central government or the Lottery. Academic societies are also deliverers of status, in a hierarchy of esteem all the way up to the Royal Society, each with their own rankings of fellows, prizes and other awards.

At the very bottom of this academic hierarchy is teaching. This confers no status in any of these three measures, and, because it involves committing time and energy to meet the needs of other people, also fails to satisfy the essential egoism of many academics. In the past, when universities operated on a more collegiate basis, teaching was seen as a burden to be shared among academic staff in each department. This may still be the case in universities which have maintained a sense of collegiate self-government. But most universities have adopted a corporate model, in which individual academics are set targets for research grants and publications, thereby driving ambitious academics to avoid teaching at all costs. This is not, however, possible for all staff. Competition for research funds is severe (with some only funding 10% of grant applications) and many academics reach a point in their career when the money no longer arrives. In this case, they sink down the hierarchy of academic status and take on more and more teaching and managerial tasks. Some rationalise their diminished status by saying they have so much teaching that they have no time for research.

The funding system for higher education in the UK exactly matches and thus confirms this hierarchy. Research grants are allocated to designated ‘principal investigators’, who can take their grants with them if they transfer to another university. Since their grants usually fund a team of researchers and research students, these too usually transfer, like medieval peasants following their liege lord. This places the most successful principal investigators in a strong position with their university. If they object to policy changes or resent being asked to accept part of the ‘burden’ of teaching, they can move to another institution and take staff, money and prestige with them.

A major source of research grants are the various research councils, which are publicly-funded. But a second stream of public funding is allocated to universities rather than individuals, in accordance with the type and quality of research they undertake. ‘Quality’ is assessed in an infrequent series of assessments previously called the ‘RAE’ but now renamed the ‘REF’. This process involves a series of panels of senior researchers from each academic discipline, who allocate a score on a five-point rating scale for the research carried out in their discipline by each university over the period since the preceding assessment. The scores are awarded, needless to say, on a combination of citation scores for research papers, research grants awarded, and measures of esteem received by staff from their learned societies. The whole process requires a great many panels (67 in the last RAE), and is a massive diversion of funds and precious staff time from research itself. But it is eagerly supported by most research academics because, in a society obsessed by status, the results of the research assessments play the same role that Burke’s peerage or the Almanach de Gotha occupied in aristocratic societies. 

Funding for teaching, by contrast, is awarded neither to named individuals or even for specific courses. Instead, each university is given a block grant for a designated number of students, with this sum topped up by student fees. The block grant is higher if the student attends a clinical or a laboratory-based course (because these are more expensive to deliver), but the same sum arrives if the course is excellent or poor, or whether it is taught intensively or neglectfully. As a result, no status or corporate power attaches to staff who specialise in teaching, and few academics could name the leading teachers in their field of study.

There is, however, one real problem with this hierarchy. Just as status-obsessed aristocrats could bankrupt their countries, so research can undermine the finances of their universities. This is because much research does not cover its costs. Funding from research councils meets only 80% of the its full cost, while many charitable bodies cover little more than the direct costs of the research projects they support. Income from the RAE covers some of this deficit, but many universities subsidise research from funds allocated for teaching. This size of this subsidy is likely to increase because funding from charities and payments for research from central government is likely to decline in response to the economic recession and the consequent poor state of public finances in the UK. Faced by this problem, universities have proposed that student fees be raised while the cost of teaching be reduced. This latter can be achieved by accelerating the existing trends of reducing the number of staff specialising in education, transferring teaching tasks to research students, and cutting back teaching hours.

If this strategy is successful, it will have catastrophic effects for this country. Virtually all professional training now takes place at universities. A poorer quality of higher education would therefore mean less skilled lawyers, doctors, engineers, economists, librarians and so on. Whether or not students are in professional training, they would be less likely to have their ideas challenged, to learn to assess and analyse evidence, or develop skills in the laboratory. The strategy of subsidising research at the expense of teaching would thus have the paradoxical outcome of depriving us of the researchers of the future.

See also: http://stuartcumella.blogspot.com/2009/10/great-crackpot-ideas-of-past.html

Wednesday 16 June 2010

The New Era of Good Feelings

Britain has now entered a new era of good feelings. Two of our political parties are in coalition, and outwardly agree on the fundamentals of policy. The third party is in opposition and electing a leader from a range of candidates with a shared lack of any policies at all. There may be disagreements for the sake of form, but all party leaders agree that there should be budgetary cuts (a bit more or a bit less), that Britain needs to ameliorate climate change by ‘green’ policies, that schools should have greater autonomy (usually involving the private sector), and that the UK should remain engaged in the endless war in Afghanistan.

The first ‘Era of Good Feelings’ was a period of about eight years in the USA after the end of the War of 1812. One of the two main political parties had collapsed, and the other soon ceased to function. Many of the outstanding issues of the day (particularly the geographical expansion of slavery and the creation of a national bank) which had previously divided politicians were, for a time, resolved. Most leading politicians were either slave owners or tolerated slavery, and shared a commitment to territorial expansion and aggressive dominance of the USA in the Americas. Lack of party competition resulted in falling turnout at elections, and in 1820 President Monroe was re-elected (by the electoral college) with only one dissenting vote.

This lack of organised political conflict did not of course mean that no issues divided Americans, merely that their leading politicians chose not to express them. Once the presidency became vacant in 1824, none of the four main candidates won a majority in the electoral college. The election was then decided in the House of Representatives, and  John Quincy Adams was elected as a result of a backstairs deal. Andrew Jackson, who had won the largest number of votes, bitterly attacked this decision as corrupt, and began organising a political party to promote his candidature for the next presidential election. His rivals formed another party, and politics returned to an era of ill feelings as each party sought areas of discontent to exploit for votes.

The new era of good feelings in Britain does not follow a war or the collapse of one of the political parties, but it does correspond with the intellectual collapse of the traditional political parties. There is simply no intellectual content remaining in socialism, liberalism or conservatism. All parties proclaim they are ‘green’, and portray themselves as the more effective managers of the national consumer society. The party leaders (and the would-be party leaders in the Labour Party) are also remarkably similar. It is true that one of these candidates, Diane Abbott, is distinguished from the others on physiological grounds, but shares with them (and the leaders of the Conservative and Liberal Democrat parties) an education at either Oxford or Cambridge Universities, and a lifetime career in the media, public relations, and the junior ranks of politics. It is hard to think what any of these potential leaders would do differently in office from the two party leaders that currently control the Government.

It is difficult to predict how long this new era of good feelings will last. It is possible that one of the party leaders will spot an opportunity to speak on behalf of rising discontent. However, insurgent politics is probably more likely. This would arise if a politician from outside the three main parties was able to effectively articulate hostility to all of them. This has happened in the USA with the ‘tea party’ movement and Sarah Palin, whose apparent lack of sophistication combined with her startling ignorance of the wider world are taken as signs of authenticity by her followers. We await with terpidation her British equivalent.

Wednesday 12 May 2010

Murder in the village.

The worst-informed people in any society are those who watch the most television. It is of course possible to learn a certain amount about the world by watching the news. But most news programmes only skim the outward appearances, usually without any explanation for why there are riots in one place or starvation in another. In any case, people who watch the most television do not sit gazing at 24 hour news channels - they watch daytime television. Their knowledge of the doings of humankind is thereby based on personal confessions of dysfunctional families, celebrity gossip, house redecorating and repeats of murder mysteries.

The murder mysteries, at least those shown on British television, are the most misleading of all. These show that the most dangerous places in this Kingdom are Oxford University, country houses, and villages in beautiful countryside. Fortunately, the victims and perpetrators of all these murders are restricted to a small number of members of the local elite, living in splendour, speaking upper class English, and surrounded by horses, land rovers and mixed herbaceous borders. The ordinary rural population of England do appear in murder mysteries from time to time, distinguished by their comic ways and all-purpose rural accents.

I have lived in a small country village for over a quarter of a century, arriving almost by accident because there were few houses on the market and one my wife and I could afford became available. There have been no murders and hardly any crime at all. But there has been a great deal of friendship, shared community activity, and effective local leadership. There is also the immense beauty of the West Worcestershire countryside, of hills, fields, and footpaths.

One late summer evening just after we moved into our bungalow, I lay on the small front lawn, looking through the woods opposite to the ancient country church, listening as the village bell-ringers rang the oldest complete set of bells in England. Later I watched the pipistrelles circle above me at dusk. I decided to stay. My wife and I raised two children in our small bungalow. They walked each day to the village playschool, then the primary school and finally to the local high school (also, fortunately, located in the village). They could play in fields and the quiet local streets.

Living in a village had its costs for me: each job was further away, and meant a longer commute. Traffic got heavier and the trains more crowded. But now I work from home, and look from my office window on the same trees and the same ancient village church, and see the seasons come and go.

Friday 30 April 2010

Bad news at breakfast

When you travel across the world, you seem to spend a lot of time lounging around in-between places - airport lounges, hotels, railway stations. When you are there, you watch people, read, and look at a lot of television. It doesn’t matter that you don’t always speak the language - most of the programmes are remarkably understandable, following formats that vary little from country to country. One of these is breakfast television. This almost always includes a middle-aged man and a somewhat younger woman. The man has the reassuring but bland look of the chair of the members’ committee in the local golf club. The woman is always attractive and wears different clothes each day (no-one notices whether the man wears a different suit from day to day). Much of the programme consists of a friendly chit-chat between these two, interspersed with reading the news from the Teleprompter. When they report some disaster, war, or human interest story with associated suffering, they both assume serious or even troubled expressions. They often read alternate sentences. When the man reads, the woman will either look at him or make a range of appropriate expressions for the camera. Generally, however, the mood is jolly. Any troubling news is rapidly followed by celebrity gossip, entertainment chit-chat, or some amusing story. What you don’t get much of is news.

When you do get the news, there is almost never any explanation. Why are people in Bangkok wearing red shirts and rioting? Why do the people in Gaza seem so angry? How come so many Southern European countries are in financial difficulties. Do not look at breakfast television for any understanding. What news exists is dominated by pictures. A disaster which kills two people in the USA (with film) is far more ‘newsworthy’ than one which kills a thousand (without film) in Bangladesh. Terminology is slack. Alabama is apparently in ‘Southern America’, while Japan is one of the ‘Western’ nations. The vast diverse continent of Africa is spoken of as an undifferentiated entity. Breakfast television exists in a kind of collective Korsakoff’s Syndrome, with no memory, no awareness of context, and complete lack of insight. The ambition of its presenters is not to overcome these deficiencies by becoming real journalists, but to star in one of the many televised talent shows (Dancing with the Stars, Dancing on Ice etc).

Every so often, real journalists do break into breakfast television, in cases where the talents of the presenters prove insufficient. Then we hear the political correspondents, the economic experts, and so on. These do know what they are talking about, seek out stories, and write interesting books in their spare time. However, they do lack the good looks of the breakfast television presenters - expertise, it seems, does not always coincide with a pretty face.

Friday 16 April 2010

The Dark Heart of Suburbia

There are places we are supposed to find menacing: neglected churchyards down a lonely country lane in a cold night with the wind howling round a ruined church tower; or derelict urban streets covered in gang slogans inhabited by diverse menacing locals. But these are all clichés. My view is that true horror is found in the mundane, particularly in the English suburban streets of detached, semi-detached and terraced houses. Horror in Fred West’s terrace house, with bodies stuffed in the cellar, the garden and in the wall cavities. Or horror in Dennis Nilson’s house in London, where the dismembered remains of his many victims were found by DynoRod, called in to unblock the drains.

I confess that I was born and raised in a standard English suburban location (‘community’ seems an inappropriate word). When I was a child, Shirley was a string of shops along the main road from Birmingham to Stratford, plus several streets of semi-detached and detached houses. It had once been an attractive village, and there was still an old blacksmith’s forge and some timber-framed buildings. I was able to walk to the countryside from my home, and the streets were full of children playing. But the old buildings were demolished, and Shirley stretched ever further over what had been the beautiful green fields, woods and hedgerows of Warwickshire. The stream where I used fish for sticklebacks became a ditch in the middle of a housing estate. I used to walk through a wood and come out the other side in a cornfield. Now the wood remains, but is surrounded by houses. Shirley stretches for miles along the main road, lined with the dreary shed cities of supermarkets, DIY stories, household furnishers, and electrical goods stores. The roads are full of cars (parked or moving), and children sit indoors watching television and computer games.

Shirley has not had its mass murderer, but it did have the only political party in England to collapse because its leader set fire to his wife. The Shirley Ratepayers’ Association in July 1997 had three councillors on Solihull Borough Council: Trevor Eames, his wife Ursula, and Brenda Otton. Ursula had been having an affair with a council official, David Parfitt, and photographic evidence of this had been collected for Trevor by Brenda. According to the press, David and his wife met Trevor and Ursula to discuss things over dinner. This does not seem to have resulted in reconciliation, because Trevor subsequently attacked David with a hammer. Ursula eventually decided to leave Trevor, and he responded by throwing a glass of petrol over her and setting it alight. He was sentenced for seven years in prison afer causing what the papers uniformly describe as ‘horrific injuries’ to his wife. This seems to have been the end of the Shirley Ratepayers’ Association as a political force, but not of Trevor.

He emerged from prison after four and a half years, and resumed his interest in local politics. He soon after stood (unsuccessfully) for the council, and is now a very active secretary of the Solihull and Meriden Residents Association (SAMRA). In the picture below he is standing next to Nikki Sinclaire who at that time was a representative of the UK Independence Party (UKIP) in the European Parliament. Ms Sinclaire has now been expelled from UKIP, after refusing to join meetings of the group to which it is affiliated in the European Parliament. Apparently, she discovered that UKIP and its affiliates are ultra right-wing and intolerant of her lesbian sexuality. The latest development is that SAMRA candidates (including Trevor) are contesting every seat in the Solihull Borough elections, while Nikki is standing for the UK Parliament as a SAMRA candidate. The horror continues.

Saturday 10 April 2010

South Pacific politics


My encounter with New Zealand politics came in Paihia. This is a pleasant town in the far North of New Zealand. It has a subtropical climate (Tangiers is on the comparable latitude in the Northern Hemisphere). There is a small grid of wide streets surrounded by densely-wooded hills  facing an enclosed bay. Ferries cross frequently over the bay to the fine historic town of Russell. On the third side of the bay are the Waitangi Treaty Grounds, where the treaty between the Maori and the British was signed. The date is now commemorated as New Zealand National Day, which in London becomes an occasion for exiled Kiwis to take part in a Circle Line pub crawl.

My family stayed three nights in Paihia in early March this year, in a penthouse apartment in a splendid motel called (rather strangely) the Swiss Chalet Lodge. One day at lunchtime, I walked the few yards down to the waterfront, and then along the sea front to the shops. I stopped at a small booth selling kebabs and ordered one. I chatted to a genial Kiwi and his mate. I learnt he had family connections with my part if England and had visited it. I asked his name and discovered he was Phil Goss, the Leader of the Opposition in the New Zealand Parliament, former foreign minister, former trade minister etc etc. Phil was touring by bus to campaign against an increase in the local version of VAT. ‘Campaigning’ seemed to be a matter of stopping and chatting to people.

This all seemed appropriate for a profoundly egalitarian country like New Zealand. Phil told me that when he was a minister he visited London and was met at Heathrow by an official chauffeur-driven car. As usual, he sat in the front. The driver was horrified and told him that ministers always sat in the back. That says a lot about life in England: some politicians think they are too superior to their drivers to sit alongside them, and the drivers regard their inferior status as right and proper. If you come across a British politician in our general election, look where he sits in the car.

I remember another observation Phil made. He was surprised that people in different parts of England all spoke with different accents. New Zealand has an accent of its own but few if any local variations. I don’t believe this situation will last. The country has a bigger area than the UK but has only four million inhabitants. About a quarter live in Auckland, but the rest are scattered over small cities, towns and villages separated by hills and mountains (and linked by narrow winding roads). There are strong local loyalties: several people in the South Island told me how much more beautiful it is than the North Island; while people I met in Coromandel were keen to prevent people from  Auckland building holiday homes in the Town. Local loyalties mean people try and distinguish themselves from outsiders, and speech is one way to do it. Over the next century, Kiwi ingenuity will create a range of new accents, possibly with even more unexpected vowels than the ones they use at present.

Monday 5 April 2010

Working in the machine

 
The best parody of industrial work is by Charlie Chaplin in Modern Times. He shows men working to a pace set by machines, and how the most ordinary human activities (scratching your armpits, swatting a fly which lands on your face) disrupt production. The factory has a time card for going to the toilet, and a big cinema screen inside the toilet so that the manager can threaten workers having a quiet smoke. The workers in the film are, as far as the management is concerned, a regrettable necessity - they are only valuable in as much as they themselves become machines. Humanity is an obstacle to efficiency.

This is a familiar picture of factory life, raised to a peak of supposed efficiency in firms which applied the principles of ‘scientific management’ developed by Frederick Taylor in the early part of the last century. Scientific management involved the careful measurement and evaluation of each human activity in the production process, which could then be defined precisely, so that workers could be set production targets and rewarded according to  degree of attainment (called ‘piecework’ in the UK). Scientific management essentially views an organisation as a machine with human components. It is hardly surprising that it became the standard way of organising repetitive industrial processes (most typically automobile manufacture) and was enthusiastically supported by Lenin and the Communist Party of the Soviet Union.

The shortcomings of scientific management have been recognised for many years. Workers rarely look with enthusiasm on any organisation which treats them as interchangeable and disposable components. Unless very closely supervised, they will take their revenge on the organisation by strikes or sabotage, or cut corners to meet targets at the expense of the quality of output. Even if workers do thoroughly internalise their job description, problems occur. The job description becomes the limit of their responsibility: workers no longer cover for each other or collaborate in solving problems, while they pursue the rules of the organisation to its detriment. When workers in the 1960s, wished to bring an organisation to its knees without going on strike, they ‘worked to rule’.

The alternative to seeing an organisation as a machine, is to see it as a society of varied individuals with diverse skills, who interact with each other to complete tasks and solve problems. They can be motivated to perform well and creatively not just by money but also by their loyalty to their colleagues, their sympathy with the aims of the organisation, their commitment to their customers, and their personal standards and self-respect. This view of the organisation was a key element in success of the Toyota Management System, described by James Womack and colleagues in their book The Machine that Changed the World.

You know that you work in an organisation as society when its workers celebrate each others’ birthdays, marriages, promotions and departures. There will also be a strong organisational culture, with a collective wisdom about what actions work and don’t work. At their best, organisations as societies can provide a strong sense of identity and meaning for peoples’ lives. There can, however, be problems. Organisations with strong social bonds can becme inward-looking and exclusive: the organisation comes to exist only for its workforce rather than to meet the needs of its customers, students, clients or patients.

Expressions of concern about this displacement of goals have been used to justify the ‘modernisation’ of public services over the last 20 years. It has been proposed that governments need to re-assert  strategic direction and the maintenance of quality by setting up a series of public and private agencies to take over many of the activities formerly provided by employees of national and local government departments. Output and quality targets are set for each agency by central government (or one of its nominated QANGOs), while quality of services is monitored by a further set of agencies. It is proposed thereby (in the words of Osborne and Gaebler in their book Re-Inventing Government) to separate ‘rowing’ from ‘steering’: leaders of individual agencies can use their skills to motivate their staff within a set of strategic objectives set by government.

This model has been extended throughout public services, even to organisations like universities which, in the UK at least, were never previously managed by national or local government. The effect has been catastrophic. Setting enforceable targets for agencies drives their managers to cascade these targets throughout the organisation, even to the extent of setting targets for individual members of staff. As the organisation as a whole becomes ‘mechanised’, co-operation between staff diminishes, which leads managers to promotes elaborate job descriptions and procedures manuals. These become quasi-legal documents within the organisation, and thus eliminate opportunities for creativity. Staff become demoralised, and the loss of quality familiar in traditional industrial work spreads to the professions and to public services.

Of course, the mechanisation of public services can be seen as part of a wider pattern in which the whole of society becomes regulated and monitored. We may not have television screens in our workplace toilets yet, but we have CCTV cameras almost everywhere else.

Sunday 28 March 2010

The road of ageing

An earlier posting noted that people who lose in ‘reality’ talent competitions on television usually describe their experience as a ‘journey’. This is one of many clichés. Some people claim to be ‘born again’, others to have reached a ‘turning point in their life’. At times we have a sense of opportunities not taken and difficult tasks preferred, as in Robert Frost’s wonderful poem:

Two roads diverged in a yellow wood,
And sorry I could not travel both
And be one traveller, long I stood
And looked down one as far as I could
To where it bent in the undergrowth;
Then took the other, as just as fair,
And having perhaps the better claim,
Because it was grassy and wanted wear;
Though as for that the passing there
Had worn them really about the same,
And both that morning equally lay
In leaves no step had trodden black.
Oh, I kept the first for another day!
Yet knowing how way leads on to way,
I doubted if I should ever come back.
I shall be telling this with a sigh
Somewhere ages and ages hence:
Two roads diverged in a wood, and I-
I took the one less travelled by,
And that has made all the difference.

Most of the time, however, we do not experience choice of this kind: we just see signs on the road that tell us that for some time our lives have been heading in an unexpected direction. This is true of the experience of ageing. My first such sign was in 2001 on the Great Barrier Reef near Port Douglas in Far North Queensland. The family had split up for the day - my wife went on trip in a glass-bottom cruiser and I took my two children (then aged 15 and 12) on an escorted snorkelling expedition. I made the usual assumption of fathers that my job was to protect my children while encouraging them to explore. But they swept ahead after the group leader, plunging down to follow a large turtle along the Reef. I was unable to keep up, floundered, and surfaced. I saw the boat a few yards away, and wondered if I would reach it. My children, I realised, needed to look after me.

Now I have passed another sign of ageing - retirement. On Friday, I went to a party at the University organised and paid for by my most generous colleague Dr Qulsom Fazil. I am also grateful to all the people who came and wished me well, and for generous gift. Anyway, this is voluntary early retirement. I intend to carry on teaching intellectual disability, preparing distance learning texts, and writing about social policy. But I will have more time to also write to prisoners of conscience, to improve my language skills, and to walk through the hills and fields of Worcestershire.

http://stuartcumella.blogspot.com/2009/04/cliche-rears-its-ugly-head.html

Sunday 21 March 2010

Memories of the British Empire

The main road running East-West through central Vancouver is called ‘Broadway’. When you drive along it, the names of the streets which cut across it are hung above the road. Driving East after Alma Street, you cross over Collingwood Street and Waterloo Street, Blenheim Street and Balaclava Street, and then Trafalgar Street. Names of the battles and heroes of the British Empire in the 19th Century are found all over Canada, Australia and New Zealand, together of course with streets, parks, gardens, cities and towns and one Australian state named after Queen Victoria herself. They are part of the cultural heritage of Britain, as much as the English language, the rule of law, representative government, fish and chips, and the kilt and the bagpipes.

As British people settled in different lands, they adapted this heritage to take account of the native people and settlers from other countries, the strange new landscape they found themselves in, and the very distance from the mother country. They innovated new ways of building, farming, and governing themselves, and also new ways of speaking English. The most extraordinary adaptation to the lives of the native peoples took place in Canada, where British and French settlers learnt from the First Nations how to survive, travel and trap fur in the vast boreal North of the country. Adaptation to the new landscape came more slowly. For several generations, British settlers tried to make their new land resemble the old - importing plants, animals and pests, and creating gardens of rose bushes and herbaceous borders. Eventually, their descendents came to love and feel at home in the bush, and venerate indigenous trees and animals and national symbols. Their farmers adapted to the opportunities of a warmer climate, and in New Zealand and Australia developed the best wines in the world.

To a visitor like myself, the most striking signs of innovation in Canada, Australia and New Zealand are the towns and the buildings. In all three countries, the smallest communities have very wide streets, often set out in grid. Many have wooden buildings with elaborate facades, while, particularly in New Zealand, shops have ‘verandas’ (canopies over the street to protect the pedestrian from rain and the sun). These superb vernacular building styles are shown below, from Trafalgar Street in Nelson, New Zealand.

For comparison, there is a photograph, below, from the beautiful town of Nelson, British Columbia.


All of these countries settled from the British Isles have subsequently received people from many different lands. But the continuity with Britain remains strong. In 2003, my family visited Eastern Canada and one day came to Fredericton, the capital of the Province of New Brunswick. Fredericton was named after one of the sons of King George III, and was first settled by loyalists moving North after the American Rebellion. It is a small city, noted for its educational institutions and its support of the arts. There is a fine cathedral in English Gothic style facing a provincial parliament building, which, like most parliaments in the former British Empire is Gothic rather than Grecian or Roman in appearance. This is an outward display of the political theory that freedom is based on the continuity of the law rather than abstract reason. I asked to visit the provincial parliament building, and was surprised to find the Provincial Assembly in session. Speeches were bilingual, with each member switching easily between English and French. As I left, an august lady swept from her car to enter the building. She was the lieutenant governor, arriving to sign into law the bills that had just been voted by the assembly. I felt pleased and proud that such a civilised way of life, with its origins in the best British traditions, was maintained so well amidst a vast and beautiful wilderness of forests and mountains.

Wednesday 17 March 2010

Modest proposals for reducing poverty in the UK

Now that an election is near in Britain, our political parties have once again discovered that many of our people live in poverty. Poor people may lack the resources to bankroll political parties and are unlikely to make the dinner tables of senior politicians, but there are an awful lot of them, and they have votes. About a fifth of the population of the UK has an annual income of less than 60% of the median. For a single person, ‘poverty’ is therefore defined as a weekly income of less than £115 after tax and housing costs are met (see the website http://www.poverty.org.uk/summary/key%20facts.shtml. Reducing poverty would mean increasing wage rates for those on or near the minimum wage, and raising welfare benefits (particularly for families with young children). This is expensive, and would require unwelcome action for governments, like increasing tax on the very wealthy or diverting the billions of pounds of public money spent on the privatisation-management consultancy-IT complex.

So here are my own ideas for what government can do:
1. Declare all low wage earners to be non-domiciliary. Most people living in poverty who are of working age live in households in which one or more people are in work. Yet they still pay income tax and national insurance on their miserably low earnings. Very wealthy people such as Lord Ashcroft and Zac Goldsmith have avoided this inconvenience by declaring themselves ‘non-domiciliary’ in the UK, even though this is where most of their income is derived. I would extend this privilege to all people on low incomes, who can be given notional residence in the Cayman Islands, Jersey, or other locations for tax refugees which happen conveniently to come under the British crown.

2. Declare all poor people who live in houses with gardens to be farmers. Farm subsidies are always presented as a means of supporting low-income farmers. In fact, most of the cash goes to big land-owning corporations and the most wealthy farmers (the Duke of Westminster, the richest man in England, receives £300,000/year). Landowners also get a subsidy not to grow anything at all (called ‘set aside’). My plan would be to extend these EU subsidies to all low income people with gardens. They would receive guaranteed minimum prices for the potatoes, runner beans etc they grow in their back gardens and allotments, or set-aside payments if they grow grass, flowers or concrete. This would all come from EU funds, and so would have limited impact on British government expenditure.

3. Set up local versions of the House of Lords in each area of deprivation. Members of the House of Lords currently get £80/day attendance allowance on a SISO basis (‘SISO’ means sign in - sod off). There is also an overnight allowance of £160 and generous allowances for travel and other costs supposedly associated with having a title. In my plan, membership of each local house of lords would be open to people on low incomes who are not eligible for the first two payments I have proposed (ie those who are not receiving a wage and do not have a garden). Getting poor people to advise on poverty would also be a pleasant change from paying large sums to management consultants and academics to undertake this task.

So we can see that Britain has one of the best welfare states in the world for wealthy people. All we need to do now is to extend it to the poor.