Tuesday 29 December 2009

Have yourself a merry/melancholy/some other emotion Christmas

Christmas looms so large in our minds that it distorts our sense of time. Rather than seeing one day ahead, of feasting and entertaining, Christmas appears like a prolonged season. Women (who in most households are responsible for provisioning, entertaining etc) stock up as if preparing for the siege of Malta. Siege-panic is also driven by the many examples in the media of well-stocked Christmas tables, surrounded by contented families enjoying the most expensive gifts. However, the urge to eat well and buy gifts is driven by more than imitative consumerism. Parents give gifts to bring happiness to their children, and, by seeing the pleasure these gifts evoke, to themselves. The desire for your children to be happy is a recognition that their lives will not always be so. But seeing their happiness pushes from our minds the pain we feel when we think of those we have loved (still love) that are no longer round the table, telling their usual jokes, laughing in the way we remember. There is another source of melancholy, derived from the passage of time. We remember previous Christmases, when we were younger and had the hopes and expectations of life we see in our children.

This mixture of emotions explains the persistence of Christmas songs that express regret and disillusion, like The Pogues’ Fairytale of New York, and Greg Lake’s I Believe in Father Christmas. Another popular song, Have Yourself a Merry Little Christmas from the film Meet Me in St Louis, differs in being melancholy because the singer looks ahead pessimistically to future Christmases. This made the lyrics unacceptable for those who dislike emotional complexity, and new, more upbeat, ones were devised for Frank Sinatra to sing. This re-writing follows another pattern of Christmas - the hope that over-eating, alcohol and jolly songs will swamp any emotions we find painful. Spending becomes a way in which we can avoid understanding ourselves.

Thursday 17 December 2009

The Privatisation-IT-Consultancy Complex

In a speech at the time he left office, President Eisenhower warned the USA about the power of the military-industrial complex. By this, he meant that US foreign and defence policy was increasingly driven by the needs of an informal coalition between the military and defence contractors. These groups came together because both benefited from an expansion of military expenditure, and were able to generate political support by exaggerating the external threats facing the USA. Since in Eisenhower’s time the main challengers to US dominance were communist states, this meant depicting any foreign leaders not aligned to the USA and any uprisings against foreign regimes as part of a single world ‘communist’ conspiracy. US foreign policy became driven by a kind of collective paranoia, with the USA involved in military interventions and murderous wars in an attempt to fight the incoming tide of this supposed conspiracy. Communism has declined, but the paranoia remains. Imaginary wars on drugs and ‘terror’ have involved real wars against the people of Iraq, Afghanistan and the frontier regions of Pakistan.

New technology is promoted with particular enthusiasm by the military-industrial complex, ostensibly because it offers the possibility of gaining an advantage over prospective enemies. But new technology is usually less reliable and more expensive than the equipment it replaces. The more unreliable the technology, the longer the contracts, which offers the prospect of greater profits since contracts for military equipment are awarded on a cost plus basis. And equipment failure and cost over-runs can always be explained to the public as an inevitable risk required for the effective defence of the state.

In British central government, domestic expenditure is far more important than spending on the military. Our own equivalent of the military-industrial complex therefore is built around on the vast sums spent on health and social care, public order, and transport. Public funding for these areas can be seen as a treacle pot from which members of the complex can feed, and there are three ways in which they can do so.

The first is through the privatisation of what were previously regarded as being government services. This has expanded vastly since a neo-Thatcherite government came to power in 1997, and has been rationalised as introducing competitiveness and the efficiencies of the private sector into public services. A key component of privatisation is the Public Finance Initiative (PFI), which involves public authorities paying contractors high fees to rent from them newly-built hospitals, schools and other public buildings (usually with additional payments for servicing the buildings). Besides PFI, there are contracts for transport, prisons, nursing and residential care, cleaning services in public buildings, employment services and so on. Needless to say, finance raised to pay for PFI and related schemes has involved large transfers of public funds to the banking sector.

The second way is through consultancy fees. These have expanded with the introduction of contractual relationships throughout government, usually following privatisation. One or more of the big four professional services firms (PWC, KPMG, Ernst & Young, and Deloitte Touche Tohmatsu) appear in every such transaction, and are even brought in by government to audit each others’ work, and report on each others’ failures. They have managed to sell a spurious technology of management expertise to politicians and the public, such that no major governmental decisions can be made without their engagement.

The third way is through major IT projects. These have involved the world’s most expensive database (the NHS Programme for IT) which will probably have a final cost of £20 billion, plus such projects as the database for identity cards (£12 to 18 billion), ContactPoint (the database recording all children, costing £1 billion), The Libra court management system (£341 million), and the C-NOMIS offender management system (£279 million). All of these systems have spectacular cost overruns, and have failed to deliver the benefits they initially promised. Very large parts of the NHS Programme for IT are hardly even used.

The complex can therefore be referred to as the ‘Privatisation-IT-Consultancy Complex’, or ‘PIC Complex’ for short. As with the military-industrial complex in the USA, the PIC Complex in Britain shapes government policy in its own favour. There is no public demand for identity cards, and security experts see them as having a negligible impact on public safety. The phenomenal cost of the scheme is seen as an argument against it. But the very cost of the scheme is the reason it exists - it will produce an endless stream of public funds from the treacle pot for IT companies, professional services firms, and the financial sector. Few if any experts in child care asked for a child database, but we have paid for one which will almost certainly not work, and will divert resources from services which do protect children. The diversion of resources from services which deal with people into IT schemes which enrich the PIC Complex has already happened with C-NOMIS.

How can this be allowed to happen? One reason is that the concentration of power in English government gives great leverage to the PIC Complex. Its members only need to convince a small number of senior politicians and civil servants to secure multi-million pound contracts. This saves all the bother of selling products to lots of local councils, hospitals schools etc, let alone having to convince the public who ultimately pay for them. This is a symbiotic relationship - privatisation and large centralised databases weaken rival local centres of power and hence maintain this leverage. Convincing small numbers of senior politicians and civil servants has proved easy because some are for sale. This is not traditional corruption in which a contractor pays money and then gets what they want, but a new time-displaced version in which the politician favours a contractor and is subsequently rewarded with a seat on the board, consultancies etc. I would not of course allege such a thing about Patricia Hewitt MP, the former Secretary of State for Health, who is currently earning £12,500/month from BT Group (a major contractor for the NHS Programme for IT), and £4,600/month from Alliance Boots (which is a major pharmaceutical distributor to the NHS and a contractor for privatised primary care services). October was a good month, when she also earned £15,000 from Cinven, which owns a chain of private hospitals which contract with the NHS.

Academics are also subsidiary players in the PIC Complex, providing essential legitimation services, using suitably high-minded and abstract terms like ‘contestability’ and ‘choice’. But more of that in a later posting.

Thursday 3 December 2009

Learning our incompetencies

All men at an early age believe they are talented at football. Some (like myself) realise their lack of talent in the first years of primary school. Others persist in years of hope until the final disappointing realisation that theirs will not after all be a life of Premiership fame, Ferraris, and beautiful but expensive girlfriends.

Learning our incompetences helps us avoid wasting time on activities where there are more talented rivals, and enables us to concentrate on what we can do effectively. However, there are some areas of life in which this learning usually fails to take place. Most people, against all the evidence, continue to believe they are ‘above average’ drivers and lovers. Many people also believe that, despite lack of training or any previous evidence of competence, that they can run a pub or a restaurant.

The results of this delusion can be found in every town. Here is an example. My wife is a member of a group of women (which she calls the ‘ladies’) who have worked together in the past, and celebrate each others’ birthdays by meeting for dinner. The restaurant is usually one chosen by the lady with the birthday. The most recent meal was at a new Indian restaurant in Worcestershire. The ladies met there at 8pm. The decor was pleasant, and there were poppadums and chutneys awaiting them at the table. But no waiter came to ask for their orders for drinks, and there was a long delay before any waiter came at all. When this did eventually happen, the ladies decided to skip a starter and ordered main courses. Nothing happened for a very long time. The group at the next table decided that they too had waited for far too long, and decided to leave. The waiter pursued them into the street and told them the food was ready. The food then came for the ladies. It was lukewarm, indicating that it had been cooked some time ago and then forgotten.

No waiter came to ask them if they wanted to order a dessert, so the ladies decided that they would move directly to coffee. A coffee pot arrived, but no cups. The ladies found some in the restaurant and served themselves. They then seized a waiter, who produced three bills. These included one for £27 worth of lager, which they certainly had not drunk. There followed a process of negotiation to reduce the bills. The toilets were in the basement, but when reaching them, the ladies found they were closed, and they were redirected to the disabled toilet at the top of the building (!) By the time they were due to leave it was 11-30pm, and the front door of the restaurant was locked.

The sad thing is that the ladies could have gone to the Spice Cuisine in St John’s in Worcester and paid rather less for an excellent meal, cooked rapidly and served efficiently. But then, the people who run and work in the Spice Cuisine have discovered an area of life in which they are supremely competent. The staff at the other restaurant could try being footballers.

Wednesday 25 November 2009

The Hadamar Clinic and humanity

In the summer of 1941, the staff of the Hadamar Psychiatric Clinic in Germany held a party with beer and wine to celebrate the 10,000th patient they had murdered. Initially, the doctors and nurses of the Clinic killed their patients by lethal injections. However, this proved far too slow a process, and so they devised the more cost-effective system of using carbon monoxide in fake shower rooms to kill large numbers at once. The people they murdered were a diverse group of disabled and mentally-ill people, but most had an intellectual disability (called ‘learning disability’ in the UK) and were thus deemed a threat to the efficient survival of the German ‘race’.

The policy of mass murder of people with an intellectual disability by the Nazi regime was an extension of a widely-approved public policy of eugenics. In the first part of the 20th Century, eugenics attracted support from left and right, and from leading intellectuals. Indeed, it was the more socially-progressive societies like Sweden and Canada which at that time most ardently promoted the eugenic policies of the compulsory sterilisation of people with an intellectual disability, and their incarceration in mental handicap hospitals. Eugenics became discredited by its association with the Nazis, and public policy in almost all countries today favours the integration of people with an intellectual disability into the day-to-day life of society. The old hospitals have closed, the rights of disabled people are increasingly protected, and strenuous efforts are made to improve their employment opportunities.

It is tempting to see these changes as marks of ‘progress’, in the sense of an incremental improvement of civilised values from barbarism to humanitarianism. But this would be historically inaccurate. The harsh policies associated with the eugenics movement replaced those influenced by more humane ideas in the 19th Century, which had emphasised the potential of all people with an intellectual disability for learning and social improvement. Rather than a march of progress, our recent history has been a struggle between two views of humans and their worth. Eugenics represented the idea, popularised in the Enlightenment, that human beings are distinguished from other species by their rationality. Rationality then becomes the measure by which people can be ranked, but also a means of determining the general arrangements of society. This legitimises ‘social engineering’, or the application by those in power of measures to shape the lives of those deemed less rational than themselves. A fear in the early 20th Century that the less rational sections of the population were increasing in numbers compared to the more rational created support for eugenic ideas, and led to the ultimate Nazi social engineering project at both the Hadamar Clinic and the mass industrial-scale killings that followed.

There is another and older view of humankind: that each of us possesses an essential essence or soul, that gives us our human character and by which we can be judged. The quality of our souls is unrelated to our physical strength, our intelligence, or our rank in society: a person with an intellectual disability can have a soul more worthy than that of a scientist that sneers at him. This is of course a fundamentally religious outlook, which should make us very troubled by attempts to drive out religion in the name of science. What, therefore, should be the duty of the scientist and the intellectual according to this older view of mankind? It should be to find out the truth and tell it to others, to maintain knowledge in society from one generation to another, to help people reflect deeply on their values and choices, and to do all this with humility.

Tuesday 17 November 2009

The curse of the generic

I became an (unqualified) social worker in the early 1970s, just after the creation of social services departments in England. These merged three former organisations, each of which comprised professionals skilled and experienced in work with distinct groups of people: children’s departments employed children’s officers who dealt with child protection and adoption and fostering; welfare departments employed welfare officers who maintained long-term contact with disabled people; and mental health sections of local health departments had their mental welfare officers who supported people with a mental illness living outside hospital.

The decision to merge these departments followed the ‘Seebohm Report’, which correctly noted that some families were involved with two or three of these agencies, and incorrectly proposed that it would be more efficient to have a single generic ‘family’ service. The resulting merged social services departments were large and had management hierarchies rather than being led by a senior professional. The commitment to ‘generic’ social work, in which each member of staff dealt with the full range of clients, became departmental orthodoxy. Both these trends led to a rapid exit of the most skilled and senior staff. They were replaced during my first year as a social worker by people like me: well-meaning, untrained and incompetent.

The results across the country were a radical decline in the quality of child protection, and support for disabled and mentally-ill people. The first indicator of this was the avoidable death of the child Maria Caldwell. The subsequent official enquiry identified that a major cause of institutional failure was the confusion among social workers about whether their primary responsibility was to the child or to the ‘family’ (ie her parents). This was the first of many such enquiries, which led to a succession of management ‘solutions’, from inter-agency committees, registers of children at risk, centrally-imposed targets, inspections, child databases, and repeated re-organisations. No-one in power paid much attention to enhancing the professional skills of social workers involved with children, enabling them to develop specialist skills, or setting up the kind of small specialised and professionally-run departments that had been a success in the past. When specialism did arrive, it was implemented as part of a bizarre governmental reform which merged local authority child protection services with local education departments.

Why this resistance to specialism? I think it is a product of the managerial control that arises with the creation of large public organisations. In small organisations, staff are known as individuals, and there is an awareness of their different strengths. Staff can be assigned to different work informally, and their supervisors can generally assess their performance by observation and informal meetings. Large organisations see staff as a block of people to be matched with some quantitative indicator of workload. It is easier to move people around if they are supposed to have generic responsibilities rather than diverse specialist skills.

The drive to generic work and consequent de-professionalisation arises in many large public organisations. This can occur even in organisations in which specialist professional skills are regarded by almost everyone as being essential. The National Health Service has attempted to grade all its diverse professions on a single ‘knowledge and skills framework’ - a spectacular example of the kind of ‘blue skies’ (crackpot) thinking that occurs in very large organisations. At the same time, the government has attempted to reduce the time spent in specialist medical training. There are similar trends in universities. These value their most highly skilled staff, at least as long as they attract large research grants, but post-doctoral researchers and academic staff who specialise in teaching are sometimes treated as classes of helots, interchangeable and disposable.

The curse of the generic partly explains an apparent paradox: as public organisations get larger and employ more managers, the less competent they are in delivering effective public services. There are other explanations for this paradox: the conversion of previously-autonomous professionals into highly-regulated functionaries produces the alienation familiar in industrial process work. Also, long management hierarchies move decision-making further from the organisations’ customers, who usually encounter junior members of staff with limited authority to adapt procedures to meet individual needs.

We therefore need revolutionary change - towards small-scale public services, with a re-assertion of professional specialism and autonomy. We need to down-size schools with thousands of pupils, so-called ‘local’ authorities which cover wide areas of the country and multiple and formerly self-governing towns, and large welfare departments which fail to protect children at risk or adequately support the disabled. Of course, some public agencies will always need to be large: big cities need governments, and the large numbers of students in higher education will probably require large universities. However, authority can be devolved within cities to community councils (as in Scotland and Wales), while universities can operate more on the Oxbridge model, with academic staff working in semi-autonomous colleges. After all, Oxford and Cambridge Universities have hardly been failures despite lacking the benefits of centralised management.

Thursday 5 November 2009

Disneyfying the body

People have probably always regarded animals as being versions of themselves, albeit devoid of everyday speech and with some enhanced senses (smell, vision) or physical abilities (strength, speed). People develop deeper relationships with animals, relying on them as workmates, regarding them as personal friends: their grief at the loss of a favoured pet matches what they would experience at the loss of a child or sibling.

This tendency seems to be an extension of our innate ability to empathise with each other - an ability said to be lacking among people with Autism. It is one step from understanding animals as if they are human to telling stories of them as humans, speaking and wearing clothes. I call this process ‘Disneyfication’ after Walt Disney, who set up theme parks full of humans pretending to be animals, to resemble cartoon characters of animals resembling humans.

Disneyfication does not stop with animals. Genes can be Disneyfied too: Richard Dawkins has sold a lot of books called ‘The Selfish Gene’. I went to a presentation this week on techniques for regenerating cells in the Central Nervous System. The researcher spoke unselfconsciously about neural cells ‘choosing’ between options, and ‘preferring’ one binding site to another. I am sure a Disney or Pixar cartoon of neural cells, dressed as people, falling in love, and plotting with each other to rebuild a brain will arrive soon at local cinemas.

Tuesday 3 November 2009

Surviving school

My parents were very ill-advised about my secondary education. I was raised in Shirley, a suburb just outside the Birmingham City boundary. At the time I was in primary school, Shirley fell within Warwickshire Education Authority, regarded as one of the worst in the country. Children from Shirley who passed the ‘11 plus’ exam had the option of going to grammar schools in Birmingham, and many took advantage of this. However, by the time I was eleven, Shirley came within the new Solihull Education Authority, which set up its own grammar schools. The whole of my ‘A stream’ class in primary school went to the new Tudor Grange Grammar School, while my parents sent me, alone, to King Edward’s Camp Hill Grammar School in Birmingham. At 13, I passed to King Edward’s School, which at that time was a direct grant school, with a combination of fee-paying students of well-off parents and scholarship boys from ordinary families. So all my social connections and friendships were disrupted twice in two years, and I moved to a school where I knew no-one and with an ethos I found incomprehensible. To make matters worse, I needed to take three buses each way to get there, with a commute of over an hour.

The ethos of the school was incomprehensible because it involved a strange nostalgia for an imagined mediaeval England. The headmaster renamed his post ‘chief master’ to revive some supposed tradition, and the school was organised into ‘houses’ to mimic private schools. The school’s sports teams only ever played private schools such as Bromsgrove School, Malvern College and the like, and never the oiks of such schools as King Edward’s Camp Hill. Around the school were boards with the names of old boys who had gone to various Oxbridge colleges. London University was given a board on its own, and there was one for ‘other universities’. Classics was the most esteemed academic subject, and there was a sneering approach to anything practical. I completed a science project on computers (then in their very early days), and the teacher dismissed it for being too technological. There were endless petty distinctions and grades among the students, all to create a world of dull conforming hierarchy. So a school in a metal-bashing city famous for its inventiveness and outspokenness produced students equipped to thrive in a synthetic medieval nostalgia.

I left the school with relief at the age of 18 and went to Handsworth Technical College and completed three social science A levels in a year. My life opened out: I enjoyed the chance to learn as an adult, study things I enjoyed, and meet people from different cultures. I then went to the London School of Economics and studied at a time when English universities were at their peak, full of new ideas about education and the world, taking students from a wider social background than before, but not yet swamped by the vast numbers who come to university now. In those days, undergraduates were tutored by professors and the leading academics of the day: mine included Alexander Irvine (later Lord Chancellor); Edward Mishan (the first economist to challenge the worship of growth); and an American lawyer William Letwin (father of the Conservative politician).

Now my life has come full circle. I work in a university about a quarter of a mile from King Edward’s School. It takes me a long commute of an hour to reach work from home. The University is thankfully more aware than King Edward’s School of the need to be in the forefront of knowledge and to engage with its City. But I miss the intellectual challenge I encountered at the London School of Economics, and the hope for a better future which inspired me when I was a younger man. I was unhappy with the synthetic mediaeval nostalgia of King Edward's School, but I can now see that, for its teachers, it was an attempt to maintain the continuity of the human spirit after the terrible years of war they had experienced. Their beliefs, however old-fashioned, were in any case superior to morality of money and power that eventually triumphed in England.

Monday 26 October 2009

The rule of anarchy

The most stupendous revolutionary poem ever written in English is Shelley’s Mask of Anarchy (www.artofeurope.com/shelley/she5.htm). Shelly wrote this after hearing of the Peterloo Massacre. The poem describes a vision of an evil masque in which a parade of ghouls (murder, fraud, hypocrisy and anarchy) appear wearing masks resembling the political leaders of the day. The most terrible ghoul is anarchy, who resembles King George IV.

Last came Anarchy: he rode
On a white horse, splashed with blood;
He was pale even to the lips,
Like Death in the Apocalypse.

And he wore a kingly crown;
And in his grasp a sceptre shone;
On his brow this mark I saw -
'I AM GOD, AND KING, AND LAW!'

With a pace stately and fast,
Over English land he passed,
Trampling to a mire of blood
The adoring multitude

Shelley’s association of anarchy with the supreme authority and power of the state differs from most popular uses of the word, in which ‘anarchy’ is used to designate a disorganised powerless multitude in which private appetites are fulfilled by force. Shelley’s view (and mine) is that most people, left to themselves, will organise their lives peacefully. The main causes of violence and hatred are the abuse of power and the love of power respectively. These are more likely to emerge from those who have already accumulated power, seek more, and fear its loss.

The control of the powerful has therefore become the major political and legal enterprise in any society. This takes the form of ensuring decision-making is shared with representatives of those affected by the decisions, that proper deliberative procedures are followed, and that decisions are implemented on the basis of clear rules rather than arbitrary favours to cronies and courtiers. Needless to say, societies which operate in this way tend to make better decisions which command greater consent. They also tend to be more prosperous: people will work to accumulate money if they can be assured it will not be seized from them by some arbitrary decision.

These principles are well-understood when people think of societies and governments, but not when they think of big organisations. Corporations, banks and universities can all experience the rule of anarchy. Rules may exist in the organisation, but they are by-passed by the powerful. Off-the-cuff decisions replace due process and the thorough discussion of options. Arbitrary favours are handed out to cronies and lovers. The central aim of the organisation becomes the accumulation of prestige, wealth and power for the small ruling junta. Such organisations, like some banks recently and many commercial organisations in the past, make increasingly grandiose and inept decisions, and crash to disaster. Commentators are then amazed that such apparently powerful and dominating figures - like Fred Goodwin (formerly of the Royal Bank of Scotland) or Kenneth Lay (formerly of Enron) - should fall so suddenly. To appreciate their achievements, we need to return to Shelley:

I met a traveller from an antique land
Who said: "Two vast and trunkless legs of stone
Stand in the desert. Near them on the sand,
Half sunk, a shattered visage lies, whose frown
And wrinkled lip and sneer of cold command
Tell that its sculptor well those passions read
Which yet survive, stamped on these lifeless things,
The hand that mocked them and the heart that fed.
And on the pedestal these words appear:
`My name is Ozymandias, King of Kings:
Look on my works, ye mighty, and despair!'
Nothing beside remains. Round the decay
Of that colossal wreck, boundless and bare,
The lone and level sands stretch far away"

Tuesday 20 October 2009

Great crackpot ideas of the past

Some people collect train numbers, some collect beer mats. I collect ideas. The most prized items in my collection are the great crackpot ideas that have inspired men to gleefully slaughter each other over the past decades. We could order these into a top ten, ranking them by millions of deaths through war and starvation, but this would take more research than I have time for at present. So here’s one great crackpot idea to be going on with: The peoples of the world can be arranged into separate states, each of which should comprise a distinct nation.

The ‘nation-state’ is a great crackpot idea because the peoples of Europe and the rest of the World have not distributed themselves into neat geographical clusters. Of course, rulers have tried for centuries to make their subjects more homogeneous, usually by attempting to eliminate inconvenient groups that persisted in adhering to minority religions, languages or customs. The ‘nation-states’ which dominated Europe after 1918 sought even more vigorously to make reality fit the crackpot idea, with mass transfers of population, suppression of minority languages, and exterminations. The creation of the European Union is an admission by all but a few recalcitrant nationalists in Europe that the nation-state has been a catastrophe for the continent, and that a muddled unity is preferable.

There are lesser crackpot ideas too. These cause havoc and dismay but do not usually involve major loss of life. An example is the research assessment exercise (RAE) for allocating funds in British universities. This allocates about £2 billion/year, according to a grading of ‘research excellence’ across all disciplines. These grades are assigned by a series of expert panels based on their assessments of published research papers submitted by universities, and on various intangible fudge factors like ‘research impact’. RAE assessments are completed every few years. The total cost of this exercise has been estimated at £47 million, mainly in the time spent by university staff in preparing submissions for the expert panels.

What makes this a crackpot idea is not just the high transaction cost of the exercise, but the impossibility of grading all human knowledge from genetics to philosophy on a simple rating scale. Most academics could probably agree about the leading centres in their discipline, but there would be probably be little reliability in any ratings beyond that. To complicate matters, there is some excellent research in some mediocre university departments and vice-versa. Academics would also struggle to reliably rate research which crosses traditional disciplinary boundaries or which challenges dominant paradigms. Watson and Crick were lucky to have done their research before the RAE existed.

To make matters worse, the RAE grades past research, not current performance. Because of the long intervals between assessments and the delays in academic publishing, some of the research assessed may be based on laboratory or fieldwork almost a decade old. Panel ratings are therefore exercises in the history of knowledge. There is an interesting contrast when academics bid for funds to carry out new research projects. In this case, the research councils and the various charitable research funding agencies assess the academics’ recent performance and the prospect of a useful outcome from their proposed research project. Unfortunately, research funds are limited, and academic staff spend much of their time preparing research bids with limited chance of success.

The obvious solution to all this would be to close down the RAE, and transfer the funds to the research councils. This would also release the high transaction costs of the RAE. Why won’t this happen? The answer is that however crackpot an idea, it generates its winners and losers. The universities which gain most funds from the RAE worry they might lose by an alternative system. Besides funds, the RAE assigns esteem, and esteem counts for a lot in the competitive and hierarchical world of universities. Central government is also a winner: it can enforce regular changes in the formulas used to calculate the allocations of funds following the RAE, and thereby remind those working in higher education of their subordination to the state. There are important lessons of power here: use complicated rules to disguise the exercise of your power and to divide those you rule into squabbling factions. Get clever people to spend all their time in silly competitions, and they will pay less attention to the serious business of challenging your power.

Thursday 8 October 2009

Bullying as a career



Some years ago, the Government tried to improve the recruitment of teachers with the slogan ‘You never forget a good teacher’. That may be true, but you never forget really bad ones either. Most of all, you never forget the school bully. Bullying at school causes absenteeism, illness, and even suicide. Even after leaving school, the pain can live on. I know of friends who have had chance encounters with school bullies years later, and felt the same daggers of pain, anxiety and humiliation.

School bullies are usually portrayed in fiction as thick and cowardly, who inflict cruelty because of psychological abnormalities. In fiction, they are defeated in the end. But this is wishful thinking. An alternative fictional school bully is shown in Michael Palin’s Tomkinson’s Schooldays. Here, the school bully enjoys the exercise of cruelty, but uses it to gain exceptional favours including access to cigars, whisky and the pleasure of attractive young Philippine women. In Tomkinson’s Schooldays, being a school bully is an important career, leading directly to the Cabinet.

Bullying has been a profitable career for many others. Those who enjoy personal power over others and exercise it cruelly will not only succeed in life, but will usually accumulate an adoring circle of cronies. This is because bullying serves many functions. Apart from the obvious gains of encouraging compliance, it can generate solidarity. In mediaeval Japan, the shoguns created a caste called ‘burakumin’, who were assigned the most inferior status in society. All others could share the joyful common task of bullying and humiliating them. The British Conservative Party has a successful history of building support by identifying groups of victims who can not fight back, from immigrants, to single mothers, to the chronic sick on benefits.

In some cases, bullying is a response to specific impediments to management. An example would be in universities, where many staff are on ‘open contracts’ and can not be dismissed except for gross misbehaviour. University leaders wish to enhance the prestige of their institution (and hence themselves) by expanding research. Research is deemed to be an exceptional intellectual skill, not available to people who are committed to lesser forms of scholarship such as teaching. University leaders therefore aspire to recruit researchers in place of teachers, and offload teaching to even lowlier staff. But if teachers are open contract with a full workload, they can not be made redundant. Bullying is therefore employed. This can take the form of denigrating their work and closing the courses they run, excluding them from senior positions, allocating them to inferior work spaces, and threatening disciplinary action for minor (or no) infractions.

Heaven forbid that you might think this true of my own dear university, led as it is by saintly figures thinking only of the welfare of the staff and students in their charge.

Tuesday 29 September 2009

The generation of heroes and the generation of would-be heroes

I recently saw the film The Ipcress File on television. This was first distributed in 1965, the year I began living in London as a university student. The Ipcress File was a kind of antidote to the James Bond spy films, and emphasised the routines, the crowded working conditions and the everyday incompetences of working life. But the film had a special significance for me because of the profound influence its hero (played by Michael Caine) had on my life. When I was a teenager, no hero ever wore glasses. It seemed that wearing glasses (which became essential from about the age of 13) was therefore to be consigned to a secondary role in life: I would be behind the lines, in the backrooms, but never the hero winning the medals and the beautiful women. Michael Caine in The Ipcress File was the first ever hero on film to wear glasses. His character was from a working-class background, had a disrespect for authority, knew his classical music and was an excellent cook. He also succeeded with attractive women. He thus (I believed at the time) provided a template for my own life. I began to listen to classical music, and learnt how to cook. The attractive women came, although not quite in the numbers I had hoped for.

Seeing The Ipcress File also made me reflect on why I wanted to be a hero in the first place. Born in 1946, I was raised in the shadow of war. My parents and all the members of their generation had stories about the War. One of my uncles had been in the Eighth Army in Africa and Italy, and another in the Airborne forces from D-Day onwards. My father had been unable to serve on health grounds, but all civilians were on the front line when the air raids began. There were gaps in local streets where bombs had fallen, the centre of town (Birmingham) had ‘bomb sites’. A local wood was full of rubbish left behind by American troops who had been stationed there. Comics, films and newspapers were dominated by stories of the War. Heroes were everywhere. ‘Heroism’ was more than bravery in the face of the enemy: it meant endurance and stoicism, a sense of common purpose, and a willingness to sacrifice for the sake of others. Many members of my generation absorbed these values, even though there was no war left to fight. The substitute for some of us was politics: an heroic struggle against oppression and injustice which involved marching but no gunfire. We followed the campaigns in the USA against segregation as if it was our own battle. We marched against nuclear weapons, and when we came of age, we marched to support Dubcek and oppose intervention in Vietnam.

Of course, our generation of would-be heroes was pushed aside by younger people not raised in the shadow of the War and who responded to the peacetime values of self-enhancement, both financial and psychological. These new values became important in the later 1960s. Sexual hedonism and drugs were followed by searches for mystical enlightenment. Eventually, most people realised that financial and psychological enhancement could best be attained by an endless sequence of purchases. These values came of age politically in this country in 1979, and still dominate the political class. As a result, politicians still talk like members of rival firms seeking to persuade reluctant consumers to buy their products. When faced with real crises, like recession or global warming, they lack the skills or the understanding to mobilise mass support. There is once again a need for politics to be an heroic enterprise, but we have no heroes any more.

Wednesday 26 August 2009

The Chaos in our High Streets

English high streets have become places of nomenclatural chaos. Visitors to this pleasant but damp island will have noticed that although the high streets of our cities and towns bear the marks of a varied history, they all contain much the same shops. The only exceptions are the small market towns not yet captured by the armies of Marks & Spencers, WH Smith, River Island, Greggs etc, as well as the specialist areas of large cities such as the Jewellery Quarter in Birmingham. Small local shops are usually explicit in what they sell (with the exception of hairdressers who usually use punning names like ‘Headmasters’, ‘Hair Today’, ‘Headstrong’ and so on). The real problem lies with the multiples. Here are some examples of misleading names:

▸ Currys. Despite my persistent demands at our local branch for a chicken tikka marsala, they insist that they only supply electrical goods.

▸ Boots. A series of mysterious stores which sell make-up, medicines, domestic goods and lots of other things, but no footwear.

▸ Thomas Cooks. The name suggests that these are either restaurants (staffed by people called Thomas), or shops selling kitchenware. All they seem to sell are holidays and foreign currency.

▸ Office. A recent arrival in my local high street in Worcester. Demands for stationery were not welcome, and all they stocked were women’s shoes.

▸ Bank. The most confusing of all. No savings facilities, loans or credit cards, just piles of clothes on sale.

This trend to name shops after things they do not sell must come to an end. Our government must act in the name of health and safety, security, or any of the other reasons they usually summon up to order people around.

Monday 10 August 2009

The Laws of Information No. 3

The third law of information is:

3. Data that is collected to measure performance loses validity.

First a confession. In the late 1980s, I worked as Director of Planning and Information in a mental health service in the NHS. One of my tasks was to organise the statistical returns on clinical activity for despatch to the Department of Health. At that time, these were based on a set of standard definitions called the ‘Korner system’ (after a woman who chaired a committee which recommended them). Our service included a brilliant and very hard-working consultant psychiatrist for the elderly. She believed that assessments of new patients should initially be in the patient’s own home (a ‘domiciliary visit’ or ‘DV’). Since she was an orthodox Jew, this meant a lot of walking when her duty days coincided with the Sabbath. Unfortunately, the Korner system required information about scheduled outpatient clinics but not domiciliary visits. Following the Korner rules would have meant that our most active consultant would appear as our least active. This was obviously unjust, so I modified the returns for her clinical activity to record each DV as an attendance at a (non-existent) outpatient clinic.

Paradoxically, my data-adjustment produced statistical returns which were a more accurate reflection of clinical activity than would have been the case without such adjustment. Nonetheless, they became an inaccurate record of inpatient clinics in the service in which I worked. I suspect that data-adjustment in the desired direction was and is rife in the NHS. Although this is dishonest, it can cause far less damage than changing reality to generate honest statistics. A well-known example of changing reality in the NHS is to make patients wait in ambulances outside A&E departments. This reduces the time the patient spends in A&E for the purposes of official statistics, and hence enables the hospital to meet a government target. There are many, many more examples in the NHS of how meeting centrally-imposed targets can damage patient care.

This is not a recent phenomenon. The whole technology of corporate strategic planning and management by targets owes its origins to Gosplan, the state planning commission in the USSR. Studies of the Soviet economy in the 1960s and onwards were full of examples of how rational responses by individual enterprises to centrally-determined targets could produce absurd results. These included the shoe factory that met its target number of shoes by producing shoes all in one size, and the steel factory that met it target for weight of steel by producing a few huge ingots.

Friday 31 July 2009

The rise of the celebrity wedding

Last week, I did not attend a family wedding. To be more accurate, I drove my mother (who is almost 90) to the venue and picked her up afterwards, but was not invited as a guest. The wedding took place in a splendid castle rather than a church, lasted most of the day, and (according to my mother) was like a long enjoyable party. Many of the men wore kilts, while the bride had no need to blush since she has lived with the groom for several years. My mother had a wonderful time.

This kind of wedding is increasingly popular, and is modelled on the celebrity example. As social relationships within communities weaken because of commuting (for both men and women) and the domination of television, people take their guidance on how to live their lives from the examples of celebrities. Tradition, religion and morality become less important than an outward display of mimicry. The greatest celebrity events are weddings, held in private so that exclusive rights for photography can be sold to Hello or OK magazines. The event is therefore staged with the care that would be expected of a film or a play in the West End.

The celebrity example means that many people now regard a wedding as an opportunity to star in their own theatrical event. This has inflated the cost of weddings, which have become the largest single item of expenditure for many couples apart from buying their house. People can not usually afford such an event when they first live together, and so weddings are postponed for many years. This has a profound effect on the meaning of the event. A wedding of this kind is no longer a commitment by a man and a woman to live together and support each other through life, but is instead a party to celebrate several years of sustained cohabitation. It is no longer a union of two families celebrated in their presence, or for that matter in the presence of the public.

So church bells will ring less often across the fields of my village on a Saturday, and weddings will become private fancy dress parties for ageing couples.

Monday 27 July 2009

The Laws of Information No. 2

Staff in offices, universities, schools and almost everywhere else are communication victims. The management in my own university is excellent at communicating to its staff. There are attractive magazines full of good news, regular staff meetings in which college heads present their challenges and achievements, all backed up by daily emails from an array of administrators to guide staff about their business. Yet a recent survey of staff has found dissatisfaction with ‘communication’. What could be the solution? More attractive magazines? More meetings? One answer that has not been considered is less (but more useful) information. As I noted in my posting on the First Law of Information, information is costly. Staff believe that all information emerging from senior management must be important, and therefore it must be read and understood. They do not have the time to do this in addition to all the other emails they receive daily, so messages accumulate in inboxes unread.

The cost of information is felt most acutely by staff when it is required from them. There are routine statistics to be completed, forms to be filled on staff and student progress (including one for every single meeting with a research student!), surveys of staff satisfaction, and one-off requests for information which have descended the management line (usually with shorter and shorter deadlines at each stage of transmission). Staff usually see these requests as a chore to be completed quickly, and do not therefore strive tirelessly for accuracy in collecting and recording the required data. This leads to the second law of information:

2. Data is always less reliable than you think.
Scientific texts emphasise the potential pitfalls in gathering data, and careful scientists have standard routines for checking its validity and reliability. Gathering research data from people is particularly troublesome because of their capacity to fabricate, to rationalise, to forget, and even to avoid telling the truth as an act of politeness. Even in a world where people did none of these things, there would still be lag between events occurring and data being collected, inconsistent application of rules for categorising data, and missing data. Yet these limitations are usually ignored when organisations collect and process information from their staff or from the public. Instead of using wide confidence intervals when reporting the information they have collected, organisations glibly report data to an exact percentage point. There are earnest debates about small changes in statistics from one reporting period to another, even though these are probably within confidence intervals.

Interpreting data would be difficult enough if it was simply a matter of general unreliability, but there is the far bigger problem of biassed unreliability. This is the third law of information:

3. Data that is collected to measure performance loses validity.
I will deal with this in the next posting.

Tuesday 14 July 2009

The Laws of Information No. 1

After finishing my first degree, I worked for a summer in a typewriter factory. Typewriters are now so obsolete that it is usually necessary to explain to younger people what they were for. But this experience taught me a lot about information and how it is used in organisations. This was because I worked on what was then called ‘O&M’, reporting to a rather odd but very clever Welshman. The first law of information that I learnt was:

1. Information is costly. Back in 1968, there were no photocopiers, office computers, or emails. If you wanted a copy of a letter, the typist had to insert carbons and additional sheets of paper when she typed. There was a limit of about three or four copies that could be made this way. If you needed more, then a different process was required. The typist would type the report on specially-waxed ‘skins’, which were attached to the drum of a machine we called a ‘Gestetner’. The drum would contain thick black ink, which you always got on your hands. Both methods of copying were costly and time-consuming, and a major O&M task was therefore to reduce the amount of unnecessary information circulating round the factory. We did this by creating a flow diagram for all the routine reports generated by staff, and asking their recipients whether they found them useful. We found that many reports had begun as one-off requests by management to meet a specific need, but had then become routinised. Some reports went straight from the envelope to the waste paper bin.

This seems a lost world now because photocopiers, word-processing and emailing have successively made the production of multiple copies much easier. But this has had the effect of shifting the cost of information to the reader. People in offices now spend hours a week sifting through emails, most of which come from their seniors but are irrelevant to their work. Emails accumulate in inboxes, and the ones which require rapid attention are missed. Because the idea has taken root that information is cheap to reproduce, staff are required, often at short notice, to produce data and statistics for senior management. As in the past, these requests can become routinised even when the original need for the information has passed.

This indicates that organisations should revert to the O&M principle of reducing the flow of unnecessary information, to release staff time and speed up their response to the information that really matters. Without this, problems develop with the data we do have, which I will look at in a later posting.

Thursday 9 July 2009

Life as a palimpsest



Before printing on paper was invented, texts were written on parchment made from animal hides. Parchment was durable but expensive, and so it wasn’t wasted. If you had something to write, you took an existing parchment with writing on it, rubbed out a line of old text, and wrote in the space you had made. After this had happened several times, the piece of parchment contained multiple erasures and bits of text which, if read in sequence, made no sense at all. This type of parchment is called a 'palimpsest'.

The word ‘palimpsest’ has been used metaphorically to describe cities. Bits are knocked down and replaced by many different architects and builders, all with different aims in mind. This is particularly true of the sort of European cities in which the streets are not laid out in grids and where no king or emperor has been able to impose an overall plan. Palimpsest cities may make no sense (particularly to a visitor), but can be pleasant to discover: alleys and streets wind in mysterious directions; streets suddenly open out into hidden squares; churches and other imposing buildings occupy sites next to houses and office blocks.

‘Palimpsest’ can also be used to describe organisations. An example is the National Health Service in England, which has had numerous organisational changes and endless new initiatives, each with a new set of organisations to implement them. The resulting organisational structure makes the sort of sense familiar to readers of palimpsests. But the NHS keeps on functioning because the real work is done by doctors, nurses, paramedics and other people who know what they are doing. Chaos only intrudes when politicians, the Department of Health or some part of the senior management interfere. Living in an organisational palimpsest, they naturally speak a higher form of management gibberish (‘targeting the deliverables’ etc). In fact, the decay of language into this kind of gibberish is probably an indication that those who speak it live in a world of meaningless procedures and incomprehensible systems for evading responsibility.

Human life itself could be seen as a palimpsest. As you get older, your memory gets over-written by random experiences, different skills and knowledge. You make off-the-cuff decisions which have major implications for the rest of your life, and make sudden and unexpected changes to what you had intended to be an orderly and planned life. Of course, you don’t see it that way when you look back. Human beings have a marvellous ability to rationalise their actions and to see stories (and even conspiracies) where there are only random events.

Wednesday 1 July 2009

New gods for old

Michael Jackson is the latest of our gods to die. Mortal gods are nothing new - many of the Roman emperors declared themselves to be gods. But they seem feeble compared with the immortal gods (favoured by Christians, Muslims and others) who can be imagined as infinitely powerful, wise and just. Those who spend their lives worshipping immortal gods therefore have an ideal set of behaviours to which they can, however imperfectly, aspire. But immortal gods lack personality and presence, particularly if their religions discourage their representation in pictures or statues. Living gods overcome these problems. They become gods because they are ever-present, not in the theoretical sense, but by being on television screens every night. Being on television places them in the special realm of true reality, of which our own lives are but a grey, flickering and imperfect reflection.

Living gods are created by their worshippers, but do not usually require any other forms of special behaviour. Their worshippers are not required to forego particular types of food or wear a special set of clothes or be more moral than the rest of us. The latter is particularly convenient because the people chosen as gods rarely lead exemplary lives. However, a true worshipper will deny any reports of their god’s wrongdoing (even pederasty and drug abuse), or regard it as a sign of the trials inflicted on the god to attain his or her status. Worship therefore involves idealising as well as idolising, and the worshipper passionately follows the lives of the god and creates special shrines in their homes of various icons of him or her. When the god dies, death is denied, symbolically and sometimes literally. The god dies in both senses when their last worshipper has passed away. Many years ago, I have met an old woman who still worshipped Rudolph Valentino and another who worshipped Mario Lanza. None surely are left alive who worship these mortal gods now, and Michael Jackson’s worshippers too will pass away over the next decades. People might still go on singing his songs though.

Friday 26 June 2009

Valuing intellectually abnormal people

After many years of neglect, The Government in England has got round to making policy for people with an intellectual disability (usually called ‘learning disability’ in the UK). The main policy statements have the usual vacuous PR names - Valuing People and Valuing People Now, and include high-minded statements of principle combined with a commitment to ‘choice’. This is to be realised by subjecting people with an intellectual disability to lots of assessments, while at the same time relieving them of the burden of having anything to choose between. This is because services such as day centres are being closed (sorry, ‘modernised’), while funding for any kind of communal living is being withheld. After several scandals, the Government has finally conceded that some action may be needed to improve the health of people with an intellectual disability, but is still committed to winding down specialist mental health services for this group.

This is all justified as being ‘inclusive’ - the latest term in a sequence which began with ‘normalisation’, via ‘social role valorisation’ and ‘ordinary living’. However, these principles to date have only been imposed on people at one end of the IQ scale - those who are two standard deviations or more below the mean, equivalent to an IQ score of lower than 70. To avoid discrimination, we should of course apply the same thinking to the other group of intellectually abnormal people: those two standard deviations above the mean, with an IQ of 130 and above.

When we look at the life-styles of this latter group in England today, we can see that these too offend the principle of inclusion. This is particularly true of arrangements for education and daycare. For a long time in the past, the intellectually-abnormal attended segregated special schools (the so-called ‘grammar schools’). The expansion of comprehensive education has of course reduced this segregation, but wealthy people still pay for their intellectually-abnormal children to attend residential special schools at places like Eton and Harrow. It is, however, after secondary school that non-inclusive policies dominate. Most intellectually abnormal at present go to segregated institutions (the so-called ‘universities’). Fortunately, changes in government policy has meant that these institutions have expanded to include a much wider range of students who are not intellectually abnormal. However, universities still have a major role providing sheltered daycare for many people with intellectual abnormalities for the whole of their adult lives. This is often presented as being ‘work’, but you will often find low levels of activity, most of which comprises pointless and unfulfilling activities like teaching and marking.

The situation is hardly better in the residential circumstances of intellectually-abnormal people. Many seek their friends exclusively among other members of this group, and even marry each other. They often live together in communal settings (particularly when they are ‘students’), and tend to congregate in particular areas of towns. The position is far worse in the USA, where there are rumoured to be whole college towns of people with intellectual abnormalities.

What should be done about this? Following the principles of Valuing People etc, we should enable people with intellectual abnormalities to choose the lifestyles of the general population. There is no reason why they can not work in open employment such as shopwork, clerical activities, and manual assembly-work. Indeed, thanks to recent Government economic policies, more and more intellectually-abnormal people are moving into such jobs after they leave university. We should emphasises the importance of intellectually-abnormal people taking part in ordinary community activities (like going to bingo, the pub and the dog track), which will help them develop friendships with ordinary people. In that way, we will move to a society where everyone will increasingly be the same.

Wednesday 17 June 2009

How to drive the Worcestershire way

The most common places in which we experience rage are in queues and in cars. In a queue of cars behind a slow driver is worst of all. But this a common experience driving between the towns and villages of Worcestershire. Some drive slow because they are elderly and know their reactions have slowed - this seems fair to me and I try to be tolerant. But others drive slowly as part of a set of strange driving practices that they believe make them a very safe driver. To help understand why they do this, I include a guide to how to drive the Worcestershire way.

Speed limits. Always drive 10 miles/hour less than the local speed limit. This will avoid you being caught speeding if your speedometer is inaccurate.

Brakes. The brake pedal is the most useful part of the car. Push the brakes on when you are coming to a corner (any corner), when you see a car coming towards you on the other side of the road, when you are about to go up a hill, when you are going down a hill, and when you can see some traffic lights in the distance ahead of you.

Traffic lights. Wait a bit after the lights have changed to green. You never know if they might suddenly change back to red again.

Roundabouts. Never go on a roundabout if you can see another car approaching it. When you go on a large roundabout, always stick to the left-hand lane whether you are going left, straight ahead or right.

Country lanes. These do not have white lines down the middle, which means you can safely drive in the middle of the road. If you are lost, just stop the car and look at a map.

Motorways and other roads with three lanes each way. Very careful drivers avoid motorways altogether. But if you are on one, always drive in the middle lane. This gives you a clear view of the road, particularly important when other drivers try and pass you on the inside!

Filter lanes. Do not filter into traffic because you might then need to change lanes while moving. Instead, wait until the road is clear and drive straight over to the lane you want to be in. While you are waiting to do this, you should ignore the drivers behind you beeping their horns - they have obviously not learnt to appreciate how to drive the Worcestershire way.

Monday 15 June 2009

Where have all the Marxists gone?

Most of the time, you walk through life with your head down. Every so often, you stop, look around, and realise that the scenery has changed. Part of this scenery is the everyday chatter from the mass media, family and friends. The content of this chatter changes all the time, following personal experiences, news events, and the deeds of celebrities. But political and economic chatter can also be remarkably transient. Nobody now talks about monetarism, although this was one of the dominant political ideas of the 1980s. Neo-Liberalism and Neo-Conservatism also seem destined to wither. One set of ideas that has almost completely disappeared is Marxism.

This is strange because capitalism is now truly in crisis, and there was a time in the recent past when Marxism seemed a powerful set of intellectual ideas. At the time I was active in the British Labour Party (in the 1970s and early 1980s), political debate among activists in my constituency party was dominated by different versions of Marxism. Various Trotskyist factions existed within the Party, of which the strongest was the ‘Militant Tendency’. This was essentially a self-contained and disciplined political party which pretended to be an informal group of voluntary newspaper sellers. There was a minestrone of other Marxists groups outside the Party, most of whom had names with various combinations of the words ‘Revolutionary’, ‘Communist’, ‘Socialist’ or ‘Workers’. None of these factions had much mass support: they were stage armies of would-be leaders of a revolutionary struggle which they (and many others on the Left and Right at that time) believed was imminent.

The lack of mass support did not seem important because British politics is dominated by secretive committees and caucuses. This gives great political leverage to small and well-organised groups. Control of constituency labour parties can be won by getting a few activists nominated from poorly-attended or inactive local branches, who can then nominate candidates for Parliament who can be elected on a Party ticket even though their voters usually have little idea of their personal opinions (or personal morality). Many of the most ambitious local politicians in the Labour Party in the early 1980s rose either as members of one of the Marxists factions, or by doing deals with them. Either way, this required sharing a political language of class struggle against capitalism. In reality, ‘class struggle’ amounted to organising local and Parliamentary election campaigns, supporting strikes, and passing long and angry resolutions to be despatched to the Party’s National Executive Committee.

This all made Marxist ‘entryism’ a successful strategy up to a point. Members of various factions could become councillors and a few became Members of Parliament. But once Marxists had made it onto the national stage, they showed an extraordinary ability to lose votes, particularly when they tried to use local councils to introduce ‘socialism in one borough’. The Party leadership realised at that point that they needed to act against the factions. But a more important cause of the decline of Marxism was the rapid disappearance of communism in Eastern Europe, and, more locally, the continued success of the Conservative Party in British elections. In other words, the struggle against capitalism seemed to have been lost, and ambitious politicians decided they must find another route to power. This other route involved following Tony Blair in portraying Labour as competent and non-threatening, but also being relaxed about major and growing inequalities in wealth, subservience to the USA in foreign policy, and the parcelling out public services to the great profit of an inter-connected web of management consultancy, IT and PFI companies.

This is a world away politically from the Marxism of the early 1980s, but the ex-Marxists who now dominate the Labour Party have retained many of their old habits of mind. There is still the bitter factionalism, the preference for secretive committee-room conspiracies, and the contempt for public and for democratic politics. Gordon Brown became Prime Minister as a result of the first two of these, while his failure to hold a general election shortly afterwards illustrates the third. What we miss from Marxism, and what has been abandoned by these politicians, is the political energy, the internationalism, and the dream of a society organised on a different and fairer basis.

Thursday 4 June 2009

The old wandering track

England is not a land of straight lines. Our streets curve, wander, change names at every intersection, and rarely intersect at right-angles. Our country lanes follow the track of paths between fields and follow the curves made by ancient plough-teams. Our footpaths follow old routes along ridges and between fords, and paths taken to carry coffins from outlying hamlets to the parish church. So it is not surprising that experts were sceptical about Alfred Watkins’ proposals in the 1920s that there was an ancient network of straight tracks across the landscape, from one hill fort and stone circle to another. Watkins called these ‘ley lines’ because they often went through places with the suffix ‘ley’. I live in one such community, which are thick on the ground in this part of the Midlands, probably because this land was once covered by trees, and a ‘ley’ means a meadow cleared from woodland (and is pronounced ‘lee’ not ‘lay’).

Watkins’ ley-lines became adopted by ‘new age’ believers, the most rapidly-growing faith group in this country. Whereas Watkins proposed that ley-lines were just a means of finding your way across the wooded and boggy prehistoric landscape, new-agers speak of ley-lines leading along ‘lines of magnetic power’ across the ‘living rock’, and other such wonderful codswallop. I have a better explanation for lines of the landscape - to reach a good pint of ale. I have noticed that the pubs close to my home form a line, from the Fox at Lulsley, via the Talbot at Knightwick and the Admiral Rodney at Berrow Green, to the Crown at Martley. Extend the line Northwards, and you get to the Red Lion at Holt Health. Ale is an ancient drink, regarded as much safer to drink than water through much of English history. My ‘ale lines’ provide a much better motive for travelling than just following lines of magnetic power. Of course, if you stop at a few pubs on the way, you will no longer follow a straight path, which helps explain why the English landscape looks the way it does.

Wednesday 3 June 2009

Dining in Yuppieland

A map of Birmingham looks like a spider’s web. Like most inland cities in Europe, a series of concentric ring roads encircle the city, while main roads radiate from the centre, usually named after their destinations. If you head outwards from the city down the Stratford Road, you pass through the long straight streets of terrace houses of the late 19th and early 20th Century, into the semi-detached estates built a generation later. You then pass the sprawling ‘shed cities’ of supermarkets and the new ‘executive homes’ to emerge in the fields and villages that are the homes of the young upwardly-mobile professionals of Yuppieland. To an outsider, Yuppieland looks rural, but instead of farms there are golf courses, riding stables and craft centres. The cottages of the peasantry have become the mansions of the wealthy, while the village pubs have become expensive restaurants.

My experience is that dining in these converted pubs is dispiriting. Earlier this year, I went with my mother to Henley-in-Arden, a beautiful small town along the Stratford Road in the heart of Yuppieland. The restaurant included mediaeval features, superb decor and friendly staff. The food was a work of art presented on elegant plates and looked wonderful. It was expensive but poorly-cooked. It somehow managed to be both tasteful and tasteless. This has been my experience with several dining expeditions to this part of the world. People in Yuppieland are prepared to spend a lot on their meals, like them to be an impressive visual experience in a fawning environment, but can not discriminate between good food and bad.

Not all of England is like this. To encounter good food made from fresh local ingredients, it is better to move further from the big cities. In my own area of rural Worcestershire, there are two places I visit which always delight. The first is the Talbot at Knightwick. This is a small hotel with its own brewery and farm. The food is imaginative, is sourced from the Teme Valley and is full of taste. The main draught beers are called ‘This’ and ‘That’, and these are the best beers I have ever tasted anywhere. The plates and beer glasses are not the products of designer studios, but who cares. The second place to eat is the Venture In in Ombersley. This is a small restaurant (with Michelin rosettes) owned by its cook, who lives with his family above the shop. I have a family connection with the Venture In because my son Andrew worked there as a kitchen porter one evening a week before going to University. The atmosphere in the kitchen was far from the kind of shouting match shown on television pictures of restaurants. Instead, there was an effective and friendly team of people dedicated to cooking in the best way from the best ingredients. The result is food that is as close to heaven as can be attained on earth.

Thursday 28 May 2009

Where is the love?

As you get older, you spend more of your time in hospital - not just for your own ailments, but for those of your parents. This accelerates with increasing age. There are visits with your parents to a widening range of specialists followed by ever-longer hospital admissions, and then terminal care. You remember the grim meeting with medical staff leading to consent for ‘Do not resuscitate’, followed by the wait as the defences of a weakening body finally give way. These memories stay with me, but I also remember sitting by bedsides watching life in a hospital ward. What I usually observed was active neglect. Nurses would talk to each other in their ward station while incontinence bags overflowed. Every 10 or 15 minutes a nurse would walk round the ward looking briefly at each at patient but not talking to them. No-one in the hospital saw it as their responsibility to comfort the sick and the dying.

How could this happen? One possible reason is staff burnout. Many people find talking to ill people emotionally taxing. This can wear down the kindest of people, who escape pressure by reducing the emotional content of their interaction with patients or clients, and convert their work to a set of technical procedures. But this raises the question of why organisations do not take steps to avoid burnout or why they continue to tolerate it among their staff. I think the main reason this happens is because the emotional content of professional and personal support work is seen as problematic: the direction of training this group of staff has therefore instead emphasised the acquisition of technical skills. Ethics is still taught as a subject in professional training, but has been reduced to a set of guidelines to follow in obtaining consent for research or for the application of medical, nursing or other procedures.

This loss of the emotional content of health and social care is part of a wider trend towards the excision of passion and feeling from organisational life. It is assumed that no-one works because they have skill and personal commitment, and a passion to apply it to their work. One consequence is the belief-in-practice (written into numerous guidelines and codes of practice and quality assurance manuals) that no-one can be trusted and no-one can perform well unless regulated and inspected or, if they are senior managers, given generous financial bonuses to do their work. This belief becomes self-fulfilling: staff become de-motivated and truculent under the weight of inspection, while senior managers become oriented solely to their bonuses and neglect wider responsibilities.

Of course it is often argued that such mechanisms of control are necessary to manage large organisations and complex centralised states. But this fails to ask whether we need such large organisations or whether our society should be so centralised. It is possible that this trend to centralisation has come about not because it is better at producing goods and services, but because it has generated a new and dehumanised ethos of organisational rationalism. This incorporates the distrust of human emotion and commitment, and hence the negative view of human nature. It proposes instead that humans are properly motivated only by a combination of financial rewards and penalties. This looks rational in the sense that economists speak of rational behaviour, but at its core are the darker emotions of greed, fear, and love of power.

Thursday 14 May 2009

You are not included

Are you socially-excluded? If so, what exactly have you been excluded from? Is it from the basic things of life like a decently-paid job, an active community life, and having a spouse and children? Or is it from life’s pleasures, like eating Marks and Spencers’ ready meals, going on foreign holidays, or even membership of an exclusive London club?

‘Social exclusion’ is a wonderfully vague phrase promoted by governments because it avoids the need to talk about poverty and inequality. It is never made clear precisely which sorts of people are socially-excluded, or what precisely they are excluded from. Nor is evident why they are excluded in the first place. The term ‘socially-excluded’ has at different times been used to include people on low incomes, disabled people, people from ethnic and racial minorities, and even the elderly and the young. When the phrase first became part of political discourse in the 1990s, it was agreed that the problem lay with the excluders - with the lack of opportunities for well-paid work, with poor schools and with discrimination. Now governments increasingly suggest that the problem lies with the excluded, and that ‘social inclusion’ really means coming off welfare benefits. So the unemployed, single parents, the disabled and the chronically sick are to be hectored into taking poorly-paid work, irrespective of the fact that work of any kind is getting hard to find. Of course, if it all gets too much for the socially-excluded, cognitive behaviour therapy will be made available to cheer them up.

But the hectoring received by the ‘socially-excluded’ is only a rather more extreme version of what the rest of us have to put up with. This Government have sent leaflets to our homes telling us that we must all eat more healthily, that we must wash our hands, and use a tissue when we sneeze. Our children are assessed and examined more frequently than in any other country in the world, so that teachers can be harassed into improving their school’s position in national league tables. To supposedly prevent crime, we are watched, scanned and regulated by the largest CCTV network in the world. To supposedly prevent terrorism, basic legal protections are stripped away, with old men arrested as ‘terrorists’ if they heckle the prime minister at a Labour Party conference.

Yet there are small groups of people who have been exempt from all this monitoring, assessment and harassment - our financial sector and our members of Parliament. Financial leaders have squandered cash on foolhardy investments and driven their institutions to bankruptcy. Once found out, they have retired on very generous pensions while their banks have been funded at immense expense for the taxpayer. Many members of Parliament have used an elastic expenses system to refurbish one or more houses, make large capital gains, clean their moats, employ their spouses on generous salaries, and buy porn videos. When they are defeated at the next general election, they too will retire on a generous superannuation. Both bankers and members of Parliament explain away their behaviour by saying they acted within the rules, and that their activities were subject to audit. But this audit and inspection has been a sham - designed to give the impression of regulation while actually allowing them a free hand.

A sham if this kind exists because the powerful believe in their hearts that those who make the rules need not live by them. This network of beneficiaries constitute the ‘politically-included’, who live from the savings and taxes of the rest of us (the politically-excluded) who must pay for our own mortgages, rents, patio heaters, dogfood, moat-cleaning and so on. Nevertheless, this experience may be character-building, and we have a chance in future elections to improve the characters of our current politicians by enabling them too to experience the delights of political exclusion.

Wednesday 6 May 2009

How to be Impatient - Part One

You are standing in a queue at a railway station waiting to buy a ticket. The train will leave soon. The person buying a ticket is fussing with their money, then asks for a timetable, and then asks the staff in the ticket-window which platform the train leaves from. When told ‘Platform 2', they ask ‘Where is Platform 2?’ If you feel mounting irritation at this point, you are, like me, one of the impatient.

Few people seem able to praise impatience; cultural superiority lies with the stolid, the patient, and the inert. Never mind that most innovations have occurred because some impatient person got sick of waiting. This all means that impatient people, like me, are an oppressed cultural minority: we need to use our wits to survive. This obviously requires a lengthy book of hints, of the kind that get sold in airport bookshops. Regard this text as Part One.

The main area in which impatience is experienced, at least in England, is on the roads. I drive a small sports car, and enjoy zipping along the open road. However, a more usual traffic experience is to be at the end of a queue of slower vehicles, tailing along behind a Nissan Micra (the underpowered car for underpowered people). This shows the terrible paradox of driving - that the slower the car, the more likely it is to be in the front of the traffic with an unimpeded view of the road ahead.

So there is not much advice I can give impatient drivers. However, I do have a handy hint regarding another area of impatience: which queue to join in the supermarket check-out. My advice, based on sustained observation and testing, is to choose the queue with the fewest women. There are two main reasons:

1. Women are nicer than men. Women are more prepared than men to regard the staff on the checkout as people, worthy of conversation. This takes time. They will also attempt to help the checkout staff by giving the exact change for their purchases. This involves carefully sorting through their money and counting it out. This also takes time, and contrasts with the male approach of dropping a banknote.

2. Women are more careful with their money than men. Most men keep their cash in their trouser or jacket pockets. Women, on the other hand, keep their cash in a series of containers, each inside the other. The purse (designed to be small and difficult to get fingers into) is kept inside a handbag, which is often inside a shopping bag. Any purchase is therefore preceded by a sequence of money-discovery, and followed by a reverse sequence of money-concealment.

To come later: hints on how to break the buffet queue.

Monday 4 May 2009

The conspiracy continued...

On the very next day, I read the following letter in my local newspaper, the Worcester News, on 3 may 2009:

SIR – It is highly worrying that on Tuesday morning at Birmingham airport, 400 passengers were allowed to leave a flight from Brazil and go their separate ways – without any checks, records or advice.

Why did nobody go on the plane before embarkation, check the passengers and give advice?

It is now quite obvious that the authorities are not serious about containing the spread of this virus – and for very good reasons. The benefits of a worldwide killer pandemic are huge!

A large global loss of life through ostensibly “natural causes” would lift a huge burden from politicians, by reducing world over-population at a stroke without the enormous cost of any war. It also instantly reduces the demand on our stretched food stocks.

I would go further and say that it might have been surreptitiously and deliberately introduced and Mexico would be the ideal propagation nation. The pieces of the puzzle fit rather too neatly for my liking.

So, always use a hanky and wash your hands frequently.

STANLEY D PARR,
Pershore

As I suspected, some people will find a conspiracy everywhere.

Sunday 3 May 2009

The Great Mexican Swine Flu Conspiracy - Gate

The 20th century saw the creation of a wide range of new art forms: the cinema, television, arranging seats in football stadiums to make words and pictures, and the Internet conspiracy theory. Of course, conspiracy theories are not new. Through much of history European countries have blamed any experience of collective adversity on the Devil or the Jews. But the 20th Century saw many new scapegoats, including international capitalism and international communism, the oil industry, Muslims, and (in the USA) the Federal Government and something called ‘the New World Order’. The Internet has allowed conspiracy theories to multiply and spread in days rather than months or years. The isolated and paranoid can meet their fellows on-line and confirm and reinforce each others’ beliefs.

What makes a good conspiracy theory? There should be a major disaster involving loss of life. Obvious natural events, like earthquakes, floods, and volcanoes must not be involved (although this will not stop enthusiastic religious believers from attributing these causes to the desire of a just and loving God to indiscriminately wipe out large numbers of guilty and innocent people as a sign of man’s fall from grace). There must be some real or imagined discrepancy in the official explanation for the disaster, which can ascribed to a desire by those responsible for the conspiracy to hide their involvement. Further attempts by governments to clarify causes of the disaster, hold commissions of enquiry etc, will be portrayed as a ‘cover-up’, usually involving the suffix ‘gate’. The fact that an effective cover-up would require complicity by the entire government, press and civil service becomes ‘evidence’ that it is a truly immense conspiracy.

What are the conspiracy theories of the future? I propose that Mexican Swine Fever could do the job. This began with predictions of pandemics and millions of fatalities. Then, all of a sudden, we heard of a few non-fatal infections. A conspiracy theorist would argue that the millions of deaths have and actually are taking place, but that governments are covering this up by taking over isolated warehouses and incinerating bodies in secret. Lots of people disappear every year, and this explains what is happening to them. Other bodies are being loaded on to ships and sent to the Gulf of Aden, where Somali pirates, who are part of the conspiracy, capture and sink the ships. We could imagine an even more comprehensive conspiracy theory, in which governments are responsible for the Swine Fever in the first place. This could be a result of using pigs for germ warfare experiments, to be tried out first on the Mexican population. Perhaps governments are in league with the Green Movement, who believe that only a massive reduction in the world’s population can - er - save the world’s population. Of course, the conspiracy theorists could themselves be part of an even wider conspiracy...

Thursday 30 April 2009

Looking down on others' needs

Most of us have pretty clear ideas about what we want at any point in time (food, drink, sex, peace and quiet etc), but are less good at planning our lives in the long term. Sometimes we get it right and marry a warm and sympathetic partner, get a job that pays enough and keeps us interested, and have holidays and recreations we enjoy. We then congratulate ourselves on our wisdom rather than our luck, and consider what should be done about people who are lonely or in unhappy relationships, are unemployed or in poorly-paid and tedious jobs, and who otherwise lead dreary and unhappy lives. For some people, this concern for the needs of others is part of their job. They include politicians and other policy-makers, but also a whole range of professions that make day-to-day decisions about people with learning difficulties, people with mental health problems, the elderly and infirm, and others judged incapable of making decisions about their own lives.

But this task raises a real problem: on what basis can you assess the needs of other people? One traditional solution is to assume that all people have the same needs and should live their lives in the same way (usually involving the performance by the entire community of the same religious or patriotic rituals). But in more individualistic societies, another way of assessing needs is required. Fortunately, psychologists and social researchers have the answer: detailed lists of human needs and ‘objective’ measures of the quality of life. The oldest of these categorisations of needs, and the one most people have heard of, is Mazlow’s Hierarchy.






Mazlow proposed that human beings seek to satisfy needs in a set progression, starting from the most basic physiological needs. Once these needs are satisfied, human beings can aim to satisfy successively higher needs, leading ultimately to self-actualisation and self-transcendence. What do these last two phrases mean? ‘Self-actualisation’ for Mazlow meant reaching one’s full potential, which he believed was marked by creativity, an internalised morality, spontaneity, and closeness to others. ‘Self-transcendence’ referred to peak experiences in which a person experiences a sense of spiritual fulfilment.

There are some obvious problems with Mazlow’s approach. People are too diverse to fulfill their needs in the orderly way he suggested. We can all think of examples of people who follow a different hierarchy: people who starve themselves for beauty or for religious belief, and people who seek out risk and danger. People also differ in what they regard as ‘self-transcendence’. People like Aldous Huxley and Timothy Leary proposed it could be achieved through drugs (so for that matter did Ian Drury in his song Sex and Drugs and Rock and Roll). To most of us, however, drug-taking looks more like self-destructive hedonism. I suspect that Mazlow esteemed ‘self-transcendence’ because he was a humanist psychologist. If models wrote hierarchies of needs, they would probably place beauty and poise at the top, while footballers would place agility and teamwork. In other words, setting up a supposed hierarchy of needs becomes an implicit way of ranking other people or looking down on their needs. This would not matter too much, except that hierarchies of needs get imposed on others.

The most imposed hierarchy of all is that of rationality. Many philosophers (who as a rule are rather good at reasoning) have decided that rationality is not only at the top of the hierarchy but also the true marker of humanity. People can then be ranked according to their reasoning powers, with questions raised about the extent to which people with severely-impaired reasoning are really human. This has led to the ‘thought experiment’ by the philosopher Peter Singer and others which compares the relative worth of a severely-disabled person and one of the more intelligent animals. Singer’s views are subtle and have been misrepresented, but anxiety about this kind of though experiment are understandable. They are based on the fear that somewhere in a garret is a lonely soul planning a rise to political power in which thought experiments will be implemented. Then the killings will begin again.

Monday 20 April 2009

My life as a steam engine

Each old person is a living museum. They remember old ways of dressing, cooking and eating, old words and old ways of thinking. Some try and continue to live in the old way, but others adapt while being aware of what has changed. Technological and social change is now so fast that people in early middle age can feel outdated and hence ‘old’. They watch bewildered at things that are day-to-day and commonplace for younger people, like Facebook, text messaging, and music downloads. Even stranger are fundamental changes in patterns of thought, like the transition from being a steam engine to being a car.

When I was young, absolutely no-one reported that they were experiencing ‘stress’. The common metaphor of the time was the steam engine, which still dominated rail transport. Like a steam engine, people said they were ‘under pressure’ when life became difficult, the answer to which was ‘to let off steam’. This involved some physical activity or other form of release such as drinking and socialising. By the 1970s, however, steam engines and rail transport had largely been replaced in people’s minds by motor transport, and people came to regard themselves as being a type of car. Cars and other fast-moving machines suffer from metal fatigue and stress, and this became the dominant metaphor for the emotional state experienced when people face adversity. The conventional answer to being ‘under stress’ or ‘stressed out’ is inactivity, or possibly handing over responsibility for their condition to a therapist. There is no shortage of these, all promoting their services as helping people avoid the sad fact that life includes a fair share of loss, pain and grief.

As for me, I still think of myself as more of a steam engine than a car. When life gets difficult, I prefer to become more active. Of course, it is still helpful to unburden myself on my wife and others, even if it does make them feel stressed.

Sunday 19 April 2009

Synthetic nostalgia

Go to any country in the world, and you will find folk museums. These aim to show how people lived their daily lives in the reasonably recent past, at least within the memory of the oldest inhabitant in the land. Folk museums differ from historical reconstructions, like the Plymouth Plantation and Williamsburg in the USA, which show a much earlier time, close to the origins of their societies. In most countries, folk museums show rural life, but not in England. In the world’s first industrialised and urbanised country, there is no popular memory of rural life: instead, our folk museums contain facsimiles of coal mines, factories, canals, and terrace houses. These evoke strong feelings of nostalgia and personal identification. My wife, who was born in the Black Country, can visit the Black Country Museum and recognise the type of house her grandmother lived in when she visited her as a child. I too experience nostalgia when I visit this Museum. Yet none of my parents or grandparents, as far as I know, lived in back-to-back houses in the Black Country or elsewhere. My nostalgia is therefore synthetic. Where does it come from?

The origin lies with my parents, particularly my father. He was born in 1914 in what seems to have been a reasonably prosperous family which lived in villages and suburbs on the outskirts of Birmingham. He told me that, when he was a child, the family had a domestic servant. Yet his parents were defrauded of their business, and he came home from school one day to find the family’s furniture and other possessions on the street following eviction. He passed the exams for the local grammar school, but failed the interview. In the absence of a secondary education, he did a variety of poorly-paid jobs, including truck driver (at the age of 14!), chauffeur, workhouse clerk, and even coalman. Eventually, he became a welder on the Land Rover track in Solihull and then in Birmingham. I remember him cycling to work in the rain, and the burn marks down his chest from welding sparks. I am not sure how these experiences shaped him, but my father always had a strong sense of social justice and a commitment to trade unions and the Labour Party.

I spent the early years of my life following these footsteps and inheriting his ambitions. I was active in Labour Party politics and imagined myself as a future MP. I felt part of a working-class movement that would make society more just and treat all people with respect. So when I first visited the Black Country Museum, I identified with the past lives of the working-class, and adopted this synthetic nostalgia. I do not believe my experience is unusual: the difference between us all is the nostalgia we adopt. I have a genial colleague at work who has adopted the mannerisms of an Oxbridge scholar even though he comes from near Bolton. I know of people of marginal religious faith who have adopted the lifestyle and clothing of the committed believer to assert their membership of a lost identity.

My father, Dad, died in 2000. He spent the last 15 years of life unable to speak following an operation for cancer of the throat. As a man who enjoyed talk and argument, this must have been a great loss, but he managed his life with day-to-day stoic courage. I do not pass a day without thinking of him. From both parents, I learnt the importance of education as a means of advancing from dreary low-income work, and have spent my working life in fulfilling occupations with few money worries. I still retain my father’s political convictions, though, unlike him, I feel unrepresented. The political party that he and I worked for year after year has all but disappeared. Its successor, which calls itself the ‘New Labour Party’, has betrayed almost all the hopes of those who voted it into office. After the Iraq War began, I tore up my party card. I am no longer nostalgic for an imagined working-class past, but my anger and sense of betrayal remains strong.