Showing posts with label Organisations. Show all posts
Showing posts with label Organisations. Show all posts

Sunday, 21 February 2016

Trapped in the university

Much of popular anger with politicians comes from a belief that they fail to successfully tackle the major challenges facing our society. Since many members of the public regard political leaders as all-powerful beings, they can only ascribe this failure to weakness or mendacity. This leads to a demand for politicians who exhibit strength of purpose and/or personal morality. Unscrupulous candidates for office (as now in the USA)  respond to these expectations with blustering statements of their ‘toughness’ and tedious personal expressions of piety.

Yet when we read the autobiographies of the most senior politicians, it becomes clear that they struggle hard to achieve anything. Organisations, even in authoritarian societies, are now so complex that there are multiple points of resistence to any amendment in policy or day-to-day practice. Instead, complex organisations carry on doing what they have always done, in the way they have always done it. As a result, our leaders find themselves trapped, often reduced to the role of becoming spokesmen for decisions that seem to have been made by no one person, but which have somehow become inevitable.

I experienced this sense of being trapped in a small way when I was head of the Division of Neuroscience in Birmingham Medical School. I attributed my lack of effectiveness to personal incompetence until one day, together with other senior managers in the School, I attended a management development session. One of the tasks set by the facilitator was for each of us to prepare a montage showing our view of our role in the University. We were provided with a board, various newspaper colour supplements, scissors and glue. We set to work. When we had finished, I looked round at what we had produced. Every board without exception included prominent images of railings, iron bars and other signifiers of imprisonment. In other words, all of us felt trapped in the university.

The management training session taught me that my colleagues also experienced a sense of powerlessness. Perhaps their appearance of competence showed not so much that they were better managers than me but better actors. I tried thereafter to act the role of a competent manager, but I do not enjoy acting and after a while I ceased to bother. After five difficult years as Head of Division, I retreated to thankful obscurity.

Wednesday, 12 August 2015

The perils of being nice

When I was at school, we were warned against using the word ‘nice’ - a word, we were told, which was milk-and-water, signifying the bland and the inoffensive. Being told to ‘Have a nice day’ was therefore hardly a blessing, which may explain why most English people wish you have a ‘good day’ instead. Nevertheless, niceness is a real and important phenomenon which shapes day-to-day behaviour, particularly in large organisations. The most important part of niceness is an extreme reluctance to say anything which might possibly cause offence to another person, and a corresponding fear of being the subject of complaint by another. Niceness consequently means remaining silent when people do something wrong, refusing to challenge another person’s opinions however wrongheaded they may be, and avoiding any action that might possibly be attributed to you personally.

Niceness is often confused with good manners, but they are different. The core of good manners is consideration and respect for others. This means that you take the opinions of others seriously, disagree where appropriate, but do so in a way that does not humiliate or intimidate. The difference between niceness and manners can be shown in this example. Some time in the 1990s, I was asked to give a presentation at an NHS conference in Birmingham on ageing in society. I was due to speak in the morning session after several other speakers. There were the usual rules in such matters - 15 minutes for each presentation, followed by five minutes for questions. All the speakers kept to these rules until the one before me - a woman who had recently completed a PhD. Her presentation was a description of her research, set out at length, with one tedious detail after another - all spoken in a dull flat voice. Before the talk, she had placed a sheaf of paper copies of her overheads on each seat, and the audience realised after half an hour that she was still less than half way through her intended talk. Despite this, the chairman failed to act until a member of the audience (the local political activist Dave Spilsbury), asked “Mr Chairman - when is this talk going to end? Some of us would like to hear the next speaker”. The chairman, with obvious reluctance, asked the speaker to draw her talk to a close. She droned on with no sign of concluding for another five minutes, until he finally told her to finish. There were of course no questions. It was then my turn. I spoke for ten minutes in as punchy a manner as I could manage. After that, the audience inevitably applauded with great enthusiasm.

As I let the room for lunch, I heard one woman say to another: “That man was so rude”. She meant Dave Spilsbury, not the nice and ineffective chairman, who had failed to exercise the very simple task of keeping a speaker to the allotted time and had therefore shown a lack of respect to the audience and to the next speaker. His behaviour was therefore an example of bad manners combined with niceness.

I had even worse experiences at two other NHS conferences, when the chairmen allowed the speaker before my presentation to drone on for twice their allotted time and then asked me whether I could possibly shorten my talk “because we seem to be running over”. The three ineffective chairmen at these conferences were all senior managers in the NHS, and their niceness may have been a factor in their career success. Niceness was indeed the dominant culture in NHS management and the other public sector organisations in which I have worked, and those who conform to the dominant culture tend to be the most successful.

When I worked in the NHS, one general manager (who later rose to great heights) would look concerned whenever disagreement broke out in a board meeting, and then immediately suggest that the issue should be considered by a subcommittee. This ensured that a nice atmosphere could be preserved at the meeting and that all disagreement (or difficult decisions) could be avoided. One consequence of this tendency is a preference for reacting to events rather than anticipating them. In this way, conflict can be avoided and decisions presented as fait accompli. When I was a member of the same management board, the monthly accounts at the start of the financial year showed a substantial operating deficit. I pointed this out and suggested we start planning how to re-organise services to reduce costs. But this view was dismissed, the deficit accumulated until at the very end of the financial year the general manager announced to staff that the board had reluctantly been ‘forced’ to close a ward.

A second consequence of niceness in organisations is a futile obsession with secrecy. Since criticism is to be avoided at almost all costs, all decisions are inspected for any possible embarrassment they may cause, and a major effort is put into keeping them secret. Keeping things secret is thereby given greater priority than challenging incompetence and dealing with abuse. Staff who abuse patients or clients are therefore quietly re-located instead of being dismissed. The culture of niceness among staff means a lack of respect for those in their care.

See also:The rudeness of strangers

Monday, 13 July 2015

The dangerous mythology of computers

Last year, my aged laptop ceased to work and I so I went to the local PC World to buy a new one. I discovered that it had the new operating system called ‘Windows 8', in which the main screen resembles that on a mobile phone. I managed to find out how to make this work by trial-and-error, and eventually modified it so that it resembled the earlier version of Windows with which I was familiar. I do not use the programme Microsoft Word and prefer Corel Wordperfect, which I re-installed on the new computer. However, I do use PowerPoint , and so I decided to buy the Microsoft Office suite, which also includes Word and the Excel spreadsheet programme. Instead of getting an installation disk, PC World took my money and gave a product key on a card, so that I could download the Office suite from the Microsoft website. All went well until this week when I finally needed to use Microsoft Word. As soon as I loaded the programme, an error message informed me that it had ceased working and it closed down. The same thing happened, I discovered, with Excel.

So I phoned Microsoft technical support for help, and was told that they could solve the problem if I took out a support contract. In other words, Microsoft had sold me a product that did not work and wanted more money to make it work. My next step was therefore to contact PC World, who as the retailer have a legal obligation to sell goods that are ‘merchantable’ (a legal term meaning that goods should be reasonably fit for the ordinary purposes for which such products are manufactured and sold). Their technician told me that such principles could not apply to computers because of their complexity, and the possibility that different software applications can conflict with each other. However, they could sort it out for me if I paid them a £50 repair fee. Not being a complete mug, I went home and checked the Internet. It became clear that my problem was a common one, and occurred because Word and Excel are incompatible with a programme called ‘Finereader’. I have never used Finereader, and do not remember installing it, so it was probably supplied with the computer. Anyway, I deleted Finereader, solved the problem, and saved £50 plus whatever a Microsoft support contract costs.

All this illustrates an important part of the mythology of computers and their software: that they are complicated and that they are expected to be fallible, and that their failure is somehow not the responsibility of those who made them. Complexity and fallibility were once expected of cars. Up to the 1980s, brand new cars would often break down and old ones frequently so. Car-owners would expect to spend a lot of their time attempting to repair them or waiting for a roadside recovery service. All this ended when Japanese manufacturers began to make cars that were highly-reliable. A similar transition has yet to occur with computer systems and so there is still a common belief that computer systems are inherently unreliable. This suits the corporations that make and market them. They can continue to produce poorly-designed products like Windows 8 and expect supplementary payouts from customers when their products prove unworkable. The sum of repair fees and software support contracts from millions of customers must amount to a tidy sum.

Even better profits are generated by the failure of large contracts for public computer systems. This month, the National Audit Office has reported that the General Practitioner Extraction System (GPES) has utterly failed to operate. This is not failure in the sense that the system is unreliable or slow, rather it is failure in the sense that it has never managed to generate a single piece of data for general practitioners. The GPES failed to operate when it cost £19 million, which resulted in a further large input of public cash so that it now fails to operate having cost £40 million. But that is chickenfeed compared with some other public procurement systems like the NHS patient records system abandoned after costing at least £10 billion. In any rational economy, the companies responsible for wasting public money on this scale would be out of business and the civil servants responsible for procurement and management would be in prison or exile. I suspect, however, that they have retired with a good pension, a lucrative ‘consultancy’ post, and the award of a medal from the honours list.


Sunday, 9 June 2013

Not staying focussed

Every person who is successful in whatever walk of life now claims that the secret of their success is their capacity to ‘stay focussed’. ‘Staying focussed’ has become a sort of magical mental state, recommended by advisors to the great majority of us who only lead lives of middling achievement. Like all clichés, ‘staying focussed’ is popular because it conveys certainty without any precise meaning. At its most mundane, it could mean that a person should concentrate on the task in hand or on some immediate ambition. In a broader sense it could it could mean that a person is advised to pursue self-advancement irrespective of effects on their health, their personal morality, their responsibility for others, the happiness of their family life, or the mental health of their children.

Of course, it is essential that for any great task to be completed a person or team of people must concentrate on understanding the problems they must overcome and work together to achieve success. But there are two problems with advising people to ‘stay focussed’. The first is that their greatest ambition in life can or should not be achieved. It is amusing to watch programmes like The X-Factor and see contestants who utterly lack both talent and insight. Their rejection by the panel evokes bewilderment, anger and a renewed determination to succeed at becoming stars - even though ‘success’ will probably in their case mean little more than a wasted lifetime of singing out-of-tune to diminishing audiences. All of us, but especially the most focussed, need to learn what we are not good at. That is not to say that we should avoid activities in which we do not excel. We may of course gain great pleasure from singing, dancing, stamp-collecting or whatever: we should persist in such activities even if we recognise that we will never be world-class. If something is worth doing, it is worth doing even if we do it badly.

The second problem with being ‘focussed’ is that people who concentrate on the task in hand lose sight of the broader picture: they do not see the system. We have all met junior members of staff who rigidly apply the rules of their job even where this undermines the purpose of their employer. But this becomes truly damaging in senior management and politics. Alistair Darling’s memoirs of his time as Chancellor of the Exchequer (Back from the Brink) looks at the events leading up to the collapse of the banking system. He notes that the Bank of England and the Financial Services Authority (FSA) were used to assessing the financial stability of individual banks: they did not take into account the massive extent to which banks borrowed from each other, such that the collapse of one bank would topple all the others. The staff at the Bank of England and the FSA were so focussed that they could not see the whole system. Nor indeed were there any senior civil servants in the Treasury with an understanding of the whole system.

A similar failure has occurred recently with the crisis in accident and emergency (A&E) departments in hospitals. The Government has made substantial cuts in funding to local authorities. These have responded by reducing the support the provide for the elderly, the disabled and the mentally-ill. As a result, vulnerable people are discharged from hospital, are unable to care for themselves and are rapidly re-admitted. Governments have persistently failed to see that health and social care services are essentially a single system: cutting expenditure on social services results in expensive hospital beds becoming blocked by people who could remain in their own home (or in a care home) at a better quality of life and less public expense.

Why is system-thinking so rare? The main reason is that it is difficult and becoming more so. System-thinking requires a breadth of knowledge of how many different sorts of institutions operate and the ability to analyse their inter-connectedness. But as society becomes more complicated, people must work ever harder to understand their own small part of it. It is often said that academics advance by knowing more and more about less and less. But the same is true of many other occupations. As a result, people specialise and become experts in their own narrow field or organisation and see the rest of the world as their ‘environment’, either predictable or the origin of unexpected and incomprehensible demands.

Perhaps we need a new set of clichés. Instead of encouraging people to ‘stay focussed’, we should advise them to ‘always see the broader picture’ or ‘look at how it all works together’, or even ‘try not to be too focussed on one small piece of the jigsaw’.

Wednesday, 10 April 2013

Laws of Information 4 and 5

A long time ago, I proposed some ‘laws of information’, looking particularly at the kind of information available to manage large public organisation. These are as below:
1. Information is costly
2. Data is always less reliable than you think.
3. data that is collected to measure performance loses reliability
You can click these weblinks to access the relevant text for each law.

Here is another law:
4. There is always more information available than you first thought.
Natural science has advanced through ever-improved measurement techniques, of which the Large Hadron Collider is the most recent and by far the most expensive. Each area of science has its own preferred measurement technique, and great effort is expended in improving its accuracy and reliability. Social science works on very different principles: no single measure remotely approaches the levels of accuracy taken for granted in the natural sciences, and so social scientists draw on multiple sources of data to base their conclusions. This principle is also a good one for managing large public organisations like universities and hospitals, where there is also a mass of different information, sometimes of dubious reliability (for reasons why it is dubious, see Laws of Information 1-3).

Sadly, this principle is not always applied. Managers and politicians often focus on one (unreliable) measurement and ignore the others. In part, this is a consequence of a commitment to the written word. Nothing is truly believed to exist unless it has been written down, preferably on a form. Once written down, it is believed superior to all other forms of information such as observation, informal discussions with staff, or patients’ letters of complaint. The most holy of all written data is quantitative data, especially that emanating from a computer. This tendency is reinforced by the use by governments of simple quantitative targets to measure the complex activities of complex institutions.

As an example, look at the Stafford Hospital case. Analyses of routine data showed that the hospital was an outlier in mortality statistics for some surgical procedures (ie people were much more likely to die). This all explained in an excellent article in the London Review of Books, available here:
Rigging the death rate
If the hospital management had followed the Fourth Law of Information, they would have seen this analysis of mortality statistics as a sign that they should gather data from other sources. They could then have visited the wards and observed daily care, spoken to patients and staff, reviewed casenotes, checked how their staffing levels compared with those of other similar hospitals, or brought in some outside experts to do these things and advise. They don’t seem to have done any of this: instead, they decided to discredit the mortality statistics. Management consultants were brought in to change the diagnostic codes of patients who died in hospital. Researchers at Birmingham University were funded to discredit the use of statistics to assess hospital performance. Their report made the correct conclusion that statistics can be misleading and that one set of them should not be used exclusively to assess performance. But that of course misses the point. Being an outlier should be regarded as a warning sign rather than definitive proof. It should have indicated a need to collect other data. In other words, the truth is found not in one set of data, however tidily it is presented and however quantitative, but in a wide range of information, from which an informed person can make a judgement. This leads to the fifth law of information:
5. Interpreting information requires judgement.

The word ‘judgement’ of course will sound a warning bell to some. How much better to pretend that decisions follow automatically from the data without human intervention or the exercise of personal responsibility. Then all that is needed when things go wrong is for the relevant procedures to be blamed and amended. This defence (“I was only carrying out procedures”) is an effective life strategy in any large organisation, and may be a more reliable path to promotion than anticipating problems and taking the initiative in solving them. Look around, and you will see the consequences.

Wednesday, 22 December 2010

Mind and Society

One of the great unresolved questions in philosophy and neuroscience is the nature of mind. Many textbooks in these subjects begin their discussions with Descartes, who proposed a separation between matter (which occupies space) and mind (which involves thought and does not occupy space). Most discussions focus on how these two substances could interact, since the a person’s mind can clearly lead to physical action. One increasingly dominant view (held by ‘materialists’) is that the mind is located in a series of chemical processes in the brain.

Less attention seems to have been given by philosophers and neuroscientists to the slightly different problem of the location of society. This might at first sight seem a puzzling question because the institutions of society and government seem solid and permanent - so permanent in fact, that we give them the names of the buildings that are associated with their headquarters. We speak of the ‘White House’ or ‘10 Downing Street’ to designate their current political inhabitants and their staff. Or, extending the metaphor, we talk of the ‘structure’ of society. Yet this solidity is an illusion because apparently ‘solid’ governments and societies can disappear almost overnight. The German Democratic Republic had large armed forces, a secret police with records on the majority of the population and a network of informers (one for every 6.5 citizens), no overt opposition, and was defended against its enemies (and those of its inhabitants who wished to live elsewhere) by a wall, fences, minefields and machine-guns. Despite surviving with little change for 40 or so years, this ‘structure’ blew away in the wind in a few weeks in 1989.

This indicates that, however solid they may appear to be, societies are generated in the minds of individual people. For years, they wake up each morning and go about their daily routines, thereby maintaining patterns of behaviour that sustain those in authority over them. They may, from time to time, reflect on their lives and decide individually or in small groups to look for other work, seek a new partner, or move to another part of the country. These decisions will take account of the rewards and sanctions for alternative courses of action. On a few occasions, they may act in accord with many others to make a sudden and major change in their daily habits. This will result in attempted political uprisings, outbreaks of mass violence, adherence to a new religious order, support for a new form of music, and other such revolutions.

If society is in people’s minds, then where is it located? The materialist approach is unlikely to be helpful. Even if it became possible to interpret the chemical processes of the brain with greater sophistication than with the current fMRI scanners, all we would find in each individual would be sets of hopes, fears, expectations, habits of thought and expectations of routine behaviours. This would give us some understanding of how that individual functioned in society, but not much else. Instead, we need to think of societies and other human institutions as sets of recurring patterns of human behaviour that are akin to Descartes’ understanding of the mind. They are sustained by thought but are not in themselves a material substance. Perhaps then we could stop talking about organisations of people as if they were blocks of stone and concrete.

See also: Working in the Machine

Monday, 5 April 2010

Working in the machine

 
The best parody of industrial work is by Charlie Chaplin in Modern Times. He shows men working to a pace set by machines, and how the most ordinary human activities (scratching your armpits, swatting a fly which lands on your face) disrupt production. The factory has a time card for going to the toilet, and a big cinema screen inside the toilet so that the manager can threaten workers having a quiet smoke. The workers in the film are, as far as the management is concerned, a regrettable necessity - they are only valuable in as much as they themselves become machines. Humanity is an obstacle to efficiency.

This is a familiar picture of factory life, raised to a peak of supposed efficiency in firms which applied the principles of ‘scientific management’ developed by Frederick Taylor in the early part of the last century. Scientific management involved the careful measurement and evaluation of each human activity in the production process, which could then be defined precisely, so that workers could be set production targets and rewarded according to  degree of attainment (called ‘piecework’ in the UK). Scientific management essentially views an organisation as a machine with human components. It is hardly surprising that it became the standard way of organising repetitive industrial processes (most typically automobile manufacture) and was enthusiastically supported by Lenin and the Communist Party of the Soviet Union.

The shortcomings of scientific management have been recognised for many years. Workers rarely look with enthusiasm on any organisation which treats them as interchangeable and disposable components. Unless very closely supervised, they will take their revenge on the organisation by strikes or sabotage, or cut corners to meet targets at the expense of the quality of output. Even if workers do thoroughly internalise their job description, problems occur. The job description becomes the limit of their responsibility: workers no longer cover for each other or collaborate in solving problems, while they pursue the rules of the organisation to its detriment. When workers in the 1960s, wished to bring an organisation to its knees without going on strike, they ‘worked to rule’.

The alternative to seeing an organisation as a machine, is to see it as a society of varied individuals with diverse skills, who interact with each other to complete tasks and solve problems. They can be motivated to perform well and creatively not just by money but also by their loyalty to their colleagues, their sympathy with the aims of the organisation, their commitment to their customers, and their personal standards and self-respect. This view of the organisation was a key element in success of the Toyota Management System, described by James Womack and colleagues in their book The Machine that Changed the World.

You know that you work in an organisation as society when its workers celebrate each others’ birthdays, marriages, promotions and departures. There will also be a strong organisational culture, with a collective wisdom about what actions work and don’t work. At their best, organisations as societies can provide a strong sense of identity and meaning for peoples’ lives. There can, however, be problems. Organisations with strong social bonds can becme inward-looking and exclusive: the organisation comes to exist only for its workforce rather than to meet the needs of its customers, students, clients or patients.

Expressions of concern about this displacement of goals have been used to justify the ‘modernisation’ of public services over the last 20 years. It has been proposed that governments need to re-assert  strategic direction and the maintenance of quality by setting up a series of public and private agencies to take over many of the activities formerly provided by employees of national and local government departments. Output and quality targets are set for each agency by central government (or one of its nominated QANGOs), while quality of services is monitored by a further set of agencies. It is proposed thereby (in the words of Osborne and Gaebler in their book Re-Inventing Government) to separate ‘rowing’ from ‘steering’: leaders of individual agencies can use their skills to motivate their staff within a set of strategic objectives set by government.

This model has been extended throughout public services, even to organisations like universities which, in the UK at least, were never previously managed by national or local government. The effect has been catastrophic. Setting enforceable targets for agencies drives their managers to cascade these targets throughout the organisation, even to the extent of setting targets for individual members of staff. As the organisation as a whole becomes ‘mechanised’, co-operation between staff diminishes, which leads managers to promotes elaborate job descriptions and procedures manuals. These become quasi-legal documents within the organisation, and thus eliminate opportunities for creativity. Staff become demoralised, and the loss of quality familiar in traditional industrial work spreads to the professions and to public services.

Of course, the mechanisation of public services can be seen as part of a wider pattern in which the whole of society becomes regulated and monitored. We may not have television screens in our workplace toilets yet, but we have CCTV cameras almost everywhere else.

Tuesday, 9 March 2010

A survey has shown that...

If you want to get publicity for some idea, promote a product, or just get in the news, then you should report the results of a meaningless survey. Search on Google using the phrase "a survey has shown that...", and you will see what I mean. You will learn that one in six therapists have tried to cure homosexuals, that more than 70% of people would exchange their computer password for a bar of chocolate, that Americans who attend church are more likely to favour torture than those that do not, and so on. You don't need to bother with getting a good response rate, a representative sample, or even a valid and reliable questionnaire. Just circulate some questions to a few people, and send the most eye-catching result to the press.

There are also plenty of meaningless surveys which never get to the press, but are circulated within companies, government departments and universities. These are often promoted as 'quality assurance', and are even taken seriously by some people. Management boards ponder reasons for a fall in satisfaction ratings by 5% on a survey with a response rate of 20%, without admitting that the whole exercise does not mean very much. Truth to tell, survey results might not mean much even if the response rate was 100%. Many meaningless surveys use ambiguous questions coupled with dubious Likert scales (the kind which assign numerical scores to a range of five or so questions from 'very satisfied' to 'very dissatisfied'). These have the apparent advantage of producing a numerical score and hence allowing statistical analysis. Usually however, people only look at mean scores, and these can be misleading. A survey in which 50% of respondents were 'very satisfied' and 50% 'very dissatisfied' would produce the same mean score as one in which 100% said they were 'neither satisfied or dissatisfied'.

What's the alternative? It is essential for organisations to assess the quality of what they do, and their customers/citizens/students are in a good position to assess this. Rather than assessing mean scores on Likert scales, organisations should concentrate their attention on the causes of satisfaction and dissatisfaction, and ideas for improverment. The best way of doing this is probably to use open-ended interviews or focus groups. Of course, this would require quality assurance staff to be skillful in survey techniques, to be creative, and to be prepare to co-operate with front-line staff rather than stand in judgement over them.

Friday, 12 February 2010

How to stifle innovation

For big commercial organisations, innovation is a requirement and a threat. Innovation is essential because it provides a means of developing products that are more attractive to their consumers than those of rival firms, or at least can be produced at lower cost. A reputation can also be useful in marketing: Apple can announce some re-design of an existing device, cunningly presented to resemble our idea of the modern (sleek, smooth, and shiny). But innovation is also destructive: old product lines are closed and their workers displaced. Whole companies disappear, and the towns that depended on them decline and empty. Few universities have closed, but departments, courses and research teams have. Scientific reputations have been lost, and well-established theories ridiculed and discredited.

There will therefore always be resistance to innovation, and this will be most successful when rival organisations can either be eradicated or (for the time being) ignored. There are several ways to stifle innovation. One is to reward conformity, and promote people regarded as being ‘a safe pair of hands’. By contrast, innovative people should be identified and excluded from promotion or, in more authoritarian societies, from life itself. But heretics are persistent, and other means of stifling their ideas are needed. One that is particularly successful is to regulate every aspect of organisational life in detailed procedures manuals, quality assurance rituals, and the kind of job descriptions that  comprise series of bullet-point lists. This method essentially outlaws the kind of local innovation that generates change in organisations and societies. Instead, things can only change when everyone changes. The organisation thereby innovates at the pace of the slowest or, more likely, not at all. The final means of stifling innovation is to generate a culture of smug superiority: this usually involves asserting that potential rivals are inferior without bothering to find out if this is the case. After all, why go through all the fuss and disturbance of innovation when you are already the best there is.

These techniques succeeded in resisting innovation for centuries in the great empires of the past. Egypt, Rome and the later Chinese Empire were technically stagnant, believing themselves to be protected by legions, deserts, seas and great walls from the threat of rivals. These rivals were weaker and more disorderly, and hence less willing to resist innovation. The Roman Empire was eventually destroyed by tribes which had learnt how to use stirrups and hence could ride heavy cavalry horses. The small European states, locked in perpetual warfare with each other, refined the gunpowder technology invented by the Chinese to bring about the destruction of the vast and populous Chinese Empire. 

Not just empires lie in ruins. Every day I travel to work, my train passes a large cleared space at Longbridge in Birmingham, where there was once the largest car factory in Europe. On the other side of the City, there once stood the largest motor bike factory in the world. Further North are the remains of yards which a century ago built half the world’s shipping. These empty spaces and ruins are all the products of organisations which successfully resisted innovation. How I wish they had failed to do so.

Thursday, 8 October 2009

Bullying as a career



Some years ago, the Government tried to improve the recruitment of teachers with the slogan ‘You never forget a good teacher’. That may be true, but you never forget really bad ones either. Most of all, you never forget the school bully. Bullying at school causes absenteeism, illness, and even suicide. Even after leaving school, the pain can live on. I know of friends who have had chance encounters with school bullies years later, and felt the same daggers of pain, anxiety and humiliation.

School bullies are usually portrayed in fiction as thick and cowardly, who inflict cruelty because of psychological abnormalities. In fiction, they are defeated in the end. But this is wishful thinking. An alternative fictional school bully is shown in Michael Palin’s Tomkinson’s Schooldays. Here, the school bully enjoys the exercise of cruelty, but uses it to gain exceptional favours including access to cigars, whisky and the pleasure of attractive young Philippine women. In Tomkinson’s Schooldays, being a school bully is an important career, leading directly to the Cabinet.

Bullying has been a profitable career for many others. Those who enjoy personal power over others and exercise it cruelly will not only succeed in life, but will usually accumulate an adoring circle of cronies. This is because bullying serves many functions. Apart from the obvious gains of encouraging compliance, it can generate solidarity. In mediaeval Japan, the shoguns created a caste called ‘burakumin’, who were assigned the most inferior status in society. All others could share the joyful common task of bullying and humiliating them. The British Conservative Party has a successful history of building support by identifying groups of victims who can not fight back, from immigrants, to single mothers, to the chronic sick on benefits.

In some cases, bullying is a response to specific impediments to management. An example would be in universities, where many staff are on ‘open contracts’ and can not be dismissed except for gross misbehaviour. University leaders wish to enhance the prestige of their institution (and hence themselves) by expanding research. Research is deemed to be an exceptional intellectual skill, not available to people who are committed to lesser forms of scholarship such as teaching. University leaders therefore aspire to recruit researchers in place of teachers, and offload teaching to even lowlier staff. But if teachers are open contract with a full workload, they can not be made redundant. Bullying is therefore employed. This can take the form of denigrating their work and closing the courses they run, excluding them from senior positions, allocating them to inferior work spaces, and threatening disciplinary action for minor (or no) infractions.

Heaven forbid that you might think this true of my own dear university, led as it is by saintly figures thinking only of the welfare of the staff and students in their charge.

Monday, 10 August 2009

The Laws of Information No. 3

The third law of information is:

3. Data that is collected to measure performance loses validity.

First a confession. In the late 1980s, I worked as Director of Planning and Information in a mental health service in the NHS. One of my tasks was to organise the statistical returns on clinical activity for despatch to the Department of Health. At that time, these were based on a set of standard definitions called the ‘Korner system’ (after a woman who chaired a committee which recommended them). Our service included a brilliant and very hard-working consultant psychiatrist for the elderly. She believed that assessments of new patients should initially be in the patient’s own home (a ‘domiciliary visit’ or ‘DV’). Since she was an orthodox Jew, this meant a lot of walking when her duty days coincided with the Sabbath. Unfortunately, the Korner system required information about scheduled outpatient clinics but not domiciliary visits. Following the Korner rules would have meant that our most active consultant would appear as our least active. This was obviously unjust, so I modified the returns for her clinical activity to record each DV as an attendance at a (non-existent) outpatient clinic.

Paradoxically, my data-adjustment produced statistical returns which were a more accurate reflection of clinical activity than would have been the case without such adjustment. Nonetheless, they became an inaccurate record of inpatient clinics in the service in which I worked. I suspect that data-adjustment in the desired direction was and is rife in the NHS. Although this is dishonest, it can cause far less damage than changing reality to generate honest statistics. A well-known example of changing reality in the NHS is to make patients wait in ambulances outside A&E departments. This reduces the time the patient spends in A&E for the purposes of official statistics, and hence enables the hospital to meet a government target. There are many, many more examples in the NHS of how meeting centrally-imposed targets can damage patient care.

This is not a recent phenomenon. The whole technology of corporate strategic planning and management by targets owes its origins to Gosplan, the state planning commission in the USSR. Studies of the Soviet economy in the 1960s and onwards were full of examples of how rational responses by individual enterprises to centrally-determined targets could produce absurd results. These included the shoe factory that met its target number of shoes by producing shoes all in one size, and the steel factory that met it target for weight of steel by producing a few huge ingots.

Monday, 27 July 2009

The Laws of Information No. 2

Staff in offices, universities, schools and almost everywhere else are communication victims. The management in my own university is excellent at communicating to its staff. There are attractive magazines full of good news, regular staff meetings in which college heads present their challenges and achievements, all backed up by daily emails from an array of administrators to guide staff about their business. Yet a recent survey of staff has found dissatisfaction with ‘communication’. What could be the solution? More attractive magazines? More meetings? One answer that has not been considered is less (but more useful) information. As I noted in my posting on the First Law of Information, information is costly. Staff believe that all information emerging from senior management must be important, and therefore it must be read and understood. They do not have the time to do this in addition to all the other emails they receive daily, so messages accumulate in inboxes unread.

The cost of information is felt most acutely by staff when it is required from them. There are routine statistics to be completed, forms to be filled on staff and student progress (including one for every single meeting with a research student!), surveys of staff satisfaction, and one-off requests for information which have descended the management line (usually with shorter and shorter deadlines at each stage of transmission). Staff usually see these requests as a chore to be completed quickly, and do not therefore strive tirelessly for accuracy in collecting and recording the required data. This leads to the second law of information:

2. Data is always less reliable than you think.
Scientific texts emphasise the potential pitfalls in gathering data, and careful scientists have standard routines for checking its validity and reliability. Gathering research data from people is particularly troublesome because of their capacity to fabricate, to rationalise, to forget, and even to avoid telling the truth as an act of politeness. Even in a world where people did none of these things, there would still be lag between events occurring and data being collected, inconsistent application of rules for categorising data, and missing data. Yet these limitations are usually ignored when organisations collect and process information from their staff or from the public. Instead of using wide confidence intervals when reporting the information they have collected, organisations glibly report data to an exact percentage point. There are earnest debates about small changes in statistics from one reporting period to another, even though these are probably within confidence intervals.

Interpreting data would be difficult enough if it was simply a matter of general unreliability, but there is the far bigger problem of biassed unreliability. This is the third law of information:

3. Data that is collected to measure performance loses validity.
I will deal with this in the next posting.

Tuesday, 14 July 2009

The Laws of Information No. 1

After finishing my first degree, I worked for a summer in a typewriter factory. Typewriters are now so obsolete that it is usually necessary to explain to younger people what they were for. But this experience taught me a lot about information and how it is used in organisations. This was because I worked on what was then called ‘O&M’, reporting to a rather odd but very clever Welshman. The first law of information that I learnt was:

1. Information is costly. Back in 1968, there were no photocopiers, office computers, or emails. If you wanted a copy of a letter, the typist had to insert carbons and additional sheets of paper when she typed. There was a limit of about three or four copies that could be made this way. If you needed more, then a different process was required. The typist would type the report on specially-waxed ‘skins’, which were attached to the drum of a machine we called a ‘Gestetner’. The drum would contain thick black ink, which you always got on your hands. Both methods of copying were costly and time-consuming, and a major O&M task was therefore to reduce the amount of unnecessary information circulating round the factory. We did this by creating a flow diagram for all the routine reports generated by staff, and asking their recipients whether they found them useful. We found that many reports had begun as one-off requests by management to meet a specific need, but had then become routinised. Some reports went straight from the envelope to the waste paper bin.

This seems a lost world now because photocopiers, word-processing and emailing have successively made the production of multiple copies much easier. But this has had the effect of shifting the cost of information to the reader. People in offices now spend hours a week sifting through emails, most of which come from their seniors but are irrelevant to their work. Emails accumulate in inboxes, and the ones which require rapid attention are missed. Because the idea has taken root that information is cheap to reproduce, staff are required, often at short notice, to produce data and statistics for senior management. As in the past, these requests can become routinised even when the original need for the information has passed.

This indicates that organisations should revert to the O&M principle of reducing the flow of unnecessary information, to release staff time and speed up their response to the information that really matters. Without this, problems develop with the data we do have, which I will look at in a later posting.

Thursday, 9 July 2009

Life as a palimpsest



Before printing on paper was invented, texts were written on parchment made from animal hides. Parchment was durable but expensive, and so it wasn’t wasted. If you had something to write, you took an existing parchment with writing on it, rubbed out a line of old text, and wrote in the space you had made. After this had happened several times, the piece of parchment contained multiple erasures and bits of text which, if read in sequence, made no sense at all. This type of parchment is called a 'palimpsest'.

The word ‘palimpsest’ has been used metaphorically to describe cities. Bits are knocked down and replaced by many different architects and builders, all with different aims in mind. This is particularly true of the sort of European cities in which the streets are not laid out in grids and where no king or emperor has been able to impose an overall plan. Palimpsest cities may make no sense (particularly to a visitor), but can be pleasant to discover: alleys and streets wind in mysterious directions; streets suddenly open out into hidden squares; churches and other imposing buildings occupy sites next to houses and office blocks.

‘Palimpsest’ can also be used to describe organisations. An example is the National Health Service in England, which has had numerous organisational changes and endless new initiatives, each with a new set of organisations to implement them. The resulting organisational structure makes the sort of sense familiar to readers of palimpsests. But the NHS keeps on functioning because the real work is done by doctors, nurses, paramedics and other people who know what they are doing. Chaos only intrudes when politicians, the Department of Health or some part of the senior management interfere. Living in an organisational palimpsest, they naturally speak a higher form of management gibberish (‘targeting the deliverables’ etc). In fact, the decay of language into this kind of gibberish is probably an indication that those who speak it live in a world of meaningless procedures and incomprehensible systems for evading responsibility.

Human life itself could be seen as a palimpsest. As you get older, your memory gets over-written by random experiences, different skills and knowledge. You make off-the-cuff decisions which have major implications for the rest of your life, and make sudden and unexpected changes to what you had intended to be an orderly and planned life. Of course, you don’t see it that way when you look back. Human beings have a marvellous ability to rationalise their actions and to see stories (and even conspiracies) where there are only random events.

Wednesday, 8 April 2009

Another road to extermination

One benefit of living into your 60s is having seen the end of the world many times. I have seen numerous films in which the world (or at least New York) is destroyed by aliens, asteroids, nuclear war, plague, and even falling into the sun. Most of these films were set in the distant future which, at the time they were made, was the 1990s or the early years of this century. Of course, I have also lived through manned interplanetary and interstellar space flight, all of which should have taken place by now. And there was that manned flight to Jupiter eight years ago which ended in a mysterious series of art-house cinematic effects.

Interest in the end of the human species has been revived recently by the 200th anniversary of Charles Darwin’s birth, and the grim realisation that almost all species that have lived on the planet have evolved out of existence. It is possible that humans could be like sharks, crocodiles etc, which have stayed more or less the same for millions of years. However, that would not please all the predictors of our demise who have made a comfortable living by spreading alarm about extermination as a result of ecodoom, nanodoom, plague (again), and (popular in the USA) the idea that God will finally decide to wipe us all out while saving only a few Godly white Americans.

Attracted by the thought of a more comfortable life, I would like to join the doomsayers by proposing a new road to extermination: death by quality assurance. Quality assurance (QA) began as an heroic enterprise. In the 1950s, Japanese manufacturers became alarmed at the poor quality and reputation of their products. They realised that customers were increasingly demanding reliable products with a higher specification. Mass-produced standardised good were no longer satisfactory: cars and other products had to be differentiated according to wishes of the customer. This all required a rethink of the production process: instead of mass production lines of workers carrying out repetitive tasks over which they had no control, workers were encouraged to redesign their work to improve quality and efficiency. This all worked: Japanese cars and other products gained a reputation for reliability and good design.

However, something terrible happened to QA once it passed from private industry to public services. Instead of being concerned with responding to customers and innovating production, the term became associated with the exact opposite: imposing centrally-directed targets, standardising procedures, removing control from the people who actually do the work, and inspection. It is time-consuming to inspect the day-to-day work of schools, hospitals and local authority services, so ‘QA’ soon became a matter of checking paperwork. This in turn provoked public services to generate standardised operating procedures, with paper reporting and IT systems to record compliance. In addition, designated QA personnel were appointed to check that all paper is produced according to central demands. Needless to say, this all demoralises staff, who respond by working to contract (ie doing the bare minimum) and covering this up by manipulating their paperwork. This in turn results in government ‘strengthening’ inspection and making the problem worse.

The result is that, over time, the number of people setting targets, devising standardised procedures, completing paperwork, and inspecting each other has risen, while the people who actually do the work (growing food, making things, running power stations, teaching children, ministering to the sick etc) are getting squeezed out. The current recession has speeded this up: manufacturing jobs are being cut, while banks need more (and probably equally ineffective) ‘supervision’. If we extrapolate this trend, we can see a time later in this century when people who know how to grow and make things will have almost completely disappeared. The vast numbers of QA staff will then sit in their darkening offices, starving to death, tapping out lists of targets and procedures on computer systems that no longer work.