Health and safety – Dangerous things conkers?


conkers

Whilst walking this weekend in the pretty town of Wansford, Northants, I managed to slip on a couple of conkers. It reminded me of the story that, due to health and safety fears, the game of conkers is banned in many schools.

Except, that the story isn’t true – and nor is the idea that schools were told to ensure kids wore protective glasses by the draconian Health and Safety Executive (HSE). In fact, the HSE has a sense of humour about it. On their site they write,

‘this is one of the oldest chestnuts around, a truly classic myth…..realistically the risk from playing conkers is incredibly low and just not worth bothering about. If kids deliberately hit each other over the head with conkers, that’s a discipline issue, not health and safety’

 

Conkering the world


Let’s hope that they had no such discipline issues in the cauldron of battle at Sunday’s  2016 World Conker Championships that were taking place in nearby Southwich (nr Oundle). Unfortunately, it was almost all over by the time I’d spotted the road signs advertising the event. But then, there’s always next year whether as a spectator or competitor.

For those of you craving victory in one of the more esoteric world discliplines, you can enter the 2017 competition here. If your name’s William, there’s an obvious title waiting for you.

 

Good and bad error culture


The HSE might not be worried about conkers, but their website has several papers relating to a type of procedural risk called ‘error culture’.

And the same topic is covered in more readable form in Gerd Gigenrenzer’s book, ‘Risk Savvy’. In it he compares how errors are treated in different types of organisation and how it affects their management of risk and safety.

The spectrum of error culture runs from negative to positive.  A negative culture is one where people fear making and reporting errors, whether good or bad. They therefore want to hide them and will be defensive against those trying to shine light on bad practice.

Conversely, a good error culture is one where everyone is encouraged to highlight mistakes and learn from them. The clear aim is to improve the overall level of safety and employ and review good practices.

As good and bad examples he looks at aviation and medicine.

In aviation he highlights the extraordinary safety record of an industry that is extremely complex. The engineering considerations alone are considerable. Aircraft engines, for example, need to function with blades rotating at 12,500 rpm. But they also have to allow for bird strikes, or as a friend who works for Rolls Royce told me, the frozen turkeys they use in tests.

This tendency in aviation towards a positive error culture, means that risks are assessed and communicated throughout the industry. And there’s a realisation that there’s always room for improvement. All changes are intended to lower risk, but they may lead to other unintended problems. The automation of processes such as fly-by-wire have made flying much safer. But are they making pilots dumber and less able to react in the rare emergencies they are called upon?

So in the US there is a program named System Think that reviews all the areas that combine so that a plane takes off at A and lands safely at B.  This level of cooperation also reflects how aligned the incentives in the industry are. The best outcomes for passengers also happen to be the best outcome for the pilots and crew and, given the costs involved, airlines and plane manufacturers. Bad outcomes are also far more newsworthy given the loss of life involved.

In contrast, medicine tends towards a negative error culture. And that seems to be true even with different combinations of private and public care. Gigenrenzer argues that risk management and safety is skewed by rigid hierarchies and the threat of litigation rather than the best outcomes for patients. Drug companies and the interpretation of drug trials only add to the complexity. And bad outcomes may take months or years to materialise even though the numbers involved may be significant.

Taken together, he argues, these are the result of misaligned incentives and poor understanding and communication.

His headline statistic comes from the Institute of Medicine in the US that estimates 44,000 to 98,000 patients are killed each year by preventable medical errors. In the UK,  the severe failures at the Mid Straffordshire NHS foundation trust have been well documented. And only last year Sir Robert Francis QC, who led two inquiries into those failures at Mid Staffs, released a damning report on whistleblowing called the ‘Freedom to Speak Up Review’.

 

Is the comparison fair?


In defence of medicine there are far more decisions made that involve acute and variable risk, whatever the error culture. And the outcome for the patient always eventually ends up in failure.

Medicine is also a victim of its own success. Keeping people alive for longer mean that half of us will end up developing cancer. And the treatment of cancer in the elderly has its own risk complications.

Some risks in medicine present themselves in a way that does not relate to aviation.

This was true of my father survived a triple heart bypass, but went on to develop an unrelated cancer.  The cancer required expertise from both a neurosurgeon and a haematologist. But neurosurgeons and haematologists can have different outlooks on risk.

Neurosurgeons are used to operating daily on patients with severely reduced expectations of survival. Haematologists are used to managing conditions such as lymphoma over much longer periods of time. The neurosurgeon will offer the long shot of a successful operation, the haematologist will suggest management of what they might see as an inevitable decline.

In such an environment can error culture help? How does a concerned relative really make an informed decision? There’s no bad practice in this case, just a difficult choice.

As another example, how would you choose between two surgeons if one of them had a worse safety record on paper, but was known to treat patients who were more acutely ill?

However, Gigenrenzer specifically refers to preventable medical errors. And as one example of the difference in approach of aviation and medicine he refers to the use of checklists. He asks, ‘why do pilots use checklists but doctors don’t?’

 

Checklists are only useful if you er..check them.


He gives an example of a hospital where infection rates were 11%. They had checklists, but they were being ignored in a third of cases by senior doctors. Nurses were then authorised to stop doctors during procedures if they witnessed them skipping an agreed step in the disinfection process. This challenged the existing hierarchy but infection rates fell to almost zero.

The NHS has clearly made great strides in this same area since the concerns over superbug infections in the last decade. In hospitals you’ll see anti-bacteria gel dispensers everywhere to give everyone a gentle nudge.

But recent experience with another elderly relative suggest that, even in basic communication, cultural errors still exist with our local foundation trust. Just getting on a waiting list can prove difficult because an anaesthetist hasn’t sent their report – for a month. And it doesn’t seem to be anyone’s responsibility to ensure that it’s done. The result is a lot of wasted staff and patient time.

So safety checklists are only useful if they are being enforced. At the bottom of the page I have highlighted an example of this working in practice that I used to witness every day on the train to work in Japan.

 

Checklists in finance.


If you work in medicine you might argue that the world of finance is more deserving example of bad error culture. And you’d be right. A lot of the blogs I write highlight examples where that is the case, without stating it explicitly.

Gigenrenzer does cover finance in his book, but only in relation to how to choose risky investments i.e. buy the index of leading shares because most experts won’t outperform it.  He doesn’t look at the culture so much.

In the next post I’ll specifically look at how the current culture in finance relates to issues such as Brexit and the recent fall in the pound. And it involves the need for the Bank of England to draw up a list of foreigners, which I know is a popular theme with the government at the moment.

 

An entertaining commute and why HS2 should take the Bullet.


Every time I travelled from Yokohama to central Tokyo, I’d get on the front carriage with full view of the driver. As the train left the station the driver would make some strange gestures and make a loud exclamation.

I never considered in the mindless fog of the morning commute what they were doing. I just thought it was a traditional gesture amongst Japanese train drivers. And being a father eager to embarrass his children, I occasionally used to mimic the whole process as I pulled away in our car. Actually, I still do – rarely, when no one’s looking.

What I was witnessing is known in Japan as the ‘pointing and calling’ safety standard. It’s part of a positive error culture that contributes to it’s labyrinth of railway lines having the best safety record in the world. And it’s used throughout Japanese industry.

It’s all about enforcing a safety checklist.  Japan is a reserved culture and by training drivers to make these elaborate movements and calls in full view of passengers, they are ensuring the checklist is completed.

For the trainspotting anoraks amongst you, the whole process is demonstrated in the video below. And as for why we should look to Japan for help with HS2, foreign exchange permitting, here is an article in Japan Today from last year. Japanese companies also have be able to demonstrate good behaviour. This weekend’s FT has an article explaining that the HS2 managment committee has a budget of £900,000 to spend on a team of behavioural psychologists to make sure all the consortiums bidding for contracts can work together.

That’s something quite separate to error culture and the topic for another blog. Makes you wonder how the pyramids were ever built.

 

Please remember:

  • past performance is no guide or guarantee of future returns;
  • the value of stock market investments can rise and fall over time, so it is quite possible to get back less than what you put in, depending upon timing;
  • this blog does not constitute financial advice and is provided for general information purposes only.