What can you see in the NHS crystal ball?

Gazing a the crystal ballCast your mind back 10 years to January 2009. How accurately would you have been able to predict the world as it is today?

Who could have thought that Michael Jackson would leave the planet within six months? Or we would then lose David Bowie, Prince and George Michael during a single year in 2016? That this annus horribilis of celebrity deaths would occur just a few months after Queen Elizabeth became the longest reigning monarch in UK history and still going strong? Could you have entertained the idea of Dr Who being a woman? In January 2009 Gareth Southgate was really struggling as manager of Middlesborough football. He was heading for relegation and the sack. Would you have predicted that by 2019 he would be the most successful England manager of recent times? How many people would have laughed at the thought of Donald Trump as President of the USA – even though this was foreseen by Bart Simpson? Could you have imagined your worried patient making an appointment to discuss their smartwatch data? And what else has changed in your life over the past 10 years?

We can’t predict the future. Today, the current political events related to Brexit mean we find it hard to predict what our situation will be this time next week, never mind in January 2029.

So, why bother with a 10 year plan?

The new long term plan for the NHS in England has now been published. Yet it hardly seems like any time since the turmoil of the major restructure driven by the Health as Social Care Act began. This act passed through parliament in March 2012. Implementation began April the next year with the final changes being as recent as 2015. One of it’s cornerstones was that every NHS Trust would achieve Foundation status by 2014.  That never happened.

And haven’t we just started working with the Five Year Forward View? Well that was published in 2014. Today, we find ourselves in a situation where the intended integration between health and social care looks very different in one part of the country to another. Just another journey started but never completed?

Think closer to home and to your own hospital, department or team.  Think of all those change-projects that have taken place. What proportion of them actually achieved anything worthwhile? How many have made a lasting impression? Multiply this across the country and now think how much these failed local initiatives have all cost in total? And for what?

With all these projects, changes of structure and direction, are things today any better now than they were in 2009? If we can’t even tell what the world will be like 6-months from now, then what’s the point in a 10 year plan? You might well be hearing some colleagues say, “Just let doctors get on with their jobs!”

So what makes a good plan?

The best plans do make a difference. Before they even start there is clarity of overall purpose. They are focused to the future and keep this purpose in mind. Though informed by the past, they go beyond simply identifying what’s gone wrong. They have identified why it’s gone wrong. Simultaneously, they pay equal attention to what is going right and why. This helps set a long term direction with clear but general ambitions. From this, projects are defined which are aligned to this direction. These projects are managed by people who have taken the time and effort to develop the skills required. We are then all in a good place to be able to work with actions at a monthly, weekly and daily level which are aligned to the direction and purpose.

To quote Nelson Mandela:

Vision without action is just a dream. Action without vision just passes the time. And vision with action can change the world.

Though we can’t always tell what the change will look like in the end, paying attention to progress and events means that direction is maintained. The greater the number of people who are working with awareness, who are informed of the plans, know the system and paying attention to progress being made, the greater the chance of achieving the purpose.

What steps are you taking to improve your ability to participate in the change?

Stephen McGuire – Director of Development

Should we work on insignificant improvements?

Doctors hate their computers. Well, that’s according to Atul Gawande. The influential doctor and writer recently dedicated almost 9,000 words to outline his thoughts on the matter in The New Yorker’s Annals of Medicine.  So many initiatives intended to make things faster and better actually result in the opposite. His main focus of his frustration is on computerised patient record systems. They slow down consultations and increase workload. The vast majority of ‘improvements’ made seem to result in demand for more hours working away from patients.  He goes on to suggest a direct link between software and bureaucracy. As prevalence and complexity of the former grows, so too does the latter.  In addition:

“I began to see the insidious ways that the software changed how people work together. They’d become more disconnected; less likely to see and help one another, and often less able to.”

Impact on patients

Atul Gawande is not alone in throwing the spotlight onto computer systems which are intended to help. To quote Helen Salisbury, writing in The BMJ:

“..if I’m not careful, technology can take over the consultation so that (it takes) more time than the listening and the talking. Worse still is when I try to do both at once—listening a bit but not enough, hands already typing, eyes on the screen. There’s nothing like not listening to encourage not talking.”

It’s so easy to see how, in hindsight, all the little things add up to being something much bigger. Mountains are made up of countless grains of sand – some fused together as solid, impenetrable rock. All the extra little tasks and delays add up to hours of our time. But do we think about all the little things that could help in the same way?

Do we think about improvements in the same way?

If you are reading this, then you are reading the first blog entry on our newly redeveloped website. It’s much, much faster than the old version. Didn’t you notice? No? But then, why should you? We don’t usually notice how quickly or efficiently something works. We just expect it. If we do notice an improvement, it doesn’t take long for us to get used to it and forget the way things were. But we do notice when things are slow and get in the way.

To quote Tom Peters:

“There’s no such thing as an insignificant improvement.”

Such thinking feeds into the concept known as the ‘aggregation of marginal gains’. This principle was utilised by Sir David Brailsford as he coached the British Cycling team and Team Sky to unparalleled success. Yes, all the little inefficiencies add up to major problems. Equally, multiple little improvements add up to something well worth the effort.

Making a worthwhile difference

Our research, published in BMJ Leader, revealed major shortfalls in doctors discussing things that matter. Of over 200 doctors, some 40% said they were not discussing progress toward goals either enough or at all with their colleagues. 37% said they were not involved with discussions about improving processes. Learning how to have such discussions with team members would be one small improvement. Actually having the discussions would create the possibility of many more. Learning how and improving the ability to channel ideas into worthwhile change helps make these little changes real. All too often, we wait for the big idea or the big initiative. Worse still is waiting for someone else to do something. Meanwhile, the problematic little things continue to tighten their grip.

What small yet significant improvements will you make next?

Stephen McGuire – Head of Development

Is patient-doctor communication needing to evolve?

Financial limitations create many challenges for the NHS. Across the country, the four home nations of the UK are each embarking on programmes designed to ensure high-quality, sustainable systems for the future. Let’s start with a look at NHS Wales’ Prudent Healthcare approach which is based on a set of four guiding principles;

  • Achieving health and well-being with the public, patients and professionals as equal partners through co-production.
  • Care for those with the greatest health needs first, making effective use of all skills and resources.
  • Do only what is needed, no more, no less; and do no harm.
  • And Reduce inappropriate variation using evidence-based practices consistently and transparently.

Similar ideas are included in the strategic plans of England, Scotland and Northern Ireland. A few examples of the activities resulting from Prudent Healthcare include:

  • A drive to reallocate finances from high-cost/low-effect activities toward high-effect strategies, such as shifting the spend for the treatment of COPD from providing steroid inhalers toward support for smoking cessation.
  • Reducing unnecessary procedures and use of medications, in particular anti-microbials, anti-psychotics and opioids.
  • And Transparency of the costs of prescriptions.

The ‘Transparency’ challenge

Transparency of costs requires striking a delicate balance for clinicians in communication with patients.  In Scotland, an elderly relative of mine had a discussion with their Consultant about care options and was informed of the cost of treatment.  “I can’t expect the NHS to pay that for me”.  A conclusion reached without any awareness of the costs of any other treatment they or any others currently receive.

The ‘crackdown’

On the day of writing this piece, the BMJ reports, ‘NHS England has launched a fresh crackdown on GPs’ prescribing of “low priority” items in a bid to save £70m a year to reinvest in other areas.‘ This will undoubtedly lead to some difficult conversations with patients and family members who have previously received such prescriptions.  News media reporting of NHS England’s consultation activity is risking alarming patients.  There are suggestions that supplying needles and blood glucose testing strips for diabetics will be halted as part of the same initiative that stops provision of silk garments to people with skin conditions.  GP’s are no doubt bracing themselves for an avalanche of queries.

Further changes

These changes are taking place at the same time as the drive for social prescribing and the shift away from old-style patriarchal medical-practice into concordance and informed consent. In combination, these changes and progressions create significant challenges for healthcare professionals – not least in relation to patient communication. There can only ever be true informed consent when doctor and patient are speaking the same language.  Yet patients speak a number of different languages – and I don’t mean based on their geographic heritage. I mean in relation to their attitude to their care.

Different languages

Groopman and Hartzband proposed thinking of patients in terms of:

  • Minimalists resist and want to avoid interventions, treatment or contact with healthcare professionals
  • Maximalists want and even demand attention and action for every ailment – real or imagined
  • Naturalists trust mother earth or spirituality to provide the best solution
  • Technologists believe that the very latest man-made inventions, drugs and techniques must be the answer.

Evolution of language

This approach to categorisation was based on a framework of two distinct dichotomies.  Yet all languages continually evolve. The changes towards prudence, consistency and transparency introduce new dimensions to complicate the model. These include:

  • the patient’s attitudes toward fairness, in terms of person rights and entitlement versus collective social responsibility
  • the patient’s perception of what constitutes good value for money versus what is expensive
  • the patient’s desire for having personal control versus patriarchal care

Change is essential if we are to ensure we have a sustainable healthcare system to meet the challenges of the UK in 21st Century.  The skill requirements for doctors to truly speak to patients in their own language has never been higher. If patient languages are evolving then so too must doctors.

What steps are you taking to ensure you develop and maintain your skills as a multi-linguist in terms of patient communication?

Stephen McGuire – Head of Development




Not all superheroes wear capes

Sad news this week about the passing of Stan Lee, creator of the Marvel Universe.  His unique imagination gave us a world of superheroes with special powers, many of them dedicated to helping ordinary people deal with situations beyond their control.

The best superhero stories go well beyond simple “POW”, “CRASH”, “BANG” and “WALLOP” of early fun-time Batman movies.  Many of the greatest tales are parables for everyday life.  In these comics and movies we often we witness “ordinary” people doing extraordinary things.  They take a stand and make a real difference.  When we see such acts in real life we often refer to these people as superheroes-without-capes.

Now many gifted superheroes go to great lengths to conceal their actions, challenges and achievements even from those closest to them.  Sometimes it’s because it can seem too big or too much.  Sometimes they want to protect their loved ones from worry or concern.  This often backfires with spectacular results.  In one popular movie, Will Smith played Hancock: a superhero who had become an empty shell of himself, bitter and twisted, feeling unappreciated for his talents and efforts.

To the general public, doctors do extraordinary things.  They are often considered in the superhero-without-capes category.  But back in the real world, and it’s anti-bullying week (11-16th November).  Sadly it seems that every other week there is another notable case of bullying related to doctors and the NHS.

Does it need a superhero to take a stand?

Here’s a link to a video from our new NHS & UK Medical Regulation Video Tutorials.


How effectively are you supporting the superheroes-without-capes who work around you in your team?

Stephen McGuire – Head of Development

Is openness in healthcare under threat?

Mask being held ready for disguiseLately, the subject of Care Quality Commission inspections has been prevalent in the healthcare news.   It’s been claimed the ‘CQC cannot be relied upon to enforce the duty of candour‘.  Their ‘tick box mentality‘ has also been criticised as ineffectual.  A government funded study has concluded ‘evidence is elusive’ to support the idea that ‘the regulator’s regime of intensive inspection has been beneficial’.  In addition, any real evidence that the general public pays any attention to the CQC’s ratings is also lacking.  A potentially greater problem is that some new online GP services are reported to be actively evading inspection.

So are inspections necessary?

Well, let’s consider the new online GP services as an example.  There are undoubtedly some great benefits for all concerned if new systems can be developed which get things right.  Unfortunately, there are also great risks: misdiagnosis, fragmentation of care, over-prescribing and even the reinforcement of health inequalities.  Add in the aspect of ensuring appropriate financial management as new organisations disrupt the status quo and the case for proactive regulation is obvious.  It’s easy to see the relevance in this case. So, why should it be any different for established providers?  Let’s consider events as wide ranging as the scandal of Stafford Hospital from the last decade through to the recent problems stemming from ‘toxic bickering‘ at St George’s Hospital, London.  Things can and do go very wrong.

Waiting for things to go wrong?

The principles of clinical governance are designed to provide a systematic approach for maintaining and improving quality of care by measuring performance against a recognisable standard and promoting accountability.  Inspections by regulators should form an important element of this.  It would be unacceptable for them to simply wait for things to go wrong, relying on others bringing issues to their attention.  But clinical governance cannot stop with the regulators if its full benefits are to be realised.  It’s an essential discipline for all levels of healthcare practice – national, regional and local; within organisations, departments and teams.  Clinical governance should also be central to the personal standards of practice of each and every individual doctor.

Auditing and openness

Audit is one of the main seven pillars of clinical governance.  Organisations, departments and teams must participate in auditing themselves – and each other.  They must be aware of standards, pay attention to performance, compare it to the expected standard and take action where necessary.  This can happen formally or informally by simply sharing feedback.  Again, individual doctors must do the same.  People can and should audit themselves, as well as each other.

Openness is another of the seven pillars.  Good governance requires honesty and candour over what is being observed.  It also requires honesty and candour over personal performance.  But true openness goes beyond transmission of information.  It also includes receiving information from others: the good, the bad and the ugly.  But here lies a problem.

Our research has revealed significant shortfalls in the willingness of doctors to give and receive feedback.  This conclusion results from information gathered through self-audit of over 200 participants.  We also identified a notable reluctance to ask for help when required.  Reasons for this lack of openness vary from arrogance to fear to simply falling in line with the prevalent culture.

Moving forward

Without doubt, the CQC and other regulators must find ways to ensure their inspection processes and reports lead to real differences.  Likewise, all doctors must ensure that they and their colleagues develop the practical skill and discipline required for meaningful performance management.  They must also develop the ability to communicate effectively within teams.  Audit and feedback must be raised above the level of simplistic tick-box exercises.  Otherwise true openness, honesty and candour really does come under serious threat.

What are you doing to improve audit and openness?

Stephen McGuire – Head of Development

How will robots communicate with patients?

Artificial intelligence is developing at pace.  Our mobile phones are filled with an ever increasing plethora of apps.  We no longer need to turn dials, push buttons or remember numbers to get in touch with friends or associates that we’ve not spoken to for years.  Cortana, Alexa and Siri are fast becoming part of our everyday lives.  They bring things to our attention, listen to our needs and take action accordingly.  Driverless cars don’t appear to be that far away.  But, even before we get to that stage, technology in our rapidly developing vehicles can give us directions, control our speed, start emergency braking faster than we can react or even alert us that we may be falling asleep at the wheel.  We are continually learning how to help technology learn so that it can serve us well.

Countless clinical applications already exist and progress is accelerating rapidly.  For some time, apps and instruments have collected data, taken scans and created records.  In the majority of disciplines we now have technology which makes diagnostic recommendations based on comparison of individual results to big data.  There’s even evidence that, in some fields, quantity and quality of information is outstripping our human ability to interpret it.  The ‘machines’ are starting to do it better!  See, for example, the case of technology outperforming retinal specialists in ophthalmology.  As this trajectory continues, clinicians will need to answer a fundamental question:

What are the benefits of being human?

You may or may not be familiar with the activities of the Luddites.  They were a band of 19th century English workers who destroyed new machinery in the belief it enabled factory owners to circumvent accepted manufacturing methods which would lead directly to job losses.  Their focus was driven by fear and self-protection rather than improving efficiency and quality.  Nowadays, their name is used to label anyone resistant to the spread of technology.  Their behaviour also led to the idea of the Luddite Fallacy: technology doesn’t actually lead to job losses – it simply changes the nature of work and the mix of job roles in the economy.  Rather than constantly aiming to prove that humans are better than the machines it helps to be honest.  The machines will be better at some things.  Humans will be better at others.  Let’s ask the question again in a different way:

What can a human do that a machine can’t?

And the answer to that one has to be: less and less as each day passes!

So let’s switch things around and consider healthcare from our patient’s perspective rather than the clinician’s.

What do our patients want?

Access to information?  Yes.  Easy access to services?  Yes.  Accurate, speedy diagnosis?  Yes.  Effective treatment?  Yes.  And again, all these points and more will be increasingly well served through the development of technology.  So, is the patient’s need for a robot fast replacing the need for a doctor?  The Luddite Fallacy informs us that, rather than replacing people, technology changes roles.  Adaptation is essential and this may mean changing focus.

The large proportion of patients are experiencing stress of one type or another.  This stress may be physical, emotional, practical or any combination of the three.  Stress is a complex human reaction.  When added to our patient’s experiences and opinions it opens the door to a plethora of potential behaviours and choices.  Who is best placed to deal with that?  Who should be better at listening, understanding, encouraging disclosure, breaking bad news and helping the person overcome the difficulties they face?  Who should be better at helping people to face their challenges with empathy and compassion?  Computer or human?

The human should be better placed at adjusting to the individual and their unique circumstances than the computer.  They should be better at dealing with patients on a human level.  Sadly, not all doctors live up to this.  Some practice in a functional, automated, dissociated manner.  Some are responding to their own personal stress and deal with patients without the benefits of human connection.  In the long run, it’s quite possible that the machines will be better than them.

The machines are increasingly studying a very important topic:

What does it mean to be human and how can I communicate more effectively?

How much effort are you putting into studying this topic?

Stephen McGuire – Head of Development

Handouts for learners: yes or no?

Image of note takers and handoutsPeople who deliver presentations or lectures for learning tend to favour one of two approaches: those who give handouts and those who don’t.  Those who regularly offer handouts may argue that it is important to ensure learners take away the correct information.  After all, note-taking errors and erroneous interpretations can lead to major problems.  Others point to the importance for learners to take personal responsibility.  Giving handouts, they argue, encourages them to be passive, rather than active participants.

As a doctor, the expectation is that you practice evidence based medicine.  But where is the evidence to indicate whether it is better to give learners handouts or encourage them to make their own notes?  The style chosen is often based on the presenter’s personal approach to learning.  So, who’s right and who’s wrong?

What do learners want?

When we talk to the learners we find an even broader preference.  Some have no interest in handouts and will leave them behind after the session.  Others write endless notes during a lecture or tutorial that they will never, ever read.  Some record short sharp bullet points in their prized notebook.  Others want printed notes to read in tandem with the session.  And some would like all of the notes beforehand so that they can devour them and reflect on them in advance.  So, again, where’s the evidence to support the best approach?

Well, unfortunately, a recent research round-up on the subject of note-taking suggests this is an area where the depth of empirical findings are low in comparison to the breadth of theory proposed.  Yet it still raises some useful points that have kick-started a number of discussion threads.

Back to basics

Let’s consider two very different purposes for note-taking by a learner.  The first is to act as an ‘external storage’ process.  In this case, the challenge is to record as much detail that was seen and heard as possible.  The notes are made so that true learning can happen at a later date.  The second is where the learner is making notes to help convert information to comprehension right here and now.  The former approach is about collecting fact, word for word, step-by-step.  There’s no interpretation at this stage.  The latter is all about interpretation, paraphrasing and recording personal meaning.  Both extremes require mental effort.  As we have a finite limit to our capacity, this effort can either help or hinder genuine learning – dependent on how the presentation is delivered.

How can we use this information?

In a presentation scenario, a good teacher will realise that their group of learners will most likely include both extremes of these note-taking approaches plus all variations in-between.  In addition, there will also be those who have no interest in taking notes who need constant stimulation to maintain concentration.  The good teacher will then use this knowledge to ensure their presentation is prepared and delivered with consideration for this entire broad spectrum of preferences and approaches.

Here are a few suggestions:

  • Structuring and signposting makes it easier for everyone.  Clear direction is like the scaffolding of your presentation.  “I’m going to introduce you to the four main causes of problem X.”  “Cause number 1…”  “Cause number 2…”  “Cause number 3…”  “Cause number 4…” How often have you missed a key point simply because the presenter hasn’t made it clear that they have moved from one sub-topic to another?
  • Images and diagrams have a positive impact on both attention and retention, so long as they are well-chosen and relevant, rather than distracting.  They keep the attention of the observer group and boost comprehension for all beyond mere words.
  • Pace and quantity must be considered.  How often has the presenter moved on to the next slide while you were still copying that diagram or scribbling down point four out of five?  The point or detail is often lost.  As a presenter, you must consider the ability of people to be able to record what you are sharing.  This should inform the length of time that you spend on any sub-topic or display a slide.  This can also give you clues to the quantity of information that you are going to share.
  • Note-taking and handouts are both relevant.  Taking everything into account, encouraging and facilitating quality note-taking is a positive action.  At the same time handouts are particularly helpful when there are larger quantities of factual information or complex diagrams, when correct detail is essential or when you want learners’ focus to be on processing new ideas.

The best guidance may be to bear in mind that the point of your session is to enable your audience to learn – rather than for you to present.  Which brings us to another question:

To lecture or not to lecture?

Stephen McGuire – Head of Development

Sunny-side up: naive or positive?

Thinking black cloudsAnother week – another report highlighting NHS performance issues.  This time around it’s a report highlighting the high number of serious surgeries which are cancelled on the day they are scheduled.  A significant proportion of patients have had their operation cancelled more than once.  Stress for patients, frustration for doctors plus wasted time and resources.  You may already have been involved in discussions within your team about how to sort this out.  Reasons must be identified.  Solutions must be developed.  Actions must be put in place.

But this blog isn’t about cancelled operations.  It’s about the mindset that typically prevails when we are trying to improve something and the unintended negative impacts.

The negative downward spiral

We generally have a tendency to focus on what’s going wrong and what’s not working.  “This is failing”.  “We don’t have resources”.  “That’s not being done well enough”.  “They/we/you didn’t do what they/we/you were supposed to do”.  We then try to figure how to fix the problem or solve the puzzle.  The fixation on failure and under-performance breeds blame and negativity.  The more and more we look at the problems the more they multiply.  They multiply in both severity and in number.  The more problems we have and the bigger they are, the more stressed and negative we feel.  The more people who are discussing the problems, the more people who feel the same.  Morale heads in a downward spiral.  If this goes unchecked it feels like we’re circling the drain with no way back.

No wonder some simply dissociate or bury their head in the sand.  But we can’t ignore the problems.  We have to face up to them and deal with them.

Could there be another way to look at this?

If a negative approach leads to negativity then it would be logical that there must be a positive alternative.  But our instincts tell us we must ask the tough questions and face facts if we are to improve.  Surely anything else is simply naive?

The well-established approach of Appreciative Inquiry is driven by a number of psychological principles, including the idea that a positively asked question will lead to positive change.  The model is summarised very well in this short video.


I’ve painted a rather bleak picture of focusing on the problem.  You may find there are benefits to taking different approaches at different times.

Take a few moments to consider:

On balance, is my approach more ‘problem interrogation’ or ‘appreciative inquiry’?  What is the impact of that?

Stephen McGuire – Head of Development

Practical steps for burnout prevention

StressThe emphasis on provision of healthcare by a healthy workforce has been ramped up in recent years.  Goal #1 on the current Commission for Quality & Innovation list: ‘Improving staff health and wellbeing’.  There are benefits all round.  Reducing absence rates while avoiding presenteeism means staff are less stretched and under less pressure.  That in turn reduces costs, simultaneously increasing quality of patient care and experience.  In addition, members of the workforce are less likely to become patients themselves.  Win-win-win.  Popular methods being provided by organisations for tackling stress and preventing burnout include mindfulness classes with a focus on developing mental resilience; gym classes to promote physical health and stress release; and encouragement for all involved to strive for the elusive work/life balance.  All such initiatives should be rightfully applauded.

But is it enough?  And, maybe there is a more important question:

Is the effort being focused at the right point?

Let’s consider a basic medical approach to a patient problem:

  • Recognise the symptoms
  • Diagnose the cause
  • Appropriate treatment

Stress and burnout are symptoms of deeper issues.  As a doctor, how happy would you be to simply deal with alleviation of symptoms?  Yes, it’s an important step.  But there is a difference between treatment and cure.  It generally best to tackle the cause.  Take it a step further and the ideal is to focus on prevention.  Switching efforts away from manning rescue teams at the bottom of a cliff and onto building safe paths at the top is a better way of dealing with a problem.  So let’s switch attention to the causes of stress.

Getting to the root cause

Just like a patient whose symptoms are exacerbated by multi-morbidity, the causes of stress are multi-factorial.  There are conflicting demands, time pressures, depersonalisation, breaking bad news and demands for improvement to name just a few.  Just like any other ailment, we are each personally more vulnerable to some of these and less affected by others.  True, ‘the system’ has a lot that it has to address, change and to put right.  There are many issues beyond the control of any one individual.  Equally, many aspects of these factors are within the control of the individual – through the development of appropriate skills.

When the symptoms are recognised the key to effective long-term relief is to identify the root cause and deal with it.

Treatment and prevention

Development of skills has an important part to play in both treatment and prevention of stress and burnout.  Consider a doctor who has learned how to communicate effectively with colleagues as well as patients, is organised, can delegate, tackle underperformance, is assertive and deal constructively with the conflicting demands of others.  They will, as a result, be more resilient than one who is their equal in terms of medical knowledge and technical dexterity but has spent little effort in honing these important abilities.  The doctor who has developed an understanding of the system and its challenges, along with their teaching, communication, leadership and management skills will also be an asset to bolster the overall resilience of their department.  They support their team to be more creative, consistent and to improve in all respects.  They become a positive contributor to progress rather than a helpless passenger on a ship which has been cut adrift.

Yes, alleviating symptoms is essential.  But the very best doctors get to the root of the issue and focus on the cause.

What’s at the root of your stress and what are you doing about it?

Stephen McGuire – Head of Development

New Influence & Negotiation Skills Online Course

Difficult conversations are common when you work with people who have differing interests, needs or priorities.  So we’ve introduced a new addition to our collection of online courses for doctors: the Influence & Negotiation Skills Course.

Introductory offer  1/3 off standard price – ends Sunday 16th September.

Accredited for 2 CPD points, it’s delivered in 5 short interactive modules.  Here’s a sneak preview of the short introduction video which starts the course.

>>>>Click here of more details