The New Professionalism

September 7, 2017

When I first get out of bed the word unkempt comes to mind, mainly because my hair is unruly. But after I clean up you wouldn’t call me “kempt,” or my hair “ruly.”  These adjectives don’t, at least in modern use, yield words with opposite meaning by dropping the prefix “un-.”  The same appears to be the case with “unprofessional.”  When applied to a doctor, for example, it most often signifies behavior that is unethical, unbecoming, and unacceptable for a member of the medical profession.  But when someone does live up to those standards, we rarely use the term “professional” to describe them.  Professional in the affirmative more often is used in distinction to “amateur.”  As with the concepts of kemptness and ruliness (neither of which exists according to my spell checker), the concept of professionalism is most often acknowledged in the breach.  It is easier to note its absence than its presence, which makes it difficult to reinforce a positive notion of professionalism, one of six core competencies recognized as essential to becoming a successful physician.

It’s also difficult because professionalism in medicine is a complex and slippery concept.  It includes commitment to ethical principles such as integrity, respect, and compassion; service to patients and society that supersedes self-interest; responsiveness to the needs to diverse patients; and accountability to the profession.  Some of these, like integrity and compassion, have changed little since the time of Hippocrates.  But others evolve with time.  For example, there was a time when doctors were expected put patients first to the degree that they could even endanger themselves.  While some degree of self-sacrifice remains laudatory, few at present would condone such dangerous self-experimental practices like Dr. William Harrington’s exchange transfusion between himself and a thrombocytopenic patient, or Walter Reed and colleagues intentionally infecting themselves with yellow fever in search of a cure. (Not to mention 100 hour work weeks and 36 hour days.)

Yet the degree to which we are expected to put our patient’s needs ahead of our own continues to be something on which there is disagreement.  In part, at least, this appears to reflect generational differences.  Now unlike some of my Baby Boomer compatriots, I do not subscribe to the notion that Millennials are inherently less altruistic.  If anything, my experience with my own children and their peers suggests exactly the opposite.  But I do believe there are differences in how they show it.  My generation was more willing to sacrifice time, while Gen Y foregoes money (though both only to an extent).  And as medicine has become corporatized and more transactional, those who have only experienced that will see it as the norm.  Physicians who have always practiced in an environment where their contribution to patients is measured in relative value units (RVUs) are naturally going to be influenced by RVU considerations.  Those of us who trained 30 years ago should be neither surprised by this, nor feel superior.  (Only a couple of my medical school classmates work for free, after all.)

There are also differences in how we view “accountability to the profession.” Consistent with their overall greater attachment to organizations and institutions, earlier generations were raised with the notion that medicine was a sort of fraternity, and that as members we had an inherent responsibility to it.  If declining membership in medical societies is any indicator, I doubt most Millennials feel that same sort of attachment to the institutions of medicine.  And this isn’t necessarily bad.  Medicine, like many professions, is largely self-regulatory.  The traditional unwillingness to speak ill of a colleague has, fortunately, given way to a greater openness to mutually recognizing, acknowledging, and learning from errors.

Since the more seasoned (i.e., older) of us are the ones in a position to both teach the precepts of professionalism and judge adherence to them, we need to be aware of how the concept evolves. Otherwise we may fall into the trap – described by George Thorogood in the song “Get A Haircut And Get A Real Job” – of believing our younger colleagues are inherently unkempt, unruly, and unprofessional.


Accountability

August 21, 2017

“Well, I’m an accountant, and consequently too boring to be of interest,” says John Cleese in response to a question in a man-in-the-street interview on Monty Python’s Flying Circus. Accounting, and accountants, have a perhaps-not-entirely-undeserved reputation for being necessary but dull.  (Disclosure – my father was an accountant, so I know whereof I speak.)  Yet accountability, which shares a word root, is all the rage.  I just did a Google search and a half dozen news headlines containing the word “accountability” from just this week appeared.

Moreover, at a Children’s Minnesota leadership forum last week, we addressed the topic of accountability, which leaders raised as a concern in a survey. Specifically, there is a belief that we have trouble with accountability.  Our leaders are not unique: every place I have worked believes they have a unique problem holding people accountable.  So it’s clearly not unique. But is it true?  I think it is, albeit not in the way people think.

Let’s start with what we mean by accountability. It comes from the Latin root meaning to count, calculate, or reckon.  In early use, it signified an ability to explain, specifically to reconcile an amount of money or valuables with which one was trusted.  The concept of consequences was added later; it came to mean being able to prove that one has done right, to justify either reward or punishment.  In current usage, it often focuses solely on the consequence part, and specifically on punishment.  Every time I have had someone tell me an organization has a problem with accountability, what they mean is “So and so does a terrible job and they haven’t been punished or fired yet.”

Which may or may not be true. For one thing, firing is only the end of a long chain of consequences for poor performance, the rest of which may be invisible.  (How many of us could say whether anyone we work with has been given warning, put on a performance improvement plan, or sacrificed some compensation because of their actions?)  And those consequences are only the end of a long chain of the process of accountability.  If we look at the full meaning of the word, it includes: 1) establishing expectations; 2) communicating expectations; 3) measuring progress toward expectations (these three are necessary for the part that means “proving one has done right”); 4) providing feedback on that progress and establishing a plan for improvement if expectations are not being met; 5) identifying gaps that may be contributing to not meeting expectations (either fixing system problems that create barriers or helping the accountable person develop the necessary skills); 6) progressive consequences for continuing to fail to meet expectations.

In other words, accountability is really just what we call “performance management.” Which means it is a process. But too often we think of accountability as an event – that person who is incompetent or lazy or unprofessional loses his or her job.

So do we have an accountability problem? Yes, we do.  Holding ourselves and others accountable is difficult.  First, we don’t always devote the time and effort we need to steps 1-3.  I once had a chair who, despite never meeting me face to face, would send me an annual progress summary that read, in its entirety, “I have serious concerns about your progress toward promotion.”  (Spoiler alert – I did OK.)  As for feedback, that can be hard for multiple reasons.  There is a saying, “Feedback is a gift.”  Yet few of us approach receiving feedback with nearly the enthusiasm we have for Christmas or a birthday.  Too often feedback is criticism, not a gift.  We don’t want to be on the receiving end, and often not on the giving end.  Especially in pediatric healthcare, where we are all nice and cheerful and positive.  And in a place like ours, where people stick around and have worked up through the ranks, they may be giving that “gift” to someone who was once a colleague and friend.  Now imagine having to actually change someone’s pay or relieve them of their job.

Yet at the same time, we usually do most of this quite well. For one thing, accountability is about justifying both punishments and rewards.  If we were truly terrible about it, good people would be fleeing to places that recognized their values and rewarded them.  And yes, there are always going to be challenging cases.  But in an organization like ours that takes its mission of caring for kids seriously, the large majority of problems are actually dealt with quite effectively.  We just don’t advertise them.

So to quote another of my favorite movies, The Princess Bride, when I hear people say we have a problem with accountability, I think “You keep using that word. I don’t think it means what you think it means.”  The actual process of accountability – as opposed to the event of the final punishment that people usually think of as accountability – is, like accountants, dull but necessary, and often in the background.


Shadow of Leadership

July 24, 2017

When I was at Children’s Hospital of Wisconsin, I frequently had to go back and forth between the hospital and our office building. In the middle of the day, when traffic was light, I would often cross against the light, as did most other people.  One day Mike Thiel, our director of security, saw me do this and pulled me aside.  “You know, you have to think of the shadow you cast as a leader.  If you cross against the light, others will do the same, and they’ll think it’s OK not to follow the rules because that’s what you do.”  So from that day I would dutifully wait for the light to change – my inner New Yorker seething – and lo and behold, everyone else waited too.

Most of us are leaders to some extent – at work it may be in an informal if not a formal role, and certainly in our families and communities. We need to be aware that as leaders we do cast a shadow.  In setting an example, our actions truly speak louder than our words.

The shadow of leadership is at least as important in producing desired results as in avoiding undesired ones (like walking in front of a moving car). The other day one of our staff at Children’s Minnesota ran into me in the hall.  “I took an example from you,” he said, “and bought a bike.” (Most of you know I’m a regular bicycle commuter.)  “First time I’ve had a bike in 20 years,” he said, “and I’ve been riding it everywhere, more than I can ever remember.  It feels great.”

Wow, that felt great to me, too. To make a positive difference by setting an example is the best way to cast that shadow of leadership.  I just hope he doesn’t cross against the light.


Baseball, Hot Dogs, Apple Pie, and Inequality

June 12, 2017

The official name of Obamacare, the Affordable Care Act, is actually a bit of a misnomer. The goal of the legislation was not to lower the cost of health care.  Rather, it was to expand coverage for millions of people without insurance.  It achieved this goal, in part by expanding Medicaid, the government program primarily for the poor, and in part by providing subsidies to individuals purchasing private insurance on newly established exchanges.  And these subsidies did make insurance more affordable for those receiving them.  Thus the ACA sharply reduced income-based disparities in coverage.  But while the exchanges, spurring competition among plans, did temporarily drive lower prices for those plans offered on the exchange, the law did little to address the underlying drivers of high health care spending, and prices are now again on the rise.  So for all the good it accomplished, “affordable care” is something of a stretch.

On the other hand, the American Health Care Act, passed by the House and now being rammed through the Senate under cloak of secrecy, fully lives up to its name.  It is as American as it gets.  In rolling back Obamacare, it will restore the vast inequalities in health care that are frankly a hallmark of the American health care “system.”  It not only rolls back the Medicaid expansion, it sharply reduces Medicaid spending to below pre-ACA levels (a report done for the Children’s Hospital Association estimates the negative impact on Medicaid funding for children alone at $48-72 billion).  Moreover, it reduces the subsidies, shifts the remaining subsidies toward those with higher incomes, and gives the rest in the form of tax cuts for higher income individuals.  A foundation of the AHCA is to shift decision-making to states.  Medicaid would take the form of block grants, which states would have great latitude to distribute as they wish.  States could also opt to eliminate many of the coverage requirements of Obamacare, which prevent insurers from discriminating against people with pre-existing conditions, mental health disorders, and women, among others.

A study from the Urban Institute underscores the risk of this policy.  It showed a strong inverse correlation between the African-American population of a state and the coverage of its antipoverty programs.  For example in Vermont, which has one of the lowest populations of African-Americans (1%), 78% of poor families receive benefits under the Temporary Assistance to Needy Families (TANF) program, compared with only 4% of such poor families in Louisiana, which is 32% black.  When decisions about helping the needy are made at the state level, old patterns of inequality re-surface.

Such disparities in health insurance coverage and in support for addressing needs related to the social determinants of health lead, not surprisingly, to disparities in health outcomes.  These have been well documented for some time, but lest we need to be reminded of it yet again, a study in JAMA Pediatrics demonstrated that children living in areas of high income inequality had higher rates of preventable hospitalizations.  Another recent study, in Health Affairs, showed the gap in health between the richest and poorest Americans is among the largest in the world.  Of 32 higher-income countries studied, only Chile and Portugal had a wider disparity.

The Senate leaders may be using stealth in getting their program passed (knowing how very unpopular it is), but by calling it the American Health Care Act, they are making very clear their intentions: less health care for more people.


Happy Nurses Week!

May 11, 2017

 

By his own admission, it took Arnold Relman, former editor of the New England Journal of Medicine, until age 90 to realize the importance of nurses in providing quality medical care. It took me until a week after starting my internship.  My first rotation was on 3 Orange, the unit for medically complex children (including many ex-preemies).  In many ways, medical school had not prepared me well for residency.  I had never ordered feeds for a healthy baby, much less one with a 27-item problem list.  My first night on call, covering the entire team, I was asked to order a refill on a medication for someone else’s patient.  I checked my sign out list and wrote (with a pen, on paper) the order; 10 minutes later, the nurse paged me to double check whether that was really what I wanted to order.  It wasn’t: I had mistakenly ordered a soundalike medication, at a dose that would have been harmful if administered.  Embarrassed, I returned to the unit to correct the order.  I made some comment about making a rookie mistake.  The nurse just smiled and said, “It won’t be the last, but don’t worry because we’re all looking out for each other.”

Relman, after being hospitalized for 10 weeks after a fall, wrote a column for the New York Review of Books about his experience, in which he said, “I had never before understood how much good nursing care contributes to patients’ safety and comfort, especially when they are very sick or disabled.  This is a lesson all physicians and hospital administrators should learn.  When nursing is not optimal, patient care is never good.”

Amen. Over the years, I (and my colleagues) have been bailed out by nurses on occasions too numerous to count.  Mostly not because they caught errors – though in the era before computerized order management that was certainly important.  It’s the subtle change in a child’s behavior pattern that made the nurse call me to re-evaluate a patient who was developing hepatic encephalopathy.  It’s the funny movement that the consultant dismissed, which turned out to be decorticate posturing in a post-craniotomy patient.  It’s the question about why I selected a particular test that made me think through and decide on a different one that was just as good but less traumatic for the patient.  It’s putting a teenager with perplexing symptoms in a room and commenting, “She’s acting just like the aspirin ingestions we used to see,” arriving at the correct diagnosis hours before the physicians.  It’s the insight about family dynamics that allowed me to address concerns I might never have identified on my own.  The list is long.

It’s impossible to overstate my gratitude for all that the many nurses I have worked with over the years have done for our patients. Their job is intellectually, physically, and emotionally challenging, with rewards that are hardly commensurate with the demands.  And I also appreciate what they have done for me: for my education, my professional development, and my job satisfaction.  We share food on the night shift, we laugh and cry together, we brag about and complain about our families, we encourage each other, we look out for each other.  Those interactions, those shared experiences, illustrate what Join Together and Be Remarkable really mean.  Nurses are the embodiment of the Children’s Way.


Comfort Promise

May 8, 2017

No doubt Dr. Aziza, my pediatrician as a kid, was a nice man. But my main memory of him, 50 years later, is of having a tantrum and having to be dragged into his office when I realized I was going to get a shot.  Seriously, I still have a vivid recollection of my terror of that needle. (My mom probably does, too.)  I used to think it was me, that I was particularly fearful of sharp objects and pain.  But I now know that this is actually pretty normal.  What we healthcare providers like to call “iatrogenic pain,” which is a typically obscure way of saying “pain caused by us,” is a significant problem in pediatrics.  Even the youngest infants not only have a predictable negative physiologic response to things like needle sticks, but they have lasting effects as well, including aversion to subsequent healthcare encounters and behavioral distress. In other words, when providers do nasty things to kids – and needle sticks for immunizations and blood draws are the most common nasty thing we do – kids get scarred by it and act out.  (Sound familiar, mom?)

Fortunately, awareness of this problem is growing, and many people are doing something about it.  I am proud to note that Children’s Minnesota has developed what we call the “Comfort Promise.”  This is a commitment to offer all children and families at least one of several evidence-based interventions to minimize the pain of needle sticks when they come to our hospital or clinics.  These interventions include topical numbing medicine, positioning, and behavioral soothing measures.  For young infants, sugar water is also offered.

It does take a little extra time and effort for staff.  But when surveyed, children and families said needle pokes were the most unpleasant part of coming to the hospital. So living up to our values “Listen, Really Listen,” and “Kids First,” in the past couple of years we’ve managed to do this for over 90% of patients in the hospital, and we are now spreading it out to the outpatient clinics.

Now if only we could do something about those nasty swabs to test for strep throat….


Race to the Bottom

April 21, 2017

I had the good fortune to hear Dr. Steve Nelson give an eloquent and impassioned talk about equity and racism yesterday, which led me to want to reprise this blog from a few years ago.

I am a racist. There, I said it.

I don’t mean an Archie Bunker-type bigot who hurls invective and spews hate. But view the world through the concept of race, the idea that characteristics are bundled together, and that knowing the color of someone’s skin can be informative about what is inside.  That is the essence of racism: the idea that race is determinative, that people of different color differ in other important ways.  (Some prefer to refer to this as racialism, but let’s just call a racist a racist.)

Now, I didn’t say I believe it; actually, I do not. But being honest with myself, I’d have to admit that when I encounter someone I don’t know, I reflexively begin to make assumptions about them based on their appearance.  I do not consciously accept the concept of race, but my instincts are otherwise.  When I see a patient in the emergency department who is black, I make assumptions about the fact that they probably live in the city of Milwaukee, and they are likely to be insured by Medicaid.  I virtually always catch myself, and I work furiously not to allow that initial assumption to enter into my thinking and actions.  But no matter how good I am at suppressing it, I can’t deny it came up.

I’d be willing to bet a decent amount of money that everyone reading this is also a racist. No doubt, you do your best, like me, to overcome it, and you probably don’t ever do or say anything that would be considered “racist” in the common use of that word.  But it’s probably inevitable.  In large part, it is a manifestation of the way our minds process information.  I have written previously about heuristics – mental shortcuts our brains use to reach conclusions more efficiently. These heuristics are based on our prior experiences and on statistical facts about groups.  When a child encounters a dog for the first time, she is unlikely to be fearful.  If her first experience results in being bitten, she will instinctively react with caution to dogs in the future.  Even those of us who have never been bitten are likely to be more leery around pit bulls, based on reports (which it turns out are probably wrong) that the breed accounts for the majority of bites.

We live in a society where, statistically, there is an association between, for example, race and poverty, or race and crime. In that sense, the heuristic isn’t wrong.  It’s true that in our ED, black patients do largely live in the city of Milwaukee, and are disproportionately poor.  We run into trouble in at least two ways.  First is when we take a true fact about a group and apply it to an individual.  Even if it’s true that more blacks in this area are more likely to not finish school, it is an affront to the inherent worth and dignity of each person to make any assumptions about an individual black person’s educational level.   When we deal with a person, we cannot use mental shortcuts.  But to overcome them we must acknowledge them.

It’s also a short and slippery slope from seeing an association to seeing causation. Many people are too willing to make the leap from “black people are more likely to live in poverty” (a true if unfortunate fact), to “black people are poor because they are black.”  Therein lies the kind of thinking that people commonly associate with the term racism.  And racism in this sense is still too prevalent in 2014.

Just six years ago, in the aftermath of President Obama’s election, we were hearing about how America had become “post-racial.” Now, it seems that race relations are in the worst shape I can remember.  What went wrong?

If the first step toward a solution is admitting there is a problem, we have to accept that we are, nearly universally, racist. It takes a lot of mental effort to override our heuristics.  Pretending racism is something that only overt bigots experience, it’s too easy to let down our guard.  It also closes off conversation.   The inherent racial thinking that we all have is pretty obvious to most members of racial minorities, but less so to those of us in the majority.  Denying it invalidates their experience and prevents us from building the kind of connections that might mitigate its effects.

I’d love to think we can actually get beyond the idea that skin color has anything to do with any other inherent characteristics – we don’t tend to draw the same conclusions based on hair or eye color, after all. Not that there hasn’t been some progress.  Some medical journals, for example, will not accept analyses based on race unless there is a clear biological explanation (e.g., a study involving actual skin pigmentation).  Too often race is used as shorthand for socioeconomic status or educational status; such reporting simply reinforces the stereotypes and does nothing to contribute to our understanding.  But race seems such an entrenched part of the way of looking at the world, it’s hard to imagine a “post-racial society” anytime soon.

In the meantime, if rational thinking is to prevail over instinct, need to accept that regardless of our best intentions, we all view the world through the lens of race. Go ahead, say it.


%d bloggers like this: