• AI
  • Molecular Imaging
  • CT
  • X-Ray
  • Ultrasound
  • MRI
  • Facility Management
  • Mammography

The Expertise of Ignorance

Article

If you allow yourself to adopt an uninformed opinion, you’ll be resistant to facts that undermine it if/when they come your way.

One of two things seems to follow the phrase, “I’m not going to get political...” Either the speaker goes on to say something political anyway, or others around him do. no matter how carefully-impartial or downright bland his statement was.

Well, I’m not getting political here, and if you want to act as if I did, you’ll have to go to my Twitter page to exercise that second option. I am going to reference the recent US pullout from Afghanistan just as an example of the phenomenon I’m talking about in this week’s column.

Others far more plugged into public sentiment than myself observed, as things unfolded, how amazing it was that such a huge proportion of Americans were certain that the withdrawal had been botched. One podcaster summed it up particularly nicely: Didn’t it seem strange that so many knew things should have been done differently? How could it be that the only folks who didn’t know just happened to be the ones in charge of the operation, and also happened to be the most experienced, credentialed military/strategic experts.

Put another way, if you grab a random person with a strong feeling about the pullout, good or bad, what’s the chance you’ll have someone with any military training/experience whatsoever? Plus a detailed understanding of the situation “on the ground” where the withdrawal took place?

I’d place the likelihood somewhere north of 99% that you’ll have no such individual. Your random selection will speak purely from what s/he’s heard in the news, or from friends/family who seemed to know what they were talking about. Even completely unreliable social media postings—which, some would argue are increasingly on par with traditional “news” sources—would be filtered through the random person’s “common sense” and “real-life experience” before being shared with you as their strong/confident position.

Guesstimates at best, guesses more likely, based on speculation and scuttlebutt. How could anybody be so sure of themselves (or reliant on the verbal output of those around them) in such circumstances? To the point of getting emotional and thus even more irrational when encountering someone not sharing their conclusions and opinions?

And lest we, personally, sneer from our high horses at such folks regarding their take on the Afghan situation, I’d point out that few if any of us fallible mortals are immune to this phenomenon. Not predisposed to jump on the latest sociopolitical issue, come to quick conclusions, and sound off about it? How about your radgroup? The referrers who send cases to it? The hospitals or other facilities you cover?

Plenty—nay, most—of the rads I’ve known have the same ability to develop strong opinions based on little to no evidence about whatever the leadership of their radgroups or hospitals is up to:

  • New hardware or software turns out to be less-than? Well, clearly, the folks in charge decided to sacrifice quality for a lower price.
  • Referrers seemingly allowed to order whatever they want, whenever they want it, no matter how inappropriate their requested protocols for the provided clinical histories? Or no histories whatsoever? Clearly, the rads/admins in charge have decided those referrers are more feared/valued than practicing good radiology.
  • Group partners cutting or eliminating bonuses this year? Even next year’s comp-package? Obviously they’re getting greedier and more willing to bleed the associates to fluff up their own bottom line.

The vast majority of the time, these knee-jerk judgments don’t long stand up to the gathering of facts. Start looking into the details just a little bit, and one finds out that the actual situation is nowhere near as simple as it appeared before. At the very least, whoever made the “bad” decision is discovered to have been in a tough spot that maybe had no good outcomes, and it was just a matter of minimizing the bad.

Ignorance can make you feel a lot like an expert. Even knowing that the real world isn’t black/white, it can be a sort of mental comfort-food to act as if it is, rather than gathering and considering evidence that fills in the infinite shades of gray. So much so that, if you’ve allowed yourself to adopt an uninformed opinion, you’ll be resistant to facts that undermine it if/when they come your way, ie:

‘Oh, the radgroup leadership put out a statement about the bonus/comp-cuts. But I can see right through that, since I already decided what their motivations are. Now they’re just trying to do some damage-control with nice, cheap words rather than paying us what we deserve.’

The sneaky thing is that your susceptibility to ignorance-expertise increases with your proximity to the situation. The more of a personal stake in it you have—whether it be material such as your paycheck or mental/emotional such as your preferred political group coming out looking better—the more easily your brain convinces you that you know enough about the situation to come to a reasonable conclusion.

So be on guard all you want—it’s still going to happen to you. The next-best thing to preventing it is to keep your mind open about recognizing when it’s happened, and undoing it. If you find yourself clinging to strong opinions without really having good reasons for them, especially if you’re dismissing real-world evidence that challenges your stance as if you were swatting gnats…you’d do well to consider it a wake-up call. That podcaster I mentioned above has shared that he uses the phrase, “I could be wrong,” as something of a mantra.

Just as an individual can strive to keep himself honest, so too can the entities he might be judging. Being transparent and freely sharing information goes a long way: If you put evidence out there for all to see—ideally on a routine basis, before you have any big announcements—you’re preventing some of the ignorance which might otherwise color you unfavorably in others’ eyes.

Recent Videos
Radiology Study Finds Increasing Rates of Non-Physician Practitioner Image Interpretation in Office Settings
Related Content
© 2024 MJH Life Sciences

All rights reserved.