Sometimes letting your work speak for itself is best.
Increasingly many years ago, I was approached by someone from the emergency room. Doctor, noctor, I don’t recall…whoever it was had seen something on a chest X-ray that looked iffy to them, and they wanted me to re-check it. I looked at it again, still saw nothing amiss and said so.
“Well, what’s that,” came the rejoinder, with an accusing finger pointing at some shadow or other on the image.
An uncomfortable feeling came across me, as I could not confidently name an anatomic structure (or confluence thereof) that was responsible. However, I recognized the shadow as something I’d seen on hundreds, if not thousands, of CXRs gone by, and I knew it wasn’t pathology. I said as much, and that was good enough for my visitor. It had to be, since he didn’t have anybody else to ask (and I never heard about it via quality assurance).
The words I spoke were not “I’m the expert here, and I say it’s fine,” but that could have been a different way to say the same thing. Sufficiently undiplomatic that I’d never actually utter it—heck, I kind of recoil at the thought of proclaiming myself an expert now. I’ll get back to why in a bit.
Still, “expert” is a reasonable word to apply to anybody competently practicing radiology. Some rads are more expert than others, what with sub-specialty training and years in the field. But compare any one of us to a non-rad, even a physician of another specialty, and we’re experts by comparison.
(Okay, I’ll grant that certain other docs might be better at getting particular bits of info out of imaging pertaining to their sub-specialties. Surgeons looking specifically at things they’re experts at cutting, for instance…but the spine-guy is in for a bad time if he fails to appreciate a renal-cell carcinoma next to the disc he’s sizing up.)
For more coverage based on industry expert insights and research, subscribe to the Diagnostic Imaging e-Newsletter here.
One of the things about seeking an expert’s opinion is that you generally don’t begin by testing his expertise to see if he’s worthy of your trust. (We’re not talking about a hostile “expert witness” whose credibility you want to tear down—this is someone you’re consulting.) If it’s a given that he’s the expert and you’re not, all you risk doing is wasting time and inserting hostility into the proceedings.
Similarly, the expert generally doesn’t want to begin by proving his expertise to you. Remember, he’s not soliciting for business; you came to him, not the other way around. Coming up with ways to showcase one’s expertise can be perceived as “protesting too much,” sowing seeds of doubt where there were none…and, again, might be seen as an unwanted waste of time.
An attending in my fellowship-program sort of taught this point. He would tell rads-in-training that the purpose of their reports was not to teach referrers radiology. For instance, don’t explain all the reasons why an incidental renal lesion is a simple cyst, or how you know the liver is fatty; just state the pathology and move on. His main drive was keeping reports concise—but another worthwhile facet is that the more an expert goes out of the way to explain himself, the more he risks looking like not so much of an expert.
Another mentor, further back in my training, observed that folks’ depth of knowledge is surprisingly shallow. Ask them something about their expertise, they’ll probably know it. Ask them a follow-up about the underpinning stuff, they’ll be reasonably familiar. Go a question or two deeper, they’ll start getting shaky.
They might have known the answers at some point—but after their education, they stopped needing to know that info, and it got buried in the cobwebbed recesses of their memory. Given a chance, they could look it up and re-familiarize themselves…but, much like an adult who gets irritated by a kid who keeps asking them “why” questions on a subject, they won’t look so wise or all-knowing in the process.
Thus, if you’re the expert, unnecessarily volunteering to dig deeper into how you know what you do gets you closer to a point where you won’t have answers…and you might just interest your audience enough to ask for them. Why risk that?
Society has generally gotten more weary and less respectful of experts as it is. It seems that every talk show, whether posing as offering “news” rather than opinions, boasts its own experts…helpfully labeled as such at the bottom of the screen and lathered up by the adjacent talking heads (or pilloried if invited on the show as a token of the opposing view, little more than a strawman). Whatever expertise they actually have always seems to take a distant back seat to the axes they’ve got to grind.
Which gets back to that reluctance I’d mentioned above: The last thing I want to do is come right out and hype myself because that’s just an invitation for circumstances to take me down a peg. If I say anything like, “I’m the expert here,” it raises the point for refutation.
I think the better way is to simply demonstrate your expertise. Matter-of-factly, unceremoniously, as if you do it all the time—which, hopefully, you do. Repeated interactions with you will teach others that, yes, you know what you’re doing…especially if you are honest about being beyond your depth on the occasion that it happens.
Quiet, humble, self-assured competence inspires a lot more confidence than tooting your own horn like a snake-oil salesman claiming to have the goods that everybody else needs.
Follow Editorial Board member Eric Postal, M.D., on Twitter, @EricPostal_MD.
The Reading Room Podcast: Emerging Trends in the Radiology Workforce
February 11th 2022Richard Duszak, MD, and Mina Makary, MD, discuss a number of issues, ranging from demographic trends and NPRPs to physician burnout and medical student recruitment, that figure to impact the radiology workforce now and in the near future.
New Study Examines Short-Term Consistency of Large Language Models in Radiology
November 22nd 2024While GPT-4 demonstrated higher overall accuracy than other large language models in answering ACR Diagnostic in Training Exam multiple-choice questions, researchers noted an eight percent decrease in GPT-4’s accuracy rate from the first month to the third month of the study.