Dr. John J. Sunderland, PhD, MBA
Professor of Radiology-Division of Nuclear Medicine
Carver College of Medicine
University of Iowa
Abstract: The use of radioactive decay and their particulate and gamma-ray emissions in medical imaging and therapy dates back to the late 1930’s with the use of radioactive Iodine. Use of nuclear medicine expanded substantially in the 1960’s with the advent of the gamma camera, and then scientific excitement was boosted again with the invention of positron emission tomography (PET scanning) in the late 70’s. These nuclear technologies demonstrated the ability not to image the anatomy (like x-rays, CT, and later MRI), but to image the actual molecular biochemical underpinnings of diseases, like cancer (the Warburg Effect – look it up!), heart disease, and Alzheimer’s disease.
Clinical use of PET imaging began is the early 1990’s. Creighton University had one of the first clinical PET facilities in the US, opening in 1991 on Dorcas Street, complete with its own cyclotron used to produce radioactive 18F, 11C, 13N, 15O. But challenges to Medicare and insurance reimbursement coupled with regulatory complexities, mostly from FDA, resulted in slow growth, and even stagnation of the field.
Beginning around 2012, through advances in radiation detector technology, computing power, corporate investment, and infrastructure building, nuclear imaging and in particular, radiopharmaceutical therapy have taken off into one of the fastest growing segments of medicine today.