Medical imaging technology is characterised by continuous rapid changes and developments. Medical imaging is used extensively in the diagnosis, monitoring, and treatment of patient medical conditions. Medical imaging refers to a range of imaging technologies such as projection radiography, computerised tomography (CT), Magnetic resonance imaging (MRI), ultrasound (US) and others, which are used to provide information about the morphology (anatomy) and physiology (function) of the human body. (Figures 1 to 4)
Figure 1: Conventional radiography (X-rays)
The quality of the image is often assessed as how reliably the image represents the anatomy or the pathology being imaged and has a direct impact on the patient’s diagnosis and/or treatment. Suboptimal images may lead to misdiagnosis, incorrect treatment delivery, and necessitate a repeat of the image. Repeat images result in an unnecessary risk to the patient. All imaging technologies have an element of risk. Projection radiography and CT have the risks associated with the use of ionising radiation, which include both stochastic and non-stochastic effects. MRI uses magnetic fields and electromagnetic frequencies and their associated risks. US makes use of high frequency sound waves which produce a heating effect and potential cavitation in tissues.
Figure 2: Computerised tomography (CT)
Producing high quality medical images is important for image interpretation to obtain the maximum diagnostic information and to be able to visualise discrete changes in anatomy indicating early pathological processes. Higher quality medical images normally imply higher radiation dose or risks to patients since changes in exposure or scan parameters to this effect often increase radiation dose or associated imaging risks.
Figure 3: Positron emission tomography (PET) CT
Figure 4: Ultrasound (US)
Optimisation: Limitation of risks
In the case of imaging modalities using ionising radiation, reference doses serve as an indication of doses which are considered higher than would normally be expected for standard radiographic practice. Applying this concept helps radiologists and radiographers to identify and optimise those imaging protocols, which provide unusually high patient doses. Focusing dose reduction strategies on those centres or radiological examinations yielding higher radiation doses than the established norm is an efficient method of reducing radiation dose to patients. The establishment of reference doses is an important step towards the optimisation of patient doses. Publications 60 and 73 of the International Commission on Radiological Protection (ICRP) replaced the term reference dose with dose reference levels (DRLs). The concept of reference doses was then adopted by the European Union (EU) in its Medical Exposure Directive (MED) in 1997 and again in the Basic Safety Standards Directive of 2014 as diagnostic reference levels (DRLs), requiring all member states to promote the establishment and use of DRLs. The establishment of DRLs assesses and compares the radiation dose in performing clinical diagnostic examinations with the dose given in other national and international institutions. DRLs should be reviewed at certain time intervals to determine whether radiation dose optimisation is adequate or whether corrective action is required. DRL comparative assessment is performed for similar sized patients undergoing the same clinical examinations. DRLs are not there to compare individual patient doses but to compare average patient doses for a given examination in a particular hospital. DRLs are recommended for those radiological examinations, which are more frequently requested or those involving a higher radiation dose.
EU directives require the establishment of DRLs in medical imaging using dose quantities that are simple to obtain and which take scan parameters into account. Dose quantities should be applicable to all current and new types of imaging technologies. Radiation dose descriptors need to be consistent with other reference doses and dose descriptors already in use allowing comparison of the radiation doses in different imaging centres and with radiation doses from other diagnostic imaging examinations.
The criteria for radiation dose to the patient given for each projection radiography examination is expressed in the European Guidelines in terms of a reference value of the Entrance Surface Dose (ESD) for a standardised sized patient. Volume CT dose index (CTDIvol) and dose length product (DLP) have been proposed in the European Guidelines as appropriate dose quantities for the establishment of DRLs in CT. CTDIvol represents the average dose within a scan volume relative to a standardised CT phantom. CTDIvol is not the dose specific to the patient but a standardised index of the average dose delivered from the scanning series. A complete CT examination consists of several scans through the anatomical region. A longer scan length employs a greater number of slices and longer anatomical length. The CTDIvol multiplied by the total scan length in centimetres is given as the dose length product (DLP) given in milligray-cm (mGy-cm). Monitoring of DLP provides control on the volume of irradiation and overall exposure for a CT examination.
DRLs do not signify an optimum performance and reduction of radiation doses below the DRLs should always be pursued in line with ALARA, but with due attention to the potential loss of clinical information with any radiation dose reduction.
Optimisation: Diagnostic image quality
The standard of image quality in medical imaging mainly depends on the preferences and requirements of the reporting radiologists. Radiologists may opt for aesthetically pleasing low noise, high quality medical images and may be reluctant to reduce radiation dose or imaging risks. This potential for subjectivity increases the possibility of producing medical images that may exceed the clinical requirement for adequate diagnosis and hence the need for optimisation in keeping radiation doses as low as reasonably achievable, commensturate with the medical purpose. Hence the need for a keen understanding of image quality evaluation tools so that we can recognise good image quality to be able to balance this with low radiation dose or imaging risks. Low radiation dose and low imaging risks should be optimised in a way that is consistent with image quality to ensure that the radiation dose and imaging risk is ‘as low as reasonably achievable below the appropriate dose/risk constraints, with economic and social factors being considered’. Effective and scientifically accepted methods of assessing image quality are needed for the implementation of such optimised imaging protocols.
The important question here is, how can we ensure acceptable image quality while remaining confident in our diagnosis, using the lowest possible radiation dose or imaging risks?
The aim of image quality review during radiation dose or imaging risk reduction experimentation should, therefore, be to provide an image that is suitable for the clinical task and able to deliver sufficient information to the radiologist permitting a medical decision to be taken with an acceptable amount of assurance with the lowest radiation dose or imaging risks to the patient. Investigating new methods for radiation dose and imaging risk reduction in medical imaging have always been a priority and part of the fundamental principle of optimisation. Often simple low-cost measures are available to reduce radiation doses and imaging risks without loss of diagnostic information. Studies performed to investigate the effect of measures in reducing radiation dose and imaging risks and their effect on image quality have either tackled specific exposure or scan parameters individually or measured the effect on image quality by using phantoms. However, phantom studies alone may not necessarily reflect the true quality characteristics of the medical image in representing anatomical structures.
Evaluation of anatomical criteria such as the ones outlined in the European Commission’s documents: EUR 16260 and EUR 16262, consider both the anatomy of the area under examination and the physical image criteria including structures to be clearly visualised and discriminated reflecting contrast differences between tissues essential for the detection of pathologies. The underlying assumption in the use of these criteria is that if normal anatomy is faithfully reproduced then pathology will also be demonstrated.
To limit any uncertainties in interpretation, observer performance tests on optimised medical images should be carried out testing visualisation of anatomical structures or known pathologies. Observer performance methods such as image criteria (IC) studies, visual grading analysis (VGA) and receiver operating characteristic (ROC) analysis are established methods for the analysis of image quality. IC and VGA are useful in most cases where patient examinations present as normal anatomy. ROC is of value in determining whether a specific technique or technology is sufficient for identification of specific known pathologies. ROC is not always feasible, as obtaining an image database of sufficient size containing pathologies of varying subtlety may be difficult.
Conclusion
Adherence to diagnostic requirements for each radiographic examination will ensure that diagnostic effectiveness does not suffer from implemented optimisation strategies.
 References available on request
Dr Francis Zarb
This article appears in the latest issue of Omnia Health Magazine. Read the full issue online today.