Sign in or
Showing 1 post
Compensating Filters and Radiation Dose in the Digital Era
Mar 23 2012, 7:37 PM EDT | Post edited: Mar 23 2012, 8:03 PM EDTIt is a given that the inherent latitude of the Digital systems enable us to visualise a far wider range of densities than Film/Screen technology.
However, the use of Compensating Filters also reduces Dose to the Patient by absorbing the excess radiation before it reaches the less dense regions of the subject. In so doing scatter is reduced resulting also in a reduction in "noise" and, therefore, improved image quality.
From the Film/Screen days Optimum kV was also employed to modify the Latitude of the "Radiation Image", before it impinged upon the receptor, enabling the sensitometric response of the receptor to depict the range of structural densities of the subject in the processed image.
With Intensifying Screens of calcium tungstate, then later the Rare Earths such as terbium activated gadolinium oxysulfide, the kV response was pretty much linnear through out the medical diagnostic X-Ray spectrum from 40 to 150kV. I believe this is no longer the case and that there is a peak sensitivity of the phosphors used in digital systems at around 60 to 80kV, requiring more exposure outside this range to achieve the same Exposure Index, (or equivalent).
This seems to be of proprietary value to the various manufacturers so not that easy to confirm.
It does raise the point that an exposure chart management policy recommending raising chart kVs across the board may be misplaced and counter to ALARA.
0 out of 1 found this valuable. Do you?
Sign in to be the first to reply.