Significance and Use
4.1 This practice is intended to be used by the DDA user to measure and record the baseline performance of an acquired DDA in order to monitor its performance throughout its service as an imaging system. This practice is not intended to be used as an “acceptance test” of a DDA.
4.2 This practice defines the tests to be performed and their required intervals. Also defined are the methods of tabulating results that DDA users will complete following initial baselining of the DDA system. These tests will also be performed periodically at the stated required intervals to evaluate the DDA system to determine if the system remains within acceptable operational limits as established in this practice and defined between the user and CEO.
4.3 There are several factors that affect the quality of a DDA image including the basic spatial resolution, geometric unsharpness, scatter, signal to noise ratio, contrast sensitivity, contrast/noise ratio, image lag, and for some types of DDAs, burn-in. There are several additional factors and settings which can affect these results (for example, integration time, detector parameters, imaging software, and even X-ray radiation quality). Additionally, detector correction techniques may have an impact on the quality of the image. This practice delineates tests for each of the properties listed herein and establishes standard techniques for assuring repeatability throughout the lifecycle testing of the DDA.
1.1 This practice covers the baseline and periodic performance evaluation of Digital Detector Array (DDA) systems used for industrial radiography. It is intended to ensure that the evaluation of image quality, as far as this is influenced by the DDA system, meets the needs of users, and their customers, and enables process control to monitor long-term stability of the DDA system.
1.2 This practice specifies the fundamental parameters of DDA systems to be measured to determine baseline performance, and to track the long-term stability of the DDA system.
1.3 The DDA system tests specified in this practice shall be completed upon acceptance of the system from the manufacturer to baseline the performance of the DDA. Periodic performance testing shall then be used to monitor long-term stability of the system in order to identify when an action needs to be taken due to system degradation beyond a certain defined level.
1.4 Two types of phantoms, the duplex plate and the five-groove wedge, are used for testing as specified herein. The use of these two types of phantoms is not intended to exclude the use of other phantom configurations. In the event the tests or phantoms specified herein are not sufficient or appropriate, the user, in coordination with the cognizant engineering organization (CEO) may develop additional or modified tests, test objects, phantoms, or image quality indicators to evaluate the DDA system performance. Acceptance levels for these ALTERNATE test methods shall be determined by agreement between the user and CEO.
1.5 The user of this practice shall consider that higher energies than 450 keV may require different test methods or modifications to the test methods described here. This practice is not intended for usage with isotopes.
1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of regulatory limitations prior to use.
1.7 This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.