Significance and Use
4.1 This practice is intended to be used by the NDT using organization to measure the baseline performance of the DDA and to monitor its performance throughout its service as an NDT imaging system.
4.2 It is to be understood that the DDA has already been selected and purchased by the user from a manufacturer based on the inspection needs at hand. This practice is not intended to be used as an “acceptance test” of the DDA, but rather to establish a performance baseline that will enable periodic performance tracking while in-service.
4.3 Although many of the properties listed in this standard have similar metrics to those found in Practice , data collection methods are not identical, and comparisons among values acquired with each standard should not be made.
4.4 This practice defines the tests to be performed and required intervals. Also defined are the methods of tabulating results that DDA users will complete following initial baselining of the DDA system. These tests will also be performed periodically at the stated required intervals to evaluate the DDA system to determine if the system remains within acceptable operational limits as established in this practice or defined between user and customer (CEO).
4.5 There are several factors that affect the quality of a DDA image including the spatial resolution, geometrical unsharpness, scatter, signal to noise ratio, contrast sensitivity (contrast/noise ratio), image lag, and burn in. There are several additional factors and settings (for example, integration time, detector parameters or imaging software), which affect these results. Additionally, calibration techniques may also have an impact on the quality of the image. This practice delineates tests for each of the properties listed herein and establishes standard techniques for assuring repeatability throughout the lifecycle testing of the DDA.
1.1 This practice describes the evaluation of DDA systems for industrial radiology. It is intended to ensure that the evaluation of image quality, as far as this is influenced by the DDA system, meets the needs of users, and their customers, and enables process control and long term stability of the DDA system.
1.2 This practice specifies the fundamental parameters of Digital Detector Array (DDA) systems to be measured to determine baseline performance, and to track the long term stability of the DDA system.
1.3 The DDA system performance tests specified in this practice shall be completed upon acceptance of the system from the manufacturer and at intervals specified in this practice to monitor long term stability of the system. The intent of these tests is to monitor the system performance for degradation and to identify when an action needs to be taken when the system degrades by a certain level.
1.4 The use of the gages provided in this standard is mandatory for each test. In the event these tests or gages are not sufficient, the user, in coordination with the cognizant engineering organization (CEO) may develop additional or modified tests, test objects, gages, or image quality indicators to evaluate the DDA system. Acceptance levels for these ALTERNATE tests shall be determined by agreement between the user, CEO and manufacturer.
1.5 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety, health, and environmental practices and determine the applicability of regulatory limitations prior to use.
1.6 This international standard was developed in accordance with internationally recognized principles on standardization established in the Decision on Principles for the Development of International Standards, Guides and Recommendations issued by the World Trade Organization Technical Barriers to Trade (TBT) Committee.