Bookmark and Share

Standardization News Search
Radiography Standards

by Claudia Kropas-Hughes
Cliff Bueno
Bill Meade
and Fred Morro

Part 1: X-Ray Vision

Just as most people are familiar with X-ray examinations in medicine — checking for broken bones, or internal body changes — so is X-ray used to check the internal structure of industrial parts. Committee E07 on Nondestructive Testing deals with it every day.

The committee is responsible for authoring and maintaining standards for a broad spectrum of industrial applications, ensuring that the latest technologies available can be utilized to help improve process controls. One of its subcommittees is E07.01 on Radiology, which specializes in standards involving penetrating radiation, known as X-rays, for practices, guides and test methods for a broad spectrum of industrial applications.

Radiology is a procedure for producing a radiographic image by using a source of penetrating radiation on one side of a specimen and a detector on the other side. The radiation sources most typically used are X-ray tubes; the detectors, however, come from a broad spectrum of materials, from sheets of photographic films to solid-state materials that register X-rays.


The initial use of X-rays was in the generation of a “radiograph.” A radiograph was created by using an X-ray tube to illuminate a part with X-rays, with a photographic film placed on the opposite side of the part. The film captures the X-rays that pass through the part, generating a radiograph. Once the film is photographically processed, it is then placed on an illuminated screen for viewing by a human inspector. The parts of the film that have received more radiation, meaning thinner sections or cavities are present, appear darker because the film density is higher.

The radiograph permitted humans, for the first time, to see inside a part, without having to cut open the material, and this became a powerful tool in both medicine and industry. One of the first industrial applications for X-rays was in the process control of fusion welds in 1927. And with the start of World War II, and the need to conserve materials, radiograph generation for the quality control inspection of manufactured parts boomed.

The use of radiographs, with its photochemical processing, pushed industry to examine ways to detect X-rays without the use of film. In the last 20-30 years, extensive work has been performed in X-ray detector technology, which is electrical in nature, instead of needing chemical processing. This was the advent of digital radiography – generating a radiograph-like image, but using electrical detectors that stored the image information on a computer. The next two sections will cover some of the fascinating aspects of X-ray inspection going digital in the late 20th and early 21st centuries.

ASTM Standards Activity

Subcommittee E07.01 provides an extensive collection of standards on the use of X-rays for industrial inspections. This subcommittee provides guidance for:

• Manufacture and application of X-ray sources;
• Manufacture, use and storage of film used for radiography;
• Description and use of electrical detection devices used in digital radiography;
• Description and use of integrated X-ray systems, such as computed tomography scanners; and
• Application-specific practices for industrial inspection problems.

Part 2: Is There a DR in the House?

“DR” in this case stands for digital radiography, a means to use X-rays to examine the internal structure of parts, while producing a computerized, digital record of the image.

The medical community has led the development of digital X-ray imaging, where the demand for imaging systems allows significant investment in the development of the tools. Spin-off from the medical community has occurred, allowing the introduction of digital imaging technology for the industrial radiography community.

The digital image, by its nature, provides numerical results important for metrology and thickness measurements. The development of a wide range of digital X-ray imaging products complements the recent digital revolution and provides digital image data and results that can be incorporated into the massive digital manufacturing and services databases that have emerged to help manage the life cycles of products and structures.

In the field of industrial digital radiography, there is really no single standard X-ray system to address all applications. A major consideration in system design is of the digital X-ray detection device itself. There are almost as many choices of detectors as there are ways to configure the overall test system.

Digital Detectors

The following list shows the categories of detectors used in digital radiography today.

X-ray Image Intensifiers – X-ray phosphor (luminescent material that fluoresces upon X-ray illumination) deposited onto a photocathode that is electrostatically scanned and intensified. The resulting electron beam impinges on an output phosphor where the image is converted back to light. This image is then displayed by a TV camera.
Solid State Detectors (Semiconductors) — Semiconductors become conductors when exposed to ionizing radiation. These detectors are typically used in very low-energy systems (i.e., X-ray diffractometers).
Silicon-Based Detectors — X-ray phosphor coupled to a pixelized silicon readout device such as a photodiode, a CCD, or a CMOS device. The device can be amorphous or crystalline. The image can be seen either directly or using lens optics as dictated by the architecture of the silicon device and the needs of the inspection. The pixelized read-out device can be either a linear array or an area array. The image signals are digitized and transferred to a computer for display.
Direct Conversion Device (e.g., Amorphous Selenium Flat Panels) — These panels are a medium in which X-rays are absorbed directly in a photoconductive material without need to convert to light. The carriers are then transferred typically to a silicon readout device. As above, the image is digitized and transferred to a computer for display.

ASTM Standards Activity

Subcommittee E07.01 has been developing standards for digital radiography for almost 20 years. This subcommittee develops standards that provide guidance in evaluating different X-ray digital systems, methods of calibrating systems, and practices for use and implementation.

Part 3: Pixels and Bits

It’s the new millennium. We have the Internet, digital mobile telephones, digital music, digital television, and digital photography. Digital technology has really come of age. One of the more exciting is digital X-ray imaging.

The most commonly used medical digital X-ray technology is computed radiography. With thousands of systems being used for medical diagnostics, the natural progression was to advance this technology for the more demanding industrial applications.

Picture This

Two industrial radiographers are at work. One is a military field radiographer inspecting unexploded ordnance. One is at a power plant examining pipes and tubing for corrosion and weld quality.

Both have very important tasks at hand. Both are, needless to say, in less-than-comfortable environments. Both will use computed radiography to perform their inspections.

The military inspector will get immediate results as to the condition of the unexploded device. He can make references to thousands of images of similar devices on his hard drive. His time in a very dangerous environment is significantly reduced. His ability to enhance the digital X-ray images makes his job of providing critical information to the disposal team easier. The disposal team can electronically transmit the images to other disposal experts all over the world, almost in real time.

The power plant inspector can radiograph more linear footage of pipeline in less time. Because he requires less radiation to perform the same tasks, the people around him are safer. Because he is using penetrating radiation, he has no need to remove protective insulation to contact the pipe walls. There is less potential for exposure to hazardous materials. Because he is using computed radiography, he can use digital measuring tools to determine the amounts of wall loss due to corrosion and erosion. He can then send his clients his inspection results on a CD or DVD. That’s the digital age. That’s computed radiography.

Behind CT Technology

Computed radiography technology is based on the use of flexible, re-usable imaging plates (photostimulable storage phosphors), with a chemical composition that — when exposed to X, gamma, linear accelerator, and neutron radiation — store latent images. The stored images are subsequently extracted by laser excitation, which causes the properties found in the phosphor to luminesce. During this process, the image is dated, collected, reconstructed, and displayed on a video workstation. The phosphors can be re-used thousand of times, with no degradation of image quality. Used primarily as a replacement for X-ray film applications, imaging plate technology is rapidly becoming the leading replacement for conventional film applications.

The benefits of having an industrial radiographic process without the costs associated with film, photographic chemistry, environmental compliance, and film storage are further enhanced by the ability to achieve the required results in less time, at lower doses of radiation, using digital image archiving and the ability to network. Various image interpreters can now share information and make decisions without having to have film hard copy, or having to travel.

Of course, one of the key ingredients of any digital imaging device is the software. Powerful algorithms have enabled computed radiography to rival film in its ability to image very fine flaws. Software also allows for images to be enhanced and magnified, facilitating flaw conspicuity, and assisting the radiographic interpreter in a very demanding, tedious task.

ASTM Standards Activity

E07.01 has developed standards for the use of this technology, which include a standard guide and a standard practice. Current activities also include the development of digital image files of reference radiographic images, which provide visual inspection standards of flaws to the nondestructive testing community.

Part 4: CT Scans

Computed tomography scans were originally inspired by the needs of the medical community to non-invasively examine patients internally. CT is a method that uses a computer to reconstruct an image of a cross-sectional plane (slice) through an object by taking multiple radiographic data sets at a specific slice plane and reconstructing them into an image. The fundamental difference between CT and conventional radiography is shown in Figure 1. This figure shows that if an internal feature is detected in conventional projection radiography, its position along the line-of-sight between the source and the film is unknown. Somewhat better positional information can be determined by making additional radiographs from several viewing angles and triangulating. This triangulation is a rudimentary, manual form of tomographic reconstruction. In essence, a CT image is the result of triangulating every point in the plane from many different directions.

CT images are proportional to the density of a material under examination. This is one of the principal virtues of the technology and the reason that image data are often thought of as representing the distribution of material density within the object being inspected. This fact made CT of particular interest to industry – it provides a method of examining components, by imaging thin slices of that component for examination, all without cutting into the part.

CT Scanners

There are currently five types of CT scanners commercially available today. These scanners are identified by the mechanical equipment configuration that provides the relative motion between the test article, the source, and the detectors. It makes no difference, at least in principle, whether the test object is moved systematically relative to the source and detectors, or if the source and detectors are moved relative to the test object. Physical considerations such as the weight or size of the test article are the primary determining factors for the most appropriate motion to use and system selected for use.

ASTM Standards Activity

Subcommittee E07.01 has developed standards for CT for the last 15 years. The standards available include a guide for CT that is an extensive tutorial on the subject and a practice for CT examination, which provides users with information on applying the CT technique in practice. There are also standards available on calibrating the density measurements, evaluating the resolution of the CT system, and even a practice on how a new user is to select the CT scanner for a specific inspection application. //

Copyright 2003, ASTM

Claudia V. Kropas-Hughes (all articles) is the nondestructive examination research leader for the Air Force at Wright-Patterson Air Force Base, West Carrollton, Ohio. She has been a member of ASTM Committee E07 for over 10 years, specializing in radiology.

Cliff Bueno (part two) is a staff scientist at GE’s Global Research Center in Niskayuna, N.Y., where he helps to develop new digital radiography systems and technology. Prior to working at GE, Cliff was a nondestructive testing scientist at Lockheed Martin’s Advanced Research Laboratory in Palo Alto, Calif.

Bill Meade (part one) is a level III nondestructive examination specialist with Boeing Commercial Aircraft in Seattle, Wash. He is a member of ASTM Committee E07 and has played a lead role in the creation of ASTM standards for both film and non-film radiographic methods.

Fred Morro (part three) is the director of Digital Radiography Products fornondestructive testing applications for Fujifilm NDT Systems. He has been active in industrial radiography for 35 years. He is a graduate of the University of New Haven and chairs the committee on Digital Radiography Digital Reference Images.