Published: Jan 1982
| ||Format||Pages||Price|| |
|PDF (136K)||8||$25||  ADD TO CART|
|Complete Source PDF (8.7M)||8||$55||  ADD TO CART|
The history of early development and growth of applications for high-alloy iron-chromium-nickel castings over the past 75 years is reviewed. Following separate paths to the now-standard alloy types, the heat-resistant casting alloys evolved from the 80Ni-20Cr electrical-resistance alloy, whereas the corrosion-resistant grades grew out of the 12 percent chromium cutlery steel. Increasing demand from the automobile and chemical industries following World War I caused much experimentation with various iron-chromium-nickel combinations, leading to many “proprietary” alloys. First steps toward eliminating the confusion of grades were taken during the Depression of the 1930's by the National Recovery Administration, which established a code for the alloy foundries. Eight ASTM specifications covering several of the corrosion and heat-resistant grades were adopted in 1935, reduced to four in 1939, and replaced by two in 1946. In 1941 the Alloy Casting Institute, then conducting a program of research defining the influence of variations in chemical composition on baseline alloy properties, developed a new nomenclature system for the most widely used alloy types to distinguish them from similar wrought alloys. This system and its adaptability to the designation of new alloys are described. It was followed by ASTM in 1953 in the specifications adopted in 1946, and is still in use. The relation of these castings to the industrial production level of the United States in the past is discussed as well as the outlook for future demands. Continued research will be necessary in meeting these demands.
castings, high alloy, heat resistant, corrosion resistant, designations, symbols, production, history, iron-chromium-nickel
Consultant, Garden City, N.Y.
Paper ID: STP28432S