The corrosiveness of various residential thermal insulation materials was tested under simulated field conditions in a test wall structure. The test was conducted under controlled conditions typical of winter in the absence of a vapor barrier to create relatively severe moisture transport and possible condensation. The house-wall simulation was achieved by constructing a test panel containing 50 compartments into which various insulation materials were installed. The panel was located in an environmental chamber. The test samples included various cellulosic, glass fiber and rockwool insulations as well as sterile cotton as a control. Steel and copper coupons together with water-cooled copper pipes were embedded in the insulation and exposed for six months. It was found that moisture absorption by the insulation was the primary factor in causing corrosion but required that chemical activity from insulation components also be present. No corrosion occurred in the absence of insulation or in rockwool and glassfiber insulation. All cellulose insulations caused some corrosion; mostly this was minimal, but in a few cases severe pitting resulted. Such behavior of the cellulose did not correspond to previous laboratory test results in saturated insulation or leachants made from the insulation. However, laboratory testing of leachants made from some of the cellulose after the simulated wall test showed a change in pitting tendency, suggesting that time and/or exposure to moisture can change the corrosiveness. This should be further explored.