Dust explosibility assessment in the United Kingdom is in a transitory period as traditional test methods based on the Hartmann apparatus become superceded by methods in line with European codes of practice.
The minimum explosible concentration is measured in the open Hartmann tube using a test method with severe drawbacks. Recent years have seen the use of a 15-L apparatus and the 20-L sphere as means of generating a dust cloud more representative of the industrial scale. The sensitivity to ignition of a dust cloud in relation to electrostatic hazards is represented by its minimum ignition energy. The corresponding test method is the subject of controversy in terms of the spark generating mechanism and various methods may be adopted to simulate a typical “industrial” spark.
Minimum ignition temperatures of dust clouds are measured in the Godbert-Greenwald furnace. However, this apparatus has the disadvantage of not having an independent ignition source to enable minimum explosible concentration or ignition energy measurements to be made at elevated temperatures. A 1.2-L furnace based on a U.S. Bureau of Mines design partly overcomes this problem.
The determination of explosion pressures and rates of pressure rise is essential for the optimum design of explosion protection systems. Methods for measuring these parameters have undergone radical change in recent years, and although the Hartmann bomb apparatus is still accepted in U.K. industry, its use nowadays is not recommended, the 1-m3 or 20-L sphere vessels being preferred.
The lack of a dedicated British Standard on dust explosibility testing is one of the reasons for the apparently confused attitude towards the subject which exists in many parts of industry. The use of a vessel, such as the 20-L sphere, to measure all dust explosibility parameters (except minimum ignition temperature) is strongly advocated to give data which can be reliably applied to full-scale situations, something which the small Hartmann tube cannot be expected to give.