| ||Format||Pages||Price|| |
|PDF (884K)||43||$25||  ADD TO CART|
|Complete Source PDF (9.8M)||524||$131||  ADD TO CART|
Problems of partial or corona discharges in cables were recognized by a number of electrical engineering pioneers. As early as 1898, Fessenden performed experiments that showed the danger of air bubbles in solid insulation. Perrine reported in 1902 that the failure of cable insulation sometimes was due to the presence of spaces filled with ratified gases . In 1912, Petersen called attention to the fact that air films in a dielectric of specific inductive capacity or dielectric constant, k, are subjected to a stress of k times that in the surrounding medium, and ionization may therefore occur therein at comparatively low voltages . He also stated that ions are shot from these films into the surrounding medium. Dubsky reported in 1919 that he had measured the dielectric strength of thin air films between glass plates . He then applied these data theoretically to assumed gas spaces in solid dielectrics and showed the possible conditions under which partial discharge was likely to occur. Shanklin and Matson also reported in 1919 that they had measured the ionization voltage in actual insulation designs by the dielectric loss method . In the case of paper cables, evidence was given showing that a true ionization occurs. However, the exact nature of this ionization, its position, and the possibilities of serious damage were not shown. Many of the studies into the nature of the partial or corona discharges began with the advent of the cathode ray tube application in the 1930's. The purpose of this chapter is to briefly describe the early experiences of partial discharge problems in cables with the subsequent developments of methods of detection, and then to discuss the techniques used to standardize partial discharge testing, the measurement limitations, and finally to provide guidance for interpretations of present-day measurements in cable systems.
Philadelphia Electric Company, Philadelphia, Pa.