Article Preview
TopIntroduction
One of the major concerns for engineers in seismically active regions is the prevention of damage caused by earthquake-induced soil liquefaction. Shear waves in loose, saturated soils cause contractile volumetric strains, which may generate positive excess pore pressures and reduce effective stress, in the absence of rapid drainage. Liquefaction countermeasures used are based around either changing soil properties (by densifying, grouting or replacing soil), or controlling soil behaviour (by installations like drains or walls for example). A catalogue of field performance of remediation techniques is analysed by, amongst others, Mitchell et al. (1995) and Hausler and Sitar (2001). The latter authors conclude by stating that “while there is a growing database of observed field performance, it is still limited by the fact that large earthquakes are infrequent and, unfortunately, the quality of the data from many of the sites is marginal”. Centrifuge testing represents a logical method of performing controlled experiments in this problem area to establish the efficacy of a given liquefaction remediation technique. In this paper the performance of drains in relieving excess pore pressures following soil liquefaction will be investigated.
Charts for the selection of drain spacing for vertical drains as liquefaction remediation were first produced by Seed and Booker (1977) although the discussion by Pickering (1978) implies that engineers had already implemented the method. Two assumptions stand out from their analysis; that vertical dissipation could be ignored, and that the permeability of the drain relative to the soil could be treated as infinite. This latter assumption has been proved excessively unconservative. Real drains had finite permeability, and charts produced by Iai et al. (1988), Matsubara et al. (1988), and Onoue (1988) showed that this had to be several orders of magnitude higher than the sand in which the drains were installed to be a good assumption. In producing these charts the authors necessarily considered vertical seepage, and therefore circumvented the need for the first assumption. Despite these improvements, the case study by Kerwin and Stone (1997) suggests that the original Seed and Booker (1977) charts are still in use.
The charts of Iai et al. (1988) and Matsubara et al. (1988) were experimentally validated using a model “unit cell”. That is, it is assumed that each drain is responsible for the immediate surrounding cylindrical soil volume only and that there are adjacent drains that will drain the surrounding soil. It is unclear how predictable drain performance is outside that unit cell concept, for example, if the drain group contains a small number of drains, or for drains at the edge of a group that are being relied on for protection. There is also discussion in the literature (Boulanger et al., 1998; Papadimitrou et al., 2007) about the reliability of design chart methods based on the simplicity of some of the assumptions and their applicability to field situations. These issues provide the motivation for the research presented.
The aim of this research is therefore to identify the effect of drains dealing with different catchment areas. Centrifuge testing is used to collect data, which is then used to explain drain operation and to determine the amount of fluid conducted by individual drains. Two design charts are then evaluated. This leads to recommendations about situations that are appropriate for drain use, and to guide effective planning of drain group developments.