
Pathogenic Contributions from OSWS
Human beings who are infected or carriers of disease discharge pathogenic organisms in wastewater. Pathogenic organisms can be bacteria, parasites such as protozoa and helminths, and viruses. Pathogenic bacteria of human origin are the cause of diseases of the gastrointestinal tract such as typhoid fever, dysentery, diarrhea, and cholera. Table 1 shows the typical concentrations found in septic tank effluent and untreated wastewater and the corresponding infectious dose (Crites and Tchobanoglous, 1998).
Protozoa such as Cryptosporidium parvum and Giarda lamblia can cause severe diarrhea, stomach cramps, nausea, and vomiting. These symptoms can be of great concern in people with compromised immune systems. Helminthic parasites (adults or eggs) can infect humans; some of these parasites are resistant to environmental stresses and can survive usual wastewater disinfection procedures. Enteric viruses can be present in the fecal matter of infected humans. The most important ones are enteroviruses (e.g. polio). Norwalk viruses, rotaviruses, clciviruses, and hepatitis A.
Fecal coliform are usually used as indicator organisms of pathogenic water contamination because they are easy to test for and more numerous. However, their presence may not directly indicate that other pathogens are present. A study conducted by Sobsey and Scandura (1981) in various septic system sites in the coast of North Carolina found that fecal coliform movement through the soil differed significantly from viral movement. Viruses were found at distances up to 35 m from the drainfield while no findings were obtained for fecal coliform. In additions, fecal coliform counts did not coincide with virus counts for closer distances. However the presence of coliform bacteria always coincided with the presence of virus in the soil.
Several processes can act in the removal of pathogens from the septic effluent once it is applied onto the soil. Pathogens can be retained in the soil be entrapment or filtering, soil adsorption, and natural die-off. Pathogen filtering is enhanced as soil pore size decreases. Helminths and other protozoa (12 µm - 400 µm) are the largest and are comparable in size to sand particles 20 µm- 2000 µm). Protozoa and some bacteria (10 µm - 100 µm) are similar in size to silt particles (6 µm - 60 µm). Generally bacteria (0.2 µm - 5 µm) are the size of fine silt and coarse clay particles (2 µm - 6 µm). Finally, viruses (0.02 µm- 0.25 µm) are the size of very fine clay particles (0.01 µm - 0.1 µm). Peterson and Ward (1989) modeled possible filtering occurred mainly for large sized pathogens (protozoa and helminths) as they are trapped in the soil pore spaces. Bacterial filtering also occurs but in lesser quantities. As bacteria fails to be sieved by the soil so do viruses.
Adsorption of microorganisms in the soil is maximized when conditions such as uniform effluent distribution, development of a surface clogging mat, well drained soils, and moisture deficits are present in a septic tank system (Reneau et al., 1989). These conditions are typical of unsaturated flow. Studies such as Reneau et al. (1989) and Pekdeger and Matthess (1983) discussed that the cation bridging between the negatively charged microorganisms and soil colloids as they repulse each other drove adsorption processes. Studies performed in soil columns such as Drewry and Eliasten (1968) and Carlson et al. (1968) confirmed that high ionic strength in soils enhanced virus adsorption. Thus, ionic strength of the water is an important feature that can maximize pathogen adsorption on to soil particles.
On the other hand, pathogens can desorb when lower ionic strength waters, such as rainfall, come into contact with soils with high ionic strength (Reneau et al., 1989). Scandura and Sobsey (1981) performed laboratory experiments on viral movement through soil columns. They observed that simulated rain (distilled water) applied on to soil columns resulted in more viral movement along the column. Sandy soils showed high viral elution concentrations (~10,000 MPN/100 ml - 100,000MPN/100 ml) similar to the concentrations of applied septic tank effluent (8000 MPN/100 ml - 10,000MPN/100 ml). However, Sobsey et al. (1980) reported that soils that had clay content of 30% did not present viral movement or elution. Thus, lower ionic strength waters can drive pathogenic movement through soils subject to soils structural characteristics.
Aerobic conditions are unfavorable to septic bacteria and viruses and promote the survival of aerobic soil bacteria. Eventually large sized pathogens die off as they are subject to predation by aerobic soil bacteria. When anaerobic conditions occur, survival shifts in favor of the septic anaerobes. Survival competition was suggested by studies such as Sobsey et al. (1980), which found that for eight different types of soils virus survival was longer in sterile distilled water soil suspensions than in the presence of aerobic bacteria. Other studies such as Badgarsaryan (1964), Romero (1970), and Hurst et al. (1980) confirm this finding.
Soil structure can influence pathogen movement through soil pores. Clays can be more effective than sands in reducing pathogens. Crane and Moore (1983) reported that under unsaturated flow conditions bacterial population could be reduced to up to 95% within the first 1-5 cm of soil. This study also noticed that clays were more effective than sand in bacterial removal. Sobsey et al. (1980) found that clays achieved 99.99% reduction of viruses while sandy and organic soils achieved only 95 to 99.8% reductions
Soil moisture content and temperature can influence pathogen survival in the soil. Reneau et al. (1989) reported on those studies such as Gerba et al. (1975) and Hurst et al. (1980). These studies found that under controlled laboratory conditions moist soil and low temperatures favored bacterial and viral survival. In addition, Sobsey and Scandura (1981) found that winter temperatures and water table conditions favored the survival of viruses in sand and sandy loams. Viruses were detected up to 59 days after inoculation during winter compared to 41 days during the summer.
Several water quality surveys have assessed the occurrence of groundwater and surface water contamination with pathogens. These surveys pointed out that OSWS are the likely cause of contamination. However, many of these studies did not actually evaluate the extent of the contamination that originated from OSWS. Bicki et al. (1984) summarized some of these water quality surveys such as Woodward et al. (1961) and David and Stephenson (1970), among others. Woodward et al. (1961) reported on contamination in groundwater from OSWS in 39 communities around the Minneapolis-St. Paul, Minnesota area. The soils were described as till, sand, gravel or fractured or jointed solution riddled limestone. Groundwater wells showed elevated NO3- concentration (11% of surveyed wells had concentrations greater than 10 mg/l NO3- -N) and pathogenic contamination. Davis and Stephenson (1970) indicated that 51% of 194 private wells surveyed in Bartow County, Georgia were contaminated with bacteria and OSWS were the likely cause.
Bowers (1980) performed a study in Henry County, Indiana. This study reported that streams and ditches were contaminated with fecal bacteria. Bacterial counts were up to 3.9*106 MPN/100 ml and as low as 10 with an average count of 564,000±300,000 MPN/100 ml. A large percentage (78%) of the soils in this county had low permeability rates, present ponding conditions and underlain a glacial till located 36 inches deep. This study summarized reports on several infectious diseases such as diarrhea, hepatitis, infectious hepatitis, viral meningitis, and encephalitis. The author argued that previous to 1978 the amount of denied permits for OSWS was null and that permits were issued after the systems were already in built in place. However, contributions of OSWS to surface or subsurface waters were not identified.
Other studies, available in the literature, did assess the contributions from OSWS to pollution of surface and groundwater. Some of these studies (U.S. EPA, 1975; Brandes, 1972; Wilson, 1982) were reported by Bicki et al. (1984). The U.S. EPA (1975) reported on studies performed in Florida and North Carolina. These studies showed high concentration of fecal coliform in surface waters at Punta Gorda and Big Pine Key, Florida; and at Atlantic Beach and Surf City, North Carolina. These cities comprised dense housing developments with OSWS built in close proximity to surface waters. Effluent from OSWS reached surface waters as evidenced by dye tracer studies. These studies followed the path of Rhodamine WT dye from house drains to septic tanks and detected the dye at the nearby canal. At Punta Gorda the detection time was 25 hours while at Big Pine Key it took 110 to 150 hours for the dye to reach the canal. In contrast, at Atlantic Beach and Surf City the dye was detected after 4 hours and 60 hours respectively. The background coliform concentrations at Punta Gorda and Big Pine Key were 203 MPN/100ml and 10 MPN/100 ml respectively. However, canal concentrations ranged from 436 to 871 MPN/100 ml in a residentially undeveloped canal and 176 to 1809 MPN/100 ml in a developed canal. At Big Pine Key the coliform concentrations for an undeveloped canal were not different from the background but at a developed canal the concentrations ranged from 14 to 32 MPN/100 ml. At Atlantic Beach the background concentrations and mean surface water concentrations were 3400, 400, and 360 MPN/100 ml for the end, middle and mouth of a canal with high residential density. At Surf City a septic plume was traced with dye and concentrations were found to be greater then 2.4*106MPN/100 ml at the mouth of a canal.
Brandes (1972) reported on OSWS located 27 to 53 ft from Lake Chemong, Ontario (Canada). Soil were sandy loam and silt loam fill materials with stones and boulders. The water depth was 5 to 7 ft. The effluent from the OSWS contained an average concentration of 8*106 and 4.7*106 for total and fecal coliform respectively. The concentrations reported in the groundwater at 5 ft from the drainfield were 8*106 and 2.4*106 MPN/100 ml for total and fecal coliform respectively. At distances of 22 and 34 ft from the drainfields the concentrations for fecal coliform dropped to 1500 and 100 MPN/100 ml respectively.
Wilson (1982) monitored the effluent flow from eight systems located on moderately well to somewhat poorly drained soils. Artificial drainage was practiced in these soils using a perimeter drainage system. Tile drains located at 20 ft from the drainfield were placed 6 ft below the soil surface. This system effectively lowered the water table by 2.5 ft below the drainfield. Discharge from the tile outlet had a wide range of total coliform concentrations from 470 to 2380 MPN/100 ml with an average of 1468 MPN/100 ml. The average fecal coliform concentration was 202 MPN/100 ml with a range of 47 to 484 MPN/100 ml.
Kerfoot and Skimmer (1981) used a septic leachate detector on the shoreline of Crystal Lake, Benzie Co., Michigan. The soils were drained sands to loamy sands underlying was plains and till plains. This study detected three types of plumes that originate in OSWS: erupting, dormant and surface water plumes. The plumes were detected using an instrument that was sensitive to ultraviolet (UV) fluorescent organics derived from surfactants, softeners, and natural degradation products that persist in low oxygen conditions. Kerfoot and Skimmer (1981) defined emergent plumes were as subsurface discharges of effluent that had a higher content of dissolved solids from that present in the background. Dormant plumes indicated the subsuface seepage of effluent with no apparent difference in dissolved discharges of effluent that could be traced back to the shoreline. This study found high densities of erupting plumes in both the northeast and southeastern shores of the Lake. The west shore had no plumes, which coincided with the fact that many housing units in this shore were located far from the shoreline and the groundwater flow direction goes from the lake into the shoreline. Reverse groundwater flow in the west shore was explained as groundwater movement from Crystal Lake to Lake Michigan, which was 6 meters lower in altitude. The east bank was devoid of plumes, which matched the presence of a sewered town. Measurements of the groundwater flow showed that dormant plumes coincided with reversed flow plumes (from the lake into the shoreline). Coliform content was reported in a range from 100 to 9300 MPN/100 ml for total coliform and 10 to 120 for fecal coliform.
Sobsey and Scandura (1981) conducted a study on OSWS located in Craven Co. and New Hanover Co., North Carolina. The sites in Craven Co. had sandy soils that varied to clay and loams. These sites had very low organic matter content, varying from <0.1 to 5.4%. The soils were slightly acidic with pH varying from 4.1 to 6.5. The ionic content of the soils was low varying from 1.2 to 8.5 Me/100cc. Water depth from the soil surface varied from 0.7 to 4.8 ft on the average. Fecal coliform content of the septic plume varied from <2 to 3218 MPN/100 ml for wells placed 5 ft away from the drainfield. For wells placed 50 ft away from the drainfield the fecal coliform concentrations varied from <2 to 1600 MPN/100 ml. Fecal coliform concentrations reported during the winter and spring experiments were markedly lower than those reported in the summer. Sobsey and Scandura (1981) observed that groundwater movement was generally in one distinct direction.
Sobsey and Scandura (1981) also monitored the groundwater wells for specific viruses (BE-1 and E-1) chosen as tracers. The study found that detection of fecal coliform was not a good indicative of the presence of viruses especially for long distances away from the drainfield. A very weak correlation between virus concentrations and fecal coliform concentrations in the well water (r=0.645). However, the authors find this correlation inconclusive due to significant differences in sample population and timing of events. Similarly, a very weak correlation (r=0.667) was found for the levels of virus concentrations in the wells and the distances from the wells to the drainfield. In addition, most of the groundwater samples containing viruses were from wells located at 5 ft from the drainfield. Sobsey and Scandura (1981) also found that the relationship between rainfall occurrence and virus movement in these systems was unclear. There was no significant correlation between the appearance and non-appearance of viruses in the wells after a rainfall event. Neither there was a significant correlation between the intensity of a rainfall event and the observed virus concentrations. Similar observations were made for coliform organisms. Sobsey and Scandura (1981) and Scandura and Sobsey (1997) reported that a direct correlation (r=0.918) was found between the frequency of virus isolations in the groundwater wells and the groundwater pH values. The authors explained this result by septic plume mixing with acidic groundwater which allowed viruses to survive longer especially under saturated conditions. In addition, the authors concluded that high pH values in the groundwater of typically acid soils can indicate extensive contamination by septic effluent.
Hagedorn et al. (1978) used antibiotic bacteria to monitor the degree of movement and subsequent groundwater contamination by OSWS drainfield under saturated conditions. Antibiotic strains were used in order to differentiate fecal coliform originated in OSWS from other fecal bacteria and to determine flow movement rates. Bacteria were inoculated at concentrations ranging from 3*108 to 5*108 MPN/ml into deep pits constructed to simulate drainfields. The soils were described as silt loam varying to clay loam. The pH levels were from 5.4 to 6.0 and the organic content varied from 0.6 to 5.2%. The cation content varied from 11.2 to 35.1 Meq/100g. The bacteria were monitored for 32 days using wells installed in concentric rings at 50 cm and 100 cm from the inoculation point. Diffusion of antibiotic bacteria occurred in all directions for the first 50- cm from the inoculation point. The northeastern and eastern directions were determined as the preferential directions. A more complex network of wells was installed to follow the preferential directions. Samples were taken at several distances (3, 5, 15, and 30 m) from the drainfield at 50 cm depth from ground surface. During the first day after inoculation bacteria had traveled from 3 to 5 m in both sites. The authors observed that rainfall events helped the bacteria population move downgradient as a front or pulse through the soil. The pulse traveled through the groundwater wells and its peak concentration attenuated along its path due to groundwater dilution and soil filtration of bacteria. The authors concluded that these bacteria survived in appreciable numbers throughout the 32 days- sampling period.
Rahe et al. (1978) investigated bacterial movement in the event of a drainfield was submerged in a perched water table. This study inoculated three distinct bacterial strains, at a concentration of 1.4*109 MPN/ml into three horizontal drains that were installed into A, B and C horizons at two sites located in a western Benton Co. Oregon Hillslope. The soils in the first site were described as silt loam varying to massive clay. Soils for the second site were silty clay loam varying to overlying fractured saprolite, which was located at 2 ft deep. At the first site, the soil values of pH varied from 5.6 to 6.1, the organic matter content varied from 0.9% to 4.8%, and the cation content was within the range of 12.4 meq/100g to 20.85 meq/100g. At the second site, the soil values of pH varied from 4.7 to 5.2, organic matter content was within the range of 3% to 4.3%, and the cation content varied from 17.5 meq/100g to 52.63 meq/100g. Monitoring wells were placed at various depths, varying from 0.4 ft to 6.5 ft and at distances from the drainfield varying downgradient from 8.2 ft to 50 ft. Artificial water tables were maintained and monitored by applying water with sprinklers. Background concentrations were negligible. The second site showed rapid movement of bacteria with little to no dilution and/or diffusion was observed. Movement was observed in great quantities at depths between 0.4 ft to 6.5 ft and at distances from the drainfield from 16.5 ft to 50 ft. The study reported concentrations varying from 0 to 104 MPN/ml after 2 hours, 4 hours, and 12 hours of inoculation. Distances downgradient presented concentrations up to 104 especially for times after 12 hours of inoculation.
Results on the first site showed that the movement rate was much more slowly. The authors attributed this result to lower hydraulic gradient and hydraulic conductivity. Lower numbers of bacteria were recovered due to greater soil filtration. Bacterial counts varied from 0 to 103 after 12 hours. These numbers were obtained from wells located up to 16.5 ft downgradient and within 1 ft from the ground surface. Bacterial counts were observed up to 102 MPN/ml after 48 and 72 hours bacterial counts for downgradient wells. These numbers were only observed within the first 2.6-ft of soil. After 56 hours only the wells located at 16.5 ft from the drainfield showed up to 102 MPN/ml bacterial counts for the first 2.6 ft of soil. The authors concluded that macropores played a very important role in bacterial movement in flooded soils. In addition, the authors recommended that a true saturated hydraulic conductivity is an important figure to be assessed when faced with soils with a significant percentage of macropores. The study warned that OSWS installed on soils with fractured saprolite could potentially contribute to significant pathogenic contamination of surface waters.
Ijzerman et al. (1993) inoculated two antibiotic resistant Escherichia coli strains and two coliphages in a LPD system installed in William Co., Virginia. The soils at this site were loam varying to silty clay loam overlying shale rock. The soils were very shallow and shale rock was located at only 2 ft from the soil surface. The study analyzed three independent LPD systems operating under different actual loading rates of 4.1, 7.7, and 16.71 l/m2d. The fate and transport of the biological tracers below each system was monitored through a network of sampling wells located at various depths (0-3.3 ft) below the trenches, which were installed at 1 ft depth from the soil surface. The bacterial population inoculated was 7.8*106 colony forming units (cfu) per milliliter during the summer and 2.0*107 cfu/ml during the winter. The concentration of coliphage used in the summer inoculation was 1*104 plaque forming units (pfu) per milliliter and of both bacteria and coliphage microorganisms. Ijzerman et al. (1993) reported that for both experimental sites, during the summer period the bacterial tracer was observed with more frequencies at levels >100 cfu/50 ml than the coliphage tracer. However, during the winter both tracers were observed at levels of >100 cfu/50 ml and 100-1000 pfu/50 ml for bacteria and coliphages respectively. The authors found that during the summer the system with the lower loading rate had the greatest removal of 99.9% compared to 99% for the greater loading rate. During the winter both loading rates met a high level of tracer retention but the lowest loading rate was the most effective. Winter season proved to favor both coliphage and bacterial survival for lower depths over 72 hours after inoculation. The authors concluded that low temperatures increased the survival of both pathogenic tracers.
Cogger et al. (1998) evaluated wastewater treatment by septic systems functioning on fine sand typical of the conditions found in Atlantic Beach, North Carolina. The study assessed effects of loading rate and water table depth on bacterial counts. Two drainfields, nominated upper and lower according to topographic conditions were used under loading rates of 1, 4, and 16 cm/d. Monitoring wells were installed around the trenches at depths of 4.9 to 6.2 ft for shallow wells, which were coupled with deeper wells. The study monitored nutrient and pathogenic movement through the sandy soils directly beneath the drainfields. Both fields had a variable separation distance between trench bottom and water table. The upper field presented a separation of 2-3 ft 72% of the time. The water table was 1-1.1 ft away from the lower field 50% of the time and the trenches were nearly saturated 10% of the time. The upper field displayed bacterial concentrations varying from 1*101.3 MPN/l to 1*102.4 MPN/l. These numbers were much lower than those observed in the lower field (1*101.5 MPN/l to 1*105 MPN/l). The authors explained these differences as a result of water table proximity. The lower field had nearly saturated and anaerobic conditions which favored pathogen survival. Concentrations in the upper field also varied with loading rate. A concentration of 1*102.4 MPN/l was observed for the highest rate while a concentration of 1*101.3 MPN/l with the lower field. Yet, during the second period with drier conditions and deeper water tables, these marked tendencies were not observed with either drainfield. The authors reported log10 reductions for different pathogenic dosage. Viruses such as BE-1 and coliphages in the upper field showed removals of 99.96% and 99.87% respectively. In contrast, fecal coliform and fecal streptococci in the upper field showed a lower removal of 99.7% and 90.0% respectively. Similar results were observed for the lower field. The authors did not monitor pathogenic or nutrient movement away from the drainfield. The sea and a marsh were located 33 ft away.
Please address any questions to Dr. David Lindbo.
This page
(http://www.ces.ncsu.edu/plymouth/septic/98cardonapath.html)
created by
Vera
MacConnell, Research Technician, I
on March 1, 1999.
Last Updated on 6/29/00 by Roland O.
CoburnResearch Tech. I.