Massey Documents by Type

Permanent URI for this communityhttps://mro.massey.ac.nz/handle/10179/294

Browse

Search Results

Now showing 1 - 3 of 3
  • Item
    An approach to a field drainage problem by laboratory examination of selected properties of undisturbed soil cores : thesis presented at Massey University of Manawatu in part fulfilment of the requirements for the degree of Master of Agricutural Science
    (Massey University, 1964) Baker, Christopher John
    For many years, soil drainage investigators, from a practical view point, have had to content themselves with expert appraisal of certain direct and indirect soil and environmental characteristics in order to ascertain the cause of a particular drainage problem. In a great many instances, observations of vegetative composition, topography and general soil type, aided by aerial photography and local experience, give completely adequate information. Normally, derivation of conclusions from such observations is based on well established principles, and the recognition of general broad classes of the cause of mal-drainage conditions. Such classes may be grouped as; (I) where infiltration capacity of a soil is inadequate to deal with the amount of water supplied to the surface, because of topography, abnormal rainfall, or through inherent inability of the soil to transmit water internally, (II) where the groundwater table rises to a height detrimental to vegetative survival and/or soil structure, or where its presence hinders the function of a free draining subsoil, end (III) where a similar situation exists, due to a perched or elevated ground-water table. The allocation of a particular drainage problem to one or more of these broad classes is not usually difficult, but identification of causal processes within classes presents quite another problem. Often, drainage investigators have been content to evolve general treatments for each class, and, as a basic rule, such procedures have, more often than not, proved reasonably effective. However, with the increasing intensification of pastoral and agricultural farming, the fundamental causes of individual mal-drainage conditions must be positively identified and rectified within the broadly classified groups.
  • Item
    Measurement and modelling of chloride and sulphate leaching from a mole-drained soil : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Soil Science at Massey University
    (Massey University, 1991) Heng, Lee Kheng
    A study of the leaching of sulphate, chloride, nitrate and the associated cations was carried out over two winter periods on three mole-drained paddocks on a yellow-grey earth (Tokomaru silt loam). The paddocks were occasionally grazed by sheep. At the beginning of each drainage season potassium chloride (KCl) and sulphur fertilizer, as either single superphosphate (SSP) or elemental sulphur (S°), were applied to two of the paddocks, while the third served as a control. The amount of KCl applied was 200 kg ha-1 yr-1, while sulphur was applied at 50 kg S ha-1 in 1988 and 30 kg S ha-1 1989. Drain flow from the fertilized paddocks was measured with V-notch weirs, and sampled using proportional samplers. Large differences in the total drainage flow were measured in the two years, with values of approximately 280 mm in 1988 and 110 mm in 1989. Significant amounts of the chloride added through both fertilizer and rainfall were leached, amounting to approximately 105 kg Cl ha-1 yr-1. The leaching of sulphate-S depended on the form of S fertilizer applied, the quantity of drainage, and the rate of mineralization. Leaching losses of 17 and 3.4 kg S ha-1 were measured from the SSP and S° fertilized paddocks, respectively, in 1988, and 9.4 and 3.5 kg S ha-1 in 1989. The study showed that applying SSP just before the drainage season increased leaching losses of sulphate-S substantially. Although the particle sizes of S° used in both years were much bigger than those specified by the Ministry of Agriculture and Fisheries of New Zealand, there appeared to be no difference in the yield response to S applied either as SSP or as S°. The reduced leaching of S when applied as S° resulted from the slow oxidation of S° to sulphate-S. Relatively little nitrate-N was leached, the amount ranging from 11 to 17 kg N ha-1 yr-1. Losses of potassium were less than 10 kg K ha-1 yr-1, despite the large quantity applied as KCl fertilizer immediately before winter. However a large amount of calcium was leached, between 31 and 52 kg ha-1 yr-1. The amount of magnesium leached was between 9 and 15 kg ha-1 yr-1. A considerable quantity of sodium was leached, around 58 kg ha-1 in 1988, and 31 kg Na ha-1 in 1989, roughly equal to the input in rainfall. Good mass balances were obtained for both chloride and sulphate. The measured moles of negative and positive charge from cations and anions in the leachate were also almost equal. Three models were developed for the leaching of chloride and sulphate-S. An approach dividing the soil water into mobile-immobile phases, and using a fairly detailed soil water flow model was developed. The model was able to simulate the concentration of chloride in the drainage closely, even just after the application of chloride fertilizer, and during highly preferential flow conditions induced by heavy rain. A transfer function model, assuming a log-normal probability density function (pdf) of solute travel pathway lengths, was able to simulate the leaching of chloride reasonably well in 1988. The prediction was less satisfactory in 1989 and during highly preferential flow. By adapting the pdf for chloride to sulphate and by taking adsorption of sulphate into account, the model could predict the sulphate concentration in the drainage quite successfully. A simpler model, using the idea that the two-dimensional flow geometry to the mole drain implies that the drainage concentration approximately equals the average soil solution concentration, was also developed. Despite its simplicity it was able to simulate the leaching of chloride and sulphate as well as the transfer function model. Irrespective of the model being used, the net mineralization rate of soil organic sulphur emerged as an important factor in predicting the leaching of sulphate.
  • Item
    A study of the quality of artificial drainage under intensive dairy farming and the improved management of farm dairy effluent using 'deferred irrigation' : a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Soil Science, Institute of Natural Resources, Massey University, Palmerston North, New Zealand
    (Massey University, 2005) Houlbrooke, David John
    The last decade has been a period of great expansion and land use intensification for the New Zealand dairy farming industry with a 44% increase in national dairy cow numbers. Intensive dairy farming is now considered to be a major contributor to the deterioration in the quality of surface and ground water resources in some regions of New Zealand. Previous research has demonstrated intensive dairy farming is responsible for accelerated contamination of wateways by nutrients, suspended solids, pathogenic organisms and faecal material. A number of common dairy farming practices increase the risk of nutrient leaching. In particular, farm dairy effluent (FDE) has been implicated as a major contributor to the degradation of water quality. With the introduction of the Resource Management Act in 1991, the preferred treatment for FDE shifted away from traditional two-pond systems to land application. However, on most farms, irrigation of FDE has occurred on a daily basis, often without regard for soil moisture status. Therefore, it has been commonplace for partially treated effluent to drain through and/or runoff soils and contaminate fresh water bodies. The objectives of this thesis were to design and implement a sustainable land application system for FDE on difficult to manage, mole and pipe drained soils, and to assess the impacts of FDE application, urea application and cattle grazing events on nutrient losses via artificial drainage and surface runoff from dairy cattle grazed pasture. To meet these objectives a research field site was established on Massey University's No.4 Dairy farm near Palmerston North. The soil type was Tokomaru silt loam, a Fragiaqualf with poor natural drainage. Eight experimental plots (each 40 x 40 m) were established with two treatments. Four of the plots represented standard farm practice including grazing and fertiliser regimes. Another four plots were subjected to the same farm practices but without the fertiliser application and they were also irrigated with FDE. Each plot had an isolated mole and pipe drainage system. Four surface runoff plots (each 5 m x 10 m) were established as subplots (two on the fertilised plots and two on the plots irrigated with FDE) in the final year of the study. Plots were instrumented to allow the continuous monitoring of drainage and surface runoff and the collection of water samples for nutrient analyses. An application of 25 mm of FDE to a soil with limited soil water deficit - simulating a 'daily' irrigation regime - resulted in considerable drainage of partially treated FDE. Approximately 70% of the applied FDE left the experimental plots with 10 mm of drainage and 8 mm of surface runoff. The resulting concentrations of N and P in drainage and runoff were approximately 45% and 80% of the original concentrations in the applied FDE, respectively. From this single irrigation event, a total of 12.1 kg N ha-1 and 1.9 kg P ha-1 was lost to surface water representing 45% of expected annual N loss and 100% of expected annual P loss. An improved system for applying farm dairy effluent to land called 'deferred irrigation' was successfully developed and implemented at the research site. Deferred irrigation involves the storage of effluent in a two-pond system during periods of small soil moisture deficits and the scheduling of irrigation at times of suitable soil water deficits. Deferred irrigation of FDE all but eliminated direct drainage losses with on average <1 % of the volume of effluent and nutrients applied leaving the experimental plots. Adopting an approach of applying 'little and often' resulted in no drainage and, therefore, zero direct loss of nutrients applied. A modelling exercise, using the APSlM simulation model, was conducted to study the feasibility of practising deferred irrigation at the farm scale on No 4 Dairy farm. Using climate data for the past 30 years, this simulation exercise demonstrated that applying small application depths of FDE, such as 15 mm or less, provided the ability to schedule irrigations earlier in spring and decreased the required effluent storage capacity. A travelling irrigator, commonly used to apply FDE (a rotating irrigator), was found to have 2-3 fold differences in application depth and increased the risk of generating FDE contaminated drainage. New irrigator technology (an oscillating travelling irrigator) provided a more uniform application pattern allowing greater confidence that an irrigation depth less than the soil water deficit could be applied. This allowed a greater volume to be irrigated, whilst avoiding direct drainage of FDE when the soil moisture deficit is low in early spring and late autumn. A recommendation arising from this work is that during this period of low soil water deficits, all irrigators should be set to travel at their fastest speed (lowest application depth) to minimise the potential for direct drainage of partially treated FDE and associated nutrient losses. The average concentrations of N and P in both 2002 and 2003 winter mole and pipe drainage water from grazed dairy pastures were all well above the levels required to prevent aquatic weed growth in fresh water bodies. Total N losses from plots representing standard farm practice were 28 kg N ha-1 and 34 kg N ha-1 for 2003 and 2004, respectively. Total P losses in 2003 and 2004 were 0.35 kg P ha-1 and 0.7 kg P ha-1, respectively. Surface runoff was measured in 2003 and contributed a further 3.0 kg N ha-1and 0.6 kg P ha-1. A number of common dairy farm practices immediately increased the losses of N and P in the artificial drainage water. Recent grazing events increased NO3--N and DIP concentrations in drainage by approximately 5 mg litre-1 and 0.1 mg litre-1, respectively. The duration between the grazing and drainage events influenced the form of N loss due to a likely urine contribution when grazing and drainage coincide, but had little impact on the total quantity of N lost. Nitrogen loss from an early spring application of urea in 2002 was minimal, whilst a mid June application in 2003 resulted in an increased loss of NO3--N throughout 80 mm of cumulative drainage suggesting that careful timing of urea applications in winter is required to prevent unnecessary N leaching. Storage and deferred irrigation of FDE during the lactation season caused no real increase in either the total-N concentrations or total N losses in the winter drainage water of 2002 and 2003. In contrast, land application of FDE using the deferred irrigation system resulted in a gradual increase in total P losses over the 2002 and 2003 winter drainage seasons. However, this increase represents less than 4% of the P applied in FDE during the lactation season. An assessment of likely losses of nutrients at a whole-farm scale suggests that it is standard dairy farming practice (particularly intensive cattle grazing) that is responsible for the great majority of N and P loss at a farm scale. When expressed as a proportion of whole-farm losses, only a very small quantity of N is lost under an improved land treatment technique for FDE such as deferred irrigation. The management of FDE plays a greater role in the likely P loss at a farm scale with a 5% contribution to wholefarm P losses from deferred irrigation.