Browsing Hydrology & Water Resources Technical Reports by Title
Now showing items 100104 of 104

Traditional Aquifer Tests: Comparing Apples to Oranges?Traditional analysis of aquifer tests uses the observed hydrograph at one well caused by pumping at another well for estimating transmissivity and storage coefficient of an aquifer. The analysis relies on Theis' or Jacob's approximate solution, which assumes aquifer homogeneity. Aquifers are inherently heterogeneous at different scales. If the observation well taps into a low permeability zone while the pumping well is located in a high permeable zone, the resulting situation contradicts the homogeneity assumption embedded in the traditional analysis. As a result, a practical but important question we ask: What do we derive from the traditional analysis? Using numerical experiments in synthetic aquifers, we answer this question. Results of the experiments indicate that the effective transmissivity, Teff , and storage coefficient, Seff , values vary with time, as well as the principal directions of the transmissivity, but both values approach their geometric means of the aquifer at large times. Analysis of the estimated transmissivity (T) and storage coefficient (S ) using well hydrographs from a single observation well shows that at early times, both the estimated T and S values vary with time. At late times, both estimates approach local averages near the observation well. The T value approaches but does not equal Teff , representing an average value over a broad area in the vicinity of the observation well while the S value converges to the value dominated by the storage coefficient near the observation wells (i.e., its average area is much smaller than that of the t value).

WATER QUALITY IN THE LOWER COLORADO RIVER AND THE EFFECT OF RESERVOIRSComparison of the power spectra of TDS time series from different locations on the Lower Colorado River is useful in showing changes in salinity and for indicating physical factors influencing salinity. Similarities between the power spectra of the Lee Ferry and Grand Canyon tine series indicated that lateral inputs and evaporation are not greatly influencing the salinity cycle. The salinity change within this reach was approximated by a constant concentration change of 66.6 ppm. A similar model form was used for the Hoover Dam to Parker Dam reach. Dissimilarities between power spectra indicated that additional inputs are significant and must be accounted for in any model of such reaches. The model for Lake Mead required compensation for evaporation and for the inputs of the Virgin River and Las Vegas Wash. The modeled salinity increase between Parker Dam and Yuma contained a trend factor to allow for the effect of irrigation return flows and seepage. The crosscovariance function was used to approximate the time lag between data stations. Time series statistics, including coherence, response function spectra, and overall unit response, were used and are of utility in estimating salinity in a river system.

WATERBUD: A SPREADSHEETBASED MODEL OF THE WATER BUDGET AND WATER MANAGEMENT SYSTEMS OF THE UPPER SAN PEDRO RIVER BASIN, ARIZONAThis report describes the development and application of a spreadsheet based model of the water budget and water management systems of the Upper San Pedro River Basin in southeastern Arizona. The model has been given the name, WATERBUD.

WORTH OF DATA USED IN DIGITALCOMPUTER MODELS OF GROUNDWATER BASINSTwo digital computer models of the ground water reservoir of the Tucson basin, in south  central Arizona, were constructed to study errors in digital models and to evaluate the worth of additional basic data to models. The two models differ primarily in degree of detail  the large scale model consists of 1,890 nodes, at a 1/2 mile spacing; and the small scale model consists of 509 nodes, at a l mile spacing. Potential errors in the Tucson basin models were classified as errors associated with computation, errors associated with mathematical assumptions, and errors in basic data: the model parameters of coefficient of storage and transmissivity, initial water levels, and discharge and recharge. The study focused on evaluating the worth of additional basic data to the small scale model. A basic form of statistical decision theory was used to compute expected error in predicted water levels and expected worth of sample data (expected reduction in error) over the whole model associated with uncertainty in a model variable at one given node. Discrete frequency distributions with largely subjectively determined parameters were used to characterize tested variables. Ninety one variables at sixty  one different locations in the model were tested, using six separate error criteria. Of the tested variables, 67 were chosen because their expected errors were likely to be large and, for the purpose of comparison, 24 were chosen because their expected errors were not likely to be particularly large. Of the uncertain variables, discharge /recharge and transmissivity have the largest expected errors (averaging 155 and 115 feet, respectively, per 509 nodes for the criterion of absolute value of error) and expected sample worths (averaging 29 and 14 feet, respectively, per 509 nodes). In contrast, initial water level and storage coefficient have lesser values. Of the more certain variables, transmissivity and initial water level generally have the largest expected errors (a maximum of 73 per feet per 509 nodes) and expected sample worths (a maximum of 12 feet per 509 nodes); whereas storage coefficient and discharge/ recharge have smaller values. These results likely are not typical of those from many ground water basins, and may apply only to the Tucson basin. The largest expected errors are associated with nodes at which values of discharge /recharge are large or at which prior estimates of transmissivity are very uncertain. Large expected sample worths are associated with variables which have large expected errors or which could be sampled with relatively little uncertainty. Results are similar for all six of the error criteria used. Tests were made of the sensitivity of the method to such simplifications and assumptions as the type of distribution function assumed for a variable, the values of the estimated standard deviations of the distributions, and the number and spacing of the elements of each distribution. The results are sensitive to all of the assumptions and therefore likely are correct only in order of magnitude. However, the ranking of the types of variables in terms of magnitude of expected error and expected sample worth is not sensitive to the assumptions, and thus the general conclusions on relative effects of errors in different variables likely are valid. Limited studies of error propagation indicated that errors in predicted water levels associated with extreme erroneous values of a variable commonly are less than 4 feet per node at a distance of 1 mile from the tested node. This suggests that in many cases, prediction errors associated with errors in basic data are not a major problem in digital modeling.