• Drought and grazing: IV. Blue grama and western wheatgrass

      Eneboe, E. J.; Sowell, B. F.; Heitschmidt, R. K.; Karl, M. G.; Haferkamp, M. R. (Society for Range Management, 2002-03-01)
      An understanding of the impacts of grazing during and following drought on rangeland ecosystems is critical for developing effective drought management strategies. This study was designed to examine the effects of drought and grazing on blue grama [Bouteloua gracilis (H.B.K) Lag. ex Griffiths] and western wheatgrass [Pascopyrum smithii Rydb. (Love)] tiller growth dynamics. Research was conducted from 1993 to 1996 at the Fort Keogh Livestock and Range Research Laboratory located near Miles City, Mont. An automated rainout shelter was used during 1994 to impose a severe late spring to early fall (May to October) drought on 6 of twelve, 5- x 10-m non-weighing lysimeters. Twice replicated grazing treatments were: 1) grazed both the year of (1994) and the year after (1995) drought; 2) grazed the year of and rested the year after drought; and 3) no grazing either year. Drought had minimal impact on tiller relative growth rates of plants grazed twice, although it reduced (P less than or equal to 0.01) rates of axillary tiller emergence for blue grama (79%) and western wheatgrass (91%). Defoliation periodically increased relative growth rates (P less than or equal to 0.05) and tiller emergence (P less than or equal to 0.01) of both species. Neither drought nor grazing affected tiller densities or tiller replacement rates of either species nor did they affect productivity of blue grama. Drought, however, reduced (P less than or equal to 0.01) productivity of western wheatgrass 50% in 1994 whereas grazing reduced productivity (P less than or equal to 0.01) by 46% in 1994 and 69% in 1995. Moderate stocking levels (40-50% utilization) during and after drought did not adversely affect the sustainability of these dominant native grasses.
    • Herbicide residues and perennial grass on establishment perennial pepperweed sites

      Young, J. A.; Clements, C. D.; Blank, R. R. (Society for Range Management, 2002-03-01)
      Perennial pepperweed (Lepidium latifolium L.) is a creeping rooted exotic weed that has infested native hay meadows, riparian areas and agronomic fields throughout the western United States. This highly invasive species causes major losses in forage quality and creates numerous management problems. On many sites infested with perennial pepperweed, a near mono-culture exist. Sustainable suppression programs require the establishment of a competitive perennial species. You can not establish seedlings of a perennial competitive species without some initial substantial reduction in perennial pepperweed stands through weed control. Because tillage is not feasible with this creeping rooted species, herbicidal weed control is the primary option. Experience has shown, that the massive and extensive root system of perennial pepperweed can not be completely eliminated with one application of a herbicide. This means that repeated applications of a selective herbicide are required after the perennial seedlings of a competitive species are established. Perennial pepperweed is a broadleaf species that is some what susceptible to applications of 2,4-D. Therefore, the choice revegetation species is limited to a perennial grass that is resistant to 2,4-D applications at low rates as a seedlings and moderate rates once established. The saline/alkaline nature of the soils where perennial pepperweed is often found limit the adapted perennial grasses to tall wheatgrass (Elytriga elongata [Host] Nevski). The herbicide chlorsulfuron has been shown to be more effective in initially controlling perennial pepperweed than 2,4-D. We determined that applications of chlorsulfuron at rates sufficient to control perennial pepperweed resulted in herbicidal residues that severely reduced or eliminated the establishment of tall wheatgrass seedlings. Application of 2,4-D at flower budding for perennial pepperweed (June), followed by seeding tall wheatgrass in the fall (October), and application of low rates of 2,4-D over the wheatgrass seedlings the next spring (May), gave the best grass seedling establishment and suppression of the perennial weed.
    • Seed longevity and seeding strategies affect sagebrush revegetation

      Booth, D. T. (Society for Range Management, 2002-03-01)
      Three hypotheses were tested relating to Wyoming big sagebrush (Artemisia tridentata Nutt. ssp. wyomingensis (Beetle and Young) revegetation on coal-mined land in Wyoming: (1) that fourwing saltbush (Atriplex canescens (Pursh) Nutt.) seeded > 2.2 kg pure live seed (pls)/ha would exclude sagebrush, (2) the contrasting view that the saltbush, as a "pioneer plant", facilitated sagebrush stand development [by promoting beneficial soil microbiological activity], and (3) that sagebrush stand development would be greater on fresh-stripped, than on stored, topsoil. The hypotheses were tested by comparing stand development on field plots: 1) seeded to sagebrush in February 1992, and March 1993; 2) fallowed in 1992, and sagebrush seeded in March 1993; 3) seeded to 'Wytana' fourwing saltbush in November, 1991, with sagebrush over-seeded in March, 1993, and; 4) no seeding. The experimental design was a randomized complete block with split plots of stored and fresh-stripped topsoil and with 3 replications. New sagebrush were detected annually through 4 post-seeding spring counts. Seed efficiency was affected by seeding strategy, but efficiency, density, and height were not affected by topsoil source. Proximity to saltbush did not affect sagebrush heights. The results imply that a 'Wytana' density less than or equal to 5 seedlings/m2 is unlikely to deter or promote development of the sagebrush stand, but it will significantly increase total-shrub seed efficiency and density. Seeding strategies, particularly pre-sowing fallow and mixed-species seedings, will likely have a greater influence on sagebrush revegetation than will topsoil source when topsoils are handled as they were in this study.
    • Seasonal dynamics of prairie sandreed rhizome development

      Reece, P. E.; Nixon, J. S.; Moser, L. E.; Waller, S. S. (Society for Range Management, 2002-03-01)
      Multiple generations of rhizome-connected tillers stabilize soils and produce measurable amounts of herbage on sandy rangeland throughout the world. However, little is known about the dynamics of rhizome development in these clonal plant species. Seasonal relationships between foliar characteristics and rhizomes of prairie sandreed [Calamovilfa longifolia (Hook) Scribn.] were examined on sands range sites at 30-day intervals from May through September 1989 and 1990 at the University of Nebraska, Panhandle Experimental Range near Scottsbluff. Quadrats were excavated each year from two, 5 x 5 Latin Square macroplots in each of 2 grazing histories, long-term rest or current-year deferment. Under dry conditions in 1989, a 65% reduction in the length of new rhizomes during July preceded a 64% reduction in live tillers in August. After which, rhizome length and live tiller density were unchanged and mean tiller weight increased during September. When average precipitation occurred in 1990, a 25% reduction in live tillers and concurrent increases in new rhizome length and mean tiller weight occurred during July. Rhizome bud densities increased throughout the growing season at different but predictable rates (R2 greater than or equal to 0.95) for grazing histories, regardless of precipitation. Length of new rhizomes was highly correlated (R2 = 0.91) with live herbage throughout the growing season. Measurable increases in total rhizome length did not occur until live herbage of prairie sandreed exceeded a threshold of about 50 g m(-2). Maximum increase in length of new rhizomes per unit of live herbage was about 10 cm g(-1) near 100 g m(-2). Given its dependence on vegetative reproduction and relatively high palatability to beef cattle, periodic or repeated years of full growing season deferment may be the only reliable method of obtaining measurable increases in prairie sandreed populations.
    • Vegetation responses following wildfire on grazed and ungrazed sagebrush semi-desert

      West, N. E.; Yorks, T. P. (Society for Range Management, 2002-03-01)
      A 20-year set of cover data on sagebrush semi-desert plant communities responding to wildfire and livestock grazing near Mills in central Utah provided an opportunity to compare the assumptions and adaptability of classical and state-and-transition models for describing secondary succession. Cover data were organized and analyzed by plant species, growth forms, and other ground cover classes. Graphical analysis, ordination (employing semi-strong hybrid multi-dimensional scaling), regression, and analysis-of-variance were used to determine whether the patterns observed were best described as community change (tightly linked species) or individualistic change (each species acting independently). Distinct differences in total plant cover, growth form, and species composition were found between burned (both grazed and ungrazed) and the unburned and grazed treatments. Conventional graphical and statistical analyses of burned and ungrazed plots showed greater and earlier expansion of perennial grasses and then relatively less cover-weighted compositional change in recent years compared to the other treatments. Vegetation on none of the treatments appears to have stabilized toward either the pre-burn sagebrush semi-desert, a new state or the potential natural community for the site involved. Pathways of change reflected in the ordinations have been complex in all treatments. The only obvious trends in responses of individual species were to fire and the inverse relationship of cheatgrass to total perennial vegetational cover. All this evidence points to few tight linkages between species or growth form groups and thus favors viewing these patterns individualistically. While the state-and-transition model allows greater flexibility than the classical model in the depiction of plant community/individual species changes consequent to any management action, it doesn't apply readily everywhere, as exemplified by this case study.
    • Northern dry mixed prairie responses to summer wildlife and drought

      Erichsen-Arychuk, C.; Bork, E. W.; Bailey, A. W. (Society for Range Management, 2002-03-01)
      In August 1994, wildfire burned 6,500 ha of native Dry Mixed Prairie in southeastern Alberta. The following year, a study was initiated to monitor the recovery of major plant communities. Burning was followed by 3 successive years of drought, reducing total vegetative cover by 10%. Exposed soil increased to a high of 23%, three years after the fire. Litter and grass production were reduced through 1997, with the greatest decline in 1995 when grass production on burned and unburned areas averaged 890 and 1,468 kg ha(-1), respectively. Of the major forage species, Stipa spp. and Koeleria macrantha (Ledeb. J.A. Schultes f.) were affected for a single year and Agropyron spp. 2 years by burning. Both Agropyron and Stipa abundance displayed interactions with topographic position in response to fire. In 1995, Agropyron increased on uplands with burning from 90 to 143 kg ha(-1), but decreased on lowlands from 383 to 238 kg ha(-1), a pattern repeated in 1996. In contrast, Stipa declined at both positions, but only for a single year. Where livestock grazing occurred after the fire, forage removal was greater on burned areas. Drought conditions, in combination with summer wildfire, reduced Dry Mixed Prairie range productivity and ground cover for several years and intensified livestock grazing, highlighting the need for changes in rangeland management under these conditions.
    • Bud viability in perennial grasses: water stress and defoliation effects

      Flemmer, A. C.; Busso, C. A.; Fernández, O. A. (Society for Range Management, 2002-03-01)
      Effects of the timing and frequency of defoliation under different levels of soil water availability were evaluated on bud metabolic activity and subsequent outgrowth in the desirable (i.e., palatable) Stipa clarazii Ball. and S. tenuis Phil. in competition with the undesirable (i.e., unpalatable) S. gynerioides Phil. Field studies on these native, perennial tussock grasses were conducted from 1995 to 1997 in temperate, semiarid Argentina. Our working hypotheses were 1) axillary bud activation and subsequent tillering are lower under water stress than under higher soil moisture conditions in S. clarazii, S. tenuis, and S. gynerioides, 2) when compared to undefoliated controls, activation, and subsequent outgrowth of axillary buds reach similar or greater values in S. clarazii and S. tenuis after early (vegetative stage of development), but not late (during internode elongation) defoliations or increased defoliation frequency (vegetative+ during internode elongation) during each growing cycle. Additionally, it was expected that the greatest reduction of axillary bud metabolic activity and outgrowth would ocurr on plants defoliated twice, and 3) axillary bud activation and succeeding tillering are greater in the undefoliated S. gynerioides when the desirable species are defoliated than when they remain undefoliated. With a few exceptions, our results led us to reject all 3 hypotheses. Responses of axillary buds were in general specific to the species and treatments, to sampling time, and to the cumulative effects of previous treatments. This makes predictions of plant responses of these species under natural field conditions difficult, where they are often defoliated under water stress.
    • Day and night grazing by cattle in the Sahel

      Ayantunde, A. A.; Fernández-Rivera, S.; Hiernaux, P. H.; Van Keulen, H.; Udo, H. M. J. (Society for Range Management, 2002-03-01)
      The influence of night grazing on feeding behavior, nutrition and performance of cattle was studied. Twenty-four steers weighing 367 kg (SD = 76) grazed either from 0900 to 1900 (day grazers), 2100 to 0700 (night grazers) or 0900 to 1900 and 2400 to 0400 (day-and-night grazers) during 13 weeks. Four esophageally fistulated steers were used in a cross-over design to sample the diet selected during the day and at night. No differences (P > 0.05) were observed in the diet selected in the day or at night. As the season progressed the fiber components of the diet increased (P < 0.01) significantly while nitrogen and in sacco dry matter disappearance declined (P < 0.01). Actual grazing time (min day(-1), SE = 16) were 352, 376, and 476 for day, night, and day-and-night grazers, respectively. Day-and-night grazers had a higher intake of organic matter than either day or night grazers. Night grazers had the lowest forage intake and also the slowest rate of consumption. Steers that grazed in the night had the lowest water intake: 22.7 liter day(-1) (SE = 1.5) in week 4; 19.9 liter day(-1) (SE = 1.1) in week 8. Average weight changes (g day(-1), SE = 62) were -435, -548 and -239 for day, night, and day-and-night grazers, respectively. These results show that during the dry season, grazing exclusively in the night cannot substitute for day time grazing, but that it is rather complementary to the latter. Timing (day or night) of grazing did not affect diet selection but nocturnal grazing decreased the need for water.
    • Nutritional value and intake of prickly pear by goats

      McMillan, Z.; Scott, C. B.; Taylor, C. A.; Huston, J. E. (Society for Range Management, 2002-03-01)
      Prickly pear (Opuntia sp.) is both a benefit and hindrance to the livestock industry in the southwestern U.S. It competes with herbaceous forage but is sometimes used as emergency feed during drought. Spineless prickly pear (O. fiscus-indica Engelm. and O. rufida Engelm.) has been planted in some regions of the southwest but little is known about its nutritional value. Our objectives were to determine: (1) the nutritional value of both spined (O. macrorhiza Engelm.) and spineless prickly pear (O. rufida Engelm.); (2) if goats can be conditioned to eat prickly pear after prescribed burning; and, (3) if goats would consume prickly pear when alternative forage was available. In Experiment 1, 8 goats were placed in metabolism stalls and fed either spineless or spined prickly pear with singed spines in both summer and winter. Intake, digestibility, and nitrogen balance were measured. In Experiment 2, 18 goats were placed in individual pens, and 9 were fed spineless prickly pear to determine if this increased acceptance of spined prickly pear with singed spines. In the third experiment, we varied the amount of alfalfa pellets fed to goats (below, near, and above maintenance) to determine if level of alfalfa intake affected prickly pear intake. Spineless prickly pear was higher (P < 0.05) in digestibility and crude protein than singed prickly pear, but nitrogen balance was similar for goats consuming the 2 species. Goats ate more spineless prickly pear on an as fed basis, but on a dry basis, intake was similar. Familiarity with spineless prickly pear increased (P < 0.05) subsequent intake of singed prickly pear. Level of alfalfa intake did not affect prickly pear intake. We concluded that both species are moderately nutritious, spineless prickly pear is more digestible than spined prickly pear, and once a preference for prickly pear has developed, goats may continue to eat prickly pear even though other forage is available.
    • Intensive-early stocking for yearling cattle in the Northern Great Plains

      Grings, E. E.; Heitschmidt, R. K.; Short, R. E.; Haferkamp, M. R. (Society for Range Management, 2002-03-01)
      A 3-year study was conducted to evaluate grazing strategies for production of growing cattle during summer on Northern Great Plains rangeland. Crossbred yearling steers (N = 123 per year, avg initial weight = 275 kg) were allotted to 1 of 2 treatments replicated in 3 pastures. Treatments were season-long grazing of pastures at recommended stocking rates assuming a 4-month grazing period or intensive-early grazing of pastures stocked at the same rate assuming only a 2-month grazing season. Precipitation in 1993 was 169% of normal resulting in greater forage quality than in other years and no differences were observed in weight gains between treatments during 1993. In 1994 and 1995, steers in the intensive-early stocked pastures gained less weight during the 2 months of grazing than did those in the season-long stocked pastures; however, gain per hectare was greater in the intensive-early stocked pastures. Intensive-early stocking with growing steers may be a viable means to overcome limited forage quality during late summer in the Northern Great Plains and to maximize forage utilization in years of abundant forage.
    • Evaluating the ecological relevance of habitat maps for wild herbivores

      Stalmans, M. E.; Witkowski, E. T. F.; Balkwill, K. (Society for Range Management, 2002-03-01)
      Informed management of large herbivores depends largely on how well habitat availability and suitability are understood. The aims of the study were to quantify and map the distribution of sour and mixed grasslands in the 48,000 ha Songimvelo Game Reserve, Mpumalanga, South Africa. Mixed grassland retains its forage quality and hence its ability to sustain animal production for longer in the year than sour grassland. An unsupervised classification technique was applied to a LANDSAT 5 TM image acquired in 1993. The probability that each resulting cluster represented either sour or mixed grassland was calculated based on the proportional allocation of 428 sample plots. The 2 resulting probability maps were combined into a single image by selecting the class image that contained the maximum posterior probability and assigning that class to the output pixel. The accuracy of the vegetation map was assessed by ground-truthing with an independent set of 85 plots. This yielded a correct classification of 84.8% for the sour and 76.9% for the mixed plots. The mixed grasslands covered only 31.0% of the area but accounted for 66.1% of the game stocking. Water is widely distributed and is not a limiting factor to habitat selection. Based on a GIS analysis, the qualitative difference between mixed and sour grasslands overrides quantitative differences in forage availability, fire history and human disturbance in influencing herbivore distribution. The integration of field data and satellite imagery into a GIS system thus offers a powerful tool for the objective quantification and mapping of available habitat.
    • Elk management strategies and profitability of beef cattle ranches

      Torstenson, W. L. F.; Tess, M. W.; Knight, J. E. (Society for Range Management, 2002-03-01)
      Computer simulation was used to determine the effects of wild elk (Cervus elaphus) on available forage, cattle herd size, and ranch gross margin in southwestern Montana beef cow-calf production systems. Data collected from 5 southwestern Montana ranches were used to develop input parameters for bio-economic models of elk forage harvest and beef production. Input parameters described ranch resources, animal inventories, and animal management. Cattle herd size ranged from 241 to 1147 head. Elk numbers varied by season within ranch and ranged from 49 to 421 head. Ranches were simulated as currently managed with elk present and with 10, 20, 30, and 100% of the elk removed. Simulated management scenarios were replicated 10 times. Data from each ranch were analyzed by one-way analysis of variance. Cattle herd size, gross margin, and available forage significantly (P < 0.05) increased when all elk were removed; however, the magnitude of these effects differed among ranches. Removal of all elk permitted cattle herd size to increase from 7 to 32% across ranches. Annual costs of elk on the 5 ranches (i.e., increase in gross margin from elk removal) ranged from 5,949 to 21,152. On an AUM basis, elk costs ranged from 8.55 to 14.51. Three management alternatives were evaluated for their potential to recover elk costs: Montana's Block Management Program, coordinated exchange of forage use, and leasing of hunting access. For each ranch, at least one of these management strategies could recover all estimated costs of providing elk habitat. Elk can significantly reduce profits for cow-calf ranches in southwestern Montana. Elk impacts on beef enterprise profits are closely associated with efficiency of resource use by cattie-i.e., ranches with lower unit costs of production lose more gross margin by providing forage for elk compared to ranches with higher production costs.
    • Cull cow management and its implications for cow-calf profitability

      Little, R. D.; Williams, A. R.; Lacy, R. C.; Forrest, C. S. (Society for Range Management, 2002-03-01)
      Selling culled breeding livestock is often viewed as "just another chore." Most cull sales are made in the fall, after calves are weaned and cows are pregnancy checked and open. Since cull cow sales comprise from 15 to 30% of a cow-calf enterprise's gross revenue, perhaps they should be viewed as a potential profit center. This paper uses enterprise budgets and sensitivity analyses to illustrate cull cow management strategies that overcome certain physical and economic factors that limit the profitability of fall cow sales. The key limiting physical factor is often poor body condition, which results from the combined effect of lactation and deteriorating forage quality. The key economic factor is a seasonal price low, generated by a large beef supply in the fall. The results suggest potential, with adequate, low-cost feedstuffs, to increase net returns by properly managing cull breeding stock. In only 1 year during the 10-year period, 1990-1999, was selling cull cows in the fall the more profitable option. Over that time period, the net present value of spring cull sales averaged about 30 per cow more than selling cull cows in the fall.
    • Perspectives on water flow and the interpretation of FLIR images

      Larson, S. L.; Larson, L. L.; Larson, P. A. (Society for Range Management, 2002-03-01)
      Airborne infrared thermal radiography has been proposed as a tool which may be used to monitor the water temperature along the network of streams and rivers which compose a watershed. The proponents of this method correlate vegetative shadows on a stream channel with reduced infrared radiation (IR) reception in the radiographic data to suggest that the water temperature is reduced in such areas. Two methods are employed to demonstrate that this interpretation of the data is in error. First, the fundamental principles of thermodynamics are employed to show that if the stream is in fact flowing, the water affected by any cooling process cannot remain in the vicinity where it was cooled. Second, temperature data taken from a stream channel are used to show that the water flowing in the channel is essentially unaffected by the patterns of vegetative shade on the surface of the channel.