Rye vs. Oat Cover Crops on a Manured Field: Environmental Benefits Vary Greatly

Chris Graham, Harold van Es, and Bob Schindelbeck, Department of Crop and Soil Sciences, Cornell University

Land application of manure creates conditions conducive for significant environmental losses of nutrients. Application of manure involves large amounts of the nutrients nitrogen and phosphorus, often resulting in excess residual levels – especially after dryer growing seasons. Losses are especially acute in the following winter and spring as excess water from snow melt and rain promotes runoff and erosion of P, leaching of nitrate, and emissions of nitrous oxide from denitrification.  The latter is a significant greenhouse gas concern.

Cover crops are increasingly adopted for various purposes, including to suppress weeds, reduce runoff and erosion, build soil health, provide nitrogen (from legumes), or immobilize leftover nitrates.  For manured fields, winter cover crops may have special benefits by limiting P losses through reduced runoff and erosion, and by scavenging residual N and making it unavailable for leaching and denitrification.

In this study, we tested the ability of oats (Avena sativa L.) and winter rye (Secale cereal L.) cover crops to reduce nutrient losses through multiple potential pathways during the early winter and spring season in a soil with a history of manure application.  Winter rye and oats were selected due to their popularity in the northeastern USA and also for their difference in winter tolerance.  Oats establish well in the fall but are winter killed in our climate, which eliminates the need to terminate their growth in the spring. Rye, on the other hand, survives through our winters and resumes active growth early in the spring. Both cover crops provide soil cover and take up residual N from the previous growing season, thereby reducing both N and P losses. We hypothesized that rye, as it growth longer into the fall and re-establishes in the spring, is more effective at reducing environmental losses than oats.

Methods

This study was conducted on a working dairy farm located in Central New York using a field with a recent history of manure application. The soil at the research site is an Ovid silt loam with 4% average organic matter content in the surface soil and pH of 7.1. During the previous three years, manure was applied in April 2008, October 2009 and April 2010 (final application before study commenced) at total N rates of 145, 170, and 100 lbs per acre, respectively.

Winter rye and oats were broadcast seeded on 24 September 2010 after corn silage harvest in a spatially-balanced complete block design at a rate of 100 lbs per acre. Along with control plots, each cover crop treatment was replicated four times for a total of twelve plots. Quadrats of rye and oats were subsequently harvested on 3 December, 2010 and analyzed for N uptake. The Roots were harvested to a depth of 6 inches.  Soil samples were taken on 3 December, 14 March, 7 April, and 28 April from the 0-to-6 and 6-to-12 inch soil layers for mineral N analysis. Also, on the latter two dates soil material was collected for measurement of nitrous oxide emission potential using a method involving simulated rainfall (to induce denitrification) and 96-hour incubation at the seasonal temperatures (50oF for 7 April and 60oF for 28 April).  Soil water was sampled at 20 inch depth using a tension lysimeter to determine the nitrate content.

Results

Table 1.

Cover Crop Biomass and N Contents

The rye cover crop produced much higher levels of biomass than the oats during the fall season after seeding, as measured on 3 December (Table 1). Aboveground biomass was three times greater in the rye plots than oats, as the former grew more vigorously and was not affected by frost kill. Larger surface biomass for rye implies that it provides greater benefits for reducing runoff, erosion, and P losses.  Also, rye nitrogen uptake was 23.5 vs. 8.7 lbs per acre (269% greater) compared to the oats.  On 28 April, the rye had accumulated more than twice the biomass compared to 3 December, but the total N uptake was similar (about 25 lbs per acre; Table 1).

Figure 1.

Nitrate Leaching

Cover crop effects on nitrate concentrations below the root zone (20 inch depth) were found to vary considerably (Figure 1). Rye significantly and markedly decreased NO3-N concentrations compared to the Control and Oats treatments. Concentrations under oats in fact were about the same as the plots without cover crop – basically indicating that they had no benefit for reducing leaching.  Throughout the spring season, average measured nitrate levels were 43, 52, and 1 mg NO3-N L-1 for the Control, Oat and Rye plots, respectively.

Figure 2.

Nitrous Oxide Emissions

While variability was high, both spatially and temporally, significant results were found in nitrous oxide emissions. Treatment effects changed as the spring season progressed (Figure 2). The Oats treatment produced similar results to the Control throughout the sample period while Rye decreased N2O emissions in late April after a high initial flux earlier in the month.  Higher emissions were measured at the early sampling from plots with cover crops, which had a relatively fresh carbon source that promotes denitrification. Reductions in the Rye plots later in April, were presumably the result of a smaller soil nitrate pool, as the rye cover crop had taken up much of the released N. Average emissions from the Rye treatment were roughly half of the Oats treatments during the final sampling.

Conclusions

The results of this study are clear:  During the winter and spring period when field N and P losses can be high, rye cover crops show great potential to mitigate negative environmental effects. The rye accumulated much greater biomass than oats in the fall, providing better winter cover to reduce runoff, erosion, and P loss potential. Rye also had a very strong positive impact on reducing nitrate leaching in the soil profile, as nitrate concentrations at 20 inch depth were extremely low throughout the sampling period. Oats showed no improvements in reducing nitrate leaching compared to the no-cover crop option.

Rye did not show reduced nitrous oxide emissions resulting from a simulated heavy rainfall event in early April, but showed a 70% decrease later in the month when it was actively taking up N and producing biomass. Oats had winter killed and therefore averaged consistently high emissions throughout the spring period.

In all, the rye cover crop had significantly greater positive effects in terms of reducing P and N loss potentials, while the benefits of the oats were minimal. Although results may vary seasonally, the winter hardy rye cover crop should be given strong preference over oats when the primary objective is to reduce nutrient losses to the environment.

Acknowledgements:  This research was supported through a grant from the USDA Northeast Region Sustainable Agriculture Research and Education program.  We are grateful for the collaboration of John Fleming of Hardie Farms in Lansing, NY.

Print Friendly, PDF & Email

Winter-Forage Small Grains to Boost Feed Supply: Not Just a Cover Crop Anymore!

Tom Kilcer1,2, Shona Ort1, Quirine Ketterings1, and Karl Czymmek1,3
1Nutrient Management Spear Program, Dept. of Animal Science, Cornell University, 2Advanced Ag Systems, 3PRODAIRY, Dept. of Animal Science, Cornell University

Many NY dairies will need to rebuild forage inventory going in to 2013. Some farms are starting to take advantage of winter grains for spring harvest before corn planting. Properly managed, these crops can supply 2-4 tons of dry matter per acre (table 1), and in some fields in 2012 we measured up to 5 tons of dry matter of high quality forage from small grains planted after corn silage, even with little growth in the fall.

Table 1: Biomass fall and spring for winter cereals seeded in fall 2011 at locations across New York State. Since these are not side by side comparisons in the same field, the averages illustrate yield ranges and should not be compared directly to each other.

Crop: The main options are winter wheat, cereal rye or winter triticale. In 2011, we measured yields on all three species in a trial at the Valatie Research Farm in eastern NY and the results were very similar (2.31, 1.92, and 1.96 tons DM/acre for rye, triticale and wheat, respectively, sampled at optimal harvest time for forage. The data for 2012 are shown in Table 2. Triticale yielded between rye (highest biomass) and wheat (lowest biomass) consistently in both years. Triticale is very resistant to lodging when harvested for forage and has the best nutrition profile of the three crops.

Figure 1: A late planted crop can still generate high quality and high yielding forage in the spring. The pictures show triticale at one of the western NY sites in fall of 2011 (left; 0.2 tons DM/acre December 14, 2011) and at harvest time (right; 2.0 tons DM/acre May 11, 2012).

Planting:
Winter grains are very well suited to no-till and will do nicely with a coat of manure after corn silage. Planting with a grain drill or air seeder is the best option to assure a good stand and to maximize value from certified seed. The crop should be planted as soon after corn silage as possible, ideally, mid-late September. The comparison at the Valatie Research Farm suggest that earlier planting produces significantly higher biomass in the fall followed by high forage yields in the spring. However, all cereals produced more than 2.5 tons/acre DM (more than 7 tons/acre silage equivalent) even when seeded in October and with very little fall biomass production (Table 2). The later the planting the more critical the seed be placed 1 – 1.5 inches deep to prevent spring heaving from decimating the stand.

Table 2: Yield for fall seeded winter cereals grown as cover/double crop at the Valatie Research Farm. Seeding took place 10/5/2012 or 9/16/2012. The above ground biomass was harvested 5/2/2012.

Fertilization:
Fields with a manure history and a coat of manure applied after corn silage before, with, or shortly after planting will not need any starter fertilizer in most circumstances. For optimum yield, the crop could need some available N (supplied by fertilizer – e.g. UAN or urea) when dormancy breaks in the spring. We have seen applications in the range of 50-100 pounds of actual N work well. We will be doing more testing to hone in on a spring N guideline and invite farmers to participate in on-farm trials in the spring of 2013 to determine how much fertilizer N is needed for optimal economic yield.

Harvest:
Flag leaf stage supports very high milk production with good yields. More biomass will be added through early head emergence, so harvest timing will depend on farm goals and weather conditions.

Bottom line:
Winter small grains are easy to grow and when harvested for forage in spring make excellent feed and can provide a significant boost to forage inventories. Act now to a secure seed supply. 

 

Print Friendly, PDF & Email

Alfalfa Fall Harvest Guidelines in NY – Should They Change?

J.H. Cherney, D.J.R. Cherney, and P.R. Peterson, Cornell University

Fall harvest management is one of the factors affecting the ability of alfalfa to overwinter successfully. Other factors include the age of the stand, the winter hardiness and disease ratings of the cultivar, the length of cutting intervals throughout the season, soil pH, soil K level, soil drainage, and whether growth is left to catch snow. Once we have planted a stand of alfalfa or alfalfa-grass, the primary two persistence factors we can control are soil K level and fall cutting management.

Good Old Days
For a number of decades, the policy for alfalfa fall harvest was to insist on a no-cut fall rest period of 4-6 weeks before the first killing frost. This critical fall period allowed root reserves to be replenished and minimized the chances that cutting management would negatively impact overwintering. Adequate time to replenish root reserves was considered 10% bloom by some researchers, while others assumed that 8-10” of top growth in the fall assured maximum root reserve storage, prior to the first killing frost. It also left significant alfalfa residue to facilitate insulating snow catch.

What is a “Killing Frost”?
The temperature at which alfalfa essentially stops all growth is somewhere between 24 and 28o F. Sheaffer (MN) suggested the first killing frost was 28o F, Tesar (MI) considered it 26.6o F (-3o C), while Undersander (WI) considered a killing frost as 4 or more hours at 24o F. Other studies have used 25o F as the definition of first killing frost. This can greatly impact the date of “first killing frost”. In Ithaca, NY for example, the latest “first killing frost” date for 30 years of weather data occurred Nov. 5 at 28o F vs. Dec. 10 at 25o F. When accumulating Growing Degree Days (GDD) until first killing frost, a low temperature such as 25o F is not reasonable, as all alfalfa varieties with appropriate winter hardiness ratings for the region would have gone dormant well before Dec. 10.

Fall Alfalfa Harvest Management, 1980’s
During the 1980’s, numerous studies in Canada and the northern USA investigated alfalfa fall harvest management. Research in southern Saskatchewan found that a third cut between Aug. 25 and Sep. 20 reduced spring yields, compared to an Oct. 1 cut. McKenzie et al. (1980) determined that a second cut from Aug. to mid-Sep. consistently reduced future yields in central Alberta, but not in northern Alberta. In Minnesota, Marten (1980) concluded that a third harvest anytime in September would not reduce persistence, assuming it was a winter hardy variety on well-drained soils high in K, and there was consistent snow cover. In Michigan, Tesar (1981) also concluded that a third cut in September or early October was not harmful.

Tesar and Yager (1985) suggested that a third cut in September in the northern USA was not harmful as long as there was adequate time for replenishment of carbohydrate reserves between the second and third cuttings. Sheaffer et al. (1986) concluded that fall cutting does increase the risk of long-term stand loss, but that fall cutting will provide short-term higher yields and high quality. They also concluded that length of harvest interval and number of harvests during the growing season were as important as the final harvest date.

Root Reserves Assessed with GDD
The first attempt to quantify carbohydrate reserves between second and third cuttings of alfalfa based on GDD occurred in Canada. Research in Quebec by Belanger et al. showed that it may be acceptable to cut during the critical fall rest period in September, as long as there was an interval of approximately 500 GDD (base 5o C) between the fall harvest and the previous harvest. For forage crops in the USA, GDD are calculated using base41, with heat units accumulated above a daily average of 41o F (5o C). These do not generate the same number of GDD units, 500 GDD base5 C is equal to 900 GDD base41 F.

Current NY Guidelines
The sum of the above research results caused NY fall alfalfa harvest recommendations to change about 20 years ago to “Allow a rest period of 6 to 7 weeks between the last two cuts”. A similar recommendation in PA of “At least 45 days between the last two cuts” was also adopted. This recommendation has not changed in NY for the past 20 years. Keep in mind that any cutting management options during the critical fall rest period must involve healthy stands of better adapted winter hardy varieties with multiple pest resistance.

Application of the 500 GDD Criteria
A comparison of the Quebec 500 GDD base5 C rest period can be made with the currently recommended “6-7 week rest period”. By selecting the years with the least and most GDD accumulated during August and September, a range in days for the rest period can be calculated, based on a 500 GDD interval between the last two cuts (Fig. 1 & 2). If cutting on Sep. 1, the 500 GDD interval prior to Sep. 1 is about 5 weeks (Table 1). If cutting Sep. 30, the 500 GDD interval prior to Sep. 30 is 6 to 7 weeks. The rate of decline in GDD units per day in the fall is similar for central and northern NY (Fig. 3 & 4; Table 1).

All X- and Y-axis date combinations below the shaded boxes in Fig. 1 and 2 identify the rest period interval that will result in 500 GDD before the September cut with high confidence. These date combinations resulted in 500 GDD for all 30 years of weather data. All X- and Y-axis date combinations above the shaded box in Fig. 1 and 2 will be very unlikely to accumulate 500 GDD, as this never happened in 30 years. For example, in Ithaca (Fig. 1) if alfalfa is cut on Aug. 2, it is Sept. 12 before you are out of the rest period shaded zone. Using the 500 GDD concept, our current 6-7 week rest period is appropriate for cutting at the end of September, but could be reduced to approximately a 5 week rest period if cutting Sep. 1. For rest periods based on GDD, the later it is in the season, the longer it will take to accumulate 500 GDD (Fig. 3 & 4).

Applying the 500 GDD Interval to the Critical Fall Rest Period before 1st Frost
It has been suggested to apply the Quebec research to the period preceding 1st frost, and help define a “no-cut” time interval prior to 1st frost. The assumptions are that we need 500 GDD (base5 C) for alfalfa to build up root reserves. A second assumption is that it is safe to cut alfalfa if there are less than 200 GDD (base5 C) remaining before the first killing frost, as there would be insufficient regrowth to use up enough storage carbohydrates to negatively affect alfalfa persistence. We are presenting this system as an example, even though we were not able to find any evidence in the scientific literature concerning the 200 GDD assumption. A similar example of this concept can be found in Michigan literature (http://www.agweather.geo.msu.edu/agwx/articles/article-09.html), although GDD base41 were used for this example incorrectly. Using the 500/200 GDD criteria, we can approximate the odds that fall mowing will not cause winter injury.

Approximate probabilities of either accumulating over 500 GDD (base5 C) or accumulating less than 200 GDD (base5 C), with long-term weather data (30 consecutive years) can be calculated if alfalfa is cut on a particular date in the fall at a particular site (Fig. 5 & 6). Four dates can be determined to approximate 0 and 100% chances of either more than 500 GDD after fall cutting, or less than 200 GDD after fall cutting. For this exercise, we are assuming that the first occurrence of 28o F is a “killing frost”. A killing frost in Watertown occurs on average 9 days earlier than in Ithaca (Table 1).

Four dates, (a,b,c,d, Fig. 5 & 6) are identified by calculating the following:
a. Year with earliest killing frost date: subtract 500 GDD base5 C (from Sep. 20, 1993).
b. Year with latest killing frost date: subtract 200 GDD base5 C (from Oct. 28, 2001).
c. Year with latest killing frost date: subtract 500 GDD base5 C (from Oct. 28, 2001).
d. Year with earliest killing frost date: subtract 200 GDD base5 C (from Sep. 20, 1993).

For long term weather data, these dates correspond to:
a. Latest calendar date resulting in >500 GDD base5 C after fall cutting.
b. Earliest calendar date resulting in <200 GDD base5 C after fall cutting.
c. Earliest calendar date resulting in <500 GDD base5 C after fall cutting.
d. Latest calendar date resulting in >200 GDD base5 C after fall cutting.

To simplify the display, we then assume a linear relationship between 0% and 100% chances that fall cutting will not cause winter injury. Statistical probabilities could be calculated individually for each day, but the results would not provide clear guidelines. The rate of GDD accumulation into the fall gradually decreases and is not perfectly linear (Fig. 3 & 4), but for practical purposes a linear display suffices. Cutting on Aug. 31, Sep. 1, or Sep. 2, the odds of either accumulating >500 GDD or accumulating <200 GDD in Watertown, NY are approximately zero. Using this system, the date that would maximize the chances of winter injury due to cutting is Sep. 1 in Watertown, and Sep. 6 in Ithaca.

Comparing the Systems
Compare Fig. 4 (interval to 1st frost) to Fig. 2 (interval between last two cuts). If alfalfa was mowed on July 25, and then mowed again on Sep. 1 in Watertown, the chances of winter injury due to cutting are near zero for Fig. 2 (with 500 GDD accumulated between the last two cuts all 30 years). So under one system (Fig. 4), Sep. 1 would be the worst date to cut alfalfa in Watertown, while under the other system (Fig. 2), Sep. 1 can be a very safe date to cut alfalfa.

It is possible that both systems are reasonable. Allowing a 500 GDD interval before a Sep. 1 cut would make a Sep. 1 cut relatively safe. On the other hand, not allowing 500 GDD before a Sep. 1 cut might make this the worst possible time to cut an alfalfa stand. Keep in mind that winter damage to alfalfa is an accumulation of insults. A weakened stand will be considerably more susceptible to damage from intensive harvest management, as well as mowing during the critical fall rest period.

Reasons to be more Conservative in NY vs. the Midwest
There are several issues more specific to the Northeast/New England, which will likely have an impact on the chances of fall cutting affecting long-term alfalfa persistence. The basic requirement for any cutting of alfalfa during the critical fall period is that near ideal conditions exist. That is, you have a healthy, very winter hardy variety with high soil K, good soil drainage, and good snow cover over the winter. Good soil drainage in NY is often not the case, and consistent snow cover is never guaranteed. In northern NY there is also the possibility of alfalfa snout beetle and/or brown root rot damage, which could greatly affect the consequences of cutting during the fall period.

Reasons to be less Conservative in NY vs. the Midwest
Another NY-specific issue is that of species mixtures. Most alfalfa in the Midwest is sown in pure stands, over 85% of alfalfa sown in NY is in mixture with perennial grasses. For mixed stands with alfalfa, growers may be somewhat less risk averse than with pure stands, when it comes to the chances that fall cutting will result in shortened persistence of the alfalfa component. Loosing alfalfa more quickly from a mixed stand is not quite as catastrophic as loosing alfalfa in a pure stand. With the availability of Round-up Ready alfalfa, the frequency of pure alfalfa stands in the Midwest is likely to increase. Because NY has few prime alfalfa soils, it is less likely that RR-alfalfa will greatly increase the proportion of pure alfalfa stands in NY.

Conclusions
Our historical understanding of alfalfa root reserves provides evidence for maintaining a Critical Fall Rest Period for alfalfa. Applying the 500 GDD criteria to the Critical Fall Rest Period, however, results in an average rest period before 1st killing frost exceeding 7 weeks. Past research data provide evidence that a sufficient rest interval between the last two cuts allows us to take the last cut during the critical rest period. There does not appear to be evidence to change our basic logic for fall harvest of alfalfa. Some fine tuning of the rest interval between the last two cuts can be made using Fig. 1 and 2. The above suggestions are for healthy stands. If a stand is not healthy, a more conservative harvest management may increase the chances of stand survival.

Print Friendly, PDF & Email

Corn Emergence When Planting in April a Few Days Before a Snow Storm

Bill Cox, Phil Atkins and Geoff Reeves, Department of Crop and Soil Sciences, Cornell University

March 2012 was the warmest March on record across much of the USA (13 degrees above normal for most of NY). Surprisingly, a couple of growers in NY planted limited corn acreage during the week of March 19th when daytime temperatures averaged about 75 degrees. Farmer testimony indicated satisfactory emergence for the March-planted corn. Many other growers, however, elected to wait until the next warm spell, which occurred during the week of April 15th when daytime temperatures averaged about 70 degrees. Farmer testimonies, however, were somewhat mixed for the corn planted during this week with some replanting reported, especially in poorly drained areas of a field. We planted two studies that week: our corn silage hybrid trial with 82 entries on April 20th at the Aurora Research Farm in Cayuga County and a 10-acre seeding rate study on April 18th just northwest of Auburn in Cayuga County.

Table 1. Weather conditions at the Aurora Research Farm and the Auburn airport from April 15-April 30th in 2012. Emboldened date indicates the weather conditions on the day of planting for studies discussed in this article.

Weather conditions (daily weather is recorded the morning after at 8:00 AM so the April 20th data at Aurora is recorded as April 21st data when the high temperature was 78) for the first 10 days after planting at Aurora changed drastically (Table 1).  At Aurora, the high temperature the day after planting was 54 and then only 2 days above 50 degrees were recorded over the next 8 days (64 on April 26th, reported as April 27th data, and 53 on April 29th, reported as April 30th data). More importantly, only 24 hours after planting, Aurora received a cold 0.6 inches of rain followed by 0.86 inches of precipitation in the form of a 5-inch snow storm 48 hours after planting. Another 0.20 inches of precipitation occurred the following day, 72 days after planting, when the high temperature was only 36 degrees. Also, note that low temperatures dipped down to 26 degrees for two nights about a week after planting. Obviously, weather conditions were conducive for imibitional chilling damage during the initiation of the emergence process, cold stress during the emergence process, and drowning out of corn seeds shortly after planting in poorly-drained areas of a field.

When averaged across the 82 hybrids entered in the study, the stand establishment rate (number of established plants in 2 rows of the 20 foot plot length at the V4 stage/86 seeds in each seed packet planted) averaged 85.4% (Table 2). Stand establishment averaged from about 84 to about 89% for the 12 seed companies that entered hybrids.  Of the 82 hybrids entered in the study, only six hybrids had stand establishment rates of less than 80% on this drained Lima silt loam soil. Obviously, most modern hybrids can withstand the rigors of cold and wet weather conditions, even 5 inches of snow, shortly after planting (Fig.1 and 2).

Table 2. Stand establishment rates of 82 hybrids from 12 seed companies planted on April 20th, 2012, 2 days before a 5-inch snow storm.

At the field-scale study where soil conditions are more variable, we counted the number of established corn plants at the V5 stage along the entire length of one row (~800 feet) at each seeding rate for the two hybrids (9807HR from Pioneer and DKC49-94 from DEKALB) evaluated in this study. When averaged across hybrids and seeding rates, stand establishment rate averaged 84.6%. Stand establishment varied from about 83 to 87% between hybrids and from about 84 to 87% across seeding rates (Table 3). This site did experience two warm days (highs of 74 and 76, Table 1) 2 days after planting so conditions were not quite as harsh. On the other hand, low temperatures dipped down to 24 degrees for two nights and 26 degrees another night about 10 days after planting. In addition, this site received about 4 inches of snow a few days after planting. So the 84% stand establishment rate on this production field was quite satisfactory given the conditions. I will add that in a 50 by 100 foot low spot in the third replication of the study no corn emerged (not accounted for in the data because it was a seeding rate study ) so certainly the excessively wet conditions after planting had a major impact on stand establishment rates.

Table 3. Plant populations at the fifth leaf stage (V5) of a DEKALB and a Pioneer hybrid at four seeding rates in a field-scale study planted on April, 18th, 2012 a few miles northwest of Auburn, NY in Cayuga County.

So, what does the stand establishment data from 2012 tell us? First, most if not all modern hybrids have excellent cold tolerance and perhaps tolerance to imbitional chilling (an elusive phenomenon that I am not sure that I have ever observed). On the other hand, modern hybrids have limited tolerance to flooded soil conditions shortly after planting as observed in the field-scale study. So obviously, soil drainage conditions should be a major factor when considering early planting dates (an early planting date lengthens the time that corn is in the vulnerable period to flooded soil conditions, from planting to about the V5 stage). Another factor to consider is planting depth. We only plant at about a 1.5 inch depth in April, especially when cool and wet conditions are forecasted for the immediate future. Many growers mentioned that their planting depth was at the 2-inch soil depth when planting the week of April 15th, which may have contributed to poor stand establishment reported by some farmers in some poorly drained areas of a field or on heavy soils.

Fig.1. Aurora corn silage hybrid trial on May 11th, 2012, planted on April 20th.

What happens if soil conditions are dry in mid-April next year and soil conditions are once again ideal for planting? I will again recommend to begin planting anytime after April 10-15, provided your location does not experience late spring killing frosts (< 28 degrees after May 15th or so) and your soils are well-drained and do not readily flood. In other words, I recommend to plant fields with good drainage that are not in frost pockets anytime after April 10-15that a soil depth of about 1.5-1.75 inches. I wouldn’t plant much deeper in April unless you are looking for moisture.

Fig.2. Counting emerged corn plants at the V4 stage in the Aurora corn silage hybrid trial on May 31, 2012.
Print Friendly, PDF & Email

Phosphorus Saturation versus the New York P Index? Impact on Manure and Fertilizer Management in New York State

Julia Knight1, Quirine Ketterings1, Karl Czymmek1,2, and Rich Wildman3
1
Nutrient Management Spear Program, Department of Animal Science, Cornell University, 2PRODAIRY, Dept. of Animal Science, Cornell University, and 3Agricultural Consulting Service Inc.

Introduction
Phosphorus enrichment of surface waters leading to algal blooms and other issues related to eutrophication continues to be an issue in a number of locations.  Runoff from agricultural fields can contribute to P runoff and management tools and policies have been developed to manage runoff risk.   In 1999, New York (NY) introduced its first Concentrated Animal Feeding Operation (CAFO) Permit. This was followed by release of the NY Phosphorus Index (NY-P Index; USEPA, 1999; Czymmek et al., 2003) and establishment of a statewide on-farm research partnership in 2001. State policy requires implementation of the Natural Resources Conservation Service (NRCS)-NY 590 nutrient management standard on all farms with a CAFO Permit as well as animal feeding operations (AFOs) receiving state or federal cost share funds for manure storage and other related practices. Since 2001, the NY-P Index has been a required element of the NY 590 nutrient management standard.

In May of 2009 President Obama signed an Executive Order to intensify efforts to protect and restore the Chesapeake Bay and its watershed. This Order resulted from the belief that there had not been sufficient progress in restoring the health of the Bay and its watershed in the past 25 years. As a result of the Order, USEPA published Guidance for Federal Land Management in the Chesapeake Bay Watershed (“Guidance document”) on May 12, 2010. This document states that managing P through state-based P runoff indices is flawed and results in over-application of P to cropland. In the Guidance document, USEPA replaced the P index approach with a Psat approach based on a 20% Psat cutoff for manure or fertilizer application (USEPA, 2010) (Figure 1).  While only applicable on federal lands at this point, it is viewed by some as a potential precursor to more widespread implementation on private lands.

Figure 1. Guidance for Federal Land Management in the Chesapeake Bay Watershed: 1.2.2 Implementation Measures for Agriculture in the Chesapeake Bay Watershed to Control Nonpoint Source Nutrient and Sediment Pollution, USEPA.

During the review period for the Guidance document, USEPA received input from numerous organizations, including academic members of SERA-17. This group consists of research scientists, policy makers, extension personnel, and educators with the mission to develop and promote innovative solutions to minimize P losses from agriculture by supporting: (1) information exchange between research, extension, and regulatory communities; (2) recommendations for P management and research; and (3) initiatives that address P loss in agriculture (http://www.sera17.ext.vt.edu/). The SERA-17 scientists questioned the validity of the use of a Psat based cutoff for land application of manure and/or fertilizer, raised concerns that the Psat approach does not consider landscape position (a critical component of P loss), and pointed out that various Psat methodologies provide significantly different results. Despite these comments, USEPA published the 20% Psat cutoff in the Guidance document (http://www.epa.gov/owow_keep/NPS/chesbay502/pdf/chesbay_responsetocomments.pdf):

 “EPA recognizes that Psat is an important feature that could improve the usability of the P index in long term nutrient management planning, particularly where P leaching is the primary environmental concern. EPA does not recommend any one methodology for determining Psat. We understand that the methods used to determine Psat are depended upon the chemical features of the extracts and do not provide conversion factors between the methods mentioned. EPA understands that the method of P analysis should always be clearly described in any presentation of Psat or soil test P. Also, while Psat and soil P are correlated, by determining the P application based on P-Sat, EPA’s recommendation will still allow application beyond realistic yield goals in areas where Psat is lower than 20 percent; soil P is a more conservative estimate for P applications.”

The implementation of the Psat cutoff for P application to federal land, and the potential for implementation of a similar cutoff for all agricultural land, motivated a project to compare the impact of use of a Psat approach on P fertilizer and manure application cutoffs as compared to our current NY P index approach. Specifically, our goal was to evaluate if a Mehlich-3  derived Psat (P/[Fe+Al]) could be converted to a particular Cornell Morgan P and if so, determine the potential 20% Psat cutoff for manure application.

What Did We Do?
In total, 91 soil samples were tested for Cornell Morgan (Morgan, 1941) and Mehlich-3 (Mehlich, 1984) extractable P, Fe, Al, and Ca. The Psat was determined as P/[Fe+Al]*100 (molar ratios) according to Kleinman et al. (2002). As mentioned, there are different methods for estimating Psat. The ratio of Mehlich-3 extractable P over Fe+Al was selected as a most likely candidate for implementation, because it is a commonly available agronomic test, despite evidence that this method (1) is unsuitable for calcareous soils found in parts of NYS, and (2) requires soil specific calibrations. Samples were collected from New York farms identified in conjunction with Agricultural Consulting Services, Inc. (ACS). Samples were air-dried and ground to pass a 2 mm sieve prior to laboratory analysis. Regression analyses were performed to determine if Morgan data could be correlated to Psat and if so, at what Cornell Morgan soil test level a Mehlich-3 derived P saturation of 20% was obtained.

What Did We Find?
Across all soil samples, a P saturation of 20% corresponded to a Cornell Morgan P of 86 lbs/acre (Figure 2). This Cornell Morgan value was somewhat higher than the 56 lbs P/acre (Cornell Morgan test) reported for 59 soil samples from the Delaware River Watershed in 1999 (Kleinman et al., 1999; assuming that Psat based on Mehlich-3 equals 0.7 times Psat derived from the oxalate extraction according to Kleinman and Sharpley, 2002), and similar to the 80 lbs P/acre (Maine Modified Morgan test) for 106 soil samples submitted to the Maine Soil Testing Service (Ohno et al., 2007). The New York data also show a wide range in soil test P equivalents; for example, of the 7 soils with a Psat of 20%, corresponding Cornell Morgan P levels ranged from 56 to 172 lbs P/acre with a median value of 71 lbs P/acre. Similarly, soils with a Cornell Morgan P of 75-85 lbs/acre corresponded to a Psat ranging anywhere from 16 to 38%.

Figure 2: Relationship between the Cornell Morgan P test and P saturation derived from Mehlich-3 data (P/(Fe+Al) in molar ratio).
Figure 2: Relationship between the Cornell Morgan P test and P saturation derived from Mehlich-3 data (P/(Fe+Al) in molar ratio).

Implications
The implementation of a Psat cutoff of 20% for manure application instead of the NY-P Index will not impact manure application to high risk fields with a Cornell Morgan soil test of 80 lbs/acre or more, as the current P Index will not allow manure application to those fields as the NY-PI score will be 100 or more if the the transport factor is 1.0. Given that a very low percentage of NY fields test greater than 80 lbs P/acre (about 5%), implementation of a Psat in NY will have minimal effect on manure application practices. However, it could adversely impact farms with fields with very high soil test P but low transport risk. Such Psat based policy purports to address manure disposal (i.e. application beyond what would be most optimal for P resource management) but will increase the use of purchased fertilizer as it does not account for fertilizer value of N and K in the manure. Further, we do not believe implementing the Psat cutoff in NY offers real environmental benefit because as a chemical test alone, it fails to account for key, field specific risk considerations of landscape position and relationship of the field to surface waters.

Conclusions
Implementation of a Psat approach will cause restrictions on P application for very high P fields with a low NY-PI transport risk. On average, across all soils in the study, a Psat of 20% corresponded to a Morgan soil test P level of 86 lbs/acre, just above the current cutoff for P application for fields with a high transport risk. This means that implementation of a Psat approach would eliminate manure and fertilizer application to fields with a Cornell Morgan P of 86 lbs/acre, independent of the risk of transport of this soil test P to surface or groundwater. We do not recommend the application of the Psat approach in NY as it will increase costs for some farms while unlikely to offering corresponding environmental benefit.

References

  • Czymmek, K.J., Q.M. Ketterings, L.D. Geohring, and G.L. Albrecht. 2003. The New York Phosphorus Index User’s Guide and Documentation. CSS Extension Bulletin E03-13. 64 pp. Available: http://nmsp.cals.cornell.edu/publications/extension/PI_User_Manual.pdf [29 January 2012].
  • United States Department of Agriculture and Environmental Protection Agency (USDA-EPA). 1999. Unified National Strategy for Animal Feeding Operations. Washington DC. Available: http://www.epa.gov/npdes/pubs/finafost.pdf [23 April 2012].
  • United States Department of Agriculture and Environmental Protection Agency (USDA-EPA). 2010. Guidance for Federal Land Management in the Chesapeake Bay Watershed. Chapter 2. Agriculture. EPA841-R-10-002. Washington DC. Available: http://www.epa.gov/owow_keep/NPS/chesbay502/pdf/chesbay_chap02.pdf [23 April 2012].
  • Ohno, T., B. R. Hoskins, and M.S. Erich. 2007. Soil organic matter effects on plant available and water soluble phosphorus. Biology and fertility of soils 43: 683-690.
  • Kleinman, P.J.A., R.B. Bryant, and W.S. Reid. 1999. Development of pedotransfer functions to quantify phosphorus saturation of agricultural soils. Journal of Environmental Quality 28: 2026-2030.
  • Kleinman, P.J.A., and A.N. Sharpley. 2002. Estimating soil phosphorus sorption saturation from Mehlich-3 data. Communications in Soil Science and Plant Analysis 33: 1825-1839.

Acknowledgments
This work was supported by the Cornell University Agricultural Experiment Station (CUAES) and in-kind contributions by Agricultural Consulting Service Inc. For questions about these results contact Quirine M. Ketterings at 607-255-3061 or qmk2@cornell.edu, and/or visit the Cornell Nutrient Management Spear Program website at: http://nmsp.cals.cornell.edu/.

Print Friendly, PDF & Email

New York P Index Survey: What Caused Impressive Improvements in the NYS P Balance?

Quirine Ketterings1 and Karl Czymmek1,2

1Nutrient Management Spear Program, 2PRODAIRY, Department of Animal Science, Cornell University

Introduction
The New York Phosphorus Index (NY-PI) was introduced in 2001. Since then, phosphorus (P) fertilizer sales (farm use) declined from 36,506 tons of P2O5 in 2001 (19.5 lbs P2O5/acre) to 18,610 tons P2O5 in 2009 (10.2 lbs P2O5/acre). In 2011, we surveyed Certified Nutrient Management Plan (CMNP) developers certified through the New York State Agricultural Environmental Management (AEM) program to evaluate their perceptions of the drivers for this change in P use. All 24 planners responded to the survey allowing us to document: (1) farms and acres covered by CNMPs and changes in management practices and soil test levels; and (2) planner perceptions of the drivers of these changes since the introduction of the NY-PI in 2001. The survey contained questions related to (1) farms and acres for which CNMPs were developed in 2010; (2) time and effort needed to do a NY-PI assessment for a field; (3) impact of NY-PI field assessment on changes in manure and/or fertilizer practices; and (4) changes in soil test P levels after 2001 when the NY-PI was introduced. In addition, planners were asked what they would tell policy makers about why farmers made changes and what policies and programs are needed to continue progress. The 24 CNMP planners consisted of 18 from the private sector, 5 from Soil and Water Conservation Districts (SWCD) and one from Cornell Cooperative Extension based in the New York City Watershed. One of the SWCD planners works with a private sector planner and their joint response is included in the private sector planner category.

Table 1: Percent of all acres and farms under nutrient management planning in 2010 in New York planned by Soil and Water Conservation Districts (5 planners), Cornell Cooperative Extension (1 planner, New York City Watershed), and private sector planners (18 planners).
Table 1: Percent of all acres and farms under nutrient management planning in 2010 in New York planned by Soil and Water Conservation Districts (5 planners), Cornell Cooperative Extension (1 planner, New York City Watershed), and private sector planners (18 planners).

Results and Discussion
Farm Sizes

The private sector planners were responsible for CNMPs covering 88% of all CNMP cropland and 76% of all farms with a CNMP (Table 1). Although private sector planners also planned most of the new plans in 2010 (74% of all acres, 62% of all farms), 22% of all acres newly planned in 2010 were farms in the NYC Watershed. The SWCDs planned less than 10% of all farmland and farms.

The private sector and the SWCD planners worked primarily with CAFO-farms (200 cows or more) with average farm size exceeding 800 acres/farm. The planner from the NYC Watershed worked primarily with smaller operations (<200 acres/farm and 50-80 cows per farm) (Table 2).

Table 2: Total acres and number of farms as well as farm size for farms with certified nutrient management plans in 2010 planned by Soil and Water Conservation Districts (SWCD, 5 planners), Cornell Cooperative Extension (CCE, 1 planner, working in the New York City (NYC) Watershed), and private sector planners (18 planners).
Table 2: Total acres and number of farms as well as farm size for farms with certified nutrient management plans in 2010 planned by Soil and Water Conservation Districts (SWCD, 5 planners), Cornell Cooperative Extension (CCE, 1 planner, working in the New York City (NYC) Watershed), and private sector planners (18 planners).

About 1/3rdof all the farms that CNMPs were developed for in 2010 did not meet the minimum size requirements to be qualified as a medium or large CAFO but were in state or federal programs that required a CNMP. Most of the farms in the NYC Watershed are included in this category. For both private sector planners and SWCD planners, new plans developed in 2010 tended to be for smaller farms (Table 2), consistent with the 100% compliance for CAFO farms in NY and expansion of CNMP planning to smaller farms involved in federal or state programs.

Time Required for NY-PI
The time needed to complete an NY-PI assessment for a field varied from 10 to 90 min, mostly dependent on whether the assessment was for a new field (and included determination of dominant slope and flow distance to streams), or if the assessment was an update from a previous year. Averaged across all planner responses, 40 min per field was needed, although 50% of all planners indicated assessments could be done within 30 min. About 40% estimated they needed 30-60 min per field, while 10% said more than 1 hour per field was needed. These differences might reflect differences in field topography (complex slopes, multiple flow paths etc.).

Fields Impacted by NY-PI
The planners estimated that management of 17% of acres under nutrient management planning was altered because of an initially very high or high NY-PI score. As a result of NY-PI implementation, manure was reallocated to fields that would otherwise not have received manure (as indicated by 77% of the planners). The most frequent changes made in manure management were changes in timing and rate (86% of the planners ranked timing and rate as the top two changes made). Changes in method of application were less common (ranked in the top two by 13% of the planners only). According to 65% of the planners, the introduction of the NY-PI resulted in an increase in both acres per farm and amount of exported manure. Forty three percent of the planners indicated that NY-PI based planning decreased the average soil test P levels over time and 48% said the percentage of fields classified as very high in soil test P decreased. The introduction of the NY-PI did not change cow numbers per farm or poultry litter use over time, according to 57% and 78% the planners, respectively.

Soil Test P Trends
Only 5% of the fields represented in the assessment tested above 80 lbs/acre Morgan extractable P, the level at which the NY-PI exceeds 100 if the transport risk from the field is high, and slightly less than ten times the agronomic critical level for most crops. Of the total cropland area, 4% could not receive manure under NY regulations because the NY-PI already exceeded 100 without the manure application.

Figure 1: Planner perceptions of the drivers of the drastic reduction in P fertilizer sales for on-farm use in New York. Planners were asked to ranks the drivers from 1-5 with 1-2 considered important, while 4-5 was not an important contributor to the change over time. Cost of fertilizer and on-farm research were identified as the most important drivers (23 respondents).
Figure 1: Planner perceptions of the drivers of the drastic reduction in P fertilizer sales for on-farm use in New York. Planners were asked to ranks the drivers from 1-5 with 1-2 considered important, while 4-5 was not an important contributor to the change over time. Cost of fertilizer and on-farm research were identified as the most important drivers (23 respondents).

Perceptions of Drivers
The two most important drivers for the changes in fertilizer use observed by NY planners were the price of fertilizer and the on-farm research partnership that showed that no additional starter P was needed if the soil test was classified as high or very high in P (Figure 1).

The reply related to fertilizer sales is most likely reflecting recent memory of the peak in fertilizer prices in 2008, as actual fertilizer sales decreased over time, prior to the 2008 price spike. Other reasons included greater use of soil testing for fertilizer use decisions, the expansion of manure application options in the state, awareness of the link between animal numbers and acres needed to apply the manure generated by the animals, improvements in herd nutrition, and the onset of a regulatory environment. One planner pointed out the importance of involving stakeholders when addressing environmental concerns:

“The history of collaboration and trust between the public, academic, and private sector stakeholders in New York State has led to a track record of efficient problem solving. Involve stakeholders in the process and hold them accountable to create real solution.”

 Policy Message
Some planners pointed to improvements made in NY, the farms’ investment in protection of the environment, and the role of the NY-PI in achieving such improvements. Others pointed to the need for partnership, science-based guidelines, and funding for applied research and planner and farmer training:

“The bottom line for the success that we have seen in NYS is because of the “systems” approach taken by the state and not just focusing on one problem area. Not only was phosphorus looked at, but, nitrogen and now potassium research is ongoing. On-farm research is one of the major “keys” to have “real data” from a true farm field setting with actual weather and field conditions with specified goals being measured. This approach has proven to be successful within all farming regions of NYS.”

“Another “key” to the success in NYS is that ALL agencies have collectively worked together in providing funding for research, data collection and analysis, training and educational programs for certified CNMP planners and farmers along with assistance for implementation of all needed conservation practices. Funding at the state and federal levels is the life blood for continued success that NYS has experienced thus far.”

The role of qualified professionals was stressed by several of the planners:

“The support of a skilled and knowledgeable planner using good information and effective tools applying the right strategies in the right places at the right times has been critical in helping NY farmers achieve reduced environmental impact. The P-Index applied by trained Nutrient Management Planners helps farms implement practices that are both environmentally effective and economically feasible.”

Also pointed out were the needs for farm-specific solutions and flexibility to address the challenges in nutrient management:

“Farmers need to know why changes are required, but they need flexibility to manage with day to day changes.  Economics will continue to be major driver.”

In addition, the need for research and improvement of tools for management in general was pointed out in the planner responses:

“We need to continue to use science based technology such as the P index, N index, etc. rather than using broad restrictions to nutrient management planning (i.e. no winter spreading)”

“Keep supporting our farms with research and training programs.”

Planners referred to benefits of the collaborative approach to P management among dairy farms in NY for other industries, and/or called for action by other sectors of agriculture:

“The system is working!! Good research coupled with effective communication and on-farm planning has brought incredible benefits to New York agriculture. It goes beyond livestock agriculture. I know of several successful landscape businesses that never apply any P fertilizer to lawns anymore.”

Enforcement of regulations was identified as a key component as well in achieving improvements at the farm level:

“I think however that the biggest driver of changes in terms of nutrient management, amount of manure applied and reduction of overall P applications is due to the fact that the DEC is enforcing the CNMP. We have been doing CNMPs in NY since […], most (and maybe all) of our medium and large CAFO clients have been inspected several times, and as the competence of the inspectors increase, and the inspections became more thorough, the attitude of the farmers was to look for the recommendations and to make sure that they actually applied what was there. […]”

Others indicated the need for continued support for planners, training, and on-farm research:

“With the success that has been obtained in NYS, I would recommend continuing the current programs that are in place with more funding devoted to enhancing our farm producers viability, farmers know their farms and fields better than anyone else, including government officials, but they continually need assistance with improving production and lessening environmental risk through state and federal programs so new technology and implementation practices or best management practices can be adopted in a timely and financially stable manner.”

Conclusions
The key ingredients for success identified by the CNMP planners were: (1) statewide awareness of environmental issues driven by both regulations and extension programming/training; (2) development and implementation of science-based and practical tools (like the NY-PI) that allow for farm-specific solutions to the challenges; (3) demonstrated need for or benefits of alternative management practices (i.e. an on-farm research partnership that addresses relevant questions and on-farm research that results in credible answers); (4) accountability; (5) state enforcement of regulations; and (6) the presence of economically feasible solutions. The success story of NY reflects a recognition of the need for change by both farmers and farm advisors, an interest in exploring management alternatives while looking for win-win approaches (i.g. reduced fertilizer use, re-evaluation of dairy rations, etc.), and a willingness by farmers and farm advisors to contribute to on-farm research that generated reliable data and believable results (with as the foundation a trust-based farmer-advisor-researcher relationship). We conclude that the NY-PI contributed to the successful reduction in P use in NY by being acceptable to farmers and farm advisors as a risk assessment tool, by being directionally correct (it made sense) and by allowing farms to design farm-specific solutions. The story of NY shows that change can be obtained via policy, incentives, measuring and monitoring.

Acknowledgments
For questions about these results contact Quirine Ketterings at 607-255-3061 or qmk2@cornell.edu, and/or visit the Cornell NMSP website at: http://nmsp.cals.cornell.edu/.

 

Print Friendly, PDF & Email