Skip to main content

EDF’s NutrientStar Reports Do Not Reflect Adapt-N Capabilities

Harold van Es, professor
Soil and Crop Sciences Section, School of Integrative Plant Science, Cornell University
hmv1@cornell.edu

Key Points

  • The NutrientStar program uses evaluation methodologies that are built on an old flat-rate paradigm and penalize new adaptive 4R-Plus approaches, like Adapt-N.
  • For on-farm testing, NutrientStar relies on crop consultants who in some cases used inappropriate practices or make erroneous assumptions that impact the trial outcomes
  • NutrientStar results for Adapt-N contradict findings from other research reports and the experience of many practitioners
  • The TED framework suggests a level of precision that masks the underlying weaknesses of the trial data.
  • NutrientStar reports do not reflect the use potential of Adapt-N and heads farmers in the wrong direction away from advanced 4R-Plus management tools, which are needed to solve the pressing nitrogen problem

The NutrientStar initiative by Environmental Defense Fund aims to provide science-based guidance on nutrient management tools and products. This fills a need for independent evaluation of the many new N management technologies that are hitting the market. A key feature of the initiative is the use of on-farm strip trials to evaluate their performance.  Results are reported on their website and can also be evaluated through the TED tool (Technology Extrapolation Domains Geospatial Framework), allowing for regionally targeted information.

We at Cornell University have supported NutrientStar from its inception and provided it with advice and data sharing. We also helped EDF explore metrics for sustainable nitrogen management.  However, in the context of evaluating our Adapt-N tool we developed serious concerns with their methodologies and reporting, which we conveyed to the NutrientStar team. This potentially affects farmer adoption of new technologies that help them implement 4R-Plus management strategies and achieve environmental goals. We believe it is important that users of NutrientStar reports are aware of the following serious concerns with their approach:

  1. The Adapt-N technology embraces dynamic-adaptive N management with on-going field monitoring during the growing season. This approach was well demonstrated when Adapt-N won the Tulane Nutrient Reduction Grand Challenge Prize.  NutrientStar, however, evaluated the tool in the old paradigm through fixed N response trials and post-season assessment of the optimum rates.  This ignores that growers can use technologies that allow them to adjust N inputs and risk levels during the season to respond to in-season weather.  Therefore, trial results are not based on the actual use potential of Adapt-N.  This was most apparent in wetter seasons when many early sidedress applications – often at stages V3-V4 (too early for effective use of the tool) – were followed by significant rainfall and N losses.  On-going monitoring of the field N status would have indicated N deficiencies that were not addressed in the trials.  In those cases, results from Adapt-N appear to underestimate the crop nitrogen requirements which would not have happened with appropriate use and consideration for additional N applications.  NutrientStar still reports those results, and these mistrials strongly impact the average estimated farmer returns due to the high negative impact of yield losses on profits.
  2. The NutrientStar program mostly relies on crop consultants to implement the on-farm trials.  Many are experienced professionals but their abilities to effectively use the Adapt-N tool varied greatly.   This was reflected in the very diverse outcomes from the on-farm trials with each regional consultant (average profit ranges from -$131 to +$40). Upon review of the data, we found inconsistencies with data entry and implementation practices with some of the consultants, notably incorrect assumptions about yield potential and rooting depths.  In many cases this approach strongly influenced results, which therefore do not represent the best use of the tool.  This “naive” approach with some consultants (we were not given an opportunity to provide training) also negates the notion that the use of more sophisticated management tools like Adapt-N improves with practice and local adaptation.  It is also out of line with results from many rigorous on-farm evaluations of the tool that showed win-win outcomes (see: 1, 2, 3, 4, 5).
  3. The NutrientStar reports compare our technology with farmer rates.  However, it was not a direct side-by-side comparison but an indirect assessment after curve fitting procedures. The trials actually involved multiple pre-selected nitrogen rates which were fitted with a yield response curve. The Adapt-N and farmer rates were evaluated on how they fit on the response curve.  NutrientStar uses quadratic functions, which are known to calculate optimum N rates that are too high.  The results were therefore affected by the functions that were chosen to fit the curves, which biased against rates that optimize economic and environmental considerations.  Moreover, NutrientStar emphasizes the relative profits (returns to nitrogen) and not the N reductions and environmental gains from the tested technologies.  Many trials involved minor profit tradeoffs with significant environmental gains, as indicated by the fact that virtually all trials showed increases in nutrient use efficiency.
  4. The farmer rates did not reflect a representative range of real farmer practices, but were generally based on consultant recommendations for farmers that use a fixed N management approach with in-season application technology.   The NutrientStar results therefore underestimate the potential benefits from technologies like Adapt-N to help farmers optimize 4R-Plus management through its advanced reporting features.
  5. The TED framework used with NutrientStar allows a user to look at the apparent regional adaptability of a technology.  However, the results in different types of growing regions are highly confounded with the practices of the regional consultants who conducted the trials.  For example, in some regions Adapt-N appears to work much worse than others, but this mostly reflects the respective consultants’ research practices (points 1 and 2 above).  Therefore, the notion that a technology works better or worse in a particular TED region is an inaccurate representation of its actual regional performance potential.  We found Adapt-N performance to be similarly good for many regions of the country.

We believe that the NutrientStar reports provide false impressions by appearing quantitatively rigorous but actually having serious methodological weaknesses.  They do not represent the true capabilities of the Adapt-N technology and might discourage farmers and consultants from using technology that can move the industry forward towards effective implementation of the 4R-Plus strategy.  This requires N management tools that integrate all components (formulation, placement, timing, soil health, water management, etc.), as well as local soil and weather conditions to determine an optimum N rate recommendation.  Adapt-N does exactly that, and provides additional insights that help farmers gain benefits from new management changes.  Our question is:  Why does Environmental Defense Fund create inaccurate negative reports for promising technologies that can actually help them achieve their goal to create workable environmental solutions?

Skip to toolbar