- Author: Yoni Cooperman
Alfalfa has a long and storied history in California agriculture. First introduced during the gold rush of 1849-1850, California now leads the nation in alfalfa production. Between 2013-2015, an average of 815,000 acres of alfalfa were harvested in the state. Statewide alfalfa yields increased to 5,451,000 tons in 2015 and now California now accounts for over 9% of total U.S. production. Alfalfa serves as an important source of hay while also proving useful as a ‘green manure' that can provide nitrogen for the following crops. That being said, alfalfa has relatively high water demands, a particularly vexing issue due to the current statewide drought. While some 98% of alfalfa cropping systems utilize flood irrigation, the potential for utilizing sub-surface drip irrigation (SDDI) has become an attractive option to many growers seeking to reduce water usage. The type of irrigation utilized can also influence nitrous oxide (N2O) emissions. N2O is a potent greenhouse gas 300 times more powerful than carbon dioxide in warming the planet. Agriculture accounts for more than 60% of statewide N2O emissions from human activity. It has been shown that SSDI reliably reduces N2O emissions in other cropping systems.
UC Davis Land, Air, and Water Resources graduate student Ryan Byrnes recently completed a year-long study investigating the potential for SSDI to mitigate N2O emissions from alfalfa production. The study compared rates of N2O production in side by side alfalfa systems, one utilizing check flood irrigation while the other had a SSDI system installed. “We found that yearly emissions were significantly reduced by the adoption of SSDI.” Sustained soil moisture drives N2O emissions, and he noted that “SSDI keeps the soil surface dry, potentially reducing N2O emissions.” While large bursts of N2O emissions are often observed following the rewetting of dry soil, Ryan pointed out that “the pulse emissions in the SSDI plots was lower than in the flood plots.” The emissions following the first rains were not high enough in the SSDI plots to offset the reductions observed through the rest of the season. While additional studies are needed to draw any definitive conclusions, the findings of this study are encouraging. Stay tuned to this space for reports on future studies investigating the potential for SSDI to mitigate statewide N2O emissions.
- Author: Yoni Cooperman
Sequestering Carbon in the Soil Using Biochar
Soils store three times more carbon than exists in the atmosphere. Plants absorb atmospheric carbon during photosynthesis, so the return of plant residues into the soil contributing to soil carbon. While much of this carbon ultimately returns to the atmosphere as soil microbes decompose carbon based plant biomass and release carbon dioxide, soil carbon stores can increase if the rate of carbon inputs exceeds the rate of microbial decomposition. Carbon sequestration refers to this process of storing carbon in soil organic matter and thus removing carbon dioxide from the atmosphere.
Biochar is produced from burning organic material at high temperatures with little to no oxygen availability. The potential of utilizing biochar to sequester carbon in the soil has received considerable research attention in recent years as part of efforts to develop climate smart agricultural practices. As the majority of biochar is carbon (70-80%) it can potentially contribute more carbon than plant residue (approximately 40% carbon) of similar mass. Furthermore, around 60% of this biochar organic carbon is of high stability and therefore resists decomposition more-so than plant material that has not be processed into biochar. That being said, many questions remain as to the effectiveness of biochar application in sequestering carbon.
(For more information about biochar, check out our recent blog post)
Persistence of Biochar Carbon in Soil
While biochar does contain high levels of carbon, there remains uncertainty as to how long that carbon will persist in the soil following application. The inherent characteristics of the biochar--as dictated by feedstock and pyrolysis conditions--interact with climatic conditions such a precipitation and temperature to influence how long biochar carbon remains stored in the soil. Recent studies suggest that shorter pyrolysis times and higher pyrolysis temperatures make for more recalcitrant biochar (i.e. it persists for longer periods in the soil). However, there are trade-offs involved as these pyrolysis conditions produce less biochar per unit feedstock. As is so often the case, soil texture plays a key role in determining the persistence of biochar carbon. Biochar becomes stabilized in the soil by interacting with soil particles. Clay particles have more surface area for biochar to interact with and are therefore more effective at stabilizing biochar.
The Priming Effect
A number of studies have observed an increase in the rate of organic matter decomposition following biochar application. This so-called “priming effect” complicates any efforts to sequester carbon as this increase in microbial activity could result in decomposition rates exceeding carbon input rates. (see figure above). While the exact mechanism responsibility for this effect has not been conclusively identified, it may result from the stimulation of microbial activity as microbes utilize carbon and nitrogen present in biochar.
Biochar remains a hot topic with regards to increasing soil carbon stores and helping fight climate change. However, many questions remain before definitive conclusions about what conditions allow for biochar to positively contribute to soil carbon sequestration.
Ontl, T. A. & Schulte, L. A. (2012) Soil Carbon Storage. Nature Education Knowledge 3(10):35
Lal, R. (2016). Biochar and Soil Carbon Sequestration. Agricultural and Environmental Applications of Biochar: Advances and Barriers. M. Guo, Z. He and S. M. Uchimiya. Madison, WI, Soil Science Society of America, Inc.: 175-198.
Stewart, C. E., et al. (2013). "Co-generated fast pyrolysis biochar mitigates green-house gas emissions and increases carbon sequestration in temperate soils." GCB Bioenergy 5(2): 153-164.
Yang, F., et al. (2016). "The Interfacial Behavior between Biochar and Soil Minerals and Its Effect on Biochar Stability." Environmental Science & Technology 50(5): 2264-2271.
- Author: Yoni Cooperman
- Contributor: Deirdre Griffin
In the on-going quest to develop sustainable agricultural practices, growers are looking for new and inventive technologies. In this blog post, we'll focus on biochar, one such technology that has been a focus of intense research in recent years. Biochar is produced by burning organic material at extreme temperatures as high as 1600° F with little to no oxygen available. Oftentimes biochar is a by-product of energy production, but it can also be produced solely to be used as a soil amendment.
There's a few reasons growers might incorporate biochar into their cropping systems. Biochars' high surface area allows it to act as a reservoir of water while increasing the retention of nutrients such as calcium, magnesium, and ammonium. This is especially useful in more sandy soils with low cation exchange capacity. Biochar can also serve as a liming agent to increase soil pH, which increases nutrient availability in acidic soils. Additionally, biochars with high ash content can contain calcium and potassium that plants can use. Biochar inputs are also high in carbon. Stay tuned to this blog for another post highlighting the potential for biochar to increase soil carbon storage.
Feedstock – the organic material used to produce biochar – varies widely. Common feedstocks include wood chips, nut shells, and grasses. In California nut shells stand as a potentially useful source of feedstock due to the large nut industry. Biochar can also be produced from manures. Both feedstock and production temperature influence how biochar will behave in the soil. Dr. Sanjai Parikh's lab at the University of California, Davis has developed a biochar database that includes both of these characteristics.
Initial interest in biochar stemmed from the study of the Terra Preta soils in South America. These generally low fertility, acidic oxisols were able to sustain higher productivity than nearby non-Terra Preta soils while also accumulating organic matter. One of the reasons for this productivity was the addition of charcoal by indigenous farmers thousands of years ago. The hope was to mimic this in a modern agricultural setting.
Like most agricultural practices, biochars present some challenges for effective integration into a cropping system. Like compost or manure, it can be difficult to predict when nutrients from biochar will become plant available or how a char will interact with a particular soil. Different soil types require different rates of biochar application. For example, a clay loam would require more biochar to increase pH when compared to a sandy soil as a result of the clay loam's higher buffering capacity (see figure below).
UC Davis Soils and Biogeochemistry graduate student Deirdre Griffin is researching how soil microbes respond to biochar additions. She explains that “while biochars can sometimes serve as a source of labile carbon to spur microbial activity, some chars can give off inhibitory compounds that may reduce microbial activity.” In particular, she is looking at whether biochars with high sorption capacity (i.e. the ability to hold on to compounds in the soil) can interfere with signaling between legumes and soil bacteria that fix nitrogen and make it available to plants. She is careful to note that “others have found biochars to increase nodulation in legumes.”
All in all, “the leaders in the field recognize that while there are many benefits of biochar, there can also be negative impacts…There was a burst of [research] excitement followed by some backlash, and now things are starting to even out.” Biochar can serve as a tool for sustainable production systems, but it isn't appropriate for every situation. Continued research will illuminate what types of biochar are suitable for different soils.
Tuesday, November 17, 2016 from 8 AM - 4:30 PM.
Topics that will be covered include
Getting maximum value from soil and water testing
Comparing fertilizer sources
Irrigation water quality effects on soil management
Nitrogen management and environmental protection.
While the course assumes that attendees will have a basic background of soil science, participants should be expected to gain insight into nutrient management during a time of fluctuating fertilizer costs, uncertainty with the water supply, and increased governmental regulation.
The course will be held at the Walter A. Buehler Alumni Center in Davis, CA.
- Author: Jordon Wade
- Contributor: Hannah Waterhouse
- Contributor: Martin Burger
In order to be accurate and effective, fertilizer recommendations must factor in a wide range of considerations, ranging from the site-specific to the climatic. To help guide these decisions, “the 4 R's” have been developed: Right rate, Right place, Right time, and Right form. These 4 R's can be utilized in tandem to maximize a given goal, whether that is maximum yield, maximum profitability, minimize adverse environmental effects, or perhaps a combination of factors. However, the specific recommendations will vary according to farm- or field-specific factors, such as climate, soil mineralogy, crop choice, or labor constraints. As such, it is difficult to make “best management” prescriptions across regions.
Several UC Davis researchers—Hannah Waterhouse, Martin Burger, and Will Horwath—recently investigated the 3 of the 4 R's of corn production over two years (2013-2014) on a farm near Stockton in the San Joaquin Valley. They were particularly interested in how nitrogen fertilizer rate, placement, and timing affected nitrous oxide (N2O) emissions. Additionally, they were comparing emissions and yields between drip and furrow-irrigated corn.
Right Rate: For both years of this study, fertilization rates were adjusted using the preplant (or residual) nitrogen levels, which were 65 lbs/ac in 2013 and 77 lbs/ac in 2014. These rates of residual nitrogen were then subtracted from the target fertilization rates to have an equal level of available N across years. To learn more about calculating residual nitrogen rates, visit our page on residual nitrogen budgeting. Overall, emissions increased with increasing rate, although there was a high degree of variability. Yield-scaled emissions, which allow for emissions to be examined in terms of agronomic efficiency, also increased as N rates increased. Using the corn stalk nitrate test in 2014, they found that there was no N deficiency, except a marginal deficiency in the 65 lbs/ac rate. At the highest rates (227 lbs/ac and 307 lbs/ac), the corn stalk nitrate test found hugely excessive levels of plant-available N.
Right Place: They also looked at the effect of applying fertilizer in a single band or a double band. They applied fertilizer at the same rate—202 lbs/ac in 2013 and 227 lbs/ac in 2014—on either the inside (1-band) or both sides (2-band) of the corn plant line. Comparing emissions from the single band vs. the double band, they saw twice as many emissions from the single band in 2013 and 3-4 times as much emissions in the single band in 2014, without seeing any differences in yield. There was also much higher residual nitrogen in the 1-band application, resulting in a higher fertilizer use efficiency in the 2-band treatment.
Right Time: For both years of the study, the majority of the fertilizer was applied as a sidedress during V2 stage of crop growth in 2013 (202 lbs/ac) and during V4/V6 in 2014 (227 lbs/ac). The use of the nitrification inhibitor AgrotainPlus helped to maintain the fertilizer in the less mobile ammonium form for longer, to better sync nitrogen supply with crop nitrogen demand. In the first year (2013), the application of fertilizer and nitrification inhibitor at V2 was a bit too early and did not reduce emissions. In 2014, the fertilizer and nitrification inhibitor were timed better to coincide with crop N demand and reduce emissions by 60%, although no yield difference was observed. This better syncing also resulted in an “excess” reading from the stalk nitrate test, suggesting that fertilization rates could likely be decreased in subsequent years.
These results were supported in another field trial of corn by the same group of researchers in Yolo County, where the AgrotainPlus also decreased emissions by approximately 50% in the sandier, coarser soils. In this study, AgrotainPlus also decreased easily-leached residual nitrate by 10 lbs/ac.
Irrigation Method: In 2013 and 2014, irrigation types were varied in the 202 lbs/ac and 227 lbs/ac treatments, respectively. Using subsurface drip to supply fertilizer and irrigation to the corn resulted in a 50-80% reduction in nitrous oxide emissions, relative to the furrow-irrigated field. The drip also had double the grain yield of furrow-irrigated corn in 2013, but no difference in total yield when growing for silage in 2014.
While the results of this study are subject to much of the same inherent variability associated with agricultural studies, it does support much of the current body of knowledge and show that California is not an exception. The central take-home messages from this research (that are well-supported by other studies) are:
- Testing for residual nitrate prior to planting helps to adjust fertilizer recommendations to minimize environmental effects, such as nitrous oxide emissions.
- Concentrating N fertilizer (especially ammonia/ammonium) into a single applied band will greatly increase emissions and decrease your fertilizer N use efficiency.
- Nitrification inhibitors can substantially decrease nitrous oxide emissions and increase your fertilizer N efficiency. Although they might not increase yields, they have the potential to increase N cycling within the system.
- Using subsurface drip irrigation can increase your yields (especially grain yields) while cutting your N2O emissions in half.