Evaluation of a model for predicting Avena fatua and Descurainia sophia seed emergence in winter rapeseed

Avena fatua and Descurainia sophia are two important annual weeds throughout winter rapeseed ( Brassica napus L.) production systems in the semiarid region of Iran. Timely and more accurate control of both species may be developed if there is a better understanding of its emergence patterns. Non-linear regression techniques are usually unable to accurately predict field emergence under such environmental conditions. The objectives of this research were to evaluate the emergence patterns of A . fatua and D . sophia and determine if emergence could be predicted using cumulative soil thermal time in degree days (CTT). In the present work, cumulative seedling emergence from a winter rapeseed field during 3 years data set was fitted to cumulative soil CTT using Weibull and Gompertz functions. The Weibull model provided a better fit, based on coefficient of determination ( R 2 sqr ), root mean square of error (RMSE) and Akaike index (AIC d ), compared to the Gompertz model between 2013 and 2016 seasons for both species. Maximum emergence of A. fatua occured 70-119 days after sowing or after equals 329-426 °Cd, while in D. sophia it occurred 119-134 days after sowing rapeseed equals 373-470 °Cd. Both models can aid in the future study of A . fatua and D . sophia emergence and assist growers and agricultural professionals with planning timely and more accurate A . fatua and D . sophia control.


Introduction
Wild oat (Avena fatua) and f ixweed (Descurainia sophia) are noxious weed species distributed worldwide which produce severe yield and quality losses in cereal and oil seed crops in temperate and semiarid climates (Holm et al., 1977;Blackshaw et al., 1981). In the semiarid region of Iran, A. fatua and D. sophia are major weeds in winter cereals and rapeseed (Brassica napus L.) which their field emergence patterns show great year-to-year variability mainly due to the effect of highly unpredictable precipitation regimes as well as a complex seed bank dormancy behaviour regulated by both, genetic and environmental factors.
The timing and progression of seedling emergence are important determinants of weed competitiveness, susceptibility to control measures, and reproductive success (Blackshaw et al., 1981;Forcella et al., 2000). Early weed emergence relative to the crop allows weeds to compete better with crops. Many authors have reported that the magnitude of crop yield losses from crop-weed competition, among other factors, depends on the time of weed seedling emergence relative to that of the crop (Chikoye et al., 1995;Knezevic et al., 1997;Moechnig et al., 2003). Hence, to control weeds adequately, especially with limited use of herbicides, farmers need to know the timing and extent of weed seedling emergence before and during the growing season. Armed with such knowledge, farmers can better time the allocation of their resources and energies to actual weed problems, either through hand labour, work animals, and mechanized implements or through herbicides (Ekeleme et al., 2005).
Field emergence predictive models are essential tools for the development of weed management support systems aimed to design sustainable weed control programs while optimizing crop yield. Such models should be able to minimize the degree of uncertainty on the estimation of the time and magnitude of seedling emergence (Forcella et al., 2000). Various emergence models have been used to describe seed emergence, specifically, the thermal-time model (TT), which has been used extensively (Probert, 1992;Bradford, 2002;Royo-Esnal et al., 2010). In these models, average air or soil temperature above a specified threshold is accumulated over the days until weed emergence (Royo-Esnal et al., 2010).
Use of TT in emergence models became successful with the realization that emergence can be represented by a simple continuous cumulative sigmoidal curve, but only if the upper few centimeters of field soil remained continuously moist, either through irrigation or natural rainfall (Forcella et al., 2000).
The objective of this study was to develop models of seedling emergence for A. fatua and D. sophia present in winter rapeseed crop and to examine whether TT is an appropriate variable for describing the timing of emergence of their seedlings.

Experimental sites and design
Field experiments were conducted between 2013 and 2016 in a winter rapeseed field at the experimental farm of Islamic Azad University, Karaj Branch, Iran. The site is located at latitude 35°45´ N, longitude 51°6´ E and 1313 meters above the sea level in semi-arid climate. The soil type was a silty clay (10.33% sand, 46.33% silt, 43.34% clay), with 0.98% organic matter and pH of 7.4.
Experimental site was under a continuous rapeseedcorn crop rotation for more than 10 years and had A. fatua and D. sophia infestations. To prepare the field, it was irrigated before conducting the experiment and after the field got wet enough, it was 30-cm depth mouldboard ploughed in a few days before sowing in August, followed by a disking to slice plant residue and incorporate fertilizers into the soil. Fertilizers were applied at 150:80:80 N:P:K kg/ha using urea (46% N), diammonium phosphate (18% N, 46% P 2 O 5 ) and potassium sulfate (50% K 2 O) as source of N, P and K, respectively during each experimental year. Full dose of P, K and 1/3 of N were applied before sowing and incorporated. Other portions of N were used at the end of rosette stage and beginning of the fowering. In the three years, rapeseed (cv. Okapi) was sown on October 17-20. Each experimental plot (6.5×2.3 m) included 6 planting rows with 6 m in length and 30 cm in width. Average crop density was 80 plants/m 2 . In each of the three years of research, irrigation was done at a rate of 3500 m 3 /ha at five rapeseed growth stages of emergence, shooting, fowering, pod setting and grain filling.
D. sophia and A. fatua emergence data were recorded from sowing to harvesting of rapeseed weekly from 20 randomly located one square meter quadrates. Seedlings after counting were removed from the soil with a minimum of soil disturbance. No herbicide was used in all three growing seasons for weed control in the experimental plots. Soil temperature was recorded every hour during the experiment with a data-logger placed at a depth of 5-cm under the soil surface (Sharma et al., 1976;Yousefi et al., 2014). Daily rainfalls were obtained from a meteorology station located 7 km away from the experimental field ( Fig. 1).

Model description
The cumulative soil thermal time (CTT) was estimated daily during the growing season with the following equation (Leblanc et al., 2003;Leguizamon et al., 2005;Izquierdo et al., 2013): where T mean is the daily average soil temperature (°C), T base is the lowest temperature for germination of A. fatua and D. sophia and n is the number of days after sowing. T base was 1 °C for A. fatua (Cousens et al., 1992) and 2.5 °C for D. sophia (Kiemnce & Mcinnis, 2002). This method is accurate if the minimum temperature is above the base temperature. However, if the minimum temperature decreases below the base temperature, negative values are obtained. When the minimum temperature < T base no thermal time was assumed to accumulate (Leguizamon et al., 2005).
Two different functions were tested for describing the time trend of weed seedling emergence, Weibull (Eq. [2]) and Gompertz (Eq. [3]): where Y represents the cumulative percentage of emergence, b is the rate of increase of emergence once it is initiated, z is the time of first emergence and c is a parameter determining the shape of the curve and m represents the point of infection on the x axis. Weibull and Gompertz distributions have been used commonly in the development of weed emergence models (Izquierdo et al., 2013;Royo-Esnal et al., 2015). Model parameters of both species were estimated with SigmaPlot 12.0 using a non-linear regression fitting routine.

Models analysis
In all cases, goodness-of-fit measures were based on means of the adjusted coefficient of determination (R 2 sqr ), root mean square error (RMSE) and the corrected Akaike's information (AIC d ).
The general definition of AIC d provided in Qi & Zhang (2001) was adopted (Eq. [4]), where m is the number of parameters of the model, N is the number of observations and d is a user defined constant, which allows the tuning of the penalty term.
The Weibull model provided a better fit, based on R 2 sqr , RMSE and AIC d , compared to the Gompertz model between 2013 and 2016 seasons for both species (Table 1).

Discussion
We used the 2013 to 2016 data to select a model that generally produced the lowest RMSE and AIC d

Figure 2. Avena fatua (upper graphs) and
Descurainia sophia (lower graphs) cumulative seedling emergence as predicted by cumulative soil thermal time (CTT) during three consecutive seasons in Iranian rapeseed. Observed (symbols) and predicted (lines) emergence data according to a Weibull function.
and also higher R 2 sqr for cumulative TT at the target emergence stage (Table 1). In the semiarid region of Iran, A. fatua and D. sophia show an irregular seedling emergence behaviour along the season and a great variability among years mainly due to a highly unpredictable precipitation regime, also infuenced by a fuctuating thermal environment and seed dormancy level variations within the population.
According to these results, Weibull function was chosen to describe A. fatua and D. sophia emergence related to thermal and corrected TT. Our results partially agree with Yousefi et al. (2013) in a comparison of three models (Gompertz, Logistic and Weibull models) on D. sophia, who found that the Weibull model gave the best fit for Iran climate. Chantre et al. (2012) reported that the Weibull model was deemed to provide a better fit than the other models for A. fatua seedling emergence in Argentina. However, Gonzalez-Diaz et al. (2007) reported that the Logistic model was deemed to provide a better fit than the other models for A. fatua seedling emergence in Spain. The differences between A. fatua and D. sophia emergence patterns in semiarid conditions might be attributed mainly to a highly unpredictable precipitation regime, fuctuating thermal environment and seed dormancy level variations (Chantre et al., 2012). In addition to weather conditions, different soil management (such as cultivation operation and working depth) may affect the accuracy of the model by varying the vertical movement of the seeds within the soil profile (Grundy, 2003).
In all years, maximum emergence of A. fatua was reached 70-119 days after sowing or after 329-426 °Cd had accumulated (Fig. 2). From the farmer's point of view, seedlings that emerge at the onset of the rainy season, in autumn, are of minor importance, as they are suppressed by cultivation during seedbed preparation for sowing. However, this cultivation stimulates germination and new seedlings emerge simultaneously with the crop (Izquierdo et al., 2013). A wide range of days for maximum wild oat emergence have been reported. For example, Yousefi et al. (2014) in Iran found that 100% emergence of A. fatua was reached 84 days (>500 °Cd) after sowing date. Martinson et al. (2007) in USA indicated that 100% emergence of A. fatua was reached 28 to 42 days (400-600 °Cd) after initial emergence. The differences in reported time to maximum emergence are most likely because of weather conditions and possibly dormancy. Sharma et al. (1976) found that maximum emergence of wild oat was reached 17 days after seeding and no further emergence occurred 30 days after seeding in USA. Non dormant A. fatua seeds may undergo secondary dormancy if the conditions for their germination are unfavourable. A. fatua prefers cool climate and moist soil conditions (Sharma & Vanden Born, 1978;Yousefi et al., 2014). Our experimental site presents warmer and drier conditions than USA, which could affect secondary dormancy, leading to an extended emergence period. Imam & Allard (1965) observed genetic variability within wild oat populations in the same region as well as across regions. Such species variability is another potential explanation for the emergence range in reported temperatures. Mickelson & Grey (2006) found that wild oat seed mortality increased linearly as soil water content increased. In our study, A. fatua emergence in 2015/16 showed an earlier starting than 2013/14 and 2014/15 seasons. Later onsets in field emergence of A. fatua in 2013/14 and 2014/15 were also markedly infuenced by rainfall and soil temperatures. Wild oat tends to prefer cool, moist soil conditions for emergence (Sharma & Vanden Born, 1978). Sexsmith (1969) determined that temperature had a greater effect than soil moisture on wild oat seed dormancy. In this three years study just prior to wild oat emergence, soil temperatures were higher than 20 °C (Fig. 1). Therefore, soil moisture is an important factor in the wild oat emergence in semiarid conditions. In our experiments, the beginning of the 2013/14 season was drier than the previous seasons; rainfall during 60 days after sowing winter rapeseed in 2013/14, 2014/15 and 2015/16 was 34, 40 and 73 mm, respectively (Fig. 1). Patterns of emergence in relation to rainfall, cultivation and tillage system are different because of climate and management variability (Ogg & Dawson, 1984;Cardina & Hook, 1989). Results showed that maximum emergence of D. sophia was reached 119-134 (late autumn) days after sowing or after 373-470 °Cd (Fig. 2). Best (1977) reported that seeds of D. sophia germinated most readily in late autumn and early spring, while germination was rare during summer. D. sophia emergence during 2014/15 and 2015/16 was earlier than 2013/14 season. Important rainfall events with high soil temperatures in the autumn occurred at the beginning of emergence in 2014/15 and 2015/16 seasons, which probably promoted early emergence (Fig. 1). Maybe the dormancy release rate was increased by hydration events in the soil (Gallagher et al., 2004). In 2013/14, soil temperature was lower before emergence compared to 2014/15 and 2015/16. In winter annuals like D. sophia, although emergence of seeds is inhibited under water stress conditions, no secondary dormancy appears to develop if temperature and light conditions are adequate. In the case of winter annuals, high autumn temperatures promote the full loss of dormancy and increased emergence, while low winter temperatures may, wholly or partially depending on the species, prevent loss of dormancy (Baskin & Baskin, 1989).
In summary, the main topic of this study was to develop an explicit predictive model of emergence for A. fatua and D. sophia seedlings in the semiarid region of Iran using meteorological data easily available to farmers and practitioners. Having the ability to accurately predict the emergence for both species in rapeseed has practical implications on post-emergence herbicide application timing and efficacy. Many growers would prefer to time the control operations in their fields as soon as 100% of the emergence for both species population occurs. However, validation of such a predictive model for A. fatua and D. sophia would require repeating the germination phenology study in different years and perhaps in different locations.