Data from: Mortality versus survival in drought‐affected Aleppo pine forest depends on the extent of rock cover and soil stoniness
Data files
Feb 25, 2019 version files 63.14 KB
-
Preisler et al. 2019_ BAI-tree rings_Figure 3.xlsx
-
Preisler et al. 2019_ Bucket model results_Figure 5.xlsx
-
Preisler et al. 2019_ Climatic panel_Figure 1.xlsx
-
Preisler et al. 2019_ Stoniness and root density_Figure 4.xlsx
-
Preisler et al. 2019_ WP_figure 2.xlsx
Abstract
Drought-related tree mortality had become a widespread phenomenon in forests around the globe. Recent drought years led to 5-10% mortality in the semi-arid pine forest of Yatir (Israel). The distribution of dead trees was, however, highly heterogeneous with parts of the forest showing >80% dead trees (D plots) and others with mostly live trees (L plots). At the tree level, visible stress was associated with low predawn leaf water potential at the dry season (-2.8 MPa, vs. -2.3 MPa in non-stressed trees), shorter needles (5.5 vs. 7.7 mm) and lower chlorophyll content (0.6 vs. 1 mg g-1 dw). Trends in tree ring widths reflected differences in stress intensity (30% narrower rings in stressed compared with unstressed trees), which could be identified 15-20 years prior to mortality. At the plot scale, no differences in topography, soil type, tree age, or stand density could explain the mortality difference between the D and L plots. It could only be explained by the higher surface rock cover and in stoniness across the soil profile in the L plots. Simple bucket model simulations using the site’s long-term hydrological data supported the idea that these differences could result in higher soil water concentration (m3/m3) in the L plots and extend the time above wilting point by several months across the long dry season. Accounting for subsurface heterogeneity is therefore critical to assessing stand level response to drought and projecting tree survival, and can be used in management strategies in regions undergoing drying climate trends.