Abstract
Landscape-scale vegetation stripes (tiger bush) observed on the gentle slopes of semi-arid regions are useful indicators of future ecosystem degradation and catastrophic shifts towards desert. Mathematical models like the Klausmeier model—a set of coupled partial differential equations describing vegetation and water densities in space and time—are central to understanding their formation and development. One assumption made for mathematical simplicity is the local dispersal of seeds via a diffusion term. In fact, a large amount of work focuses on fitting dispersal ‘kernels’, probability density functions for seed dispersal distance, to empirical data of different species and modes of dispersal. In this paper, we address this discrepancy by analysing an extended Klausmeier model that includes long-distance seed dispersal via a non-local convolution term in place of diffusion, and assessing its effect on the resilience of striped patterns. Many authors report a slow uphill migration of stripes; but others report no detectable migration speed. We show that long-distance seed dispersal permits the formation of patterns with a very slow (possibly undetectable) migration speed, and even stationary patterns which could explain the inconsistencies in the empirical data. In general, we show that the resilience of patterns to reduced rainfall may vary significantly depending on the rate of seed dispersal and the width of the dispersal kernel, and compare a selection of ecologically relevant kernels to examine the variation in pattern resilience.
Original language | English |
---|---|
Pages (from-to) | 151-161 |
Number of pages | 11 |
Journal | Journal of Theoretical Biology |
Volume | 481 |
Early online date | 4 Oct 2018 |
DOIs | |
Publication status | Published - 21 Nov 2019 |