This is one of the 100 recipes of the IPython Cookbook, the definitive guide to high-performance scientific computing and data science in Python.
import numpy as np
import matplotlib as mpl
import matplotlib.pyplot as plt
%matplotlib inline
N = 100 # maximum population size
a = .5/N # birth rate
b = .5/N # death rate
nsteps = 1000
x = np.zeros(nsteps)
x[0] = 25
for t in range(nsteps - 1):
if 0 < x[t] < N-1:
# Is there a birth?
birth = np.random.rand() <= a*x[t]
# Is there a death?
death = np.random.rand() <= b*x[t]
# We update the population size.
x[t+1] = x[t] + 1*birth - 1*death
# The evolution stops if we reach $0$ or $N$.
else:
x[t+1] = x[t]
plt.figure(figsize=(6,3));
plt.plot(x);
We see that, at every time, the population size can stays stable, increase by 1, or decrease by 1.
for
loops). Instead, we vectorize the simulation by considering all independent trials at once. There is a single loop over time. At every time step, we update all trials simultaneously with vectorized operations on vectors. The vector x
now contains the population size of all trials, at a particular time. At initialization time, the population sizes are set to random numbers between $0$ and $N$.ntrials = 100
x = np.random.randint(size=ntrials,
low=0, high=N)
def simulate(x, nsteps):
"""Run the simulation."""
for _ in range(nsteps - 1):
# Which trials to update?
upd = (0 < x) & (x < N-1)
# In which trials do births occur?
birth = 1*(np.random.rand(ntrials) <= a*x)
# In which trials do deaths occur?
death = 1*(np.random.rand(ntrials) <= b*x)
# We update the population size for all trials.
x[upd] += birth[upd] - death[upd]
bins = np.linspace(0, N, 25);
plt.figure(figsize=(12,3));
nsteps_list = [10, 1000, 10000]
for i, nsteps in enumerate(nsteps_list):
plt.subplot(1, len(nsteps_list), i + 1);
simulate(x, nsteps)
plt.hist(x, bins=bins);
plt.xlabel("Population size");
if i == 0:
plt.ylabel("Histogram");
plt.title("{0:d} time steps".format(nsteps));
Whereas, initially, the population sizes look equally distributed between $0$ and $N$, they appear to converge to $0$ or $N$ after a sufficiently long time. This is because the states $0$ and $N$ are absorbing: once reached, the chain cannot leave those states. Furthermore, these states can be reached from any other state.
You'll find all the explanations, figures, references, and much more in the book (to be released later this summer).
IPython Cookbook, by Cyrille Rossant, Packt Publishing, 2014 (500 pages).