Branching processes: Martingales and stopping rules

Martingale is an abstraction of "fair play." If a stochastic process is a martingale, some problems can be greatly simplified.

We continue our study of branching processes. Xn is the random variable representing the population size of the n-th generation (X0=1 is the initial condition).

See also:

Martingales

Consider the following conditional expectation: 

E(Xn+1|X0,X1,,Xn)=xxPr(Xn+1=x|X0,X1,,Xn).

Here the sum is over all possible values of Xn+1. Note that this conditional expectation itself is a random variable as it is conditioned on the random variables X0,X1,,Xn (not on their values). This expectations doesn't really depend on X0,X1,,Xn1, but it does depend on Xn (the Markov property!). Each individual out of Xn individuals produces μ1 offsprings on average, so the above conditional expectation should be

E(Xn+1|X0,X1,,Xn)=Xnμ1.

Let's introduce a new random variable:

Zn=Xnμn=Xnμ1n.

Then, the above conditional expectation can be rearranged into

E(Zn+1μ1n+1|X0,,Xn)=Znμ1n+1.

Cancelling the common factor μ1n+1, we obtain

(Eq:Martingale)E(Zn+1|X0,,Xn)=Zn.

Thus, the conditional expectation Zn+1 is Zn. In general, a sequence of random variables {Zn} that satisfies Eq. (Eq:Martingale) is called a martingale (with respect to the random variable sequence {Xn}).

The gambling martingale

The term "martingale" originates from a strategy in gambling (see Figure 1 below). A gambler bets 1 dollar in the first game. She doubles the bet in each subsequent game. That is, the bet is 2 dollars in the second game, 4 dollars in the third, etc. Assume that her first win is at the n-th bet. This means she wins 2n dollars after losing 1+2+4++2n1=2n1 dollars. Thus, in total, she has won 1 (=2n(2n1)) dollar. This is a guaranteed method of winning! (Not recommended for obvious reasons.)

Figure 1: The gambling martingale.


Let Zn be the gambler's total asset (or debt) at the n-th bet. Each Zn is a random variable. Their possible values are:
Z0={1},Z1={0,2},Z2={2,0,2,4},Z3={6,4,2,0,2,4,6,8},
In general, Zn={2n+2,22+4,,2n2,2n}, or
Zn={2n+2m+2|m=0,1,2,,2n1}
for n=1,2,3,. Any one of these outcomes has a probability 1/2n. Thus, the (unconditional) expectation of Zn is
E(Zn)=m=02n112n(2n+2m+2)=1.
The conditional expectation of, say, Z2 given Z0,Z1 is calculated as follows (see Figure 1):
E(Z2|(Z0,Z1)=(1,2))=4Pr(Z2=4|(Z0,Z1)=(1,2))+0Pr(Z2=0|(Z0,Z1)=(1,2))=412+012=2,E(Z2|(Z0,Z1)=(1,0))=2Pr(Z2=2|(Z0,Z1)=(1,0))+(2)Pr(Z2=2|(Z0,Z1)=(1,0))=212212=0.
Hence,
E(Z2|Z0,Z1)={0,2}=Z1.
Similarly (exercise!), the conditional expectation of Z3 given Z0,Z1,Z2 is
E(Z3|Z0,Z1,Z2)={4,2,0,2}=Z2.
In general, we have
E(Zn+1|Z0,Z1,,Zn)=Zn.
Thus, {Zn} is a martingale with respect to itself.

Example (Eg:RW). Let Xn,n=0,1,2, be the random variable for the position of a walker in a symmetric random walk with X0=0. Let's show that Xn is a martingale with respect to itself. That is,
E(Xn+1|Xn)=Xn.
To show this, note that, given whatever value of Xn, the value of Xn+1 is either Xn+1 or Xn1, each with probability 12. Therefore,
E(Xn+1|Xn)=12(Xn+1)+12(Xn1)=Xn.

Stopping rules

Example (Eg:Y). Consider a random process, {Xr},r=0,1,2. We decide to stop when a specified conditions are met.
  1. We may want to stop at a specific time r=T. In this case, T is known, and E(XT) is of interest.
  2. We may want to stop when Xr becomes greater than or equal to a certain value. In this case, the value (or range) of XT is known when the process stops, but the stopping time T is unknown, and we are interested in E(T), the expected stopping time.
Consider a symmetric (Markov chain) random walk that starts at position k. Let Xn,n=0,1,2, be the random variable of the position of the walk at step n. The random variable Yn=Xn2n is a martingale with respect to Xn. That is,
E(Yn+1|Xn)=Yn.
To see this, note that Xn+1 is either Xn+1 or Xn1, each with probability 12 (See the above example (Eg:RW)). Therefore,
E(Yn+1|Xn)=12(Xn+1)2+12(Xn1)2(n+1)=Xn2n=Yn.

Theorem (Stopping Theorem)

If {Zn} is a martingale with respect to {Xn},n=0,1, and T is a defined stopping time, and
  1. Pr(T<)=1,
  2. E(|ZT|)<,
  3. E(Zn|T>n)Pr(T>n)0 as n,
then E(ZT)=E(Z0).
Proof. Omitted. (See also Optional stopping theorem.) ■

Example. The symmetric random walk starts at X0=k. Suppose the walk stops at step n=T if either XT=0 or XT=a for some a is satisfied (This is the Gambler's Ruin problem. Remember?). What is the expected stopping time, E(T)? From the above Example (Eg:Y), we know that Yn=Xn2n is a martingale with respect to Xn. We can show that this martingale satisfies all the conditions for the Stopping Theorem (exercise!). Thus,
E(YT)=E(Y0)=E(X020)=k2.
On the other hand, we have
E(YT)=E(XT2T)=E(XT2)E(T)=[0×Pr(XT=0)+a2×Pr(XT=a)]E(T)=a2×kaE(T)=akE(T)
where we used the result from the Gambler's Ruin Problem that Pr(XT=a)=k/a. By combining these results, we have the expected stopping time:
E(T)=k(ak).

Comments

Popular posts from this blog

Birth process

Branching processes: Mean and variance

Informal introduction to formal logic