Martingale is an abstraction of "fair play." If a stochastic process is a martingale, some problems can be greatly simplified.
We continue our study of branching processes. is the random variable representing the population size of the -th generation ( is the initial condition).
See also:
Martingales
Consider the following conditional expectation:
Here the sum is over all possible values of . Note that this conditional expectation itself is a random variable as it is conditioned on the random variables (not on their values). This expectations doesn't really depend on , but it does depend on (the Markov property!). Each individual out of individuals produces offsprings on average, so the above conditional expectation should be
Let's introduce a new random variable:
Then, the above conditional expectation can be rearranged into
Cancelling the common factor , we obtain
Thus, the conditional expectation is . In general, a sequence of random variables that satisfies Eq. (Eq:Martingale) is called a martingale (with respect to the random variable sequence ).
The gambling martingale
The term "martingale" originates from a strategy in gambling (see Figure 1 below). A gambler bets 1 dollar in the first game. She doubles the bet in each subsequent game. That is, the bet is 2 dollars in the second game, 4 dollars in the third, etc. Assume that her first win is at the -th bet. This means she wins dollars after losing dollars. Thus, in total, she has won 1 () dollar. This is a guaranteed method of winning! (Not recommended for obvious reasons.)
Figure 1: The gambling martingale.
Let be the gambler's total asset (or debt) at the -th bet. Each is a random variable. Their possible values are:
In general, , or
for . Any one of these outcomes has a probability . Thus, the (unconditional) expectation of is
The conditional expectation of, say, given is calculated as follows (see Figure 1):
Hence,
Similarly (exercise!), the conditional expectation of given is
In general, we have
Thus, is a martingale with respect to itself.
Example (Eg:RW). Let be the random variable for the position of a walker in a symmetric random walk with . Let's show that is a martingale with respect to itself. That is,
To show this, note that, given whatever value of , the value of is either or , each with probability . Therefore,
□
Stopping rules
Example (Eg:Y). Consider a random process, We decide to stop when a specified conditions are met.
- We may want to stop at a specific time . In this case, is known, and is of interest.
- We may want to stop when becomes greater than or equal to a certain value. In this case, the value (or range) of is known when the process stops, but the stopping time is unknown, and we are interested in , the expected stopping time.
Consider a symmetric (Markov chain) random walk that starts at position . Let be the random variable of the position of the walk at step . The random variable is a martingale with respect to . That is,
To see this, note that is either or , each with probability (See the above example (Eg:RW)). Therefore,
□
Theorem (Stopping Theorem)
If is a martingale with respect to and is a defined stopping time, and
Example. The symmetric random walk starts at . Suppose the walk stops at step if either or for some is satisfied (This is the Gambler's Ruin problem. Remember?). What is the expected stopping time, ? From the above Example (Eg:Y), we know that is a martingale with respect to . We can show that this martingale satisfies all the conditions for the Stopping Theorem (exercise!). Thus,
On the other hand, we have
where we used the result from the Gambler's Ruin Problem that . By combining these results, we have the expected stopping time:
Comments
Post a Comment