Poisson process: Introduction

 We consider stochastic processes with continuous time. For example,

  • The number of births (or deaths) in a population by time t.
  • The number of phone calls at an office by time t.
  • The number of radioactive particles detected by a Geiger counter by time t.
The Poisson process is a basic stochastic process that can be used to model above phenomena. Let's define it.

The Poisson process

Definition

Let N(t) be a time-dependent random variable of integer values, and λ>0 a constant. Here, t represents time. Suppose the probability mass function of N(t) is given by
(Eq:PoissonProc)pn(t)=Pr(N(t)=n)=(λt)neλtn!,  (t0,n=0,1,2,)
A stochastic process {N(t)} characterized by the probability distribution (Eq:PoissonProc) is called a Poisson process.

Example: Let N(t) be the number of phone calls at an office by time t. Then N(t) can be modeled as a Poisson process. (The same applies to other examples above.)

Why is it called the "Poisson" process? Because the probability distribution pn(t) is the Poisson distribution with parameter (the mean of N(t)) λt. You should check that (Eq:Mean)μ(t)=E[N(t)]=n=0npn(t)=λt where E[] represents the expectation value of the random variable in the argument. The variance is (Eq:Var)V[N(t)]=E[(N(t)μ(t))2]=λt.
What can we learn from this?
  • According to (Eq:Mean), the mean of N(t) is proportional to time t. Thus, N(t) is a non-decreasing function of time, on average.
  • According to (Eq:Var), the variance of N(t) is also proportional to time t. Thus, for a larger t, the difference between different samples of N(t) becomes larger, on average.
Here is a sample path of a Poisson process (with λ=0.1).



We can see step-wise increases of N(t) at random intervals.

Simulating Poisson processes

(See this page for details.)
How can we simulate a Poisson process with parameter λ, such as in the above figure? Here, I explain a rough approximate method without details. In later posts, we will see why this method correctly (yet approximately) simulates the Poisson process and why it is a rough approximation. Since we cannot represent continuous variables on a computer, we must discretize them. In the present case, it is the time variable t that needs to be discretized. So, we consider discrete time steps:
t=0,δt,2δt,3δt, where δt is some small positive real number (floating-point number). As the initial condition, we assume that N(0)=0.
Now, suppose we have N(kδt)=n for some k=0,1,. We determine the value of N((k+1)δt) in the following manner:
  1. Generate a random number r from the uniform distribution in [0, 1) (i.e., between 0 and 1, including 0, excluding 1).
  2. If r<λδt, then we set N((k+1)δt)=n+1, otherwise we set N((k+1)δt)=n.
By this procedure, the value of N((k+1)δt) is incremented by 1 with probability λt, and stays the same (N((k+1)δt)=n) with probability 1λδt. Let's see an example.
  1. Suppose we have λ=0.1 and δt=0.1. Thus, λδt=0.01.
  2. We start from N(0)=0.
  3. Generate a random number from [0, 1). Say, we obtained r=0.2358. Since this is greater than λδt=0.01, we set N(δt)=N(0)+0=0.
  4. Generate another random number. Say we obtained r=0.0052103<λδt. Then, we set N(2δt)=N(δt)+1=1.
  5. Generate another random number. Say we obtained r=0.025139>λδt. Then, we set N(3δt)=N(2δt)+0=1.
  6. We repeat this procedure over and over.

Since the quantity λδt serves as a probability,  this method is applicable only when λδt1 (otherwise λδt(>1)cannot be interpreted as a probability). For example, if we have lambda=20,δt=0.1, then λδt=2>1 so that r<λδt for any random number r[0,1). This means N(kδt) is always incremented by 1 regardless of the random numbers, which is nonsense. To use a larger λ value, we replace r<λδt with r<1eλδt(<1) in Step 2. Note that when λδt is sufficiently small, the Maclaurin expansion gives
1eλδt=λδt+o(δt2).
That is, λδt can be obtained as the first-order approximation of 1eλδt. In a later post, we will see why 1eλδt is indeed the correct probability of increment. (Hint: 1eλδt is the cumulative distribution function of the exponential distribution of δt with parameter λ.)

Related videos:




Comments

Popular posts from this blog

Birth process

Branching processes: Mean and variance

Informal introduction to formal logic