Poisson process: Arrival times

Let's see the Poisson process \(\{N(t)\}\) as the phone call problem. That is, \(N(t)\) is the random variable representing the number of phone calls received by time \(t\). Then, the arrival time \(T_n\) of the \(n\)-th call is defined as the earliest time at which the random variable \(N(t) = n\). The inter-arrival time \(Q_n = T_n - T_{n-1}\) is the time between the successive calls (i.e., between \(n-1\) and \(n\)). In particular, \(Q_1 = T_1 - 0 = T_1\).

We have \[T_n = Q_1 + Q_2 + \cdots + Q_n.\tag{Eq:QnTn} \]

Note that \(\{Q_i\}\) are independent and identically distributed (i.i.d.) random variables. Therefore, we first determine the distribution of \(Q_n\) and then combine them to find the distribution of \(T_n\).

Inter-arrival times \(Q_n\)

Now, observe the following.

  • The probability of having no call at time \(t\) is \[p_0(t) = e^{-\lambda t}.\]
  • Thus, the probability of having the first call by time \(t\) is\[\begin{eqnarray}\Pr(T_1\leq t) &=& \Pr(\text{``1 or more calls arrived by time $t$''})\\ &=& \Pr(N(t) \geq 1)\\ &=& p_1(t) + p_2(t) + \cdots = \sum_{n=1}^{\infty}p_n(t)\\ &=& 1 - p_0(t)\\ &=& 1 - e^{-\lambda t}. \end{eqnarray}\]
  • Since each call is independent of other calls and all \(Q_n\)'s are identically distributed, we may assume that \[\Pr(Q_n \leq t) = \Pr(Q_1 \leq t) = \Pr(T_1 \leq t).\] In other words, each call is the first call since the last call
  • Thus, the cumulative distribution function of \(Q_n\) is \[\Pr(Q_n \leq t) = 1 - e^{-\lambda t}.\] Accordingly, the density function of \(Q_n\) is \[\rho_{Q_n}(t) = \frac{{d}\Pr(Q_n \leq t)}{{d}t} = \lambda e^{-\lambda t}.\]

  • Therefore, the inter-arrival time \(Q_n\) follows the exponential distribution with parameter \(\lambda\). The mean and variance of \(Q_n\) are \[\begin{eqnarray} \mathbb{E}(Q_n) &=& \frac{1}{\lambda},\tag{eq:mean}\\ \mathbb{V}(Q_n) &=& \frac{1}{\lambda^2}\tag{eq:var}. \end{eqnarray}\]

    Arrival times \(T_n\)

    Since we have (Eq:QnTn) above, and \(Q_n\)'s are i.i.d., it is convenient to use the moment generating function to find the distribution of \(T_n\). 
    Actually, we have the following well-known theorem.

    Theorem [The sum of exponential variates is a gamma variate]

    If i.i.d. random variables \(X_1, X_2, \cdots, X_n\) follow the exponential distribution Exp(\(\lambda\)), then their sum \(Y = X_1 + X_2 + \cdots + X_n\) follows the gamma distribution Gamma(\(n, \lambda\)) (with \(n\) and \(\lambda\) being the shape and rate parameters, respectively). Accordingly, the density function of \(Y\) is \[\rho_{Y}(y) = \frac{\lambda(\lambda y)^{n-1}e^{-\lambda y}}{(n-1)!}.\]
    Proof. Exercise. ■

    Using this theorem, we find that \(T_n\) follows a gamma distribution and its density function is \[\rho_{T_n}(t) = \frac{\lambda(\lambda t)^{n-1}e^{-\lambda t}}{(n-1)!}.\]
    The mean and variance of \(T_n\) are \[\begin{eqnarray} \mathbb{E}(T_n) &=& \frac{n}{\lambda} (= n\mathbb{E}(Q_n)),\\ \mathbb{V}(T_n) &=& \frac{n}{\lambda^2} (= n\mathbb{V}(Q_n)), \end{eqnarray}\] as expected.

    Comments

    Popular posts from this blog

    Open sets and closed sets in \(\mathbb{R}^n\)

    Euclidean spaces

    Newton's method