To start, consider a Bernoulli process: a discrete time series with each value IID Bernoulli. Then we can study some properties of this:

- The waiting time for the 1st event is distributed
**geometrically**. - The waiting time for the nth event is distributed as the sum of geometric distributions, which is
**negative binomial**. - The number of events in a given period of time is distributed
**binomially**.

(You should be able to derive these distributions easily.)

(Note that this is not specific to a time series -- one has the same results for a spatial lattice or for some general abstract set of points.)

(Note that this is not specific to a time series -- one has the same results for a spatial lattice or for some general abstract set of points.)

How would one generalize this to continuous time? Well, with continuous time you really can't talk about the result for each point in time being "Bernoulli", or about them being IID. But let's do it anyway. Suppose an event has an $\mu\; dt$ chance of occurring in the timespan $dt$. Then the chance that the first event occurs at time $t$ is (by geometric distribution) ${(1 - \mu \;dt)^{t/dt}}\mu \;dt$, or: $\mu {e^{ - \mu t}}\; dt$, i.e. a probability density $\mu e^{-\mu t}$. This is called the

The waiting time till the nth event is analogously just the sum of exponential random variables and its distribution can be computed through the standard MGF route. It is left as an exercise to the reader to show that the sum of exponential random variables with parameters $\mu_1,\dots\mu_\alpha$ is given by the

$$\Gamma(\alpha,\beta)\sim \frac{1}{(\alpha - 1)!}\beta^\alpha t^{\alpha-1}e^{-\beta t}$$

What does our notion of independence translate to? The idea that there being an event at some time should not depend on whether there was at any other time. Well, the natural way to write independence in a way that makes sense for continuous distributions is to consider the waiting time for the first event

\[P(T>t+s|T>t) = P(T>s)\]

This is known as

OK -- what about the number of events in the continuous case? Well, in some interval of size $T$, the probability of the number of events equaling some $n$ is (by binomial):

\[\left( {\begin{array}{*{20}{c}}

{T/dt} \\

n

\end{array}} \right){(\mu \,dt)^n}{(1 - \mu \,dt)^{T/dt}}\]

Which it is easy to see that equals:

\[\frac{{{{(\mu T)}^n}{e^{ - \mu T}}}}{{n!}}\]

Which is the

Here's a table of analogies between memoryless discrete and continuous processes:

**exponential distribution**.The waiting time till the nth event is analogously just the sum of exponential random variables and its distribution can be computed through the standard MGF route. It is left as an exercise to the reader to show that the sum of exponential random variables with parameters $\mu_1,\dots\mu_\alpha$ is given by the

**Gamma distribution**$\Gamma(\alpha,\sum\mu_i)$:$$\Gamma(\alpha,\beta)\sim \frac{1}{(\alpha - 1)!}\beta^\alpha t^{\alpha-1}e^{-\beta t}$$

What does our notion of independence translate to? The idea that there being an event at some time should not depend on whether there was at any other time. Well, the natural way to write independence in a way that makes sense for continuous distributions is to consider the waiting time for the first event

\[P(T>t+s|T>t) = P(T>s)\]

This is known as

**memorylessness**. Indeed, one can check that the only memoryless discrete distribution is geometric, and the only memoryless continuous distribution is exponential.OK -- what about the number of events in the continuous case? Well, in some interval of size $T$, the probability of the number of events equaling some $n$ is (by binomial):

\[\left( {\begin{array}{*{20}{c}}

{T/dt} \\

n

\end{array}} \right){(\mu \,dt)^n}{(1 - \mu \,dt)^{T/dt}}\]

Which it is easy to see that equals:

\[\frac{{{{(\mu T)}^n}{e^{ - \mu T}}}}{{n!}}\]

Which is the

**Poisson distribution**with rate parameter $\mu T$.Here's a table of analogies between memoryless discrete and continuous processes:

Discrete time | Continuous time | |
---|---|---|

Overall phenomenon | Bernoulli process | Poisson process |

Single-event | Bernoulli distribution | - |

Waiting time (1st event) | Geometric distribution | Exponential distribution |

Waiting time (nth event) | Negative Binomial distribution | Gamma distribution |

Number of events | Binomial distribution | Poisson distribution |

But isn't the continuous analog of the binomial distribution (and many other distributions) the normal distribution? Do not conflate discrete time with discrete number. Both the binomial and Poisson distributions above are discrete distributions: the Poisson is just the relevant one for the continuous-time process. These are completely unrelated notions.

## No comments:

## Post a Comment