Random Walk and Differential Equations (I)

Consider discrete points on the real line, indexed by the integers … -3, -2, -1, 0, 1, 2, … . A drunken man starts at position 0 and time 0. At each time step, he may move to the left or right (each with probability p<1/2), or stay still (with probability 1-2p).

This goes on and on, and we wish to find the probability distribution of his location at time t, i.e. for each m what’s the probability u(mt) that he’s found at position m during time t? Clearly, for each t, \sum_{m\in\mathbf{Z}} u(m,t) = 1 since the man must be found somewhere.

To solve it exactly, one can use generating functions. Indeed, the recurrence relation in time gives:

u(m, t+1) = p\cdot u(m-1, t) + p\cdot u(m+1, t) + (1-2p) u(m, t),  (*)

for each integer m, non-negative integer t. So if we consider the power series

P_t(x) = \ldots + u(-1, t)x^{-1} + u(0, t) + u(1, t)x + u(2,t)x^2 + \ldots = \sum_{m=-\infty}^\infty u(m,t)x^m,

then the above recurrence relation in u(mt) can be expressed succinctly as a recurrence relation in Pt(x):

P_{t+1}(x) = (p\cdot x + p\cdot x^{-1} + (1-2p)) P_t(x),

with P_0(x) = 1. This expression has certain advantages in computation:

Example 1

Let x=1: this gives P_{t+1}(1) = P_t(1). But we already know this since Pt(1) represents the sum of all u(mt) across m, i.e. it equals 1.

Example 2

Differentiate to obtain:

P_{t+1}'(x) = (p - px^{-2})P_t(x) + (px + px^{-1} + (1-2p)) P_t'(x).

Substitute x=1 to obtain P_{t+1}'(1) = P_t'(1). Since P_0'(1) = 0, we also have P_t'(1) = 0 for all t. Again, we already know this since P_t'(1) = \sum_m m\cdot u(m,t) represents the expected position of our friend, which is always 0 since he’s just as inclined to go left as he is to go right. In probability notation, we have P_t'(1) = E(X_t), where Xt is the random variable for our friend’s location at time t.

Example 3

Differentiate  yet again to obtain:

P_{t+1}''(x) = 2px^{-3} P_t(x) + 2(p-px^{-2})P_t'(x) + (px + px^{-1} + (1-2p))P_t''(x).

Substitute x=1 to obtain P_{t+1}"(1) = 2p P_t(1) + P_t''(1) = P_t''(1) + 2p. Since P_0''(1) = 0 we have P_t''(1) = 2pt. But

P_t''(x) = \sum_{m=-\infty}^\infty u(m,t)m(m-1)x^{m-2} \implies P_t''(1) = E(X_t(X_t-1)),

so from E(X_t) = 0 we can calculate the variance via

\text{Var}(X_t) = E(X_t^2) - E(X_t)^2 = 2pt.

Example 4

Suppose the drunken man’s house is at x=5. What’s the value of E((X_t - 5)^2) at time t?

Answer : let’s expand

E((X_t - 5)^2) = E(X_t^2 - 10X_t + 25) = E(X_t^2) - 10 E(X_t) + 25 = 25 + 2pt.


As the reader can guess, successive differentiation gives us the higher moments E(X_t^n) for various n. In fact, since

P_{t+k}(x) = (p\cdot x + p\cdot x^{-1} + (1-2p))^k P_t(x),

we can even consider relations across wider time steps.

Notice that we didn’t have to start with a fixed starting position for the drunken man. In particular, we could have any sequence of non-negative real u(m, 0), as long as the sum is 1. However, care would have to be taken to ensure that the higher moments are well-defined. E.g. if the probability distribution starts with

u(m, 0) = \frac{6}{m^2\pi^2} for m>0 (u(m, 0) = 0 if m ≤ 0),

then the expected value is undefined since \sum_{m=1}^\infty m\cdot u(m,0) = \frac{6}{\pi^2} \sum_{m=1}^\infty \frac 1 m which diverges since it is the renowned harmonic series. Roughly speaking, as m → ±∞, u(m, 0) should approach 0 exponentially fast in order for the higher moments to make sense.

Higher-Dimensional Case

Let’s assume now that our drunken friend can walk in a two-dimensional plane. So the probability of him at location (mn) at time t is denoted u(mnt). At each time step, our friend can walk a unit distance in any of the four standard directions: north, south, east or west, each with probability p. The probability that he remains still for one time step is 1-4p.

The recurrence relation in t is easy to write:

u(mnt+1) = p(u(m-1, nt) + u(m+1, nt) + u(mn-1, t) + u(mn+1, t)) + (1-4p)u(mnt).

Proceeding as before, let’s define a two-parameter power series:

P_t(x, y) = \sum_{m,n\in\mathbf{Z}} u(m,n,t) x^m y^n.

The above recurrence relation then translates to:

P_{t+1}(x,y) = (px + px^{-1} + py + py^{-1} +(1-4p))P_t(x,y).

Let the random variable denoting his location at time t be (XtYt). In order to extract useful information, we’ll need to do partial differentiation. For convenience, we’ll denote A(x,y) = px + px^{-1} + py + py^{-1} + (1-4p).

\frac{\partial P_{t+1}}{\partial x} = (p - px^{-2})P_t + A\cdot \frac{\partial P_t}{\partial x},

\frac{\partial P_{t+1}}{\partial y} = (p - py^{-2})P_t + A\cdot\frac{\partial P_t}{\partial y}.

The second derivatives are:

\frac{\partial^2 P_{t+1}}{\partial x^2} = 2px^{-3} P_t + 2(p - px^{-2})\frac{\partial P_t}{\partial x} + A\frac{\partial^2 P_t}{\partial x^2},

\frac{\partial^2 P_{t+1}}{\partial y^2} = 2py^{-3} P_t + 2(p - py^{-2})\frac{\partial P_t}{\partial x} + A\frac{\partial^2 P_t}{\partial y^2},

\frac{\partial^2 P_{t+1}}{\partial x\partial y} = (p-py^{-2})\frac{\partial P_t}{\partial x} + (p-px^{-2})\frac{\partial P_t}{\partial y} + A\frac{\partial^2 P_t}{\partial x\partial y}.

Substituting x=1, y=1 gives the following useful values:

\left.\frac{\partial P_t}{\partial x}\right|_{1,1} = E(X_t), \ \left.\frac{\partial P_t}{\partial y}\right|_{1,1} = E(Y_t),

\left.\frac{\partial^2 P_t}{\partial x^2}\right|_{1,1} = E(X_t^2 - X_t),\ \left.\frac{\partial^2 P_t}{\partial y^2}\right|_{1,1} = E(Y_t^2 - Y_t),\ \left.\frac{\partial^2 P_t}{\partial x\partial y}\right|_{1,1} = E(X_t Y_t).

In the next article, we’ll consider the limiting case, where the space and time intervals approach zero, and look at the corresponding differential equations.

This entry was posted in Notes and tagged , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s