Safekipedia

Stochastic process

Adapted from Wikipedia · Adventurer experience

A 3D visualization of a random path called Brownian motion, showing how particles can move in space over time.

In probability theory and related fields, a stochastic or random process is a special kind of mathematical object. It is made up of a group of random variables that change over time. These processes help us understand systems that seem to change in unpredictable ways, like the growth of a bacterial population, the ups and downs of an electrical current, or how a gas molecule moves.

A computer-simulated realization of a Wiener or Brownian motion process on the surface of a sphere. The Wiener process is widely considered the most studied and central stochastic process in probability theory.

Stochastic processes are used in many areas, including biology, chemistry, physics, computer science, and even finance. Two important examples are the Wiener process, which models how tiny particles move in a liquid, and the Poisson process, which helps predict how many events happen in a certain time, like phone calls in a day.

These processes can also be thought of as random functions because they deal with changes over time. Depending on their properties, stochastic processes can be grouped into different types, such as random walks and Markov processes. Studying them uses many parts of mathematics, making it an exciting and important area of research.

Introduction

A stochastic or random process is a group of random variables linked to different points, often thought of as time. Each random variable comes from the same space, called the state space. This space can be simple, like whole numbers, or more complex, like points on a line or in space.

A single computer-simulated sample function or realization, among other terms, of a three-dimensional Wiener or Brownian motion process for time 0 ≤ t ≤ 2. The index set of this stochastic process is the non-negative numbers, while its state space is three-dimensional Euclidean space.

Stochastic processes can be grouped in different ways. One common way is by looking at the set used to link the random variables. If this set has a limited number of points, like whole numbers, it is called discrete time. If the set is a stretch of real numbers, it is called continuous time. Processes in discrete time are often easier to study. We can also group them by the state space — whether it uses whole numbers, real numbers, or points in space.

Examples

Bernoulli process

Main article: Bernoulli process

A Bernoulli process is a simple type of random process. It is a list of random events. Each event is separate and has the same chance of happening. Each event can have one of two results, like flipping a coin and getting heads or tails. The chance of getting heads might be half, or any other chance you choose.

Random walk

Main article: Random walk

A random walk is another type of random process. Imagine taking steps forward or backward on a number line. At each step, you might move forward with one chance and backward with another. Over many steps, this makes a path that looks random but follows some rules. This idea is used in many areas, like studying how molecules move or in money-related models.

Wiener process

Main article: Wiener process

The Wiener process is an important random process named after Norbert Wiener. It is used to model things that change smoothly over time, like the movement of tiny particles in liquid. Unlike a random walk that jumps from place to place, the Wiener process changes smoothly, making paths that are not straight but also not sharp. This process is very useful in science, money-related areas, and many other fields.

Poisson process

Main article: Poisson process

The Poisson process is used to model events that happen randomly over time, like counting how many cars pass by in an hour or how many calls come into a call center. It tells us the chance of a certain number of events happening in a given time. This process can be changed to have events happen at steady rates or to change rates over time, making it very flexible for different kinds of models.

Definitions

A stochastic process is a way to model things that change randomly over time. Imagine rolling a dice many times — each roll is random, but we can look for patterns in the results.

Stochastic processes help us understand random changes, like how bacteria grow, how electricity behaves in a wire, or how tiny particles move in liquid. These processes help scientists and mathematicians study unpredictable events.

Further examples

Markov processes and chains

Main article: Markov chain

Markov processes are special kinds of random processes where the next value depends only on the current value, not on past values. This means that what happens next depends only on where you are now. Examples include the movement of particles.

Martingale

Main article: Martingale (probability theory)

A martingale is a type of random process where the expected future value is equal to the current value. This idea is used in fair games.

Lévy process

Main article: Lévy process

Lévy processes are generalizations of random walks that happen continuously over time.

Random field

Main article: Random field

A random field is a collection of random values spread out over space.

Point process

Main article: Point process

A point process is a way to describe random points spread out over space or time.

History

Early probability theory

Probability theory began with games of chance thousands of years ago, but people didn’t study how likely different outcomes were until much later. In 1654, French mathematicians Pierre Fermat and Blaise Pascal wrote letters about probability, inspired by a gambling problem. Before that, an Italian mathematician named Gerolamo Cardano wrote about probability in the 1500s, but his work wasn’t published until 1663.

After Cardano, Jakob Bernoulli wrote a big book called Ars Conjectandi in 1713. His book helped more people learn about probability. Even with smart mathematicians like Pierre-Simon Laplace, Abraham de Moivre, Carl Gauss, Siméon Poisson, and Pafnuty Chebyshev studying it, probability theory wasn’t seen as a real part of mathematics until the 1900s.

Statistical mechanics

In the 1800s, scientists started studying how tiny particles behave in things like gases. They thought of gases as lots of particles moving around randomly. In 1859, James Clerk Maxwell showed how gas particles move in random directions and speeds. Later, scientists like Ludwig Boltzmann and Josiah Gibbs built on this idea, which would later help Albert Einstein with his work on Brownian movement.

Mathematician Joseph Doob did early work on the theory of stochastic processes, making fundamental contributions, particularly in the theory of martingales. His book Stochastic Processes is considered highly influential in the field of probability theory.

Measure theory and probability theory

In 1900, a big meeting of mathematicians happened in Paris. One mathematician, David Hilbert, asked for new ways to understand physics and probability using something called “axioms.” Soon after, mathematicians created something called measure theory to help study probability. In 1925, a French mathematician named Paul Lévy wrote the first book about probability using these new ideas.

In the 1920s and 1930s, mathematicians in Russia, like Andrei Kolmogorov, helped create the modern foundations of probability theory. Kolmogorov wrote a very important book in 1933 that used measure theory to explain probability rules clearly.

Birth of modern probability theory

The book Kolmogorov wrote in 1933 is considered the start of modern probability theory. After that, many mathematicians, including Joseph Doob, William Feller, and Paul Lévy, did important work to advance the study of probability and stochastic processes. World War II slowed this progress for a while, but after the war, the study grew again.

Stochastic processes after World War II

After World War II, mathematicians kept exploring stochastic processes. One big development was by Kiyosi Itô, who started working on stochastic calculus in the 1940s. This helped connect stochastic processes with other areas of math.

Discoveries of specific stochastic processes

Even before formal definitions, people discovered specific stochastic processes. For example, the Bernoulli process, named after Jacob Bernoulli, models something like flipping a biased coin. It was first studied in the early 1700s.

Another important process is the Wiener process, which describes Brownian motion — the random movement of tiny particles in liquid. It was first used by Albert Einstein in 1905 to explain this phenomenon.

The Poisson process, named after Siméon Poisson, is used to model events that happen randomly over time, like phone calls arriving at a switchboard. It was first applied by A.K. Erlang in 1909 to study telephone networks.

Markov processes, named after Andrey Markov, are used when the next event only depends on the current state, not past states. Markov started studying these in the early 1900s.

Lévy processes, including the Wiener and Poisson processes, were studied by Paul Lévy starting in the 1930s. These processes help describe many random events in mathematics and science.

Mathematical construction

In mathematics, we need to build and prove that things like stochastic processes really exist. There are two main ways to do this. One way looks at a special space of functions and maps them to probabilities. The other way uses a special math rule called Kolmogorov's existence theorem. This rule says that if certain conditions are met, then a stochastic process can be created.

When dealing with processes that change over continuous time, some tricky math problems can show up. For example, two different processes might look the same at certain points but behave differently overall. To solve these problems, mathematicians sometimes assume the process is "separable," meaning its behavior is mostly decided by checking a smaller, countable set of points. Another method, created by Anatoliy Skorokhod and Andrei Kolmogorov, uses special spaces of functions to build these processes.

Application

Applications in Finance

Black-Scholes Model

A famous use of stochastic processes in finance is the Black-Scholes model for pricing options. Created by Fischer Black, Myron Scholes, and Robert Merton, this model uses Geometric Brownian motion, a type of stochastic process, to describe how asset prices change.

The model assumes that the price of a stock follows a special stochastic process. This helps in pricing options and has greatly influenced financial markets. It is still widely used because it is simple and practical.

Stochastic Volatility Models

Another important use of stochastic processes in finance is in stochastic volatility models, which try to show how market volatility changes over time. The Heston model is a well-known example, allowing the volatility of asset prices to follow its own stochastic process.

These models are more flexible than the Black-Scholes model and can better match real-world observations.

Applications in Biology

Population Dynamics

Stochastic processes are also used in biology, especially in population dynamics. Unlike models that assume predictable changes, stochastic models include randomness in births, deaths, and movement. The birth-death process is a simple model that shows how populations change due to random events. These models are important for small populations, like endangered species or tiny microbial groups.

Another example is the branching process, which models how a population grows when each individual reproduces on its own. This is useful in studying disease spread.

Applications in Computer Science

Randomized Algorithms

Stochastic processes are important in computer science, especially in randomized algorithms. These algorithms use random inputs to make problem-solving easier or better. For example, Markov chains are used in probabilistic algorithms for tasks like optimization and sampling, such as in search engines. These methods help manage uncertainty in large datasets and are used in areas like cryptography and artificial intelligence.

Queuing Theory

Stochastic processes are also used in queuing theory, which models the random arrival and service of tasks in a system. This is important for network traffic and server management. Queuing models help predict delays, manage resources, and improve performance in web servers and communication networks. They are essential for designing efficient data centers and cloud computing systems.

Related articles

This article is a child-friendly adaptation of the Wikipedia article on Stochastic process, available under CC BY-SA 4.0.

Images from Wikimedia Commons. Tap any image to view credits and license.