39 lines
2.2 KiB
Text
39 lines
2.2 KiB
Text
In probability theory and statistics, a Markov chain or Markov process
|
|
is a stochastic process describing a sequence of possible events in
|
|
which the probability of each event depends only on the state attained
|
|
in the previous event. Informally, this may be thought of as, "What
|
|
happens next depends only on the state of affairs now." A countably
|
|
infinite sequence, in which the chain moves state at discrete time
|
|
steps, gives a discrete-time Markov chain (DTMC). A continuous-time
|
|
process is called a continuous-time Markov chain (CTMC). Markov
|
|
processes are named in honor of the Russian mathematician Andrey
|
|
Markov.
|
|
|
|
Markov chains have many applications as statistical models of
|
|
real-world processes. They provide the basis for general stochastic
|
|
simulation methods known as Markov chain Monte Carlo, which are used
|
|
for simulating sampling from complex probability distributions, and
|
|
have found application in areas including Bayesian statistics, biology,
|
|
chemistry, economics, finance, information theory, physics, signal
|
|
processing, and speech processing.
|
|
|
|
The adjectives Markovian and Markov are used to describe something that
|
|
is related to a Markov process.
|
|
|
|
A Markov process is a stochastic process that satisfies the Markov
|
|
property (sometimes characterized as "memorylessness"). In simpler
|
|
terms, it is a process for which predictions can be made regarding
|
|
future outcomes based solely on its present state and—most
|
|
importantly—such predictions are just as good as the ones that could be
|
|
made knowing the process's full history. In other words, conditional on
|
|
the present state of the system, its future and past states are
|
|
independent.
|
|
|
|
A Markov chain is a type of Markov process that has either a discrete
|
|
state space or a discrete index set (often representing time), but the
|
|
precise definition of a Markov chain varies. For example, it is common
|
|
to define a Markov chain as a Markov process in either discrete or
|
|
continuous time with a countable state space (thus regardless of the
|
|
nature of time), but it is also common to define a Markov chain as
|
|
having discrete time in either countable or continuous state space
|
|
(thus regardless of the state space).
|