Skip to Content

Markov process

Pronunciation: mar'kof

Definition: a stochastic process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

Further information

Always consult your healthcare provider to ensure the information displayed on this page applies to your personal circumstances.

© Copyright 2018 Wolters Kluwer. All Rights Reserved. Review date: Sep 19, 2016.

Hide