Skip to Content

Markov process

Pronunciation: mar'kof

Definition: a stochastic process such that the conditional probability distribution for the state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.

Disclaimer: This site is designed to offer information for general educational purposes only. The health information furnished on this site and the interactive responses are not intended to be professional advice and are not intended to replace personal consultation with a qualified physician, pharmacist, or other healthcare professional. You must always seek the advice of a professional for questions related to a disease, disease symptoms, and appropriate therapeutic treatments.
© Copyright 2017 Wolters Kluwer. All Rights Reserved. Review Date: Sep 19, 2016.
Hide