Summary
English Synonyms:   more detail...
  1. Markov chain:


English

Detailed Synonyms for Markov chain in English

Markov chain:

Markov chain [the ~] noun

  1. the Markov chain
    – a Markov process for which the parameter is discrete time values 1
    the Markoff chain; the Markov chain
    – a Markov process for which the parameter is discrete time values 1

Related Definitions for "Markov chain":

  1. a Markov process for which the parameter is discrete time values1

Related Synonyms for Markov chain