Tools
:
Learn New Words
|
Secure Password Generator
|
Startup Tools
|
File Uploader
|
PDF Generator
Markov Chain (
Noun
)
Meaning
A Markov process for which the parameter is discrete time values.
Classification
Nouns denoting natural processes.
Synonyms
Markoff Chain
Hypernyms
Markoff Process