Romaji Hide
Definition and Synonyms for マルコフ連鎖
1. |
マルコフ連鎖 |
パラメーターが離散時間値であるマルコフ過程 |
|
Markoff Chain |
a Markov process for which the parameter is discrete time values |
|
Synonyms: |
マルコフ連鎖 |
Meanings for each kanji in マルコフ連鎖
» |
連 |
take along; lead; join; connect; party; gang; clique |
» |
鎖 |
chain; irons; connection |
Categories マルコフ連鎖 is a member of
1. |
マルコフ過程 |
未来の状態の分布は、現在の状態によってのみ依存し、どのように現在の状態に達したかということには依存しないという単純確率過程 |
|
Markov Process |
a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state |
|
Show all words in category » |
This site uses the EDICT and KANJIDIC dictionary files. These files are the property of the Electronic Dictionary Research and Development Group, and are used in conformance with the Group's licence.