Markov chain
柯林斯词典
1. N a sequence of events the probability for each of which is dependent only on the event immediately preceding it 馬爾可夫鏈[statistics]
返回 Markov chain
1. N a sequence of events the probability for each of which is dependent only on the event immediately preceding it 馬爾可夫鏈[statistics]