Name | Description | Type | Package | Framework |
BaumWelch | Class | com.numericalmethod.suanshu.stats.hmm.discrete | SuanShu | |
BetaMixtureDistribution | The HMM states use the Beta distribution to model the observations. | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu |
BetaMixtureDistribution .Lambda | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu | |
BinomialMixtureDistribution | The HMM states use the Binomial distribution to model the observations. | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu |
BinomialMixtureDistribution .Lambda | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu | |
DiscreteHMM | This is the discrete hidden Markov model as defined by Rabiner. | Class | com.numericalmethod.suanshu.stats.hmm.discrete | SuanShu |
ExponentialMixtureDistribution | The HMM states use the Exponential distribution to model the observations. | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu |
ForwardBackwardProcedure | The forward-backward procedure is an inference algorithm for hidden Markov models which computes the posterior marginals of all hidden state variables | Class | com.numericalmethod.suanshu.stats.hmm | SuanShu |
GammaMixtureDistribution | The HMM states use the Gamma distribution to model the observations. | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu |
GammaMixtureDistribution .Lambda | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu | |
HiddenMarkovModel | Class | com.numericalmethod.suanshu.stats.hmm | SuanShu | |
HmmInnovation | An HMM innovation consists of a state and an observation in the state. | Class | com.numericalmethod.suanshu.stats.hmm | SuanShu |
HMMRNG | In a (discrete) hidden Markov model, the state is not directly visible, but output, dependent on the state, is visible. | Class | com.numericalmethod.suanshu.stats.hmm | SuanShu |
LogNormalMixtureDistribution | The HMM states use the Log-Normal distribution to model the observations. | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu |
LogNormalMixtureDistribution .Lambda | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu | |
MixtureDistribution | This is the conditional distribution of the observations in each state (possibly differently parameterized) of a mixture hidden Markov model. | Interface | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu |
MixtureHMM | This is the mixture hidden Markov model (HMM). | Class | com.numericalmethod.suanshu.stats.hmm.mixture | SuanShu |
MixtureHMMEM | The EM algorithm is used to find the unknown parameters of a hidden Markov model (HMM) by making use of the forward-backward algorithm. | Class | com.numericalmethod.suanshu.stats.hmm.mixture | SuanShu |
MixtureHMMEM .TrainedModel | Class | com.numericalmethod.suanshu.stats.hmm.mixture | SuanShu | |
NormalMixtureDistribution | The HMM states use the Normal distribution to model the observations. | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu |
NormalMixtureDistribution .Lambda | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu | |
PoissonMixtureDistribution | The HMM states use the Poisson distribution to model the observations. | Class | com.numericalmethod.suanshu.stats.hmm.mixture.distribution | SuanShu |
Viterbi | The Viterbi algorithm is a dynamic programming algorithm for finding the most likely sequence of hidden states - called the Viterbi path - that results in | Class | com.numericalmethod.suanshu.stats.hmm | SuanShu |