Neural Networks, IEEE - INNS - ENNS International Joint Conference on
Download PDF

Abstract

Both hidden Markov models (HMMs) and recurrent neural networks (RNNs) have been applied to sequence recognition problems. While HMMs are easy to train, they generally do not perform satisfactorily on difficult recognition problems. On the other hand, RNNs are excellent recognizers but are very hard to train. Hybrid HMM/NN approaches aim at taking advantage of the strengths of both paradigms while avoiding their respective weaknesses. This paper proposes a novel approach of combining HMMs with RNNs. We discuss an algorithm for directly mapping a trained HMM into RNN architecture and derive a gradient-descent learning algorithm for knowledge refinement.
Like what you’re reading?
Already a member?
Get this article FREE with a new membership!