Abstract
We introduce a novel approach for handwritten word recognition using multilevel hidden Markov models (MLHMM). The MLHMM is a doubly embedded network of HMMs where each character is modeled by an HMM while a word is modeled by a higher-level HMM. In the character model, we associate the observation with the transition. By introducing the technique called 'tied transition', i.e., the segments which have the same semantic meaning will be 'tied' together, we have successfully built up the character model by an HMM with 4 states, 5 observations (or symbols) and 7 transitions. Thus, as the states are not assigned any semantic meaning, the re-estimation algorithm is applicable. At the character level, the best model is chosen as the recognition result. So, the character model is purely a model discriminant HMM (MD-HMM) based approach. For the word model, on the other hand, both the MD-HMM and the path discriminant HMM (PD-HMM) approaches are used and their respective performances are demonstrated.