HMM-Based Vietnamese Speech Synthesis

HMM-Based Vietnamese Speech Synthesis

Son Trinh (The University of Information Technology, Ho Chi Minh City, Vietnam) and Kiem Hoang (The University of Information Technology, Ho Chi Minh City, Vietnam)
Copyright: © 2015 |Pages: 15
DOI: 10.4018/IJSI.2015100103
OnDemand PDF Download:
No Current Special Offers


In this paper, improving naturalness HMM-based speech synthesis for Vietnamese language is described. By this synthesis method, trajectories of speech parameters are generated from the trained Hidden Markov models. A final speech waveform is synthesized from those speech parameters. The main objective for the development is to achieve maximum naturalness in output speech through key points. Firstly, system uses a high quality recorded Vietnamese speech database appropriate for training, especially in statistical parametric model approach. Secondly, prosodic informations such as tone, POS (part of speech) and features based on characteristics of Vietnamese language are added to ensure the quality of synthetic speech. Third, system uses STRAIGHT which showed its ability to produce high-quality voice manipulation and was successfully incorporated into HMM-based speech synthesis. The results collected show that the speech produced by our system has the best result when being compared with the other Vietnamese TTS systems trained from the same speech data.
Article Preview

2. Hmm-Based Speech Synthesis

HMM-based speech synthesis is one of the most researched synthesis methods. In this synthesis method, HMM (hidden markov models) are trained from natural speech database. HMM consists of the training and the synthesis part (Tokuda et al., 2004).

Figure 1.

HMM-based speech synthesis system (Tokuda et al., 2004)


Complete Article List

Search this Journal:
Volume 11: 1 Issue (2023)
Volume 10: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 9: 4 Issues (2021)
Volume 8: 4 Issues (2020)
Volume 7: 4 Issues (2019)
Volume 6: 4 Issues (2018)
Volume 5: 4 Issues (2017)
Volume 4: 4 Issues (2016)
Volume 3: 4 Issues (2015)
Volume 2: 4 Issues (2014)
Volume 1: 4 Issues (2013)
View Complete Journal Contents Listing