doi: 10.4304/jsw.8.12.3222-3228
Active Learning for Prediction of Prosodic Word Boundaries in Chinese TTS Using Maximum Entropy Markov Model
2College of Computer and Information Engineering, Tianjin Normal University, Tianjin, China
Abstract—For a Chinese speech synthesis system, hierarchical prosody structure generation is a key component. The prosodic word, which is the basic prosodic unit, plays an important role in the naturalness and intelligibility of Chinese Text-To-Speech system. However, obtaining human annotations of prosodic words to train a supervised system can be a laborious and costly effort. To overcome this, we explore active learning techniques with the goal to reduce the amount of human-annotated data needed to attain a given level of performance. In this paper Active Maximum Entropy Markov Model(AMEMM) is used to predict Chinese prosodic word boundaries in unrestricted Chinese text. Experiments show that for most of the cases considered, active selection strategies for labeling prosodic word boundaries are as good as or exceed the performance of random data selection.
Index Terms—Prosodic Word, Text-to-Speech System (TTS), Active Learning, Maximum Entropy Markov Model.
Cite: Ziping Zhao, Xirong Ma, "Active Learning for Prediction of Prosodic Word Boundaries in Chinese TTS Using Maximum Entropy Markov Model," Journal of Software vol. 8, no. 12, pp. 3222-3228, 2013.
General Information
ISSN: 1796-217X (Online)
Abbreviated Title: J. Softw.
Frequency: Quarterly
APC: 500USD
DOI: 10.17706/JSW
Editor-in-Chief: Prof. Antanas Verikas
Executive Editor: Ms. Cecilia Xie
Abstracting/ Indexing: DBLP, EBSCO,
CNKI, Google Scholar, ProQuest,
INSPEC(IET), ULRICH's Periodicals
Directory, WorldCat, etcE-mail: jsweditorialoffice@gmail.com
-
Oct 22, 2024 News!
Vol 19, No 3 has been published with online version [Click]
-
Jan 04, 2024 News!
JSW will adopt Article-by-Article Work Flow
-
Apr 01, 2024 News!
Vol 14, No 4- Vol 14, No 12 has been indexed by IET-(Inspec) [Click]
-
Apr 01, 2024 News!
Papers published in JSW Vol 18, No 1- Vol 18, No 6 have been indexed by DBLP [Click]
-
Jun 12, 2024 News!
Vol 19, No 2 has been published with online version [Click]