Learning Chinese Word Segmentation Based on Bidirectional GRU-CRF and CNN Network Model

Learning Chinese Word Segmentation Based on Bidirectional GRU-CRF and CNN Network Model

Chenghai Yu (Zhejiang Sci-Tech University, Zhejiang, China), Shupei Wang (Zhejiang Sci-Tech University, Zhejiang, China) and Jiajun Guo (Zhejiang Sci-Tech University, Zhejiang, China)
Copyright: © 2019 |Pages: 16
DOI: 10.4018/IJTHI.2019070104

Abstract

Chinese word segmentation is the basis of the Chinese natural language processing (NLP). With the development of the deep learning, various neural network models are applied to the Chinese word segmentation. However, current neural network models have the characteristics of artificial feature extraction, nonstandard word-weight, inability to effectively use long-distance information and long training time of models in Chinese word segmentation. To solve a series of problems, this article presents a CNN-Bidirectional GRU-CRF neural network model (CNN Bidirectional GRU CRF Network, CBiGCN), which breaks through the limit of conventional method window, truly realizes end-to-end processing and applies to the neural network model by the five-Tag set method, bias-variable-weight greedy strategy and supplements by Goldstein-Armijo guidelines. Besides, this model, with simple structure, is easy to be operated. And it can automatically learn features, reduces large amounts of tasks on specific knowledge in the form of handcrafted features and data pre-processing, makes use of context information effectively. The authors set an experiment with two data corpuses for Chinese word segmentation to evaluate their system. The experiment verified their new model can obtain better Chinese word segmentation results and greatly reduce training time.
Article Preview
Top

1. Introduction

In the field of Chinese NLP, Chinese word segmentation is indispensable for intelligent Question-Answer system, speech recognition and machine translation because it is necessary for further analysis and processing of Chinese sentences. However, unlike English texts, there is no naturally recognizable separator in Chinese texts, and there is a problem of polysemy in Chinese. Therefore, we need to rely on the Chinese word segmentation to obtain the key information of Chinese texts.So Chinese word segmentation becomes an important research direction in Chinese NLP.Since Chinese word segmentation was proposed, many experts and scholars had devoted themselves to the important and basic research in the field of Chinese NLP. With the wide application of neural network, the Chinese word segmentation methods have been transitioned from the previous dictionary and statistic methods to the neural network methods. Especially the application of machine learning and deep learning makes the Chinese word segmentation more effective. In 2015, Prof. Ze-Wen Liu from Tsinghua University proposed a model based on Linear Chain Conditional Random Field (LCCRF) to solve Chinese word segmentation problem (Liu, Ding, & Li, 2015), which effectively optimizes features selection and tagging, and reduces the time complexity and space complexity of the training model. Several months later, Xinchi Chen, Xipeng Qiu, Chenxi Zhu, Pengfei Liu, and Xuanjing Huang (2015) from Fudan University proposed another method using Long Short-Term Memory (LSTM) model for Chinese word segmentation in the Conference on Empirical Methods in Natural Language Processing, which had improved the results comparing with the traditional methods. In 2016, Xuezhe Ma and Eduard Hovy from Carnegie Mellon University proposed the sequence tagging method based on neural network model for POS tagging at the Association for Computational Linguistics (ACL), which brought a new idea for Chinese word segmentation. In the same year, Yao and Huang (2016) adopted the Bidirectional LSTM neural network model, which didn't need any previous knowledge or pretreatment to improve the accuracy of Chinese word segmentation. However, CRF relies heavily on extracted features and task-specific resources, LSTM has complex structure and has the disadvantage of training and predicting time too long. Professors from Xiamen University proposed a new method to solve Chinese word segmentation by the Gated Recurrent Unit (GRU) neural network in 2017 (Li, Duan, & Xu, 2017).

Both the GRU model and the LSTM model are extensions of the recurrent neural network model, and they have reached predestinate goals on a number of tasks, but the GRU model has simpler structure than the LSTM model. However, both the LSTM model and the GRU model are default to unidirectional processing, so the two models will set the back words are more important than the previous words when they applied to the Chinese word segmentation task. This setting is not appropriate for the task of Chinese word segmentation, because the weight of past information and future information is uncertain for different sentences. Therefore, in this article we propose a new model CBiGCN which combines Bidirectional GRU (it consider the different weight of text words), CRF (it can gets a globally optimal solution by the statistical normalized probability globally.) and CNN, inherits the advantages of these models, uses end-to-end sequence tags method for Chinese word segmentation. After experiments, the model proposed by this paper has been significantly improved in terms of Chinese word segmentation accuracy and training speed.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 17: 4 Issues (2021): Forthcoming, Available for Pre-Order
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing