Journal on Communications ›› 2018, Vol. 39 ›› Issue (2): 53-64.doi: 10.11959/j.issn.1000-436x.2018024

• Papers • Previous Articles     Next Articles

Neural network model for dependency parsing incorporating global vector feature

Hengjun WANG1,Nianwen SI1(),Yulong SONG2,Yidong SHAN1   

  1. 1 The Third Institute,PLA Information Engineering University,Zhengzhou 450001,China
    2 73671 Army,Luan 237000,China
  • Revised:2017-12-08 Online:2018-02-01 Published:2018-03-28

Abstract:

LSTM and piecewise CNN were utilized to extract word vector features and global vector features,respectively.Then the two features were input to feed forward network for training.In model training,the probabilistic training method was adopted.Compared with the original dependency paring model,the proposed model focused more on global features,and used all potential dependency trees to update model parameters.Experiments on Chinese Penn Treebank 5 (CTB5) dataset show that,compared with the parsing model using LSTM or CNN only,the proposed model not only remains the relatively low model complexity,but also achieves higher accuracies.

Key words: dependency parsing, graph-based model, long short-term memory network, convolutional neural network,feature

CLC Number: 

No Suggested Reading articles found!