[1] |
XU W D , AULI M , CLARK S . CCG supertagging with a recurrent neural network[C]// The 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Short Papers). 2015: 250-255.
|
[2] |
GEOFFREY E,HINTON , . Learning distributed representations of concepts[C]// The Eighth Annual Conference of the Cognitive Science Society. Amherst,Mass, 1986: 1-12.
|
[3] |
YAO Y S , HUANG Z . Bi-directional LSTM recurrent neural network for Chinese word segmentation[J]. arXiv:1602.04874v1[cs.LG], 2016.
|
[4] |
CHIU J P C , NICHOLS E . Named entity recognition with bidirectional LSTM-CNNs[J]. Transactions of the Association for Computational Linguistics, 2016,4: 357-370.
|
[5] |
HOCHREITER S , SCHMIDHUBER J . Long short-term memory[J]. Neural Computation, 1997,9(8): 1735.
|
[6] |
TAN M , XIANG B , ZHOU B W . LSTM-based deep learning models for non-factoid answer selection[C]// ICLR 2016.
|
[7] |
KIPERWASSER E , GOLDBERG Y . Simple and accurate dependency parsing using Bidirectional LSTM feature representations[J]. Transactions of the Association for Computational Linguistics, 2016,4: 313-327.
|
[8] |
ZEILER M D . ADADELTA:an adaptive learning rate method[J]. arXir:1212.5701v1[cs.LG], 2012.
|
[9] |
MIKOLOV T , SUTSKEVER I , CHEN K ,et al. Distributed representations of words and phrases and their compositionality[J]. Advances in Neural Information Processing Systems, 2013,26: 3111-3119.
|
[10] |
MAAS A L , DALY R E , PHAM P T ,et al. Learning word vectors for sentiment analysis[C]// The 49th Annual Meeting of the Association for Computational Linguistics:Human Language Technologies. 2011: 142-150.
|
[11] |
CHO K . Natural language understanding with distributed representation[J]. Nato Asi, 2015,147: 139-155.
|
[12] |
SANTOS C D , TAN M , XIANG B . Attentive pooling networks[J]. arXir:1602.03609v1[cs.CL], 2016.
|
[13] |
SEVERYN A , MOSCHITTI A . Modeling relational information in question-answer pairs with convolutional neural networks[J]. arXiv:1604.01178v1[cs.CL], 2016.
|
[14] |
CROSS J , HUANG L . Incremental parsing with minimal features using bi-directional LSTM[C]// The 54th Annual Meeting of the Association for Computational Linguistics. 2016: 32-37.
|
[15] |
TURIAN J , RATINOV L , BENGIO Y . Word representations:a simple and general method for semi-supervised learning[C]// The 48th Annual Meeting of the Association for Computational Linguistics. 2010: 384-394.
|
[16] |
SUGAWARA H , TAKAMURA H , SASANO R ,et al. Context representation with word embeddings for WSD[M]. Computational Linguistics.Springer. Singapore, 2015: 108-119.
|
[17] |
SUNDERMEYER M,SCHLüTER R , NEY H . LSTM neural networks for language modeling[J]. Interspeech, 2012,31(43): 601-608.
|
[18] |
WANG P L , QIAN Y , SOONG F K ,et al. A unified tagging solution:bidirectional LSTM recurrent neural network with word embedding[J]. arXiv:1511.00215v1[cs.CL], 2015.
|
[19] |
HUANG Z H , XU W , YU K . Bidirectional LSTM-CRF models for sequence tagging[J]. arXiv:1508.01991v1[cs.CL], 2015.
|
[20] |
WANG L,LUíS T , MARUJO L , et al . Finding function in form:compositional character models for open vocabulary word representation[C]// The 2015 Conference on Empirical Methods in Natural Language Processing. 2016: 1520-1530.
|
[21] |
RAO A , SPASOJEVIC N . Actionable and political text classification using word embeddings and LSTM[J]. arXiv:1607.02501v1[cs.CL], 2016.
|