Journal on Communications ›› 2019, Vol. 40 ›› Issue (12): 41-50.doi: 10.11959/j.issn.1000-436x.2019195

• Papers • Previous Articles     Next Articles

Self-correcting complex semantic analysis method based on pre-training mechanism

Qing LI1,Jiang ZHONG1(),Lili LI2,Qi LI3   

  1. 1 College of Computer Science,Chongqing University,Chongqing 400044,China
    2 School of Civil Engineering,Chongqing University,Chongqing 400044,China
    3 Department of Computer Science and Engineering,Shaoxing University,Shaoxing 312000,China
  • Revised:2019-10-30 Online:2019-12-25 Published:2020-01-16
  • Supported by:
    Fundamental Research Funds for the Central Universities(2018CDYJSY0055);The National Key Research and Development Program of China(2017YFB1402400);The National Key Research and Development Program of China(CYB18058);Chongqing Technological Innovation and Application Demonstration Project(cstc2018jszx-cyzdX0086)

Abstract:

In the process of knowledge service,in order to meet the fragmentation management needs of intellectualization,knowledge ability,refinement and reorganization content resources.Through deep analysis and mining of semantic hidden knowledge,technology,experience,and information,it broke through the existing bottleneck of traditional semantic parsing technology from Text-to-SQL.The PT-Sem2SQL based on the pre-training mechanism was proposed.The MT-DNN pre-training model mechanism combining Kullback-Leibler technology was designed to enhance the depth of context semantic understanding.A proprietary enhancement module was designed that captured the location of contextual semantic information within the sentence.Optimize the execution process of the generated model by the self-correcting method to solve the error output during decoding.The experimental results show that PT-Sem2SQL can effectively improve the parsing performance of complex semantics,and its accuracy is better than related work.

Key words: Text-to-SQL, semantic parsing, natural language processing, complex event processing

CLC Number: 

No Suggested Reading articles found!