Abstract
Syntactic and semantic parsing has been investigated for decades, which is one primary topic in the natural language processing community. This article aims for a brief survey on this topic. The parsing community includes many tasks, which are difficult to be covered fully. Here we focus on two of the most popular formalizations of parsing: constituent parsing and dependency parsing. Constituent parsing is majorly targeted to syntactic analysis, and dependency parsing can handle both syntactic and semantic analysis. This article briefly reviews the representative models of constituent parsing and dependency parsing, and also dependency graph parsing with rich semantics. Besides, we also review the closely-related topics such as cross-domain, cross-lingual and joint parsing models, parser application as well as corpus development of parsing in the article.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Manning C D, Schütze H. Foundations of Statistical Natural Language Processing. Massachusetts: The MIT Press, 1999
Kübler S, McDonald R, Nivre J. Dependency parsing. Synt Lectures Human Language Tech, 2009, 1: 1–127
Chengqing Z. Statistical Natural Language Processing. Beijing: Tsinghua University Press, 2013
Jurafsky D H. Martin J. Speech and Language Processing. 3rd ed. New Jersey: Prentice Hall PTR, 2019
Yamada K, Knight K. A syntax-based statistical translation model. In: Proceedings of Association for Computational Linguistic, 39th Annual Meeting and 10th Conference of the European Chapter. Toulouse, 2001. 523–530
Chan Y S, Roth D. Exploiting syntactico-semantic structures for relation extraction. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Portland, 2011. 551–560
Zou H, Tang X, Xie B, et al. Sentiment classification using machine learning techniques with syntax features. In: Proceedings of 2015 International Conference on Computational Science and Computational Intelligence. IEEE, 2015. 175–179
Henderson J. Discriminative training of a neural network statistical parser. In: Proceedings of the 42nd Annual Meeting of the Association for Computational Linguistics. Barcelona, 2004. 95–102
Collins M. Three generative, lexicalised models for statistical parsing. In: Proceedings of the 35th Annual Meeting of the Association for Computational Linguistics and 8th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference. Madrid, 1997. 16–23
Charniak E. A maximum-entropy-inspired parser. In: Proceedings of the 1st Meeting of the North American Chapter of the Association for Computational Linguistics. Seattle, 2000
McClosky D, Charniak E, Johnson M. Effective self-training for parsing. In: Proceedings of Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics. New York, 2006. 152–159
Petrov S, Klein D. Improved inference for unlexicalized parsing. In: Proceedings of Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics. Rochester, 2007. 404–411
Hall D, Durrett G, Klein D. Less grammar, more features. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, 2014. 228–237
Sagae K, Lavie A. A classifier-based parser with linear run-time complexity. In: Proceedings of the Ninth International Workshop on Parsing Technology. Vancouver, 2005. 125–132
Zhu M, Zhang Y, Chen W, et al. Fast and accurate shift-reduce constituent parsing. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. Sofia, 2013. 434–443
Socher R, Bauer J, Manning C D, et al. Parsing with compositional vector grammars. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. Sofia, 2013. 455–465
Durrett G, Klein D. Neural CRF parsing. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Beijing, 2015. 302–312
Stern M, Andreas J, Klein D. A minimal span-based neural constituency parser. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 818–827
Kitaev N, Klein D. Constituency parsing with a self-attentive encoder. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 2676–2686
Wang Z, Mi H, Xue N. Feature optimization for constituent parsing via neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Beijing, 2015. 1138–1147
Watanabe T, Sumita E. Transition-based neural constituent parsing. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Beijing, 2015. 1169–1179
Dyer C, Kuncoro A, Ballesteros M, et al. Recurrent neural network grammars. In: Proceedings of The 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, 2016. 199–209
Cross J, Huang L. Span-based constituency parsing with a structure-label system and provably optimal dynamic oracles. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Austin, 2016. 1–11
Liu J, Zhang Y. In-order transition-based constituent parsing. Trans Assoc Comput Linguist, 2017, 5: 413–424
Fried D, Klein D. Policy gradient as a proxy for dynamic oracles in constituency parsing. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 469–476
Kitaev N, Klein D. Tetra-tagging: Word-synchronous parsing with linear-time inference. ArXiv: 1904.09745
Shen Y, Lin Z, Jacob A P, et al. Straight to the tree: Constituency parsing with neural syntactic distance. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 1171–1180
Teng Z, Zhang Y. Two local models forneural constituentparsing. In: Proceedings of the 27th International Conference on Computational Linguistics. Santa Fe, 2018. 119–132
Vilares D, Abdou M, S0gaard A. Better, faster, stronger sequence tagging constituent parsers. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 3372–3383
Zhou J, Zhao H. Head-driven phrase structure grammar parsing on Penn treebank. In: Proceedings of the 57th Conference of the Association for Computational Linguistics. Florence, 2019. 2396–2408
Mrini K, Dernoncourt F, Bui T, et al. Rethinking self-attention: An interpretable self-attentive encoder-decoder parser. ArXiv: 1911.03875
Klein D, Manning C D. Accurate unlexicalized parsing. In: Proceedings of the 41st Annual Meeting of the Association for Computational Linguistics. Sapporo, 2003. 423–430
Socher R, Manning C D, Ng A Y. Learning continuous phrase representations and syntactic parsing with recursive neural networks. In: Proceedings of the NIPS-2010 Deep Learning and Unsupervised Feature Learning Workshop. Vancouver, 2010. 1–9
Gaddy D, Stern M, Klein D. What’s going on in neural constituency parsers? An analysis. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2018. 999–1010
Peters M, Neumann M, Iyyer M, et al. Deep contextualized word representations. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2018. 2227–2237
Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 4171–4186
Ratnaparkhi A. A linear observed time statistical parser based on maximum entropy models. In: Proceedings of Second Conference on Empirical Methods in Natural Language Processing. Providence, 1997
Zhang Y, Clark S. Transition-based parsing of the chinese treebank using a global discriminative model. In: Proceedings of the 11th International Workshop on Parsing Technologies. Paris, 2009. 162–171
Collins M. Discriminative training methods for hidden markov models: Theory and experiments with perceptron algorithms. In: Proceedings of the 2002 Conference on Empirical Methods in Natural Language Processing. Philadelphia, 2002. 1–8
Cross J, Huang L. Incremental parsing with minimal features using bi-directional LSTM. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 32–37
Coavoux M, Crabbe B. Neural greedy constituent parsing with dynamic oracles. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 172–182
Goldberg Y, Nivre J. A dynamic oracle for arc-eager dependency parsing. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Jeju Island, 2012. 959–976
Coavoux M, Crabbe B, Cohen S B. Unlexicalized Transition-based Discontinuous Constituency Parsing. Trans Assoc Comput Linguist, 2019, 7: 73–89
Fernández-González D, Gómez-Rodríguez C. Faster shift-reduce constituent parsing with a non-binary, bottom-up strategy. Artif Intell, 2019, 275: 559–574
Vinyals O, Kaiser Ł, Koo T, et al. Grammar as a foreign language. In: Proceedings of Advances in Neural Information Processing Systems. Montreal, 2015. 2773–2781
Choe D K, Charniak E. Parsing as language modeling. In: Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Austin, 2016. 2331–2336
Gómez-Rodríguez C, Vilares D. Constituent parsing as sequence labeling. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 1314–1324
Vilares D, Strzyz M, Søgaard A, et al. Parsing as pretraining. ArXiv: 2002.01685
Yang Z, Dai Z, Yang Y, et al. Xlnet: Generalized autoregressive pretraining for language understanding. In: Proceedings of Thirty-third Conference on Neural Information Processing Systems. Vancouver, 2019. 5754–5764
McClosky D, Charniak E, Johnson M. When is self-training effective for parsing? In: Proceedings of the 22nd International Conference on Computational Linguistics. Manchster, 2008. 561–568
Candito M, Crabbeí B. Improving generative statistical parsing with semi-supervised word clustering. In: Proceedings of the 11th International Workshop on Parsing Technologies. Paris, 2009. 138–141
Collins M, Koo T. Discriminative reranking for natural language parsing. Comput Linguist, 2005, 31: 25–70
Huang L. Forest reranking: Discriminative parsing with non-local features. In: Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics. Columbus, 2008. 586–594
Hajic J, Ciaramita M, Johansson R, et al. The CoNLL-2009 shared task: Syntactic and semantic dependencies in multiple languages. In: Proceedings of the Thirteenth Conference on Computational Natural Language Learning. Boulder, 2009. 1–18
Oepen S, Kuhlmann M, Miyao Y, et al. Semeval 2015 task 18: Broad-coverage semantic dependency parsing. In: Proceedings of the 9th International Workshop on Semantic Evaluation. Denver, 2015. 915–926
Nivre J, McDonald R. Integrating graph-based and transition-based dependency parsers. In: Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics. Columbus, 2008. 950–958
McDonald R, Crammer K, Pereira F. Online large-margin training of dependency parsers. In: Proceedings of the 43rd Annual Meeting of the Association for Computational Linguistics. University of Michigan, 2005. 91–98
Carreras X. Experiments with a higher-order projective dependency parser. In: Proceedings of the CoNLL Shared Task Session of EMNLP-CoNLL. Prague, 2007
Nivre J. An efficient algorithm for projective dependency parsing. In: Proceedings of the Eighth International Conference on Parsing Technologies. Nancy, 2003
Yamada H, Matsumoto Y. Statistical dependency analysis with support vector machines. In: Proceedings of the Eighth International Conference on Parsing Technologies. Nancy, 2003
Li Z, Cai J, He S, et al. Seq 2seq dependency parsing. In: Proceedings of the 27th International Conference on Computational Linguistics. Santa Fe, 2018. 3203–3214
McDonald R, Pereira F. Online learning of approximate dependency parsing algorithms. In: Proceedings of the 11st Conference of the European Chapter of the Association for Computational Linguistics. Trento, 2006
Koo T, Carreras X, Collins M. Simple semi-supervised dependency parsing. In: Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics. Columbus, 2008. 595–603
Chen W, Kazama J, Uchimoto K, et al. Improving dependency parsing with subtrees from auto-parsed data. In: Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing. Singapore, 2009. 570–579
Bohnet B. Top accuracy and fast dependency parsing is not a contradiction. In: Proceedings of the 23rd International Conference on Computational Linguistics. Beijing, 2010. 89–97
Koo T, Collins M. Efficient third-order dependency parsers. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Uppsala, 2010. 1–11
Ma X, Zhao H. Fourth-order dependency parsing. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Jeju Island, 2012. 785–796
Nivre J. Algorithms for deterministic incremental dependency parsing. Comput Linguist, 2008, 34: 513–553
Zhang Y, Clark S. A tale of two parsers: Investigating and combining graph-based and transition-based dependency parsing. In: Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing. Honolulu, 2008. 562–571
Zhang Y, Nivre J. Transition-based dependency parsing with rich nonlocal features. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Portland, 2011. 188–193
Pei W, Ge T, Chang B. An effective neural network model for graph-based dependency parsing. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Beijing, 2015. 313–322
Zhang Z, Zhao H, Qin L. Probabilistic graph-based dependency parsing with convolutional neural network. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 1382–1392
Wang W, Chang B. Graph-based dependency parsing with bidirectional LSTM. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 2306–2315
Kiperwasser E, Goldberg Y. Simple and accurate dependency parsing using bidirectional LSTM feature representations. Trans Assoc Comput Linguist, 2016, 4: 313–327
Dozat T, Manning C D. Deep biaffine attention for neural dependency parsing. ArXiv: 1611.01734
Li Y, Li Z, Zhang M, et al. Self-attentive biaffine dependency parsing. In: Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. Macao, 2019. 5067–5073
Ji T, Wu Y, Lan M. Graph-based dependency parsing with graph neural networks. In: Proceedings of the 57th Conference of the Association for Computational Linguistics. Florence, 2019. 2475–2485
Chen D, Manning C. A fast and accurate dependency parser using neural networks. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha, 2014. 740–750
Dyer C, Ballesteros M, Ling W, et al. Transition-based dependency parsing with stack long short-term memory. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Beijing, 2015. 334–343
Zhou H, Zhang Y, Huang S, et al. A neural probabilistic structured-prediction model for transition-based dependency parsing. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Beijing, 2015. 1213–1222
Andor D, Alberti C, Weiss D, et al. Globally normalized transition-based neural networks. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 2442–2452
Ballesteros M, Dyer C, Goldberg Y, et al. Greedy transition-based dependency parsing with stack LSTMs. Comput Linguist, 2017, 43: 311–347
Ma X, Hu Z, Liu J, et al. Stack-pointer networks for dependency parsing. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 1403–1414
Kiperwasser E, Goldberg Y. Easy-first dependency parsing with hierarchical tree LSTMs. Trans Assoc Comput Linguist, 2016, 4: 445–461
Strzyz M, Vilares D, Gómez-Rodríguez C. Viable dependency parsing as sequence labeling. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 717–723
Kulmizev A, de Lhoneux M, Gontrum J, et al. Deep contextualized word embeddings in transition-based and graph-based dependency parsing — a tale of two parsers revisited. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Hong Kong, 2019. 2755–2768
McDonald R. Discriminative learning and spanning tree algorithms for dependency parsing. Dissertation for Dcotoral Degree. Philadelphia: University of Pennsylvania, 2006
Bohnet B. Very high accuracy and fast dependency parsing is not a contradiction. In: Proceedings of the 23rd International Conference on Computational Linguistics, Proceedings of the Conference. Beijing, 2010. 89–97
Lei T, Xin Y, Zhang Y, et al. Low-rank tensors for scoring dependency structures. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, 2014. 1381–1391
Gómez-Rodríguez C, Nivre J. Divisible transition systems and multiplanar dependency parsing. Comput Linguist, 2013, 39: 799–845
Nivre J. Non-projective dependency parsing in expected linear time. In: Proceedings of the 47th Annual Meeting of the Association for Computational Linguistics and the 4th International Joint Conference on Natural Language Processing of the AFNLP. Singapore, 2009. 351–359
Sartorio F, Satta G, Nivre J. A transition-based dependency parser using a dynamic parsing strategy. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. Sofia, 2013. 135–144
Noji H, Miyao Y. Left-corner transitions on dependency parsing. In: Proceedings of the 25th International Conference on Computational Linguistics. Dublin, 2014. 2140–2150
Huang L, Sagae K. Dynamic programming for linear-time incremental parsing. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Uppsala, 2010. 1077–1086
Kuhlmann M, Gómez-Rodríguez C, Satta G. Dynamic programming algorithms for transition-based dependency parsers. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Portland, 2011. 673–682
Goldberg Y, Sartorio F, Satta G. A tabular method for dynamic oracles in transition-based parsing. Trans Assoc Comput Linguist, 2014, 2: 119–130
Gómez-Rodríguez C, Sartorio F, Satta G. A polynomial-time dynamic oracle for non-projective dependency parsing. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha, 2014. 917–927
Ballesteros M, Dyer C, Smith N A. Improved transition-based parsing by modeling characters instead of words with LSTMs. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, 2015. 349–359
de Lhoneux M, Ballesteros M, Nivre J. Recursive subtree composition in LSTM-based dependency parsing. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 1566–1576
Fernández-González D, Gómez-Rodríguez C. A dynamic oracle for linear-time 2-planar dependency parsing. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2018. 386–392
Sun W, Wan X. Data-driven, PCFG-based and pseudo-PCFG-based models for Chinese dependency parsing. Trans Assoc Comput Linguist, 2013, 1: 301–314
Goldberg Y, Elhadad M. An efficient algorithm for easy-first non-directional dependency parsing. In: Proceedings of Human Language Technologies: Conference of the North American Chapter of the Association of Computational Linguistics. Los Angeles, 2010. 742–750
Zhou G, Zhao J, Liu K, et al. Exploiting web-derived selectional preference to improve statistical dependency parsing. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Portland, 2011. 1556–1565
Turian J, Ratinov L A, Bengio Y. Word representations: A simple and general method for semi-supervised learning. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Uppsala, 2010. 384–394
Chen W, Zhang Y, Zhang M. Feature embedding for dependency parsing. In: Proceedings of the 25th International Conference on Computational Linguistics. Dublin, 2014. 816–826
Chen W, Kawahara D, Uchimoto K, et al. Dependency parsing with short dependency relations in unlabeled data. In: Proceedings of Third International Joint Conference on Natural Language Processing. Hyderabad, 2008
Chen W, Zhang M, Li H. Utilizing dependency language models for graph-based dependency parsing models. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics. Jeju Island, 2012. 213–222
Søgaard A, Rishøj C. Semi-supervised dependency parsing using generalized tri-training. In: Proceedings of the 23rd International Conference on Computational Linguistics. Beijing, 2010. 1065–1073
Li Z, Zhang M, Chen W. Ambiguity-aware ensemble training for semi-supervised dependency parsing. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, 2014. 457–467
Li Z, Liu T, Che W. Exploiting multiple treebanks for parsing with quasi-synchronous grammars. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics. Jeju Island, 2012. 675–684
Guo J, Che W, Wang H, et al. A universal framework for inductive transfer parsing across multi-typed treebanks. In: Proceedings of the 26th International Conference on Computational Linguistics. Osaka, 2016. 12–22
Jiang X, Li Z, Zhang B, et al. Supervised treebank conversion: Data and approaches. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 2706–2716
Oepen S, Kuhlmann M, Miyao Y, et al. SemEval 2014 task 8: Broad-coverage semantic dependency parsing. In: Proceedings of the 8th International Workshop on Semantic Evaluation. Dublin, 2014. 63–72
Che W, Shao Y, Liu T, et al. SemEval-2016 task 9: Chinese semantic dependency parsing. In: Proceedings of the 10th International Workshop on Semantic Evaluation. San Diego, 2016. 1074–1080
Surdeanu M, Johansson R, Meyers A, et al. The conll 2008 shared task on joint parsing of syntactic and semantic dependencies. In: Proceedings of the Twelfth Conference on Computational Natural Language Learning. Manchester, 2008. 159–177
Che W, Li Z, Li Y, et al. Multilingual dependency-based syntactic and semantic parsing. In: Proceedings of the Thirteenth Conference on Computational Natural Language Learning. Boulder, 2009. 49–54
Johansson R. Statistical bistratal dependency parsing. In: Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing. Singapore, 2009. 561–569
Titov I, Henderson J, Merlo P, et al. Online graph planarisation for synchronous parsing of semantic and syntactic dependencies. In: Proceedings of the 21st International Joint Conference on Artificial Intelligence. Pasadena, 2009
Henderson J, Merlo P, Titov I, et al. Multilingual joint parsing of syntactic and semantic dependencies with a latent variable model. Comput Linguist, 2013, 39: 949–998
Swayamdipta S, Ballesteros M, Dyer C, et al. Greedy, joint syntactic-semantic parsing with stack LSTMs. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning. Berlin, 2016. 187–197
Lluís X, Carreras X, Marquez L. Joint Arc-factored parsing of syntactic and semantic dependencies. Trans Assoc Comput Linguist, 2013, 1: 219–230
Sun W, Du Y, Kou X, et al. Grammatical relations in Chinese: GB-ground extraction and data-driven parsing. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, 2014. 446–456
Du Y, Zhang F, Zhang X, et al. Peking: Building semantic dependency graphs with a hybrid parser. In: Proceedings of the 9th International Workshop on Semantic Evaluation. Denver, 2015. 927–931
Almeida M S, Martins A F. Lisbon: Evaluating turbosemanticparser on multiple languages and out-of-domain data. In: Proceedings of the 9th International Workshop on Semantic Evaluation. Denver, 2015. 970–973
Peng H, Thomson S, Smith N A. Deep multitask learning for semantic dependency parsing. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 2037–2048
Wang Y, Che W, Guo J, et al. A neural transition-based approach for semantic dependency graph parsing. In: Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence. New Orleans, 2018
Dozat T, Manning C D. Simpler but more accurate semantic dependency parsing. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 484–490
Wang X, Huang J, Tu K. Second-order semantic dependency parsing with end-to-end neural networks. In: Proceedings of the 57th Conference of the Association for Computational Linguistics. Florence, 2019. 4609–4618
Thomson S, O’Connor B, Flanigan J, et al. CMU: Arc-factored, discriminative semantic dependency parsing. In: Proceedings of the 8th International Workshop on Semantic Evaluation. Dublin, 2014. 176–180
Kuhlmann M, Jonsson P. Parsing to noncrossing dependency graphs. Trans Assoc Comput Linguist, 2015, 3: 559–570
Cao J, Huang S, Sun W, et al. Parsing to 1-endpoint-crossing, pagenumber-2 graphs. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 2110–2120
Cao J, Huang S, Sun W, et al. Quasi-second-order parsing for 1-endpoint-crossing, pagenumber-2 graphs. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen, 2017. 24–34
Sun W, Du Y, Wan X. Parsing for grammatical relations via graph merging. In: Proceedings of the 21st Conference on Computational Natural Language Learning. Vancouver, 2017. 26–35
Sun W, Cao J, Wan X. Semantic dependency parsing via book embedding. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 828–838
Ribeyre C, Villemonte de la Clergerie E, Seddah D. Alpage: Transition-based semantic graph parsing with syntactic features. In: Proceedings of the 8th International Workshop on Semantic Evaluation. Dublin, 2014. 97–103
Kanerva J, Luotolahti J, Ginter F. Turku: Semantic dependency parsing as a sequence classification. In: Proceedings of Proceedings of the 9th International Workshop on Semantic Evaluation. Denver, 2015. 965–969
Sagae K, Tsujii J. Shift-reduce dependency DAG parsing. In: Proceedings of the 22nd International Conference on Computational Linguistics. Manchster, 2008. 753–760
Tokgöz A, Eryiǧit G. Transition-based dependency dag parsing using dynamic oracles. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing: Student Research Workshop. Beijing, 2015. 22–27
Zhang X, Du Y, Sun W, et al. Transition-based parsing for deep dependency structures. Comput Linguist, 2016, 42: 353–389
Gildea D, Satta G, Peng X. Cache transition systems for graph parsing. Comput Linguist, 2018, 44: 85–118
Buys J, Blunsom P. Robust incremental neural semantic graph parsing. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 1215–1226
Agić Ž, Koller A. Potsdam: Semantic dependency parsing by bidirectional graph-tree transformations and syntactic parsing. In: Proceedings of the 8th International Workshop on Semantic Evaluation. Dublin, 2014. 465–470
Schluter N, Søgaard A, Elming J, et al. Copenhagen-malmö: Tree approximations of semantic parsing problems. In: Proceedings of the 8th International Workshop on Semantic Evaluation. Dublin, 2014. 213–217
Agić Ž, Koller A, Oepen S. Semantic dependency graph parsing using tree approximations. In: Proceedings of the 11th International Conference on Computational Semantics. London, 2015
McClosky D, Charniak E, Johnson M. Reranking and self-training for parser adaptation. In: Proceedings of the 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics. Sydney, 2006. 337–344
Sagae K. Self-training without reranking for parser domain adaptation and its impact on semantic role labeling. In: Proceedings of Workshop on Domain Adaptation for Natural Language Processing. Uppsala, 2010. 37–44
Kawahara D, Uchimoto K. Learning reliability of parses for domain adaptation of dependency parsing. In: Proceedings of Third International Joint Conference on Natural Language Processing. Hyderabad, 2008
Chen W, Wu Y, Isahara H. Learning reliable information for dependency parsing adaptation. In: Proceedings of the 22nd International Conference on Computational Linguistics. Manchster, 2008. 113–120
Yu J, Elkaref M, Bohnet B. Domain adaptation for dependency parsing via self-training. In: Proceedings of the 14th International Conference on Parsing Technologies. Bilbao, 2015. 1–10
Steedman M, Hwa R, Clark S, et al. Example selection for bootstrapping statistical parsers. In: Proceedings of Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics. Edmonton, 2003. 236–243
Sagae K, Tsujii J. Dependency parsing and domain adaptation with LR models and parser ensembles. In: Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Prague, 2007. 1044–1050
Plank B and van Noord G. Effective measures of domain similarity for parsing. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. Portland, 2011. 1566–1576
Yang H, Zhuang T, Zong C. Domain adaptation for syntactic and semantic dependency parsing using deep belief networks. Trans Assoc Comput Linguist, 2015, 3: 271–282
McClosky D, Charniak E, Johnson M. Automatic domain adaptation for parsing. In: Proceedings of Human Language Technologies: Conference of the North American Chapter of the Association of Computational Linguistics. Los Angeles, 2010. 28–36
Reichart R, Rappoport A. Self-training for enhancement and domain adaptation of statistical parsers trained on small datasets. In: Proceedings of the 45th Annual Meeting of the Association for Computational Linguistics. Prague, 2007. 616–623
Daumeí III H. Frustratingly easy domain adaptation. In: Proceedings of the 45th Annual Meeting of the Association for Computational Linguistics. Prague, 2007. 256–263
Finkel J R, Manning C D. Hierarchical Bayesian domain adaptation. In: Proceedings of Human Language Technologies: Conference of the North American Chapter of the Association of Computational Linguistics. Boulder, 2009. 602–610
Ganin Y, Lempitsky V. Unsupervised domain adaptation by back-propagation. In: Proceedings of the 32nd International Conference on Machine Learning. Lille, 2015. 1180–1189
Sano M, Manabe H, Noji H, et al. Adversarial training for cross-domain universal dependency parsing. In: Proceedings of the CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies. Vancouver, 2017. 71–79
Joshi V, Peters M, Hopkins M. Extending a parser to distant domains using a few dozen partially annotated examples. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. Melbourne, 2018. 1190–1199
Flannery D, Mori S. Combining active learning and partial annotation for domain adaptation of a japanese dependency parser. In: Proceedings of the 14th International Conference on Parsing Technologies. Bilbao, 2015. 11–19
Li Z, Peng X, Zhang M, et al. Semi-supervised domain adaptation for dependency parsing. In: Proceedings of the 57th Conference of the Association for Computational Linguistics. Florence, 2019. 2386–2395
Zeman D, Resnik P. Cross-language parser adaptation between related languages. In: Proceedings of the Third International Joint Conference on Natural Language Processing-08 Workshop on NLP for Less Privileged Languages. Hyderabad, 2008
McDonald R, Petrov S, Hall K. Multi-source transferofdelexicalized dependency parsers. In: Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing. Edinburgh, 2011. 62–72
Cohen S B, Das D, Smith N A. Unsupervised structure prediction with non-parallel multilingual guidance. In: Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing. Edinburgh, 2011. 50–61
Naseem T, Barzilay R, Globerson A. Selective sharing for multilingual dependency parsing. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics. Jeju Island, 2012. 629–637
Täckström O, McDonald R, Uszkoreit J. Cross-lingual word clusters for direct transfer of linguistic structure. In: Proceedings of the Conference of the North American Chapter of the Association of Computational Linguistics: Human Language Technologies. Montreal, 2012. 477–487
Guo J, Che W, Yarowsky D, et al. Cross-lingual dependency parsing based on distributed representations. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Beijing, 2015. 1234–1244
Zhang Y, Barzilay R. Hierarchical low-rank tensors for multilingual transfer parsing. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, 2015. 1857–1867
Guo J, Che W, Yarowsky D, et al. A representation learning framework for multi-source transfer parsing. In: Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence. Phoenix, 2016
Wick M, Kanani P, Pocock A. Minimally-constrained multilingual embeddings via artificial code-switching. In: Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence. Phoenix, 2016
Schuster T, Ram O, Barzilay R, et al. Cross-lingual alignment of contextual word embeddings, with applications to zero-shot dependency parsing. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 1599–1613
Wang Y, Che W, Guo J, et al. Cross-lingual BERT transformation for zero-shot dependency parsing. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Hong Kong, 2019. 5721–5727
Wu S, Dredze M. Beto, bentz, becas: The surprising cross-lingual effectiveness of BERT. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Hong Kong, 2019. 833–844
Lample G, Conneau A. Cross-lingual language model pretraining. ArXiv: 1901.07291, 2019
Wu S, Conneau A, Li H, et al. Emerging cross-lingual structure in pretrained language models. ArXiv: 1911.01464
Snyder B, Naseem T, Barzilay R. Unsupervised multilingual grammar induction. In: Proceedings of the 47th Annual Meeting of the Association for Computational Linguistics and the 4th International Joint Conference on Natural Language Processing of the AFNLP. Singapore, 2009. 73–81
Jiang W, Liu Q, Lv Y. Relaxed cross-lingual projection of constituent syntax. In: Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing. Edinburgh, 2011. 1192–1201
Hwa R, Resnik P, Weinberg A, et al. Bootstrapping parsers via syntactic projection across parallel texts. Nat Lang Eng, 2005, 11: 311–325
Li Z, Zhang M, Chen W. Soft cross-lingual syntax projection for dependency parsing. In: Proceedings of the 25th International Conference on Computational Linguistics. Dublin, 2014. 783–793
Ma X, Xia F. Unsupervised dependency parsing with transferring distribution via parallel guidance and entropy regularization. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, 2014. 1337–1348
Schlichtkrull M, Søgaard A. Cross-lingual dependency parsing with late decoding for truly low-resource languages. In: Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics. Valencia, 2017. 220–229
Rasooli M S, Collins M. Density-driven cross-lingual transfer of dependency parsers. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, 2015. 328–338
Agić Ž, Johannsen A, Plank B, et al. Multilingual projection for parsing truly low-resource languages. Trans Assoc Comput Linguist, 2016, 4: 301–312
Jiang W, Liu Q, Supnithi T. Joint learning of constituency and dependency grammars by decomposed cross-lingual induction. In: Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence. Austin, 2015
Tiedemann J, Agić Ž, and Nivre J. Treebank translation for cross-lingual parser induction. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha, 2014. 130–140
Tiedemann J, Agić Ž. Synthetic treebanking for cross-lingual dependency parsing. jair, 2016, 55: 209–248
Zhang M, Zhang Y, Fu G. Cross-lingual dependency parsing using code-mixed TreeBank. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Hong Kong, 2019. 997–1006
Rasooli M S, Collins M. Cross-lingual syntactic transfer with limited resources. Trans Assoc Comput Linguist, 2017, 5: 279–293
Wang D, Eisner J. Synthetic data made to order: The case of parsing. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 1325–1337
Rasooli M S, Collins M. Low-resource syntactic transfer with unsupervised source reordering. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 3845–3856
Smith D A, Smith N A. Bilingual parsing with factored estimation: Using english to parse korean. In: Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing. Barcelona, 2004. 49–56
Burkett D, Klein D. Two languages are better than one (for syntactic parsing). In: Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing. Honolulu, 2008. 877–886
Ammar W, Mulcaire G, Ballesteros M, et al. Many languages, one parser. Trans Assoc Comput Linguist, 2016, 4: 431–444
Smith A, Bohnet B, de Lhoneux M, et al. 82 treebanks, 34 models: Universal dependency parsing with multi-treebank models. In: Proceedings of the 22nd Conference on Computational Natural Language Learning. Brussels, 2018. 113–123
Kondratyuk D, Straka M. 75 languages, 1 model: Parsing universal dependencies universally. In: Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Hong Kong, 2019. 2779–2795
Li J, Zhou G, Ng H T. Joint syntactic and semantic parsing of Chinese. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. Uppsala, 2010. 1108–1117
Wang Z, Xue N. Joint POS tagging and transition-based constituent parsing in Chinese with non-local features. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, 2014. 733–742
Li Z, Zhang M, Che W, et al. Joint models for Chinese POS tagging and dependency parsing. In: Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing. Edinburgh, 2011. 1180–1191
Li Z, Zhang M, Che W, et al. A separately passive-aggressive training algorithm for joint POS tagging and dependency parsing. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Jeju Island, 2012. 1681–1698
Hatori J, Matsuzaki T, Miyao Y, et al. Incremental joint POS tagging and dependency parsing in Chinese. In: Proceedings of Fifth International Joint Conference on Natural Language Processing. Chiang Mai, 2011. 1216–1224
Bohnet B, Nivre J. A transition-based system for joint part-of-speech tagging and labeled non-projective dependency parsing. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Jeju Island, 2012. 1455–1465
Alberti C, Weiss D, Coppola G, et al. Improved transition-based parsing and tagging with neural networks. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, 2015. 1354–1359
Zhang Y, Weiss D. Stack-propagation: Improved representation learning for syntax. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 1557–1566
Yang L, Zhang M, Liu Y, et al. Joint pos tagging and dependence parsing with transition-based neural networks. IEEE/ACM Trans Audio Speech Language Process, 2017, 26: 1352–1358
Luo X. A maximum entropy Chinese character based parser. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing. Sapporo, 2003. 192–199
Zhao H. Character-level dependencies in Chinese: Usefulness and learning. In: Proceedings of the 12th Conference of the European Chapter of the Association for Computational Linguistics. Athens, 2009. 879–887
Hatori J, Matsuzaki T, Miyao Y, et al. Incremental joint approach to word segmentation, POS tagging, and dependency parsing in Chinese. In: Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics. Jeju Island, 2012. 1045–1053
Li Z, Zhou G. Unified dependency parsing of Chinese morphological and syntactic structures. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Jeju Island, 2012. 1445–1454
Zhang M, Zhang Y, Che W, et al. Chinese parsing exploiting characters. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. Sofia, 2013. 125–134
Zhang M, Zhang Y, Che W, et al. Character-level Chinese dependency parsing. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore, 2014. 1326–1336
Zhang Y, Li C, Barzilay R, et al. Randomized greedy inference for joint segmentation, POS tagging and dependency parsing. In: Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Denver, 2015. 42–52
Zheng X, Peng H, Chen Y, et al. Character-based parsing with convolutional neural network. In: Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence. Buenos Aires, 2015
Li H, Zhang Z, Ju Y, et al. Neural character-level dependency parsing for chinese. In: Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence. New Orleans, 2018
Yan H, Qiu X, Huang X. A unified model for joint chinese word segmentation and dependency parsing. ArXiv: 1904.04697
Johansson R, Nugues P. Dependency-based semantic role labeling of PropBank. In: Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing. Honolulu, 2008. 69–78
Strubell E, Verga P, Andor D, et al. Linguistically-informed self-attention for semantic role labeling. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 5027–5038
Zhang M, Zhang J, Su J. Exploring syntactic features for relation extraction using a convolution tree kernel. In: Proceedings of Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics. New York, 2006. 288–295
Miwa M, Bansal M. End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 1105–1116
Tai K S, Socher R, Manning C D. Improved semantic representations from tree-structured long short-term memory networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing. Beijing, 2015. 1556–1566
Zhang M, Li Z, Fu G, et al. Syntax-enhanced neural machine translation with syntax-aware word representations. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2019. 1151–1161
Liu Y, Liu Q, Lin S. Tree-to-string alignment template for statistical machine translation. In: 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics. Sydney, 2006. 609–616
Qiu L, Zhang Y. ZORE: A syntax-based system for Chinese open relation extraction. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha, 2014. 1870–1880
Che W, Zhang M, Liu T, et al. A hybrid convolution tree kernel for semantic role labeling. In: 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics. Sydney, 2006. 73–80
Yang X, Su J, Tan C L. Kernel-based pronoun resolution with structured syntactic knowledge. In: 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics. Sydney, 2006. 41–48
Zhou G, Zhang M, Ji D H, et al. Tree kernel-based relation extraction with context-sensitive structured parse tree information. In: Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Prague, 2007. 728–736
Zhang M, Li H. Tree kernel-based SVM with structured syntactic knowledge for BTG-based phrase reordering. In: Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing. Singapore, 2009. 698–707
Johansson R, Nugues P. The effect of syntactic representation on semantic role labeling. In: Proceedings of the 22nd International Conference on Computational Linguistics. Manchster, 2008. 393–400
Xu Y, Mou L, Li G, et al. Classifying relations via long short term memory networks along shortest dependency paths. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, 2015. 1785–1794
Roth M, Lapata M. Neural semantic role labeling with dependency path embeddings. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Berlin, 2016. 1192–1202
Mou L, Peng H, Li G, et al. Discriminative neural sentence modeling by tree-based convolution. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. Lisbon, 2015. 2315–2325
Zhang X, Lu L, Lapata M. Top-down tree long short-term memory networks. In: Proceedings of The 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. San Diego, 2016. 310–320
Teng Z, Zhang Y. Head-lexicalized bidirectional tree LSTMs. Trans Assoc Comput Linguist, 2017, 5: 163–177
Li J, Xiong D, Tu Z, et al. Modeling source syntax for neural machine translation. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. Vancouver, 2017. 688–697
Wu S, Zhou M, Zhang D. Improved neural machine translation with source syntax. In: Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence. Melbourne, 2017. 4179–4185
Zhang M, Zhang Y, Fu G. End-to-end neural relation extraction with global optimization. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen, 2017. 1730–1740
Yu N, Zhang M, Fu G. Transition-based neural RST parsing with implicit syntax features. In: Proceedings of the 27th International Conference on Computational Linguistics. Santa Fe, 2018. 559–570
Bastings J, Titov I, Aziz W, et al. Graph convolutional encoders for syntax-aware neural machine translation. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen, 2017. 1957–1967
Zhang Y, Qi P, Manning C D. Graph convolution over pruned dependency trees improves relation extraction. In: Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, 2018. 2205–2215
Marcheggiani D, Bastings J, Titov I. Exploiting semantics in neural machine translation with graph convolutional networks. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Minneapolis, 2018. 486–492
Marcus M P, Santorini B, Marcinkiewicz M A. Building a large annotated corpus of English: The Penn Treebank. Comput Linguist, 1993, 19: 313–330
Xue N, Xia F, Chiou F D, et al. The penn chinese treebank: Phrase structure annotation of a large corpus. NLE, 2005, 11: 207–238
Johansson R, Nugues P. Extended constituent-to-dependency conversion for english. In: Proceedings of the 16th Nordic Conference of Computational Linguistics. Tartu, 2007. 105–112
Johansson R, Nugues P. Lth: semantic structure extraction using non-projective dependency trees. In: Proceedings of the 4th International Workshop on Semantic Evaluations. Prague, 2007. 227–230
Zhenghua L, Wanxiang C, Ting L. A study on the conversion of phrase-structural trees into dependencies. J Chin Inform Process, 2008, 22: 14–19
De Marneffe M C, MacCartney B, Manning C D, et al. Generating typed dependency parses from phrase structure parses. In: Proceedings of the Fifth International Conference on Language Resources and Evaluation. Genoa, 2006. 449–454
Tateisi Y, Yakushiji A, Ohta T, et al. Syntax annotation for the genia corpus. In: Proceedings of Natural Language Processing — IJCNLP 2005, Second International Joint Conference. Jeju Island, 2005. 222–227
Kong L, Schneider N, Swayamdipta S, et al. A dependency parser for tweets. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha, 2014. 1001–1012
De Marneffe M C, Dozat T, Silveira N, et al. Universal stanford dependencies: A cross-linguistic typology. In: Proceedings of the Ninth International Conference on Language Resources and Evaluation. Reykjavik, 2014. 4585–4592
Petrov S, Das D, McDonald R. A universal part-of-speech tagset. In: Proceedings of the Eighth International Conference on Language Resources and Evaluation. Istanbul, 2012
McDonald R, Nivre J, Quirmbach-Brundage Y, et al. Universal dependency annotation for multilingual parsing. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. Sofia, 2013. 92–97
Nivre J, Agić Ž, Aranzabe M J, et al. Universal dependencies 1.2. 2015
Nivre J, Agić Ž, Ahrenberg L, et al. Universal dependencies 2.1. 2017
Chen K J, Luo C C, Chang M C, et al. Sinica treebank: Design criteria, representational issues and implementation, chapter 13, 2003
Qiang Z. Annotation scheme for Chinese treebank. J Chin Infor Proc, 2004, 18: 1–8
Zhan W. The application of treebank to assist chinese grammar instruction: A preliminary investigation. J Tech Chin Language Teach, 2012, 3: 16–29
Liu T, Ma J, Li S. Building a dependency treebank for improving chinese parser. J Chin Language Comput, 2006, 16: 207–224
Che W, Li Z, Liu T. Chinese Dependency Treebank 1.0 ldc2012t05. Philadelphia: Linguistic Data Consortium, 2012
Qiu L, Zhang Y, Jin P, et al. Multi-view Chinese treebanking. In: Proceedings of the 25th International Conference on Computational Linguistics. Dublin, 2014. 257–268
Qiu L, Zhang Y, Zhang M. Dependency tree representations of predicate-argument structures. In: Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence. Phoenix, 2016
Buchholz S, Marsi E. Conll-x shared task on multilingual dependency parsing. In: Proceedings of the Tenth Conference on Computational Natural Language Learning. New York, 2006. 149–164
Nivre J, Hall J, Kubler S, et al. The CoNLL 2007 shared task on dependency parsing. In: Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. Prague, 2007. 915–932
Zeman D, Popel M, Straka M, et al. CoNLL 2017 shared task: Multilingual parsing from raw text to universal dependencies. In: Proceedings of the 21st Conference on Computational Natural Language Learning. Vancouver, 2017. 1–19
Zeman D, Hajic J, Popel M, et al. CoNLL 2018 shared task: Multilingual parsing from raw text to universal dependencies. In: Proceedings of the 22nd Conference on Computational Natural Language Learning. Brussels, 2018. 1–21
Petrov S, McDonald R. Overview of the 2012 shared task on parsing the web. In: Proceedings of First Workshop on Syntactic Analysis of Non-Canonical Language (SANCL). Montreal, 2012
Peng X, Li Z, Zhang M, et al. Overview of the nlpcc 2019 shared task: cross-domain dependency parsing. In: Proceedings of Natural Language Processing and Chinese Computing — 8th CCF International Conference. Dunhuang, 2019. 760–771
Che W, Zhang M, Shao Y, et al. Semeval-2012 task 5: Chinese semantic dependency parsing. In: Proceedings of the 6th International Workshop on Semantic Evaluation. Montreal, 2012. 378–384
Oepen S, Abend O, Hajic J, et al. MRP 2019: Cross-framework meaning representation parsing. In: Proceedings of the Shared Task on Cross-Framework Meaning Representation Parsing at the 2019 Conference on Natural Language Learning. Hong Kong, 2019. 1–27
Author information
Authors and Affiliations
Corresponding author
Additional information
This work was supported by the National Natural Science Foundation of China (Grant Nos. 61602160 and 61672211).
Rights and permissions
About this article
Cite this article
Zhang, M. A survey of syntactic-semantic parsing based on constituent and dependency structures. Sci. China Technol. Sci. 63, 1898–1920 (2020). https://doi.org/10.1007/s11431-020-1666-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11431-020-1666-4