This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
Love, Rachel et al. “Natural Language Communication with a Teachable Agent”, CoRR (2022).LoveRachel “Natural Language Communication with a Teachable Agent”, CoRR (2022).Search in Google Scholar
Qian Yangge et al. “A review of deep learning-based text semantic matching.” Software Guide 21.12 (2022): 252-261.YanggeQian “A review of deep learning-based text semantic matching.” Software Guide21.12 (2022): 252-261.Search in Google Scholar
Sun, Simeng, and Mohit Iyyer. “Revisiting Simple Neural Probabilistic Language Models”, North American Chapter of the Association for Computational Linguistics abs/2104.03474 (2021): 5181-5188.SunSimengIyyerMohit “Revisiting Simple Neural Probabilistic Language Models”, North American Chapter of the Association for Computational Linguistics abs/2104.03474 (2021): 5181-5188.Search in Google Scholar
Zhang, Min, and Li, J. T. “Generative pre-training model.” Chinese Science Foundation 35.03 (2021): 403-406. doi: 10.16262/j.cnki.1000-8217.2021.03.014.ZhangMinLiJ. T. “Generative pre-training model.” Chinese Science Foundation35.03 (2021): 403-406. DOI: 10.16262/j.cnki.1000-8217.2021.03.014.Open DOISearch in Google Scholar
Jingsheng Zhao, et al. “A study of text representation in natural language processing.” Journal of Software 33.01 (2022): 102-128. doi:10.13328/j.cnki.jos.006304.ZhaoJingsheng “A study of text representation in natural language processing.” Journal of Software33.01 (2022): 102-128. doi:10.13328/j.cnki.jos.006304.Open DOISearch in Google Scholar
Zeng, Yanhong et al. “Learning Pyramid-Context Encoder Network for High-Quality Image Inpainting”, 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019) abs/1904.07475 (2019): 1486-1494.ZengYanhong “Learning Pyramid-Context Encoder Network for High-Quality Image Inpainting”, 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019) abs/1904.07475 (2019): 1486-1494.Search in Google Scholar
Linyang, Li et al. “BERT-ATTACK: Adversarial Attack Against BERT Using BERT”, Conference on Empirical Methods in Natural Language Processing 2020.emnlp-main (2020): 6193-6202.LinyangLi “BERT-ATTACK: Adversarial Attack Against BERT Using BERT”, Conference on Empirical Methods in Natural Language Processing 2020.emnlp-main (2020): 6193-6202.Search in Google Scholar
Lee, Jinhyuk et al. “BioBERT: a pre-trained biomedical language representation model for biomedical text mining”, Bioinformatics 36.4 (2020): 1234-1240.LeeJinhyuk “BioBERT: a pre-trained biomedical language representation model for biomedical text mining”, Bioinformatics36.4 (2020): 1234-1240.Search in Google Scholar
Zhang, Zhengyan et al. “Ernie: Enhanced Language Representation with Informative Entities”, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019) abs/1905.07129 (2019): 1441-1451:ZhangZhengyan “Ernie: Enhanced Language Representation with Informative Entities”, 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019) abs/1905.07129 (2019): 1441-1451:Search in Google Scholar
Subakan, Cem et al. “Attention is All You Need in Speech Separation”, IEEE International Conference on Acoustics, Speech, and Signal Processing abs/2010.13154 (2021): 21-25.SubakanCem “Attention is All You Need in Speech Separation”, IEEE International Conference on Acoustics, Speech, and Signal Processing abs/2010.13154 (2021): 21-25.Search in Google Scholar
Sufeng, Duan, and Zhao Hai. “Attention Is All You Need for Chinese Word Segmentation”, Conference on Empirical Methods in Natural Language Processing 2020.emnlp-main (2020): 3862-3872.SufengDuanHaiZhao “Attention Is All You Need for Chinese Word Segmentation”, Conference on Empirical Methods in Natural Language Processing 2020.emnlp-main (2020): 3862-3872.Search in Google Scholar