[1] 陈刚.“数字人文”与历史地理信息化研究[J].南京社会科学,2014(3):136142.
[2] 夏翠娟.中国历史地理数据在图书馆数字人文项目中的开放应用研究[J].中国图书馆学报,2017, 43(2):4053.
[3] 常博林,万晨,李斌,等.基于词和实体标注的古籍数字人文知识库的构建与应用——以《资治通鉴·周秦汉纪》为例[J].图书情报工作, 2021, 65(22):134142.
[4] 沈雪莹,欧石燕,卢彤彤.中国古代文人生平知识图谱构建与应用——以李白和杜甫为例[J].数字图书馆论坛,2023, 19(8):114.
[5] BOL P K. GIS, prosopography and history[J]. Annals of GIS, 2012, 18(1):315.
[6] 张琪,王东波,黄水清,等.史书多维知识重组与可视化研究——以《史记》为对象[J].情报学报,2022, 41(2):131141.
[7] 李晓敏,王昊,李跃艳,等.数字人文视域下中国行政区划地名演化知识库构建及分析研究[J].数据分析与知识发现,2022, 6(11):139153.
[8] 徐蒙蒙.地方志时空数据组织与应用[D].南京:南京师范大学,2015.
[9] 胡颖.家谱GIS中古今地名的时空关系研究[D].南京:南京师范大学, 2008.
[10] 李章超,何琳,喻雪寒.基于事理图谱的典籍内容知识组织与应用——以《左传》为例[J].图书馆论坛,2024, 44(4):125137.
[11] 马晓雯,何琳,刘建斌,等.基于BiLSTM的古籍事件句触发词分类方法研究[J].农业图书情报学报,2021, 33(9):2736.
[12] 刘忠宝,党建飞,张志剑.《史记》历史事件自动抽取与事理图谱构建研究[J].图书情报工作,2020, 64(11):116124.
[13] 喻雪寒,何琳,徐健.基于RoBERTaCRF的古文历史事件抽取方法研究[J].数据分析与知识发现,2021,5(7):2635.
[14] 喻雪寒,何琳,王献琪.基于机器阅读理解的古文事件抽取研究[J].情报学报,2023, 42(3):316326.
[15] Zhang J, Wei Y, Zhu Y, et al. Selfadaptive prompttuning for event extraction in ancient Chinese literature[C]//2023 International Joint Conference on Neural
Networks(IJCNN). 2023:18.
[16] 王彦莹,王昊,朱惠,等.基于文本生成技术的历史古籍事件识别模型构建研究[J].图书情报工作,2023, 67(3):119130.
[17] 王东波,刘畅,朱子赫,等.SikuBERT与SikuRoBERTa:面向数字人文的《四库全书》预训练模型构建及应用研究[J].图书馆论坛,2022, 42(6):3143.
[18] XunziLLMofChineseclassics/XunziALLM[CP/OL]. XunziLLMofChineseclassics, 2024[20240519].
https://github.com/XunziLLMofChineseclassics/XunziALLM.
[19] CHGIS. V6 time series prefecture points[DS/OL]. Harvard Dataverse, 2017[20240820]. https://dataverse.harvard.edu/dataset.xhtml?
persistentId=doi:10.7910/DVN/WW1PD6.
[20] CHGIS. V6 time series prefecture polygons[DS/OL]. Harvard Dataverse, 2017[20240827]. https://dataverse.harvard.edu/dataset.xhtml?
persistentId=doi:10.7910/DVN/I0Q7SM.
[21] Che W, Li Z, Liu T. LTP: a Chinese language technology platform[C]//Coling 2010: Demonstrations. Beijing, China: Coling 2010 Organizing Committee, 2010:1316.
[22] Wei C, Feng Z, Huang S, et al. CHED: a crosshistorical dataset with a logical event schema for classical Chinese event detection[C]//Sun M, Qin B, Qiu X, et al.
Chinese Computational Linguistics. Singapore: Springer Nature, 2023:289305.
[23] Gururangan S, Marasovic A, Swayamdipta S, et al. Dont stop pretraining: adapt language models to domains and tasks[C]//Proceedings of the 58th Annual Meeting
of the Association for Computational Linguistics. Online: Association for Computational Linguistics, 2020:83428360.
[24] Lcclabblcu/CHED[CP/OL]. lcclabblcu, 2024[20240815]. https://github.com/lcclabblcu/CHED.
[25] Wang D, Liu C, Zhao Z, et al. GujiBERT and GujiGPT: construction of intelligent information processing foundation language models for ancient texts[EB/OL]. (2023
0711) [20240816]. https://arxiv.org/abs/2307.05354v1.
[26] Hiyouga/LLaMAFactory: unify efficient finetuning of 100+LLMs[EB/OL]. [20240520]. https://github.com/hiyouga/LLaMAFactory.
[27] Lin L, Wu M, Shen X, et al. Multimodel classical Chinese event trigger word recognition driven by incremental pretraining[C]//Lin H, Tan H, Li B. Proceedings
of the 23rd Chinese National Conference on Computational Linguistics(Volume 3: Evaluations). Taiyuan, China: Chinese Information Processing Society of China, 2024:178190.
[28] 耿云冬,张逸勤,刘欢,等.面向数字人文的中国古代典籍词性自动标注研究——以SikuBERT预训练模型为例[J]. 图书馆论坛, 2022, 42(6):5563.
[29] Reimers N, Gurevych I. SentenceBERT: sentence embeddings using siamese BERTNetworks[C]//Inui K, Jiang J, Ng V, et al. Proceedings of the 2019 conference on
empirical methods in natural language processing and the 9th International Joint Conference on Natural Language Processing(EMNLPIJCNLP). Hong Kong, China: Association for
Computational Linguistics, 2019: 39823992.
|