CINXE.COM

Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition | International Journal of Computational Intelligence Systems

<!DOCTYPE html> <html lang="en" class="no-js"> <head> <meta charset="UTF-8"> <meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta name="applicable-device" content="pc,mobile"> <meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="robots" content="max-image-preview:large"> <meta name="access" content="Yes"> <meta name="360-site-verification" content="1268d79b5e96aecf3ff2a7dac04ad990" /> <title>Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition | International Journal of Computational Intelligence Systems</title> <meta name="twitter:site" content="@SpringerLink"/> <meta name="twitter:card" content="summary_large_image"/> <meta name="twitter:image:alt" content="Content cover image"/> <meta name="twitter:title" content="Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition"/> <meta name="twitter:description" content="International Journal of Computational Intelligence Systems - Extracting meetup events from social network posts or webpage announcements is the core technology to build event search services on..."/> <meta name="twitter:image" content="https://static-content.springer.com/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig1_HTML.png"/> <meta name="journal_id" content="44196"/> <meta name="dc.title" content="Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition"/> <meta name="dc.source" content="International Journal of Computational Intelligence Systems 2024 17:1"/> <meta name="dc.format" content="text/html"/> <meta name="dc.publisher" content="Springer"/> <meta name="dc.date" content="2024-11-26"/> <meta name="dc.type" content="OriginalPaper"/> <meta name="dc.language" content="En"/> <meta name="dc.copyright" content="2024 The Author(s)"/> <meta name="dc.rights" content="2024 The Author(s)"/> <meta name="dc.rightsAgent" content="journalpermissions@springernature.com"/> <meta name="dc.description" content="Extracting meetup events from social network posts or webpage announcements is the core technology to build event search services on the Web. While event extraction in English achieves good performance in sentence-level evaluation [1], the quality of auto-labeled training data via distant supervision is not good enough for word-level event extraction due to long event titles [2]. Additionally, meetup event titles are more complex and diverse than trigger-word-based event extraction. Therefore, the performance of event title extraction is usually worse than that of traditional named entity recognition (NER). In this paper, we propose a context-aware meetup event extraction (CAMEE) framework that incorporates a sentence-level event argument positioning model to locate event fields (i.e., title, venue, dates, etc.) within a message and then perform word-level event title, venue, and date extraction. Experimental results show that adding sentence-level event argument positioning as a filtering step improves the word-level event field extraction performance from 0.726 to 0.743 macro-F1, outperforming large language models like GPT-4-turbo (with 0.549 F1) and SOTA NER model SoftLexicon (with 0.733 F1). Furthermore, when evaluating the main event extraction task, the proposed model achieves 0.784 macro-F1."/> <meta name="prism.issn" content="1875-6883"/> <meta name="prism.publicationName" content="International Journal of Computational Intelligence Systems"/> <meta name="prism.publicationDate" content="2024-11-26"/> <meta name="prism.volume" content="17"/> <meta name="prism.number" content="1"/> <meta name="prism.section" content="OriginalPaper"/> <meta name="prism.startingPage" content="1"/> <meta name="prism.endingPage" content="16"/> <meta name="prism.copyright" content="2024 The Author(s)"/> <meta name="prism.rightsAgent" content="journalpermissions@springernature.com"/> <meta name="prism.url" content="https://link.springer.com/article/10.1007/s44196-024-00697-0"/> <meta name="prism.doi" content="doi:10.1007/s44196-024-00697-0"/> <meta name="citation_pdf_url" content="https://link.springer.com/content/pdf/10.1007/s44196-024-00697-0.pdf"/> <meta name="citation_fulltext_html_url" content="https://link.springer.com/article/10.1007/s44196-024-00697-0"/> <meta name="citation_journal_title" content="International Journal of Computational Intelligence Systems"/> <meta name="citation_journal_abbrev" content="Int J Comput Intell Syst"/> <meta name="citation_publisher" content="Springer Netherlands"/> <meta name="citation_issn" content="1875-6883"/> <meta name="citation_title" content="Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition"/> <meta name="citation_volume" content="17"/> <meta name="citation_issue" content="1"/> <meta name="citation_publication_date" content="2024/12"/> <meta name="citation_online_date" content="2024/11/26"/> <meta name="citation_firstpage" content="1"/> <meta name="citation_lastpage" content="16"/> <meta name="citation_article_type" content="Research Article"/> <meta name="citation_fulltext_world_readable" content=""/> <meta name="citation_language" content="en"/> <meta name="dc.identifier" content="doi:10.1007/s44196-024-00697-0"/> <meta name="DOI" content="10.1007/s44196-024-00697-0"/> <meta name="size" content="345594"/> <meta name="citation_doi" content="10.1007/s44196-024-00697-0"/> <meta name="citation_springer_api_url" content="http://api.springer.com/xmldata/jats?q=doi:10.1007/s44196-024-00697-0&amp;api_key="/> <meta name="description" content="Extracting meetup events from social network posts or webpage announcements is the core technology to build event search services on the Web. While event e"/> <meta name="dc.creator" content="Lin, Yuan-Hao"/> <meta name="dc.creator" content="Chang, Chia-Hui"/> <meta name="dc.creator" content="Chuang, Hsiu-Min"/> <meta name="dc.subject" content="Computational Intelligence"/> <meta name="dc.subject" content="Artificial Intelligence"/> <meta name="dc.subject" content="Mathematical Logic and Foundations"/> <meta name="dc.subject" content="Control, Robotics, Mechatronics"/> <meta name="citation_reference" content="Wang, Q., Kanagal, B., Garg, V., Sivakumar, D.: Constructing a comprehensive events database from the web. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. CIKM &#8217;19, pp. 229&#8211;238. Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3357384.3357986 "/> <meta name="citation_reference" content="citation_journal_title=J. Inf. Sci. Eng.; citation_title=Eventgo! mining events through semi-supervised event title recognition and pattern-based venue/date coupling; citation_author=Y-H Lin, C-H Chang, H-M Chuang; citation_volume=39; citation_issue=3; citation_publication_date=2023; citation_pages=655-670; citation_doi=10.6688/JISE.20230339(2).0014; citation_id=CR2"/> <meta name="citation_reference" content="Foley, J., Bendersky, M., Josifovski, V.: Learning to extract local events from the web. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval. SIGIR &#8217;15, pp. 423&#8211;432. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2766462.2767739 "/> <meta name="citation_reference" content="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171&#8211;4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). https://doi.org/10.18653/v1/N19-1423 . https://www.aclweb.org/anthology/N19-1423 "/> <meta name="citation_reference" content="Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., Xu, B.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 207&#8211;212. Association for Computational Linguistics, Berlin, Germany (2016). https://doi.org/10.18653/v1/P16-2034 . https://www.aclweb.org/anthology/P16-2034 "/> <meta name="citation_reference" content="Lei, J., Zhang, Q., Wang, J., Luo, H.: Bert based hierarchical sequence classification for context-aware microblog sentiment analysis. In: International Conference on Neural Information Processing, pp. 376&#8211;386. Springer, Sydney, NSW, Australia (2019). Springer"/> <meta name="citation_reference" content="Du, X., Cardie, C.: Event extraction by answering (almost) natural questions. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 671&#8211;683. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.emnlp-main.49 . https://aclanthology.org/2020.emnlp-main.49 "/> <meta name="citation_reference" content="Sun, Y., Wang, S., Feng, S., Ding, S., Pang, C., Shang, J., Liu, J., Chen, X., Zhao, Y., Lu, Y., Liu, W., Wu, Z., Gong, W., Liang, J., Shang, Z., Sun, P., Liu, W., Xuan, O., Yu, D., Tian, H., Wu, H., Wang, H.: Ernie 3.0: large-scale knowledge enhanced pre-training for language understanding and generation. In: arXiv Preprint arXiv:2107.02137 , vol. abs/2107.02137. arXiv, &quot;Online&quot; (2021). https://api.semanticscholar.org/CorpusID:235731579 "/> <meta name="citation_reference" content="Ma, R., Peng, M., Zhang, Q., Wei, Z., Huang, X.: Simplify the usage of lexicon in Chinese NER. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5951&#8211;5960. Association for Computational Linguistics, Online (2020). https://doi.org/10.18653/v1/2020.acl-main.528 . https://aclanthology.org/2020.acl-main.528 "/> <meta name="citation_reference" content="Dean-Hall, A., Clarke, C.L., Simone, N., Kamps, J., Thomas, P., Voorhees, E.: Overview of the TREC 2013 contextual suggestion track. In: Voorhees, E. (ed.) Proceedings of The Twenty-Second Text REtrieval Conference, TREC 2013, Gaithersburg, Maryland, USA, November 19-22, 2013. NIST Special Publication, vol. 500-302. National Institute of Standards and Technology (NIST), Gaithersburg, Maryland, USA (2013). http://trec.nist.gov/pubs/trec22/papers/CONTEXT.OVERVIEW.pdf "/> <meta name="citation_reference" content="Doddington, G., Mitchell, A., Przybocki, M., Ramshaw, L., Strassel, S., Weischedel, R.: The automatic content extraction (ACE) program &#8211; tasks, data, and evaluation. In: Lino, M.T., Xavier, M.F., Ferreira, F., Costa, R., Silva, R. (eds.) Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC&#8217;04). European Language Resources Association (ELRA), Lisbon, Portugal (2004). http://www.lrec-conf.org/proceedings/lrec2004/pdf/5.pdf "/> <meta name="citation_reference" content="citation_journal_title=IEEE Access; citation_title=A survey of event extraction from text; citation_author=W Xiang, B Wang; citation_volume=7; citation_publication_date=2019; citation_pages=173111-173137; citation_doi=10.1109/ACCESS.2019.2956831; citation_id=CR12"/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Neural Netw. Learn. Syst.; citation_title=A survey on deep learning event extraction: approaches and applications; citation_author=Q Li, J Li, J Sheng, S Cui, J Wu, Y Hei, H Peng, S Guo, L Wang, A Beheshti, PS Yu; citation_volume=35; citation_publication_date=2021; citation_pages=6301-6321; citation_doi=10.1109/TNNLS.2022.3213168; citation_id=CR13"/> <meta name="citation_reference" content="Chen, Y., Xu, L., Liu, K., Zeng, D., Zhao, J.: Event extraction via dynamic multi-pooling convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 167&#8211;176. Association for Computational Linguistics, Beijing, China (2015). https://doi.org/10.3115/v1/P15-1017 . https://aclanthology.org/P15-1017 "/> <meta name="citation_reference" content="Nguyen, T.H., Cho, K., Grishman, R.: Joint event extraction via recurrent neural networks. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 300&#8211;309. Association for Computational Linguistics, San Diego, California (2016). https://doi.org/10.18653/v1/N16-1034 . https://aclanthology.org/N16-1034 "/> <meta name="citation_reference" content="Tian, C., Zhao, Y., Ren, L.: A Chinese event relation extraction model based on bert. In: 2019 2nd International Conference on Artificial Intelligence and Big Data (ICAIBD), pp. 271&#8211;276 (2019). IEEE"/> <meta name="citation_reference" content="citation_journal_title=OpenAI Blog; citation_title=Language models are unsupervised multitask learners; citation_author=A Radford, J Wu, R Child, D Luan, D Amodei, I Sutskever; citation_volume=1; citation_issue=8; citation_publication_date=2019; citation_pages=9; citation_id=CR17"/> <meta name="citation_reference" content="citation_journal_title=Adv. Neural. Inf. Process. Syst.; citation_title=Language models are few-shot learners; citation_author=T Brown, B Mann, N Ryder, M Subbiah, JD Kaplan, P Dhariwal, A Neelakantan, P Shyam, G Sastry, A Askell; citation_volume=33; citation_publication_date=2020; citation_pages=1877-1901; citation_id=CR18"/> <meta name="citation_reference" content="citation_journal_title=J. Mach. Learn. Res.; citation_title=Exploring the limits of transfer learning with a unified text-to-text transformer; citation_author=C Raffel, N Shazeer, A Roberts, K Lee, S Narang, M Matena, Y Zhou, W Li, PJ Liu; citation_volume=21; citation_issue=1; citation_publication_date=2020; citation_pages=5485-5551; citation_id=CR19"/> <meta name="citation_reference" content="Lu, Y., Lin, H., Xu, J., Han, X., Tang, J., Li, A., Sun, L., Liao, M., Chen, S.: Text2event: controllable sequence-to-structure generation for end-to-end event extraction. arXiv preprint arXiv:2106.09232 (2021)"/> <meta name="citation_reference" content="Wei, K., Sun, X., Zhang, Z., Zhang, J., Zhi, G., Jin, L.: Trigger is not sufficient: exploiting frame-aware knowledge for implicit event argument extraction. In: Zong, C., Xia, F., Li, W., Navigli, R. (eds.) Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 4672&#8211;4682. Association for Computational Linguistics, Online (2021). https://doi.org/10.18653/v1/2021.acl-long.360 . https://aclanthology.org/2021.acl-long.360 "/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Knowl. Data Eng.; citation_title=Implicit event argument extraction with argument-argument relational knowledge; citation_author=K Wei, X Sun, Z Zhang, L Jin, J Zhang, J Lv, Z Guo; citation_volume=35; citation_issue=9; citation_publication_date=2023; citation_pages=8865-8879; citation_doi=10.1109/TKDE.2022.3218830; citation_id=CR22"/> <meta name="citation_reference" content="Ritter, A., Mausam, Etzioni, O., Clark, S.: Open domain event extraction from twitter. In: Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. KDD &#8217;12, pp. 1104&#8211;1112. Association for Computing Machinery, New York, NY, USA (2012). https://doi.org/10.1145/2339530.2339704 . https://doi.org/10.1145/2339530.2339704 "/> <meta name="citation_reference" content="Mani, I., Wilson, G.: Robust temporal processing of news. In: Proceedings of the 38th Annual Meeting of the Association for Computational Linguistics, pp. 69&#8211;76. Association for Computational Linguistics, Hong Kong (2000). https://doi.org/10.3115/1075218.1075228 "/> <meta name="citation_reference" content="Chen, Y., Liu, S., Zhang, X., Liu, K., Zhao, J.: Automatically labeled data generation for large scale event extraction. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 409&#8211;419. Association for Computational Linguistics, Vancouver, Canada (2017). https://doi.org/10.18653/v1/P17-1038 . https://aclanthology.org/P17-1038 "/> <meta name="citation_reference" content="Wei, K., Yang, Y., Jin, L., Sun, X., Zhang, Z., Zhang, J., Li, X., Zhang, L., Liu, J., Zhi, G.: Guide the many-to-one assignment: Open information extraction via IoU-aware optimal transport. In: Rogers, A., Boyd-Graber, J., Okazaki, N. (eds.) Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4971&#8211;4984. Association for Computational Linguistics, Toronto, Canada (2023). https://doi.org/10.18653/v1/2023.acl-long.272 . https://aclanthology.org/2023.acl-long.272 "/> <meta name="citation_reference" content="citation_journal_title=Knowl.-Based Syst.; citation_title=Generic metadata representation framework for social-based event detection, description, and linkage; citation_author=MA Abebe, J Tekli, F Getahun, R Chbeir, G Tekli; citation_volume=188; citation_publication_date=2020; citation_doi=10.1016/j.knosys.2019.06.025; citation_id=CR27"/> <meta name="citation_reference" content="Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, &#321;., Polosukhin, I.: Attention is all you need. Advances in neural information processing systems 30 (2017)"/> <meta name="citation_reference" content="Chang, C.-H., Liao, Y.-C., Yeh, T.: Event source page discovery via policy-based rl with multi-task neural sequence model. In: International Conference on Web Information Systems Engineering, pp. 597&#8211;606 (2022). Springer"/> <meta name="citation_reference" content="Lample, G., Ballesteros, M., Subramanian, S., Kawakami, K., Dyer, C.: Neural architectures for named entity recognition. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 260&#8211;270. Association for Computational Linguistics, San Diego, California (2016). https://doi.org/10.18653/v1/N16-1030 . https://aclanthology.org/N16-1030 "/> <meta name="citation_author" content="Lin, Yuan-Hao"/> <meta name="citation_author_email" content="luff543@gmail.com"/> <meta name="citation_author_institution" content="Department of Computer Science and Information Engineering, National Central University, Taoyuan City, Taiwan"/> <meta name="citation_author" content="Chang, Chia-Hui"/> <meta name="citation_author_email" content="chiahui@g.ncu.edu.tw"/> <meta name="citation_author_institution" content="Department of Computer Science and Information Engineering, National Central University, Taoyuan City, Taiwan"/> <meta name="citation_author" content="Chuang, Hsiu-Min"/> <meta name="citation_author_email" content="showmin@cycu.edu.tw"/> <meta name="citation_author_institution" content="Information and Computer Engineering, Chung Yuan Christian University, Taoyuan, Taiwan"/> <meta name="format-detection" content="telephone=no"/> <meta name="citation_cover_date" content="2024/12/01"/> <meta property="og:url" content="https://link.springer.com/article/10.1007/s44196-024-00697-0"/> <meta property="og:type" content="article"/> <meta property="og:site_name" content="SpringerLink"/> <meta property="og:title" content="Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition - International Journal of Computational Intelligence Systems"/> <meta property="og:description" content="Extracting meetup events from social network posts or webpage announcements is the core technology to build event search services on the Web. While event extraction in English achieves good performance in sentence-level evaluation [1], the quality of auto-labeled training data via distant supervision is not good enough for word-level event extraction due to long event titles [2]. Additionally, meetup event titles are more complex and diverse than trigger-word-based event extraction. Therefore, the performance of event title extraction is usually worse than that of traditional named entity recognition (NER). In this paper, we propose a context-aware meetup event extraction (CAMEE) framework that incorporates a sentence-level event argument positioning model to locate event fields (i.e., title, venue, dates, etc.) within a message and then perform word-level event title, venue, and date extraction. Experimental results show that adding sentence-level event argument positioning as a filtering step improves the word-level event field extraction performance from 0.726 to 0.743 macro-F1, outperforming large language models like GPT-4-turbo (with 0.549 F1) and SOTA NER model SoftLexicon (with 0.733 F1). Furthermore, when evaluating the main event extraction task, the proposed model achieves 0.784 macro-F1."/> <meta property="og:image" content="https://static-content.springer.com/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig1_HTML.png"/> <meta name="format-detection" content="telephone=no"> <link rel="apple-touch-icon" sizes="180x180" href=/oscar-static/img/favicons/darwin/apple-touch-icon-92e819bf8a.png> <link rel="icon" type="image/png" sizes="192x192" href=/oscar-static/img/favicons/darwin/android-chrome-192x192-6f081ca7e5.png> <link rel="icon" type="image/png" sizes="32x32" href=/oscar-static/img/favicons/darwin/favicon-32x32-1435da3e82.png> <link rel="icon" type="image/png" sizes="16x16" href=/oscar-static/img/favicons/darwin/favicon-16x16-ed57f42bd2.png> <link rel="shortcut icon" data-test="shortcut-icon" href=/oscar-static/img/favicons/darwin/favicon-c6d59aafac.ico> <meta name="theme-color" content="#e6e6e6"> <!-- Please see discussion: https://github.com/springernature/frontend-open-space/issues/316--> <!--TODO: Implement alternative to CTM in here if the discussion concludes we do not continue with CTM as a practice--> <link rel="stylesheet" media="print" href=/oscar-static/app-springerlink/css/print-b8af42253b.css> <style> html{text-size-adjust:100%;line-height:1.15}body{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;line-height:1.8;margin:0}details,main{display:block}h1{font-size:2em;margin:.67em 0}a{background-color:transparent;color:#025e8d}sub{bottom:-.25em;font-size:75%;line-height:0;position:relative;vertical-align:baseline}img{border:0;height:auto;max-width:100%;vertical-align:middle}button,input{font-family:inherit;font-size:100%;line-height:1.15;margin:0;overflow:visible}button{text-transform:none}[type=button],[type=submit],button{-webkit-appearance:button}[type=search]{-webkit-appearance:textfield;outline-offset:-2px}summary{display:list-item}[hidden]{display:none}button{cursor:pointer}svg{height:1rem;width:1rem} </style> <style>@media only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark) { body{background:#fff;color:#222;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;line-height:1.8;min-height:100%}a{color:#025e8d;text-decoration:underline;text-decoration-skip-ink:auto}button{cursor:pointer}img{border:0;height:auto;max-width:100%;vertical-align:middle}html{box-sizing:border-box;font-size:100%;height:100%;overflow-y:scroll}h1{font-size:2.25rem}h2{font-size:1.75rem}h1,h2,h4{font-weight:700;line-height:1.2}h4{font-size:1.25rem}body{font-size:1.125rem}*{box-sizing:inherit}p{margin-bottom:2rem;margin-top:0}p:last-of-type{margin-bottom:0}.c-ad{text-align:center}@media only screen and (min-width:480px){.c-ad{padding:8px}}.c-ad--728x90{display:none}.c-ad--728x90 .c-ad__inner{min-height:calc(1.5em + 94px)}@media only screen and (min-width:876px){.js .c-ad--728x90{display:none}}.c-ad__label{color:#333;font-size:.875rem;font-weight:400;line-height:1.5;margin-bottom:4px}.c-ad__label,.c-status-message{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-status-message{align-items:center;box-sizing:border-box;display:flex;position:relative;width:100%}.c-status-message :last-child{margin-bottom:0}.c-status-message--boxed{background-color:#fff;border:1px solid #ccc;line-height:1.4;padding:16px}.c-status-message__heading{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;font-weight:700}.c-status-message__icon{fill:currentcolor;display:inline-block;flex:0 0 auto;height:1.5em;margin-right:8px;transform:translate(0);vertical-align:text-top;width:1.5em}.c-status-message__icon--top{align-self:flex-start}.c-status-message--info .c-status-message__icon{color:#003f8d}.c-status-message--boxed.c-status-message--info{border-bottom:4px solid #003f8d}.c-status-message--error .c-status-message__icon{color:#c40606}.c-status-message--boxed.c-status-message--error{border-bottom:4px solid #c40606}.c-status-message--success .c-status-message__icon{color:#00b8b0}.c-status-message--boxed.c-status-message--success{border-bottom:4px solid #00b8b0}.c-status-message--warning .c-status-message__icon{color:#edbc53}.c-status-message--boxed.c-status-message--warning{border-bottom:4px solid #edbc53}.eds-c-header{background-color:#fff;border-bottom:2px solid #01324b;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;line-height:1.5;padding:8px 0 0}.eds-c-header__container{align-items:center;display:flex;flex-wrap:nowrap;gap:8px 16px;justify-content:space-between;margin:0 auto 8px;max-width:1280px;padding:0 8px;position:relative}.eds-c-header__nav{border-top:2px solid #c5e0f4;padding-top:4px;position:relative}.eds-c-header__nav-container{align-items:center;display:flex;flex-wrap:wrap;margin:0 auto 4px;max-width:1280px;padding:0 8px;position:relative}.eds-c-header__nav-container>:not(:last-child){margin-right:32px}.eds-c-header__link-container{align-items:center;display:flex;flex:1 0 auto;gap:8px 16px;justify-content:space-between}.eds-c-header__list{list-style:none;margin:0;padding:0}.eds-c-header__list-item{font-weight:700;margin:0 auto;max-width:1280px;padding:8px}.eds-c-header__list-item:not(:last-child){border-bottom:2px solid #c5e0f4}.eds-c-header__item{color:inherit}@media only screen and (min-width:768px){.eds-c-header__item--menu{display:none;visibility:hidden}.eds-c-header__item--menu:first-child+*{margin-block-start:0}}.eds-c-header__item--inline-links{display:none;visibility:hidden}@media only screen and (min-width:768px){.eds-c-header__item--inline-links{display:flex;gap:16px 16px;visibility:visible}}.eds-c-header__item--divider:before{border-left:2px solid #c5e0f4;content:"";height:calc(100% - 16px);margin-left:-15px;position:absolute;top:8px}.eds-c-header__brand{padding:16px 8px}.eds-c-header__brand a{display:block;line-height:1;text-decoration:none}.eds-c-header__brand img{height:1.5rem;width:auto}.eds-c-header__link{color:inherit;display:inline-block;font-weight:700;padding:16px 8px;position:relative;text-decoration-color:transparent;white-space:nowrap;word-break:normal}.eds-c-header__icon{fill:currentcolor;display:inline-block;font-size:1.5rem;height:1em;transform:translate(0);vertical-align:bottom;width:1em}.eds-c-header__icon+*{margin-left:8px}.eds-c-header__expander{background-color:#f0f7fc}.eds-c-header__search{display:block;padding:24px 0}@media only screen and (min-width:768px){.eds-c-header__search{max-width:70%}}.eds-c-header__search-container{position:relative}.eds-c-header__search-label{color:inherit;display:inline-block;font-weight:700;margin-bottom:8px}.eds-c-header__search-input{background-color:#fff;border:1px solid #000;padding:8px 48px 8px 8px;width:100%}.eds-c-header__search-button{background-color:transparent;border:0;color:inherit;height:100%;padding:0 8px;position:absolute;right:0}.has-tethered.eds-c-header__expander{border-bottom:2px solid #01324b;left:0;margin-top:-2px;top:100%;width:100%;z-index:10}@media only screen and (min-width:768px){.has-tethered.eds-c-header__expander--menu{display:none;visibility:hidden}}.has-tethered .eds-c-header__heading{display:none;visibility:hidden}.has-tethered .eds-c-header__heading:first-child+*{margin-block-start:0}.has-tethered .eds-c-header__search{margin:auto}.eds-c-header__heading{margin:0 auto;max-width:1280px;padding:16px 16px 0}.eds-c-pagination{align-items:center;display:flex;flex-wrap:wrap;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;gap:16px 0;justify-content:center;line-height:1.4;list-style:none;margin:0;padding:32px 0}@media only screen and (min-width:480px){.eds-c-pagination{padding:32px 16px}}.eds-c-pagination__item{margin-right:8px}.eds-c-pagination__item--prev{margin-right:16px}.eds-c-pagination__item--next .eds-c-pagination__link,.eds-c-pagination__item--prev .eds-c-pagination__link{padding:16px 8px}.eds-c-pagination__item--next{margin-left:8px}.eds-c-pagination__item:last-child{margin-right:0}.eds-c-pagination__link{align-items:center;color:#222;cursor:pointer;display:inline-block;font-size:1rem;margin:0;padding:16px 24px;position:relative;text-align:center;transition:all .2s ease 0s}.eds-c-pagination__link:visited{color:#222}.eds-c-pagination__link--disabled{border-color:#555;color:#555;cursor:default}.eds-c-pagination__link--active{background-color:#01324b;background-image:none;border-radius:8px;color:#fff}.eds-c-pagination__link--active:focus,.eds-c-pagination__link--active:hover,.eds-c-pagination__link--active:visited{color:#fff}.eds-c-pagination__link-container{align-items:center;display:flex}.eds-c-pagination__icon{fill:#222;height:1.5rem;width:1.5rem}.eds-c-pagination__icon--disabled{fill:#555}.eds-c-pagination__visually-hidden{clip:rect(0,0,0,0);border:0;clip-path:inset(50%);height:1px;overflow:hidden;padding:0;position:absolute!important;white-space:nowrap;width:1px}.c-breadcrumbs{color:#333;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;list-style:none;margin:0;padding:0}.c-breadcrumbs>li{display:inline}svg.c-breadcrumbs__chevron{fill:#333;height:10px;margin:0 .25rem;width:10px}.c-breadcrumbs--contrast,.c-breadcrumbs--contrast .c-breadcrumbs__link{color:#fff}.c-breadcrumbs--contrast svg.c-breadcrumbs__chevron{fill:#fff}@media only screen and (max-width:479px){.c-breadcrumbs .c-breadcrumbs__item{display:none}.c-breadcrumbs .c-breadcrumbs__item:last-child,.c-breadcrumbs .c-breadcrumbs__item:nth-last-child(2){display:inline}}.c-skip-link{background:#01324b;bottom:auto;color:#fff;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;padding:8px;position:absolute;text-align:center;transform:translateY(-100%);width:100%;z-index:9999}@media (prefers-reduced-motion:reduce){.c-skip-link{transition:top .3s ease-in-out 0s}}@media print{.c-skip-link{display:none}}.c-skip-link:active,.c-skip-link:hover,.c-skip-link:link,.c-skip-link:visited{color:#fff}.c-skip-link:focus{transform:translateY(0)}.l-with-sidebar{display:flex;flex-wrap:wrap}.l-with-sidebar>*{margin:0}.l-with-sidebar__sidebar{flex-basis:var(--with-sidebar--basis,400px);flex-grow:1}.l-with-sidebar>:not(.l-with-sidebar__sidebar){flex-basis:0px;flex-grow:999;min-width:var(--with-sidebar--min,53%)}.l-with-sidebar>:first-child{padding-right:4rem}@supports (gap:1em){.l-with-sidebar>:first-child{padding-right:0}.l-with-sidebar{gap:var(--with-sidebar--gap,4rem)}}.c-header__link{color:inherit;display:inline-block;font-weight:700;padding:16px 8px;position:relative;text-decoration-color:transparent;white-space:nowrap;word-break:normal}.app-masthead__colour-4{--background-color:#ff9500;--gradient-light:rgba(0,0,0,.5);--gradient-dark:rgba(0,0,0,.8)}.app-masthead{background:var(--background-color,#0070a8);position:relative}.app-masthead:after{background:radial-gradient(circle at top right,var(--gradient-light,rgba(0,0,0,.4)),var(--gradient-dark,rgba(0,0,0,.7)));bottom:0;content:"";left:0;position:absolute;right:0;top:0}@media only screen and (max-width:479px){.app-masthead:after{background:linear-gradient(225deg,var(--gradient-light,rgba(0,0,0,.4)),var(--gradient-dark,rgba(0,0,0,.7)))}}.app-masthead__container{color:var(--masthead-color,#fff);margin:0 auto;max-width:1280px;padding:0 16px;position:relative;z-index:1}.u-button{align-items:center;background-color:#01324b;background-image:none;border:4px solid transparent;border-radius:32px;cursor:pointer;display:inline-flex;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;font-weight:700;justify-content:center;line-height:1.3;margin:0;padding:16px 32px;position:relative;transition:all .2s ease 0s;width:auto}.u-button svg,.u-button--contrast svg,.u-button--primary svg,.u-button--secondary svg,.u-button--tertiary svg{fill:currentcolor}.u-button,.u-button:visited{color:#fff}.u-button,.u-button:hover{box-shadow:0 0 0 1px #01324b;text-decoration:none}.u-button:hover{border:4px solid #fff}.u-button:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.u-button:focus,.u-button:hover{background-color:#fff;background-image:none;color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--primary:focus svg path,.app-masthead--pastel .c-pdf-download .u-button--primary:hover svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover svg path,.u-button--primary:focus svg path,.u-button--primary:hover svg path,.u-button:focus svg path,.u-button:hover svg path{fill:#01324b}.u-button--primary{background-color:#01324b;background-image:none;border:4px solid transparent;box-shadow:0 0 0 1px #01324b;color:#fff;font-weight:700}.u-button--primary:visited{color:#fff}.u-button--primary:hover{border:4px solid #fff;box-shadow:0 0 0 1px #01324b;text-decoration:none}.u-button--primary:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.u-button--primary:focus,.u-button--primary:hover{background-color:#fff;background-image:none;color:#01324b}.u-button--secondary{background-color:#fff;border:4px solid #fff;color:#01324b;font-weight:700}.u-button--secondary:visited{color:#01324b}.u-button--secondary:hover{border:4px solid #01324b;box-shadow:none}.u-button--secondary:focus,.u-button--secondary:hover{background-color:#01324b;color:#fff}.app-masthead--pastel .c-pdf-download .u-button--secondary:focus svg path,.app-masthead--pastel .c-pdf-download .u-button--secondary:hover svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:focus svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:hover svg path,.u-button--secondary:focus svg path,.u-button--secondary:hover svg path,.u-button--tertiary:focus svg path,.u-button--tertiary:hover svg path{fill:#fff}.u-button--tertiary{background-color:#ebf1f5;border:4px solid transparent;box-shadow:none;color:#666;font-weight:700}.u-button--tertiary:visited{color:#666}.u-button--tertiary:hover{border:4px solid #01324b;box-shadow:none}.u-button--tertiary:focus,.u-button--tertiary:hover{background-color:#01324b;color:#fff}.u-button--contrast{background-color:transparent;background-image:none;color:#fff;font-weight:400}.u-button--contrast:visited{color:#fff}.u-button--contrast,.u-button--contrast:focus,.u-button--contrast:hover{border:4px solid #fff}.u-button--contrast:focus,.u-button--contrast:hover{background-color:#fff;background-image:none;color:#000}.u-button--contrast:focus svg path,.u-button--contrast:hover svg path{fill:#000}.u-button--disabled,.u-button:disabled{background-color:transparent;background-image:none;border:4px solid #ccc;color:#000;cursor:default;font-weight:400;opacity:.7}.u-button--disabled svg,.u-button:disabled svg{fill:currentcolor}.u-button--disabled:visited,.u-button:disabled:visited{color:#000}.u-button--disabled:focus,.u-button--disabled:hover,.u-button:disabled:focus,.u-button:disabled:hover{border:4px solid #ccc;text-decoration:none}.u-button--disabled:focus,.u-button--disabled:hover,.u-button:disabled:focus,.u-button:disabled:hover{background-color:transparent;background-image:none;color:#000}.u-button--disabled:focus svg path,.u-button--disabled:hover svg path,.u-button:disabled:focus svg path,.u-button:disabled:hover svg path{fill:#000}.u-button--small,.u-button--xsmall{font-size:.875rem;padding:2px 8px}.u-button--small{padding:8px 16px}.u-button--large{font-size:1.125rem;padding:10px 35px}.u-button--full-width{display:flex;width:100%}.u-button--icon-left svg{margin-right:8px}.u-button--icon-right svg{margin-left:8px}.u-clear-both{clear:both}.u-container{margin:0 auto;max-width:1280px;padding:0 16px}.u-justify-content-space-between{justify-content:space-between}.u-display-none{display:none}.js .u-js-hide,.u-hide{display:none;visibility:hidden}.u-visually-hidden{clip:rect(0,0,0,0);border:0;clip-path:inset(50%);height:1px;overflow:hidden;padding:0;position:absolute!important;white-space:nowrap;width:1px}.u-icon{fill:currentcolor;display:inline-block;height:1em;transform:translate(0);vertical-align:text-top;width:1em}.u-list-reset{list-style:none;margin:0;padding:0}.u-ma-16{margin:16px}.u-mt-0{margin-top:0}.u-mt-24{margin-top:24px}.u-mt-32{margin-top:32px}.u-mb-8{margin-bottom:8px}.u-mb-32{margin-bottom:32px}.u-button-reset{background-color:transparent;border:0;padding:0}.u-sans-serif{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.u-serif{font-family:Merriweather,serif}h1,h2,h4{-webkit-font-smoothing:antialiased}p{overflow-wrap:break-word;word-break:break-word}.u-h4{font-size:1.25rem;font-weight:700;line-height:1.2}.u-mbs-0{margin-block-start:0!important}.c-article-header{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-article-identifiers{color:#6f6f6f;display:flex;flex-wrap:wrap;font-size:1rem;line-height:1.3;list-style:none;margin:0 0 8px;padding:0}.c-article-identifiers__item{border-right:1px solid #6f6f6f;list-style:none;margin-right:8px;padding-right:8px}.c-article-identifiers__item:last-child{border-right:0;margin-right:0;padding-right:0}@media only screen and (min-width:876px){.c-article-title{font-size:1.875rem;line-height:1.2}}.c-article-author-list{display:inline;font-size:1rem;list-style:none;margin:0 8px 0 0;padding:0;width:100%}.c-article-author-list__item{display:inline;padding-right:0}.c-article-author-list__show-more{display:none;margin-right:4px}.c-article-author-list__button,.js .c-article-author-list__item--hide,.js .c-article-author-list__show-more{display:none}.js .c-article-author-list--long .c-article-author-list__show-more,.js .c-article-author-list--long+.c-article-author-list__button{display:inline}@media only screen and (max-width:767px){.js .c-article-author-list__item--hide-small-screen{display:none}.js .c-article-author-list--short .c-article-author-list__show-more,.js .c-article-author-list--short+.c-article-author-list__button{display:inline}}#uptodate-client,.js .c-article-author-list--expanded .c-article-author-list__show-more{display:none!important}.js .c-article-author-list--expanded .c-article-author-list__item--hide-small-screen{display:inline!important}.c-article-author-list__button,.c-button-author-list{background:#ebf1f5;border:4px solid #ebf1f5;border-radius:20px;color:#666;font-size:.875rem;line-height:1.4;padding:2px 11px 2px 8px;text-decoration:none}.c-article-author-list__button svg,.c-button-author-list svg{margin:1px 4px 0 0}.c-article-author-list__button:hover,.c-button-author-list:hover{background:#025e8d;border-color:transparent;color:#fff}.c-article-body .c-article-access-provider{padding:8px 16px}.c-article-body .c-article-access-provider,.c-notes{border:1px solid #d5d5d5;border-image:initial;border-left:none;border-right:none;margin:24px 0}.c-article-body .c-article-access-provider__text{color:#555}.c-article-body .c-article-access-provider__text,.c-notes__text{font-size:1rem;margin-bottom:0;padding-bottom:2px;padding-top:2px;text-align:center}.c-article-body .c-article-author-affiliation__address{color:inherit;font-weight:700;margin:0}.c-article-body .c-article-author-affiliation__authors-list{list-style:none;margin:0;padding:0}.c-article-body .c-article-author-affiliation__authors-item{display:inline;margin-left:0}.c-article-authors-search{margin-bottom:24px;margin-top:0}.c-article-authors-search__item,.c-article-authors-search__title{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-article-authors-search__title{color:#626262;font-size:1.05rem;font-weight:700;margin:0;padding:0}.c-article-authors-search__item{font-size:1rem}.c-article-authors-search__text{margin:0}.c-code-block{border:1px solid #fff;font-family:monospace;margin:0 0 24px;padding:20px}.c-code-block__heading{font-weight:400;margin-bottom:16px}.c-code-block__line{display:block;overflow-wrap:break-word;white-space:pre-wrap}.c-article-share-box{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;margin-bottom:24px}.c-article-share-box__description{font-size:1rem;margin-bottom:8px}.c-article-share-box__no-sharelink-info{font-size:.813rem;font-weight:700;margin-bottom:24px;padding-top:4px}.c-article-share-box__only-read-input{border:1px solid #d5d5d5;box-sizing:content-box;display:inline-block;font-size:.875rem;font-weight:700;height:24px;margin-bottom:8px;padding:8px 10px}.c-article-share-box__additional-info{color:#626262;font-size:.813rem}.c-article-share-box__button{background:#fff;box-sizing:content-box;text-align:center}.c-article-share-box__button--link-like{background-color:transparent;border:0;color:#025e8d;cursor:pointer;font-size:.875rem;margin-bottom:8px;margin-left:10px}.c-article-associated-content__container .c-article-associated-content__collection-label{font-size:.875rem;line-height:1.4}.c-article-associated-content__container .c-article-associated-content__collection-title{line-height:1.3}.c-reading-companion{clear:both;min-height:389px}.c-reading-companion__figures-list,.c-reading-companion__references-list{list-style:none;min-height:389px;padding:0}.c-reading-companion__references-list--numeric{list-style:decimal inside}.c-reading-companion__figure-item{border-top:1px solid #d5d5d5;font-size:1rem;padding:16px 8px 16px 0}.c-reading-companion__figure-item:first-child{border-top:none;padding-top:8px}.c-reading-companion__reference-item{font-size:1rem}.c-reading-companion__reference-item:first-child{border-top:none}.c-reading-companion__reference-item a{word-break:break-word}.c-reading-companion__reference-citation{display:inline}.c-reading-companion__reference-links{font-size:.813rem;font-weight:700;list-style:none;margin:8px 0 0;padding:0;text-align:right}.c-reading-companion__reference-links>a{display:inline-block;padding-left:8px}.c-reading-companion__reference-links>a:first-child{display:inline-block;padding-left:0}.c-reading-companion__figure-title{display:block;font-size:1.25rem;font-weight:700;line-height:1.2;margin:0 0 8px}.c-reading-companion__figure-links{display:flex;justify-content:space-between;margin:8px 0 0}.c-reading-companion__figure-links>a{align-items:center;display:flex}.c-article-section__figure-caption{display:block;margin-bottom:8px;word-break:break-word}.c-article-section__figure .video,p.app-article-masthead__access--above-download{margin:0 0 16px}.c-article-section__figure-description{font-size:1rem}.c-article-section__figure-description>*{margin-bottom:0}.c-cod{display:block;font-size:1rem;width:100%}.c-cod__form{background:#ebf0f3}.c-cod__prompt{font-size:1.125rem;line-height:1.3;margin:0 0 24px}.c-cod__label{display:block;margin:0 0 4px}.c-cod__row{display:flex;margin:0 0 16px}.c-cod__row:last-child{margin:0}.c-cod__input{border:1px solid #d5d5d5;border-radius:2px;flex-shrink:0;margin:0;padding:13px}.c-cod__input--submit{background-color:#025e8d;border:1px solid #025e8d;color:#fff;flex-shrink:1;margin-left:8px;transition:background-color .2s ease-out 0s,color .2s ease-out 0s}.c-cod__input--submit-single{flex-basis:100%;flex-shrink:0;margin:0}.c-cod__input--submit:focus,.c-cod__input--submit:hover{background-color:#fff;color:#025e8d}.save-data .c-article-author-institutional-author__sub-division,.save-data .c-article-equation__number,.save-data .c-article-figure-description,.save-data .c-article-fullwidth-content,.save-data .c-article-main-column,.save-data .c-article-satellite-article-link,.save-data .c-article-satellite-subtitle,.save-data .c-article-table-container,.save-data .c-blockquote__body,.save-data .c-code-block__heading,.save-data .c-reading-companion__figure-title,.save-data .c-reading-companion__reference-citation,.save-data .c-site-messages--nature-briefing-email-variant .serif,.save-data .c-site-messages--nature-briefing-email-variant.serif,.save-data .serif,.save-data .u-serif,.save-data h1,.save-data h2,.save-data h3{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-pdf-download__link{display:flex;flex:1 1 0%;padding:13px 24px}.c-pdf-download__link:hover{text-decoration:none}@media only screen and (min-width:768px){.c-context-bar--sticky .c-pdf-download__link{align-items:center;flex:1 1 183px}}@media only screen and (max-width:320px){.c-context-bar--sticky .c-pdf-download__link{padding:16px}}.c-article-body .c-article-recommendations-list,.c-book-body .c-article-recommendations-list{display:flex;flex-direction:row;gap:16px 16px;margin:0;max-width:100%;padding:16px 0 0}.c-article-body .c-article-recommendations-list__item,.c-book-body .c-article-recommendations-list__item{flex:1 1 0%}@media only screen and (max-width:767px){.c-article-body .c-article-recommendations-list,.c-book-body .c-article-recommendations-list{flex-direction:column}}.c-article-body .c-article-recommendations-card__authors{display:none;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;line-height:1.5;margin:0 0 8px}@media only screen and (max-width:767px){.c-article-body .c-article-recommendations-card__authors{display:block;margin:0}}.c-article-body .c-article-history{margin-top:24px}.app-article-metrics-bar p{margin:0}.app-article-masthead{display:flex;flex-direction:column;gap:16px 16px;padding:16px 0 24px}.app-article-masthead__info{display:flex;flex-direction:column;flex-grow:1}.app-article-masthead__brand{border-top:1px solid hsla(0,0%,100%,.8);display:flex;flex-direction:column;flex-shrink:0;gap:8px 8px;min-height:96px;padding:16px 0 0}.app-article-masthead__brand img{border:1px solid #fff;border-radius:8px;box-shadow:0 4px 15px 0 hsla(0,0%,50%,.25);height:auto;left:0;position:absolute;width:72px}.app-article-masthead__journal-link{display:block;font-size:1.125rem;font-weight:700;margin:0 0 8px;max-width:400px;padding:0 0 0 88px;position:relative}.app-article-masthead__journal-title{-webkit-box-orient:vertical;-webkit-line-clamp:3;display:-webkit-box;overflow:hidden}.app-article-masthead__submission-link{align-items:center;display:flex;font-size:1rem;gap:4px 4px;margin:0 0 0 88px}.app-article-masthead__access{align-items:center;display:flex;flex-wrap:wrap;font-size:.875rem;font-weight:300;gap:4px 4px;margin:0}.app-article-masthead__buttons{display:flex;flex-flow:column wrap;gap:16px 16px}.app-article-masthead__access svg,.app-masthead--pastel .c-pdf-download .u-button--primary svg,.app-masthead--pastel .c-pdf-download .u-button--secondary svg,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary svg,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary svg{fill:currentcolor}.app-article-masthead a{color:#fff}.app-masthead--pastel .c-pdf-download .u-button--primary,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary{background-color:#025e8d;background-image:none;border:2px solid transparent;box-shadow:none;color:#fff;font-weight:700}.app-masthead--pastel .c-pdf-download .u-button--primary:visited,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:visited{color:#fff}.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{text-decoration:none}.app-masthead--pastel .c-pdf-download .u-button--primary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.app-masthead--pastel .c-pdf-download .u-button--primary:focus,.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{background-color:#fff;background-image:none;color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{background:0 0;border:2px solid #025e8d;box-shadow:none;color:#025e8d}.app-masthead--pastel .c-pdf-download .u-button--secondary,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary{background:0 0;border:2px solid #025e8d;color:#025e8d;font-weight:700}.app-masthead--pastel .c-pdf-download .u-button--secondary:visited,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:visited{color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--secondary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:hover{background-color:#01324b;background-color:#025e8d;border:2px solid transparent;box-shadow:none;color:#fff}.app-masthead--pastel .c-pdf-download .u-button--secondary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:focus{background-color:#fff;background-image:none;border:4px solid #fc0;color:#01324b}@media only screen and (min-width:768px){.app-article-masthead{flex-direction:row;gap:64px 64px;padding:24px 0}.app-article-masthead__brand{border:0;padding:0}.app-article-masthead__brand img{height:auto;position:static;width:auto}.app-article-masthead__buttons{align-items:center;flex-direction:row;margin-top:auto}.app-article-masthead__journal-link{display:flex;flex-direction:column;gap:24px 24px;margin:0 0 8px;padding:0}.app-article-masthead__submission-link{margin:0}}@media only screen and (min-width:1024px){.app-article-masthead__brand{flex-basis:400px}}.app-article-masthead .c-article-identifiers{font-size:.875rem;font-weight:300;line-height:1;margin:0 0 8px;overflow:hidden;padding:0}.app-article-masthead .c-article-identifiers--cite-list{margin:0 0 16px}.app-article-masthead .c-article-identifiers *{color:#fff}.app-article-masthead .c-cod{display:none}.app-article-masthead .c-article-identifiers__item{border-left:1px solid #fff;border-right:0;margin:0 17px 8px -9px;padding:0 0 0 8px}.app-article-masthead .c-article-identifiers__item--cite{border-left:0}.app-article-metrics-bar{display:flex;flex-wrap:wrap;font-size:1rem;padding:16px 0 0;row-gap:24px}.app-article-metrics-bar__item{padding:0 16px 0 0}.app-article-metrics-bar__count{font-weight:700}.app-article-metrics-bar__label{font-weight:400;padding-left:4px}.app-article-metrics-bar__icon{height:auto;margin-right:4px;margin-top:-4px;width:auto}.app-article-metrics-bar__arrow-icon{margin:4px 0 0 4px}.app-article-metrics-bar a{color:#000}.app-article-metrics-bar .app-article-metrics-bar__item--metrics{padding-right:0}.app-overview-section .c-article-author-list,.app-overview-section__authors{line-height:2}.app-article-metrics-bar{margin-top:8px}.c-book-toc-pagination+.c-book-section__back-to-top{margin-top:0}.c-article-body .c-article-access-provider__text--chapter{color:#222;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;padding:20px 0}.c-article-body .c-article-access-provider__text--chapter svg.c-status-message__icon{fill:#003f8d;vertical-align:middle}.c-article-body-section__content--separator{padding-top:40px}.c-pdf-download__link{max-height:44px}.app-article-access .u-button--primary,.app-article-access .u-button--primary:visited{color:#fff}.c-article-sidebar{display:none}@media only screen and (min-width:1024px){.c-article-sidebar{display:block}}.c-cod__form{border-radius:12px}.c-cod__label{font-size:.875rem}.c-cod .c-status-message{align-items:center;justify-content:center;margin-bottom:16px;padding-bottom:16px}@media only screen and (min-width:1024px){.c-cod .c-status-message{align-items:inherit}}.c-cod .c-status-message__icon{margin-top:4px}.c-cod .c-cod__prompt{font-size:1rem;margin-bottom:16px}.c-article-body .app-article-access,.c-book-body .app-article-access{display:block}@media only screen and (min-width:1024px){.c-article-body .app-article-access,.c-book-body .app-article-access{display:none}}.c-article-body .app-card-service{margin-bottom:32px}@media only screen and (min-width:1024px){.c-article-body .app-card-service{display:none}}.app-article-access .buybox__buy .u-button--secondary,.app-article-access .u-button--primary,.c-cod__row .u-button--primary{background-color:#025e8d;border:2px solid #025e8d;box-shadow:none;font-size:1rem;font-weight:700;gap:8px 8px;justify-content:center;line-height:1.5;padding:8px 24px}.app-article-access .buybox__buy .u-button--secondary,.app-article-access .u-button--primary:hover,.c-cod__row .u-button--primary:hover{background-color:#fff;color:#025e8d}.app-article-access .buybox__buy .u-button--secondary:hover{background-color:#025e8d;color:#fff}.buybox__buy .c-notes__text{color:#666;font-size:.875rem;padding:0 16px 8px}.c-cod__input{flex-basis:auto;width:100%}.c-article-title{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:2.25rem;font-weight:700;line-height:1.2;margin:12px 0}.c-reading-companion__figure-item figure{margin:0}@media only screen and (min-width:768px){.c-article-title{margin:16px 0}}.app-article-access{border:1px solid #c5e0f4;border-radius:12px}.app-article-access__heading{border-bottom:1px solid #c5e0f4;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1.125rem;font-weight:700;margin:0;padding:16px;text-align:center}.app-article-access .buybox__info svg{vertical-align:middle}.c-article-body .app-article-access p{margin-bottom:0}.app-article-access .buybox__info{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;margin:0}.app-article-access{margin:0 0 32px}@media only screen and (min-width:1024px){.app-article-access{margin:0 0 24px}}.c-status-message{font-size:1rem}.c-article-body{font-size:1.125rem}.c-article-body dl,.c-article-body ol,.c-article-body p,.c-article-body ul{margin-bottom:32px;margin-top:0}.c-article-access-provider__text:last-of-type,.c-article-body .c-notes__text:last-of-type{margin-bottom:0}.c-article-body ol p,.c-article-body ul p{margin-bottom:16px}.c-article-section__figure-caption{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-reading-companion__figure-item{border-top-color:#c5e0f4}.c-reading-companion__sticky{max-width:400px}.c-article-section .c-article-section__figure-description>*{font-size:1rem;margin-bottom:16px}.c-reading-companion__reference-item{border-top:1px solid #d5d5d5;padding:16px 0}.c-reading-companion__reference-item:first-child{padding-top:0}.c-article-share-box__button,.js .c-article-authors-search__item .c-article-button{background:0 0;border:2px solid #025e8d;border-radius:32px;box-shadow:none;color:#025e8d;font-size:1rem;font-weight:700;line-height:1.5;margin:0;padding:8px 24px;transition:all .2s ease 0s}.c-article-authors-search__item .c-article-button{width:100%}.c-pdf-download .u-button{background-color:#fff;border:2px solid #fff;color:#01324b;justify-content:center}.c-context-bar__container .c-pdf-download .u-button svg,.c-pdf-download .u-button svg{fill:currentcolor}.c-pdf-download .u-button:visited{color:#01324b}.c-pdf-download .u-button:hover{border:4px solid #01324b;box-shadow:none}.c-pdf-download .u-button:focus,.c-pdf-download .u-button:hover{background-color:#01324b}.c-pdf-download .u-button:focus svg path,.c-pdf-download .u-button:hover svg path{fill:#fff}.c-context-bar__container .c-pdf-download .u-button{background-image:none;border:2px solid;color:#fff}.c-context-bar__container .c-pdf-download .u-button:visited{color:#fff}.c-context-bar__container .c-pdf-download .u-button:hover{text-decoration:none}.c-context-bar__container .c-pdf-download .u-button:focus{box-shadow:none;outline:0;text-decoration:none}.c-context-bar__container .c-pdf-download .u-button:focus,.c-context-bar__container .c-pdf-download .u-button:hover{background-color:#fff;background-image:none;color:#01324b}.c-context-bar__container .c-pdf-download .u-button:focus svg path,.c-context-bar__container .c-pdf-download .u-button:hover svg path{fill:#01324b}.c-context-bar__container .c-pdf-download .u-button,.c-pdf-download .u-button{box-shadow:none;font-size:1rem;font-weight:700;line-height:1.5;padding:8px 24px}.c-context-bar__container .c-pdf-download .u-button{background-color:#025e8d}.c-pdf-download .u-button:hover{border:2px solid #fff}.c-pdf-download .u-button:focus,.c-pdf-download .u-button:hover{background:0 0;box-shadow:none;color:#fff}.c-context-bar__container .c-pdf-download .u-button:hover{border:2px solid #025e8d;box-shadow:none;color:#025e8d}.c-context-bar__container .c-pdf-download .u-button:focus,.c-pdf-download .u-button:focus{border:2px solid #025e8d}.c-article-share-box__button:focus:focus,.c-article__pill-button:focus:focus,.c-context-bar__container .c-pdf-download .u-button:focus:focus,.c-pdf-download .u-button:focus:focus{outline:3px solid #08c;will-change:transform}.c-pdf-download__link .u-icon{padding-top:0}.c-bibliographic-information__column button{margin-bottom:16px}.c-article-body .c-article-author-affiliation__list p,.c-article-body .c-article-author-information__list p,figure{margin:0}.c-article-share-box__button{margin-right:16px}.c-status-message--boxed{border-radius:12px}.c-article-associated-content__collection-title{font-size:1rem}.app-card-service__description,.c-article-body .app-card-service__description{color:#222;margin-bottom:0;margin-top:8px}.app-article-access__subscriptions a,.app-article-access__subscriptions a:visited,.app-book-series-listing__item a,.app-book-series-listing__item a:hover,.app-book-series-listing__item a:visited,.c-article-author-list a,.c-article-author-list a:visited,.c-article-buy-box a,.c-article-buy-box a:visited,.c-article-peer-review a,.c-article-peer-review a:visited,.c-article-satellite-subtitle a,.c-article-satellite-subtitle a:visited,.c-breadcrumbs__link,.c-breadcrumbs__link:hover,.c-breadcrumbs__link:visited{color:#000}.c-article-author-list svg{height:24px;margin:0 0 0 6px;width:24px}.c-article-header{margin-bottom:32px}@media only screen and (min-width:876px){.js .c-ad--conditional{display:block}}.u-lazy-ad-wrapper{background-color:#fff;display:none;min-height:149px}@media only screen and (min-width:876px){.u-lazy-ad-wrapper{display:block}}p.c-ad__label{margin-bottom:4px}.c-ad--728x90{background-color:#fff;border-bottom:2px solid #cedbe0} } </style> <style>@media only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark) { .eds-c-header__brand img{height:24px;width:203px}.app-article-masthead__journal-link img{height:93px;width:72px}@media only screen and (min-width:769px){.app-article-masthead__journal-link img{height:161px;width:122px}} } </style> <link rel="stylesheet" data-test="critical-css-handler" data-inline-css-source="critical-css" href=/oscar-static/app-springerlink/css/core-darwin-5272567b64.css media="print" onload="this.media='all';this.onload=null"> <link rel="stylesheet" data-test="critical-css-handler" data-inline-css-source="critical-css" href="/oscar-static/app-springerlink/css/enhanced-darwin-article-72ba046d97.css" media="print" onload="this.media='only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark)';this.onload=null"> <script type="text/javascript"> config = { env: 'live', site: '44196.springer.com', siteWithPath: '44196.springer.com' + window.location.pathname, twitterHashtag: '44196', cmsPrefix: 'https://studio-cms.springernature.com/studio/', publisherBrand: 'Springer', mustardcut: false }; </script> <script> window.dataLayer = [{"GA Key":"UA-26408784-1","DOI":"10.1007/s44196-024-00697-0","Page":"article","springerJournal":true,"Publishing Model":"Open Access","page":{"attributes":{"environment":"live"}},"Country":"HK","japan":false,"doi":"10.1007-s44196-024-00697-0","Journal Id":44196,"Journal Title":"International Journal of Computational Intelligence Systems","imprint":"Springer","Keywords":"Meetup event extraction, Context-aware event extraction, Event argument positioning, Event argument recognition","kwrd":["Meetup_event_extraction","Context-aware_event_extraction","Event_argument_positioning","Event_argument_recognition"],"Labs":"Y","ksg":"Krux.segments","kuid":"Krux.uid","Has Body":"Y","Features":[],"Open Access":"Y","hasAccess":"Y","bypassPaywall":"N","user":{"license":{"businessPartnerID":[],"businessPartnerIDString":""}},"Access Type":"open","Bpids":"","Bpnames":"","BPID":["1"],"VG Wort Identifier":"vgzm.415900-10.1007-s44196-024-00697-0","Full HTML":"Y","Subject Codes":["SCT","SCT11014","SCI21000","SCM24005","SCT19000"],"pmc":["T","T11014","I21000","M24005","T19000"],"session":{"authentication":{"loginStatus":"N"},"attributes":{"edition":"academic"}},"content":{"serial":{"eissn":"1875-6883"},"type":"Article","category":{"pmc":{"primarySubject":"Engineering","primarySubjectCode":"T","secondarySubjects":{"1":"Computational Intelligence","2":"Artificial Intelligence","3":"Mathematical Logic and Foundations","4":"Control, Robotics, Mechatronics"},"secondarySubjectCodes":{"1":"T11014","2":"I21000","3":"M24005","4":"T19000"}},"sucode":"SC8","articleType":"Research Article"},"attributes":{"deliveryPlatform":"oscar"}},"Event Category":"Article"}]; </script> <script data-test="springer-link-article-datalayer"> window.dataLayer = window.dataLayer || []; window.dataLayer.push({ ga4MeasurementId: 'G-B3E4QL2TPR', ga360TrackingId: 'UA-26408784-1', twitterId: 'o47a7', baiduId: 'aef3043f025ccf2305af8a194652d70b', ga4ServerUrl: 'https://collect.springer.com', imprint: 'springerlink', page: { attributes:{ featureFlags: [{ name: 'darwin-orion', active: true }, { name: 'chapter-books-recs', active: true } ], darwinAvailable: true } } }); </script> <script> (function(w, d) { w.config = w.config || {}; w.config.mustardcut = false; if (w.matchMedia && w.matchMedia('only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark)').matches) { w.config.mustardcut = true; d.classList.add('js'); d.classList.remove('grade-c'); d.classList.remove('no-js'); } })(window, document.documentElement); </script> <script class="js-entry"> if (window.config.mustardcut) { (function(w, d) { window.Component = {}; window.suppressShareButton = false; window.onArticlePage = true; var currentScript = d.currentScript || d.head.querySelector('script.js-entry'); function catchNoModuleSupport() { var scriptEl = d.createElement('script'); return (!('noModule' in scriptEl) && 'onbeforeload' in scriptEl) } var headScripts = [ {'src': '/oscar-static/js/polyfill-es5-bundle-572d4fec60.js', 'async': false} ]; var bodyScripts = [ {'src': '/oscar-static/js/global-article-es5-bundle-dad1690b0d.js', 'async': false, 'module': false}, {'src': '/oscar-static/js/global-article-es6-bundle-e7d03c4cb3.js', 'async': false, 'module': true} ]; function createScript(script) { var scriptEl = d.createElement('script'); scriptEl.src = script.src; scriptEl.async = script.async; if (script.module === true) { scriptEl.type = "module"; if (catchNoModuleSupport()) { scriptEl.src = ''; } } else if (script.module === false) { scriptEl.setAttribute('nomodule', true) } if (script.charset) { scriptEl.setAttribute('charset', script.charset); } return scriptEl; } for (var i = 0; i < headScripts.length; ++i) { var scriptEl = createScript(headScripts[i]); currentScript.parentNode.insertBefore(scriptEl, currentScript.nextSibling); } d.addEventListener('DOMContentLoaded', function() { for (var i = 0; i < bodyScripts.length; ++i) { var scriptEl = createScript(bodyScripts[i]); d.body.appendChild(scriptEl); } }); // Webfont repeat view var config = w.config; if (config && config.publisherBrand && sessionStorage.fontsLoaded === 'true') { d.documentElement.className += ' webfonts-loaded'; } })(window, document); } </script> <script data-src="https://cdn.optimizely.com/js/27195530232.js" data-cc-script="C03"></script> <script data-test="gtm-head"> window.initGTM = function() { if (window.config.mustardcut) { (function (w, d, s, l, i) { w[l] = w[l] || []; w[l].push({'gtm.start': new Date().getTime(), event: 'gtm.js'}); var f = d.getElementsByTagName(s)[0], j = d.createElement(s), dl = l != 'dataLayer' ? '&l=' + l : ''; j.async = true; j.src = 'https://www.googletagmanager.com/gtm.js?id=' + i + dl; f.parentNode.insertBefore(j, f); })(window, document, 'script', 'dataLayer', 'GTM-MRVXSHQ'); } } </script> <script> (function (w, d, t) { function cc() { var h = w.location.hostname; var e = d.createElement(t), s = d.getElementsByTagName(t)[0]; if (h.indexOf('springer.com') > -1 && h.indexOf('biomedcentral.com') === -1 && h.indexOf('springeropen.com') === -1) { if (h.indexOf('link-qa.springer.com') > -1 || h.indexOf('test-www.springer.com') > -1) { e.src = 'https://cmp.springer.com/production_live/en/consent-bundle-17-52.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.springer.com/production_live/en/consent-bundle-17-52.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('biomedcentral.com') > -1) { if (h.indexOf('biomedcentral.com.qa') > -1) { e.src = 'https://cmp.biomedcentral.com/production_live/en/consent-bundle-15-36.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.biomedcentral.com/production_live/en/consent-bundle-15-36.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('springeropen.com') > -1) { if (h.indexOf('springeropen.com.qa') > -1) { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-16-34.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-16-34.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('springernature.com') > -1) { if (h.indexOf('beta-qa.springernature.com') > -1) { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-49-43.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-NK22KLS')"); } else { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-49-43.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-NK22KLS')"); } } else { e.src = '/oscar-static/js/cookie-consent-es5-bundle-cb57c2c98a.js'; e.setAttribute('data-consent', h); } s.insertAdjacentElement('afterend', e); } cc(); })(window, document, 'script'); </script> <link rel="canonical" href="https://link.springer.com/article/10.1007/s44196-024-00697-0"/> <script type="application/ld+json">{"mainEntity":{"headline":"Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition","description":"Extracting meetup events from social network posts or webpage announcements is the core technology to build event search services on the Web. While event extraction in English achieves good performance in sentence-level evaluation [1], the quality of auto-labeled training data via distant supervision is not good enough for word-level event extraction due to long event titles [2]. Additionally, meetup event titles are more complex and diverse than trigger-word-based event extraction. Therefore, the performance of event title extraction is usually worse than that of traditional named entity recognition (NER). In this paper, we propose a context-aware meetup event extraction (CAMEE) framework that incorporates a sentence-level event argument positioning model to locate event fields (i.e., title, venue, dates, etc.) within a message and then perform word-level event title, venue, and date extraction. Experimental results show that adding sentence-level event argument positioning as a filtering step improves the word-level event field extraction performance from 0.726 to 0.743 macro-F1, outperforming large language models like GPT-4-turbo (with 0.549 F1) and SOTA NER model SoftLexicon (with 0.733 F1). Furthermore, when evaluating the main event extraction task, the proposed model achieves 0.784 macro-F1.","datePublished":"2024-11-26T00:00:00Z","dateModified":"2024-11-26T00:00:00Z","pageStart":"1","pageEnd":"16","license":"http://creativecommons.org/licenses/by-nc-nd/4.0/","sameAs":"https://doi.org/10.1007/s44196-024-00697-0","keywords":["Meetup event extraction","Context-aware event extraction","Event argument positioning","Event argument recognition","Computational Intelligence","Artificial Intelligence","Mathematical Logic and Foundations","Control","Robotics","Mechatronics"],"image":["https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig1_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig2_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig3_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig4_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig5_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig6_HTML.png"],"isPartOf":{"name":"International Journal of Computational Intelligence Systems","issn":["1875-6883"],"volumeNumber":"17","@type":["Periodical","PublicationVolume"]},"publisher":{"name":"Springer Netherlands","logo":{"url":"https://www.springernature.com/app-sn/public/images/logo-springernature.png","@type":"ImageObject"},"@type":"Organization"},"author":[{"name":"Yuan-Hao Lin","affiliation":[{"name":"National Central University","address":{"name":"Department of Computer Science and Information Engineering, National Central University, Taoyuan City, Taiwan","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Chia-Hui Chang","affiliation":[{"name":"National Central University","address":{"name":"Department of Computer Science and Information Engineering, National Central University, Taoyuan City, Taiwan","@type":"PostalAddress"},"@type":"Organization"}],"email":"chiahui@g.ncu.edu.tw","@type":"Person"},{"name":"Hsiu-Min Chuang","affiliation":[{"name":"Chung Yuan Christian University","address":{"name":"Information and Computer Engineering, Chung Yuan Christian University, Taoyuan, Taiwan","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"}],"isAccessibleForFree":true,"@type":"ScholarlyArticle"},"@context":"https://schema.org","@type":"WebPage"}</script> </head> <body class="" > <!-- Google Tag Manager (noscript) --> <noscript> <iframe src="https://www.googletagmanager.com/ns.html?id=GTM-MRVXSHQ" height="0" width="0" style="display:none;visibility:hidden"></iframe> </noscript> <!-- End Google Tag Manager (noscript) --> <!-- Google Tag Manager (noscript) --> <noscript data-test="gtm-body"> <iframe src="https://www.googletagmanager.com/ns.html?id=GTM-MRVXSHQ" height="0" width="0" style="display:none;visibility:hidden"></iframe> </noscript> <!-- End Google Tag Manager (noscript) --> <div class="u-visually-hidden" aria-hidden="true" data-test="darwin-icons"> <?xml version="1.0" encoding="UTF-8"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"><symbol id="icon-eds-i-accesses-medium" viewBox="0 0 24 24"><path d="M15.59 1a1 1 0 0 1 .706.291l5.41 5.385a1 1 0 0 1 .294.709v13.077c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742H15a1 1 0 0 1 0-2h4.455a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.8L15.178 3H5.545a.543.543 0 0 0-.538.451L5 3.538v8.607a1 1 0 0 1-2 0V3.538A2.542 2.542 0 0 1 5.545 1h10.046ZM8 13c2.052 0 4.66 1.61 6.36 3.4l.124.141c.333.41.516.925.516 1.459 0 .6-.232 1.178-.64 1.599C12.666 21.388 10.054 23 8 23c-2.052 0-4.66-1.61-6.353-3.393A2.31 2.31 0 0 1 1 18c0-.6.232-1.178.64-1.6C3.34 14.61 5.948 13 8 13Zm0 2c-1.369 0-3.552 1.348-4.917 2.785A.31.31 0 0 0 3 18c0 .083.031.161.09.222C4.447 19.652 6.631 21 8 21c1.37 0 3.556-1.35 4.917-2.785A.31.31 0 0 0 13 18a.32.32 0 0 0-.048-.17l-.042-.052C11.553 16.348 9.369 15 8 15Zm0 1a2 2 0 1 1 0 4 2 2 0 0 1 0-4Z"/></symbol><symbol id="icon-eds-i-altmetric-medium" viewBox="0 0 24 24"><path d="M12 1c5.978 0 10.843 4.77 10.996 10.712l.004.306-.002.022-.002.248C22.843 18.23 17.978 23 12 23 5.925 23 1 18.075 1 12S5.925 1 12 1Zm-1.726 9.246L8.848 12.53a1 1 0 0 1-.718.461L8.003 13l-4.947.014a9.001 9.001 0 0 0 17.887-.001L16.553 13l-2.205 3.53a1 1 0 0 1-1.735-.068l-.05-.11-2.289-6.106ZM12 3a9.001 9.001 0 0 0-8.947 8.013l4.391-.012L9.652 7.47a1 1 0 0 1 1.784.179l2.288 6.104 1.428-2.283a1 1 0 0 1 .722-.462l.129-.008 4.943.012A9.001 9.001 0 0 0 12 3Z"/></symbol><symbol id="icon-eds-i-arrow-bend-down-medium" viewBox="0 0 24 24"><path d="m11.852 20.989.058.007L12 21l.075-.003.126-.017.111-.03.111-.044.098-.052.104-.074.082-.073 6-6a1 1 0 0 0-1.414-1.414L13 17.585v-12.2C13 4.075 11.964 3 10.667 3H4a1 1 0 1 0 0 2h6.667c.175 0 .333.164.333.385v12.2l-4.293-4.292a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414l6 6c.035.036.073.068.112.097l.11.071.114.054.105.035.118.025Z"/></symbol><symbol id="icon-eds-i-arrow-bend-down-small" viewBox="0 0 16 16"><path d="M1 2a1 1 0 0 0 1 1h5v8.585L3.707 8.293a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414l5 5 .063.059.093.069.081.048.105.048.104.035.105.022.096.01h.136l.122-.018.113-.03.103-.04.1-.053.102-.07.052-.043 5.04-5.037a1 1 0 1 0-1.415-1.414L9 11.583V3a2 2 0 0 0-2-2H2a1 1 0 0 0-1 1Z"/></symbol><symbol id="icon-eds-i-arrow-bend-up-medium" viewBox="0 0 24 24"><path d="m11.852 3.011.058-.007L12 3l.075.003.126.017.111.03.111.044.098.052.104.074.082.073 6 6a1 1 0 1 1-1.414 1.414L13 6.415v12.2C13 19.925 11.964 21 10.667 21H4a1 1 0 0 1 0-2h6.667c.175 0 .333-.164.333-.385v-12.2l-4.293 4.292a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l6-6c.035-.036.073-.068.112-.097l.11-.071.114-.054.105-.035.118-.025Z"/></symbol><symbol id="icon-eds-i-arrow-bend-up-small" viewBox="0 0 16 16"><path d="M1 13.998a1 1 0 0 1 1-1h5V4.413L3.707 7.705a1 1 0 0 1-1.32.084l-.094-.084a1 1 0 0 1 0-1.414l5-5 .063-.059.093-.068.081-.05.105-.047.104-.035.105-.022L7.94 1l.136.001.122.017.113.03.103.04.1.053.102.07.052.043 5.04 5.037a1 1 0 1 1-1.415 1.414L9 4.415v8.583a2 2 0 0 1-2 2H2a1 1 0 0 1-1-1Z"/></symbol><symbol id="icon-eds-i-arrow-diagonal-medium" viewBox="0 0 24 24"><path d="M14 3h6l.075.003.126.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.054.114.035.105.03.148L21 4v6a1 1 0 0 1-2 0V6.414l-4.293 4.293a1 1 0 0 1-1.414-1.414L17.584 5H14a1 1 0 0 1-.993-.883L13 4a1 1 0 0 1 1-1ZM4 13a1 1 0 0 1 1 1v3.584l4.293-4.291a1 1 0 1 1 1.414 1.414L6.414 19H10a1 1 0 0 1 .993.883L11 20a1 1 0 0 1-1 1l-6.075-.003-.126-.017-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08a1.01 1.01 0 0 1-.097-.112l-.071-.11-.054-.114-.035-.105-.025-.118-.007-.058L3 20v-6a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-arrow-diagonal-small" viewBox="0 0 16 16"><path d="m2 15-.082-.004-.119-.016-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08a1.008 1.008 0 0 1-.097-.112l-.071-.11-.031-.062-.034-.081-.024-.076-.025-.118-.007-.058L1 14.02V9a1 1 0 1 1 2 0v2.584l2.793-2.791a1 1 0 1 1 1.414 1.414L4.414 13H7a1 1 0 0 1 .993.883L8 14a1 1 0 0 1-1 1H2ZM14 1l.081.003.12.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.031.062.034.081.024.076.03.148L15 2v5a1 1 0 0 1-2 0V4.414l-2.96 2.96A1 1 0 1 1 8.626 5.96L11.584 3H9a1 1 0 0 1-.993-.883L8 2a1 1 0 0 1 1-1h5Z"/></symbol><symbol id="icon-eds-i-arrow-down-medium" viewBox="0 0 24 24"><path d="m20.707 12.728-7.99 7.98a.996.996 0 0 1-.561.281l-.157.011a.998.998 0 0 1-.788-.384l-7.918-7.908a1 1 0 0 1 1.414-1.416L11 17.576V4a1 1 0 0 1 2 0v13.598l6.293-6.285a1 1 0 0 1 1.32-.082l.095.083a1 1 0 0 1-.001 1.414Z"/></symbol><symbol id="icon-eds-i-arrow-down-small" viewBox="0 0 16 16"><path d="m1.293 8.707 6 6 .063.059.093.069.081.048.105.049.104.034.056.013.118.017L8 15l.076-.003.122-.017.113-.03.085-.032.063-.03.098-.058.06-.043.05-.043 6.04-6.037a1 1 0 0 0-1.414-1.414L9 11.583V2a1 1 0 1 0-2 0v9.585L2.707 7.293a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414Z"/></symbol><symbol id="icon-eds-i-arrow-left-medium" viewBox="0 0 24 24"><path d="m11.272 3.293-7.98 7.99a.996.996 0 0 0-.281.561L3 12.001c0 .32.15.605.384.788l7.908 7.918a1 1 0 0 0 1.416-1.414L6.424 13H20a1 1 0 0 0 0-2H6.402l6.285-6.293a1 1 0 0 0 .082-1.32l-.083-.095a1 1 0 0 0-1.414.001Z"/></symbol><symbol id="icon-eds-i-arrow-left-small" viewBox="0 0 16 16"><path d="m7.293 1.293-6 6-.059.063-.069.093-.048.081-.049.105-.034.104-.013.056-.017.118L1 8l.003.076.017.122.03.113.032.085.03.063.058.098.043.06.043.05 6.037 6.04a1 1 0 0 0 1.414-1.414L4.417 9H14a1 1 0 0 0 0-2H4.415l4.292-4.293a1 1 0 0 0 .083-1.32l-.083-.094a1 1 0 0 0-1.414 0Z"/></symbol><symbol id="icon-eds-i-arrow-right-medium" viewBox="0 0 24 24"><path d="m12.728 3.293 7.98 7.99a.996.996 0 0 1 .281.561l.011.157c0 .32-.15.605-.384.788l-7.908 7.918a1 1 0 0 1-1.416-1.414L17.576 13H4a1 1 0 0 1 0-2h13.598l-6.285-6.293a1 1 0 0 1-.082-1.32l.083-.095a1 1 0 0 1 1.414.001Z"/></symbol><symbol id="icon-eds-i-arrow-right-small" viewBox="0 0 16 16"><path d="m8.707 1.293 6 6 .059.063.069.093.048.081.049.105.034.104.013.056.017.118L15 8l-.003.076-.017.122-.03.113-.032.085-.03.063-.058.098-.043.06-.043.05-6.037 6.04a1 1 0 0 1-1.414-1.414L11.583 9H2a1 1 0 1 1 0-2h9.585L7.293 2.707a1 1 0 0 1-.083-1.32l.083-.094a1 1 0 0 1 1.414 0Z"/></symbol><symbol id="icon-eds-i-arrow-up-medium" viewBox="0 0 24 24"><path d="m3.293 11.272 7.99-7.98a.996.996 0 0 1 .561-.281L12.001 3c.32 0 .605.15.788.384l7.918 7.908a1 1 0 0 1-1.414 1.416L13 6.424V20a1 1 0 0 1-2 0V6.402l-6.293 6.285a1 1 0 0 1-1.32.082l-.095-.083a1 1 0 0 1 .001-1.414Z"/></symbol><symbol id="icon-eds-i-arrow-up-small" viewBox="0 0 16 16"><path d="m1.293 7.293 6-6 .063-.059.093-.069.081-.048.105-.049.104-.034.056-.013.118-.017L8 1l.076.003.122.017.113.03.085.032.063.03.098.058.06.043.05.043 6.04 6.037a1 1 0 0 1-1.414 1.414L9 4.417V14a1 1 0 0 1-2 0V4.415L2.707 8.707a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414Z"/></symbol><symbol id="icon-eds-i-article-medium" viewBox="0 0 24 24"><path d="M8 7a1 1 0 0 0 0 2h4a1 1 0 1 0 0-2H8ZM8 11a1 1 0 1 0 0 2h8a1 1 0 1 0 0-2H8ZM7 16a1 1 0 0 1 1-1h8a1 1 0 1 1 0 2H8a1 1 0 0 1-1-1Z"/><path d="M5.545 1A2.542 2.542 0 0 0 3 3.538v16.924A2.542 2.542 0 0 0 5.545 23h12.91A2.542 2.542 0 0 0 21 20.462V3.5A2.5 2.5 0 0 0 18.5 1H5.545ZM5 3.538C5 3.245 5.24 3 5.545 3H18.5a.5.5 0 0 1 .5.5v16.962c0 .293-.24.538-.546.538H5.545A.542.542 0 0 1 5 20.462V3.538Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-book-medium" viewBox="0 0 24 24"><path d="M18.5 1A2.5 2.5 0 0 1 21 3.5v12c0 1.16-.79 2.135-1.86 2.418l-.14.031V21h1a1 1 0 0 1 .993.883L21 22a1 1 0 0 1-1 1H6.5A3.5 3.5 0 0 1 3 19.5v-15A3.5 3.5 0 0 1 6.5 1h12ZM17 18H6.5a1.5 1.5 0 0 0-1.493 1.356L5 19.5A1.5 1.5 0 0 0 6.5 21H17v-3Zm1.5-15h-12A1.5 1.5 0 0 0 5 4.5v11.837l.054-.025a3.481 3.481 0 0 1 1.254-.307L6.5 16h12a.5.5 0 0 0 .492-.41L19 15.5v-12a.5.5 0 0 0-.5-.5ZM15 6a1 1 0 0 1 0 2H9a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-book-series-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M1 3.786C1 2.759 1.857 2 2.82 2H6.18c.964 0 1.82.759 1.82 1.786V4h3.168c.668 0 1.298.364 1.616.938.158-.109.333-.195.523-.252l3.216-.965c.923-.277 1.962.204 2.257 1.187l4.146 13.82c.296.984-.307 1.957-1.23 2.234l-3.217.965c-.923.277-1.962-.203-2.257-1.187L13 10.005v10.21c0 1.04-.878 1.785-1.834 1.785H7.833c-.291 0-.575-.07-.83-.195A1.849 1.849 0 0 1 6.18 22H2.821C1.857 22 1 21.241 1 20.214V3.786ZM3 4v11h3V4H3Zm0 16v-3h3v3H3Zm15.075-.04-.814-2.712 2.874-.862.813 2.712-2.873.862Zm1.485-5.49-2.874.862-2.634-8.782 2.873-.862 2.635 8.782ZM8 20V6h3v14H8Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-calendar-acceptance-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-.534 7.747a1 1 0 0 1 .094 1.412l-4.846 5.538a1 1 0 0 1-1.352.141l-2.77-2.076a1 1 0 0 1 1.2-1.6l2.027 1.519 4.236-4.84a1 1 0 0 1 1.411-.094ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-date-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1ZM8 15a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm-4-4a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-decision-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-2.935 8.246 2.686 2.645c.34.335.34.883 0 1.218l-2.686 2.645a.858.858 0 0 1-1.213-.009.854.854 0 0 1 .009-1.21l1.05-1.035H7.984a.992.992 0 0 1-.984-1c0-.552.44-1 .984-1h5.928l-1.051-1.036a.854.854 0 0 1-.085-1.121l.076-.088a.858.858 0 0 1 1.213-.009ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-impact-factor-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-3.2 6.924a.48.48 0 0 1 .125.544l-1.52 3.283h2.304c.27 0 .491.215.491.483a.477.477 0 0 1-.13.327l-4.18 4.484a.498.498 0 0 1-.69.031.48.48 0 0 1-.125-.544l1.52-3.284H9.291a.487.487 0 0 1-.491-.482c0-.121.047-.238.13-.327l4.18-4.484a.498.498 0 0 1 .69-.031ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-call-papers-medium" viewBox="0 0 24 24"><g><path d="m20.707 2.883-1.414 1.414a1 1 0 0 0 1.414 1.414l1.414-1.414a1 1 0 0 0-1.414-1.414Z"/><path d="M6 16.054c0 2.026 1.052 2.943 3 2.943a1 1 0 1 1 0 2c-2.996 0-5-1.746-5-4.943v-1.227a4.068 4.068 0 0 1-1.83-1.189 4.553 4.553 0 0 1-.87-1.455 4.868 4.868 0 0 1-.3-1.686c0-1.17.417-2.298 1.17-3.14.38-.426.834-.767 1.338-1 .51-.237 1.06-.36 1.617-.36L6.632 6H7l7.932-2.895A2.363 2.363 0 0 1 18 5.36v9.28a2.36 2.36 0 0 1-3.069 2.25l.084.03L7 14.997H6v1.057Zm9.637-11.057a.415.415 0 0 0-.083.008L8 7.638v5.536l7.424 1.786.104.02c.035.01.072.02.109.02.2 0 .363-.16.363-.36V5.36c0-.2-.163-.363-.363-.363Zm-9.638 3h-.874a1.82 1.82 0 0 0-.625.111l-.15.063a2.128 2.128 0 0 0-.689.517c-.42.47-.661 1.123-.661 1.81 0 .34.06.678.176.992.114.308.28.585.485.816.4.447.925.691 1.464.691h.874v-5Z" clip-rule="evenodd"/><path d="M20 8.997h2a1 1 0 1 1 0 2h-2a1 1 0 1 1 0-2ZM20.707 14.293l1.414 1.414a1 1 0 0 1-1.414 1.414l-1.414-1.414a1 1 0 0 1 1.414-1.414Z"/></g></symbol><symbol id="icon-eds-i-card-medium" viewBox="0 0 24 24"><path d="M19.615 2c.315 0 .716.067 1.14.279.76.38 1.245 1.107 1.245 2.106v15.23c0 .315-.067.716-.279 1.14-.38.76-1.107 1.245-2.106 1.245H4.385a2.56 2.56 0 0 1-1.14-.279C2.485 21.341 2 20.614 2 19.615V4.385c0-.315.067-.716.279-1.14C2.659 2.485 3.386 2 4.385 2h15.23Zm0 2H4.385c-.213 0-.265.034-.317.14A.71.71 0 0 0 4 4.385v15.23c0 .213.034.265.14.317a.71.71 0 0 0 .245.068h15.23c.213 0 .265-.034.317-.14a.71.71 0 0 0 .068-.245V4.385c0-.213-.034-.265-.14-.317A.71.71 0 0 0 19.615 4ZM17 16a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h10Zm0-3a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h10Zm-.5-7A1.5 1.5 0 0 1 18 7.5v3a1.5 1.5 0 0 1-1.5 1.5h-9A1.5 1.5 0 0 1 6 10.5v-3A1.5 1.5 0 0 1 7.5 6h9ZM16 8H8v2h8V8Z"/></symbol><symbol id="icon-eds-i-cart-medium" viewBox="0 0 24 24"><path d="M5.76 1a1 1 0 0 1 .994.902L7.155 6h13.34c.18 0 .358.02.532.057l.174.045a2.5 2.5 0 0 1 1.693 3.103l-2.069 7.03c-.36 1.099-1.398 1.823-2.49 1.763H8.65c-1.272.015-2.352-.927-2.546-2.244L4.852 3H2a1 1 0 0 1-.993-.883L1 2a1 1 0 0 1 1-1h3.76Zm2.328 14.51a.555.555 0 0 0 .55.488l9.751.001a.533.533 0 0 0 .527-.357l2.059-7a.5.5 0 0 0-.48-.642H7.351l.737 7.51ZM18 19a2 2 0 1 1 0 4 2 2 0 0 1 0-4ZM8 19a2 2 0 1 1 0 4 2 2 0 0 1 0-4Z"/></symbol><symbol id="icon-eds-i-check-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm5.125 4.72a1 1 0 0 1 .156 1.405l-6 7.5a1 1 0 0 1-1.421.143l-3-2.5a1 1 0 0 1 1.28-1.536l2.217 1.846 5.362-6.703a1 1 0 0 1 1.406-.156Z"/></symbol><symbol id="icon-eds-i-check-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm5.125 6.72a1 1 0 0 0-1.406.155l-5.362 6.703-2.217-1.846a1 1 0 1 0-1.28 1.536l3 2.5a1 1 0 0 0 1.42-.143l6-7.5a1 1 0 0 0-.155-1.406Z"/></symbol><symbol id="icon-eds-i-chevron-down-medium" viewBox="0 0 24 24"><path d="M3.305 8.28a1 1 0 0 0-.024 1.415l7.495 7.762c.314.345.757.543 1.224.543.467 0 .91-.198 1.204-.522l7.515-7.783a1 1 0 1 0-1.438-1.39L12 15.845l-7.28-7.54A1 1 0 0 0 3.4 8.2l-.096.082Z"/></symbol><symbol id="icon-eds-i-chevron-down-small" viewBox="0 0 16 16"><path d="M13.692 5.278a1 1 0 0 1 .03 1.414L9.103 11.51a1.491 1.491 0 0 1-2.188.019L2.278 6.692a1 1 0 0 1 1.444-1.384L8 9.771l4.278-4.463a1 1 0 0 1 1.318-.111l.096.081Z"/></symbol><symbol id="icon-eds-i-chevron-left-medium" viewBox="0 0 24 24"><path d="M15.72 3.305a1 1 0 0 0-1.415-.024l-7.762 7.495A1.655 1.655 0 0 0 6 12c0 .467.198.91.522 1.204l7.783 7.515a1 1 0 1 0 1.39-1.438L8.155 12l7.54-7.28A1 1 0 0 0 15.8 3.4l-.082-.096Z"/></symbol><symbol id="icon-eds-i-chevron-left-small" viewBox="0 0 16 16"><path d="M10.722 2.308a1 1 0 0 0-1.414-.03L4.49 6.897a1.491 1.491 0 0 0-.019 2.188l4.838 4.637a1 1 0 1 0 1.384-1.444L6.229 8l4.463-4.278a1 1 0 0 0 .111-1.318l-.081-.096Z"/></symbol><symbol id="icon-eds-i-chevron-right-medium" viewBox="0 0 24 24"><path d="M8.28 3.305a1 1 0 0 1 1.415-.024l7.762 7.495c.345.314.543.757.543 1.224 0 .467-.198.91-.522 1.204l-7.783 7.515a1 1 0 1 1-1.39-1.438L15.845 12l-7.54-7.28A1 1 0 0 1 8.2 3.4l.082-.096Z"/></symbol><symbol id="icon-eds-i-chevron-right-small" viewBox="0 0 16 16"><path d="M5.278 2.308a1 1 0 0 1 1.414-.03l4.819 4.619a1.491 1.491 0 0 1 .019 2.188l-4.838 4.637a1 1 0 1 1-1.384-1.444L9.771 8 5.308 3.722a1 1 0 0 1-.111-1.318l.081-.096Z"/></symbol><symbol id="icon-eds-i-chevron-up-medium" viewBox="0 0 24 24"><path d="M20.695 15.72a1 1 0 0 0 .024-1.415l-7.495-7.762A1.655 1.655 0 0 0 12 6c-.467 0-.91.198-1.204.522l-7.515 7.783a1 1 0 1 0 1.438 1.39L12 8.155l7.28 7.54a1 1 0 0 0 1.319.106l.096-.082Z"/></symbol><symbol id="icon-eds-i-chevron-up-small" viewBox="0 0 16 16"><path d="M13.692 10.722a1 1 0 0 0 .03-1.414L9.103 4.49a1.491 1.491 0 0 0-2.188-.019L2.278 9.308a1 1 0 0 0 1.444 1.384L8 6.229l4.278 4.463a1 1 0 0 0 1.318.111l.096-.081Z"/></symbol><symbol id="icon-eds-i-citations-medium" viewBox="0 0 24 24"><path d="M15.59 1a1 1 0 0 1 .706.291l5.41 5.385a1 1 0 0 1 .294.709v13.077c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742h-5.843a1 1 0 1 1 0-2h5.843a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.8L15.178 3H5.545a.543.543 0 0 0-.538.451L5 3.538v8.607a1 1 0 0 1-2 0V3.538A2.542 2.542 0 0 1 5.545 1h10.046ZM5.483 14.35c.197.26.17.62-.049.848l-.095.083-.016.011c-.36.24-.628.45-.804.634-.393.409-.59.93-.59 1.562.077-.019.192-.028.345-.028.442 0 .84.158 1.195.474.355.316.532.716.532 1.2 0 .501-.173.9-.518 1.198-.345.298-.767.446-1.266.446-.672 0-1.209-.195-1.612-.585-.403-.39-.604-.976-.604-1.757 0-.744.11-1.39.33-1.938.222-.549.49-1.009.807-1.38a4.28 4.28 0 0 1 .992-.88c.07-.043.148-.087.232-.133a.881.881 0 0 1 1.121.245Zm5 0c.197.26.17.62-.049.848l-.095.083-.016.011c-.36.24-.628.45-.804.634-.393.409-.59.93-.59 1.562.077-.019.192-.028.345-.028.442 0 .84.158 1.195.474.355.316.532.716.532 1.2 0 .501-.173.9-.518 1.198-.345.298-.767.446-1.266.446-.672 0-1.209-.195-1.612-.585-.403-.39-.604-.976-.604-1.757 0-.744.11-1.39.33-1.938.222-.549.49-1.009.807-1.38a4.28 4.28 0 0 1 .992-.88c.07-.043.148-.087.232-.133a.881.881 0 0 1 1.121.245Z"/></symbol><symbol id="icon-eds-i-clipboard-check-medium" viewBox="0 0 24 24"><path d="M14.4 1c1.238 0 2.274.865 2.536 2.024L18.5 3C19.886 3 21 4.14 21 5.535v14.93C21 21.86 19.886 23 18.5 23h-13C4.114 23 3 21.86 3 20.465V5.535C3 4.14 4.114 3 5.5 3h1.57c.27-1.147 1.3-2 2.53-2h4.8Zm4.115 4-1.59.024A2.601 2.601 0 0 1 14.4 7H9.6c-1.23 0-2.26-.853-2.53-2H5.5c-.27 0-.5.234-.5.535v14.93c0 .3.23.535.5.535h13c.27 0 .5-.234.5-.535V5.535c0-.3-.23-.535-.485-.535Zm-1.909 4.205a1 1 0 0 1 .19 1.401l-5.334 7a1 1 0 0 1-1.344.23l-2.667-1.75a1 1 0 1 1 1.098-1.672l1.887 1.238 4.769-6.258a1 1 0 0 1 1.401-.19ZM14.4 3H9.6a.6.6 0 0 0-.6.6v.8a.6.6 0 0 0 .6.6h4.8a.6.6 0 0 0 .6-.6v-.8a.6.6 0 0 0-.6-.6Z"/></symbol><symbol id="icon-eds-i-clipboard-report-medium" viewBox="0 0 24 24"><path d="M14.4 1c1.238 0 2.274.865 2.536 2.024L18.5 3C19.886 3 21 4.14 21 5.535v14.93C21 21.86 19.886 23 18.5 23h-13C4.114 23 3 21.86 3 20.465V5.535C3 4.14 4.114 3 5.5 3h1.57c.27-1.147 1.3-2 2.53-2h4.8Zm4.115 4-1.59.024A2.601 2.601 0 0 1 14.4 7H9.6c-1.23 0-2.26-.853-2.53-2H5.5c-.27 0-.5.234-.5.535v14.93c0 .3.23.535.5.535h13c.27 0 .5-.234.5-.535V5.535c0-.3-.23-.535-.485-.535Zm-2.658 10.929a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h7.857Zm0-3.929a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h7.857ZM14.4 3H9.6a.6.6 0 0 0-.6.6v.8a.6.6 0 0 0 .6.6h4.8a.6.6 0 0 0 .6-.6v-.8a.6.6 0 0 0-.6-.6Z"/></symbol><symbol id="icon-eds-i-close-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18ZM8.707 7.293 12 10.585l3.293-3.292a1 1 0 0 1 1.414 1.414L13.415 12l3.292 3.293a1 1 0 0 1-1.414 1.414L12 13.415l-3.293 3.292a1 1 0 1 1-1.414-1.414L10.585 12 7.293 8.707a1 1 0 0 1 1.414-1.414Z"/></symbol><symbol id="icon-eds-i-cloud-upload-medium" viewBox="0 0 24 24"><path d="m12.852 10.011.028-.004L13 10l.075.003.126.017.086.022.136.052.098.052.104.074.082.073 3 3a1 1 0 0 1 0 1.414l-.094.083a1 1 0 0 1-1.32-.083L14 13.416V20a1 1 0 0 1-2 0v-6.586l-1.293 1.293a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l3-3 .112-.097.11-.071.114-.054.105-.035.118-.025Zm.587-7.962c3.065.362 5.497 2.662 5.992 5.562l.013.085.207.073c2.117.782 3.496 2.845 3.337 5.097l-.022.226c-.297 2.561-2.503 4.491-5.124 4.502a1 1 0 1 1-.009-2c1.619-.007 2.967-1.186 3.147-2.733.179-1.542-.86-2.979-2.487-3.353-.512-.149-.894-.579-.981-1.165-.21-2.237-2-4.035-4.308-4.308-2.31-.273-4.497 1.06-5.25 3.19l-.049.113c-.234.468-.718.756-1.176.743-1.418.057-2.689.857-3.32 2.084a3.668 3.668 0 0 0 .262 3.798c.796 1.136 2.169 1.764 3.583 1.635a1 1 0 1 1 .182 1.992c-2.125.194-4.193-.753-5.403-2.48a5.668 5.668 0 0 1-.403-5.86c.85-1.652 2.449-2.79 4.323-3.092l.287-.039.013-.028c1.207-2.741 4.125-4.404 7.186-4.042Z"/></symbol><symbol id="icon-eds-i-collection-medium" viewBox="0 0 24 24"><path d="M21 7a1 1 0 0 1 1 1v12.5a2.5 2.5 0 0 1-2.5 2.5H8a1 1 0 0 1 0-2h11.5a.5.5 0 0 0 .5-.5V8a1 1 0 0 1 1-1Zm-5.5-5A2.5 2.5 0 0 1 18 4.5v12a2.5 2.5 0 0 1-2.5 2.5h-11A2.5 2.5 0 0 1 2 16.5v-12A2.5 2.5 0 0 1 4.5 2h11Zm0 2h-11a.5.5 0 0 0-.5.5v12a.5.5 0 0 0 .5.5h11a.5.5 0 0 0 .5-.5v-12a.5.5 0 0 0-.5-.5ZM13 13a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h6Zm0-3.5a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h6ZM13 6a1 1 0 0 1 0 2H7a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-conference-series-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M4.5 2A2.5 2.5 0 0 0 2 4.5v11A2.5 2.5 0 0 0 4.5 18h2.37l-2.534 2.253a1 1 0 0 0 1.328 1.494L9.88 18H11v3a1 1 0 1 0 2 0v-3h1.12l4.216 3.747a1 1 0 0 0 1.328-1.494L17.13 18h2.37a2.5 2.5 0 0 0 2.5-2.5v-11A2.5 2.5 0 0 0 19.5 2h-15ZM20 6V4.5a.5.5 0 0 0-.5-.5h-15a.5.5 0 0 0-.5.5V6h16ZM4 8v7.5a.5.5 0 0 0 .5.5h15a.5.5 0 0 0 .5-.5V8H4Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-delivery-medium" viewBox="0 0 24 24"><path d="M8.51 20.598a3.037 3.037 0 0 1-3.02 0A2.968 2.968 0 0 1 4.161 19L3.5 19A2.5 2.5 0 0 1 1 16.5v-11A2.5 2.5 0 0 1 3.5 3h10a2.5 2.5 0 0 1 2.45 2.004L16 5h2.527c.976 0 1.855.585 2.27 1.49l2.112 4.62a1 1 0 0 1 .091.416v4.856C23 17.814 21.889 19 20.484 19h-.523a1.01 1.01 0 0 1-.121-.007 2.96 2.96 0 0 1-1.33 1.605 3.037 3.037 0 0 1-3.02 0A2.968 2.968 0 0 1 14.161 19H9.838a2.968 2.968 0 0 1-1.327 1.597Zm-2.024-3.462a.955.955 0 0 0-.481.73L5.999 18l.001.022a.944.944 0 0 0 .388.777l.098.065c.316.181.712.181 1.028 0A.97.97 0 0 0 8 17.978a.95.95 0 0 0-.486-.842 1.037 1.037 0 0 0-1.028 0Zm10 0a.955.955 0 0 0-.481.73l-.005.156a.944.944 0 0 0 .388.777l.098.065c.316.181.712.181 1.028 0a.97.97 0 0 0 .486-.886.95.95 0 0 0-.486-.842 1.037 1.037 0 0 0-1.028 0ZM21 12h-5v3.17a3.038 3.038 0 0 1 2.51.232 2.993 2.993 0 0 1 1.277 1.45l.058.155.058-.005.581-.002c.27 0 .516-.263.516-.618V12Zm-7.5-7h-10a.5.5 0 0 0-.5.5v11a.5.5 0 0 0 .5.5h.662a2.964 2.964 0 0 1 1.155-1.491l.172-.107a3.037 3.037 0 0 1 3.022 0A2.987 2.987 0 0 1 9.843 17H13.5a.5.5 0 0 0 .5-.5v-11a.5.5 0 0 0-.5-.5Zm5.027 2H16v3h4.203l-1.224-2.677a.532.532 0 0 0-.375-.316L18.527 7Z"/></symbol><symbol id="icon-eds-i-download-medium" viewBox="0 0 24 24"><path d="M22 18.5a3.5 3.5 0 0 1-3.5 3.5h-13A3.5 3.5 0 0 1 2 18.5V18a1 1 0 0 1 2 0v.5A1.5 1.5 0 0 0 5.5 20h13a1.5 1.5 0 0 0 1.5-1.5V18a1 1 0 0 1 2 0v.5Zm-3.293-7.793-6 6-.063.059-.093.069-.081.048-.105.049-.104.034-.056.013-.118.017L12 17l-.076-.003-.122-.017-.113-.03-.085-.032-.063-.03-.098-.058-.06-.043-.05-.043-6.04-6.037a1 1 0 0 1 1.414-1.414l4.294 4.29L11 3a1 1 0 0 1 2 0l.001 10.585 4.292-4.292a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414Z"/></symbol><symbol id="icon-eds-i-edit-medium" viewBox="0 0 24 24"><path d="M17.149 2a2.38 2.38 0 0 1 1.699.711l2.446 2.46a2.384 2.384 0 0 1 .005 3.38L10.01 19.906a1 1 0 0 1-.434.257l-6.3 1.8a1 1 0 0 1-1.237-1.237l1.8-6.3a1 1 0 0 1 .257-.434L15.443 2.718A2.385 2.385 0 0 1 17.15 2Zm-3.874 5.689-7.586 7.536-1.234 4.319 4.318-1.234 7.54-7.582-3.038-3.039ZM17.149 4a.395.395 0 0 0-.286.126L14.695 6.28l3.029 3.029 2.162-2.173a.384.384 0 0 0 .106-.197L20 6.864c0-.103-.04-.2-.119-.278l-2.457-2.47A.385.385 0 0 0 17.149 4Z"/></symbol><symbol id="icon-eds-i-education-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M12.41 2.088a1 1 0 0 0-.82 0l-10 4.5a1 1 0 0 0 0 1.824L3 9.047v7.124A3.001 3.001 0 0 0 4 22a3 3 0 0 0 1-5.83V9.948l1 .45V14.5a1 1 0 0 0 .087.408L7 14.5c-.913.408-.912.41-.912.41l.001.003.003.006.007.015a1.988 1.988 0 0 0 .083.16c.054.097.131.225.236.373.21.297.53.68.993 1.057C8.351 17.292 9.824 18 12 18c2.176 0 3.65-.707 4.589-1.476.463-.378.783-.76.993-1.057a4.162 4.162 0 0 0 .319-.533l.007-.015.003-.006v-.003h.002s0-.002-.913-.41l.913.408A1 1 0 0 0 18 14.5v-4.103l4.41-1.985a1 1 0 0 0 0-1.824l-10-4.5ZM16 11.297l-3.59 1.615a1 1 0 0 1-.82 0L8 11.297v2.94a3.388 3.388 0 0 0 .677.739C9.267 15.457 10.294 16 12 16s2.734-.543 3.323-1.024a3.388 3.388 0 0 0 .677-.739v-2.94ZM4.437 7.5 12 4.097 19.563 7.5 12 10.903 4.437 7.5ZM3 19a1 1 0 1 1 2 0 1 1 0 0 1-2 0Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-error-diamond-medium" viewBox="0 0 24 24"><path d="M12.002 1c.702 0 1.375.279 1.871.775l8.35 8.353a2.646 2.646 0 0 1 .001 3.744l-8.353 8.353a2.646 2.646 0 0 1-3.742 0l-8.353-8.353a2.646 2.646 0 0 1 0-3.744l8.353-8.353.156-.142c.424-.362.952-.58 1.507-.625l.21-.008Zm0 2a.646.646 0 0 0-.38.123l-.093.08-8.34 8.34a.646.646 0 0 0-.18.355L3 12c0 .171.068.336.19.457l8.353 8.354a.646.646 0 0 0 .914 0l8.354-8.354a.646.646 0 0 0-.001-.914l-8.351-8.354A.646.646 0 0 0 12.002 3ZM12 14.5a1.5 1.5 0 0 1 .144 2.993L12 17.5a1.5 1.5 0 0 1 0-3ZM12 6a1 1 0 0 1 1 1v5a1 1 0 0 1-2 0V7a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-error-filled-medium" viewBox="0 0 24 24"><path d="M12.002 1c.702 0 1.375.279 1.871.775l8.35 8.353a2.646 2.646 0 0 1 .001 3.744l-8.353 8.353a2.646 2.646 0 0 1-3.742 0l-8.353-8.353a2.646 2.646 0 0 1 0-3.744l8.353-8.353.156-.142c.424-.362.952-.58 1.507-.625l.21-.008ZM12 14.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 14.5ZM12 6a1 1 0 0 0-1 1v5a1 1 0 0 0 2 0V7a1 1 0 0 0-1-1Z"/></symbol><symbol id="icon-eds-i-external-link-medium" viewBox="0 0 24 24"><path d="M9 2a1 1 0 1 1 0 2H4.6c-.371 0-.6.209-.6.5v15c0 .291.229.5.6.5h14.8c.371 0 .6-.209.6-.5V15a1 1 0 0 1 2 0v4.5c0 1.438-1.162 2.5-2.6 2.5H4.6C3.162 22 2 20.938 2 19.5v-15C2 3.062 3.162 2 4.6 2H9Zm6 0h6l.075.003.126.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.054.114.035.105.03.148L22 3v6a1 1 0 0 1-2 0V5.414l-6.693 6.693a1 1 0 0 1-1.414-1.414L18.584 4H15a1 1 0 0 1-.993-.883L14 3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-external-link-small" viewBox="0 0 16 16"><path d="M5 1a1 1 0 1 1 0 2l-2-.001V13L13 13v-2a1 1 0 0 1 2 0v2c0 1.15-.93 2-2.067 2H3.067C1.93 15 1 14.15 1 13V3c0-1.15.93-2 2.067-2H5Zm4 0h5l.075.003.126.017.111.03.111.044.098.052.096.067.09.08.044.047.073.093.051.083.054.113.035.105.03.148L15 2v5a1 1 0 0 1-2 0V4.414L9.107 8.307a1 1 0 0 1-1.414-1.414L11.584 3H9a1 1 0 0 1-.993-.883L8 2a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-file-download-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3ZM12 7a1 1 0 0 1 1 1v6.585l2.293-2.292a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414l-4 4a1.008 1.008 0 0 1-.112.097l-.11.071-.114.054-.105.035-.149.03L12 18l-.075-.003-.126-.017-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08-4-4a1 1 0 0 1 1.414-1.414L11 14.585V8a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-file-report-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742H5.545c-.674 0-1.32-.267-1.798-.742A2.535 2.535 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .142.057.278.158.379.102.102.242.159.387.159h12.91a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.915L14.085 3ZM16 17a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm0-3a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm-4.793-6.207L13 9.585l1.793-1.792a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414l-2.5 2.5a1 1 0 0 1-1.414 0L10.5 9.915l-1.793 1.792a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l2.5-2.5a1 1 0 0 1 1.414 0Z"/></symbol><symbol id="icon-eds-i-file-text-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3ZM16 15a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm0-4a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm-5-4a1 1 0 0 1 0 2H8a1 1 0 1 1 0-2h3Z"/></symbol><symbol id="icon-eds-i-file-upload-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3Zm-2.233 4.011.058-.007L12 7l.075.003.126.017.111.03.111.044.098.052.104.074.082.073 4 4a1 1 0 0 1 0 1.414l-.094.083a1 1 0 0 1-1.32-.083L13 10.415V17a1 1 0 0 1-2 0v-6.585l-2.293 2.292a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l4-4 .112-.097.11-.071.114-.054.105-.035.118-.025Z"/></symbol><symbol id="icon-eds-i-filter-medium" viewBox="0 0 24 24"><path d="M21 2a1 1 0 0 1 .82 1.573L15 13.314V18a1 1 0 0 1-.31.724l-.09.076-4 3A1 1 0 0 1 9 21v-7.684L2.18 3.573a1 1 0 0 1 .707-1.567L3 2h18Zm-1.921 2H4.92l5.9 8.427a1 1 0 0 1 .172.45L11 13v6l2-1.5V13a1 1 0 0 1 .117-.469l.064-.104L19.079 4Z"/></symbol><symbol id="icon-eds-i-funding-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M23 8A7 7 0 1 0 9 8a7 7 0 0 0 14 0ZM9.006 12.225A4.07 4.07 0 0 0 6.12 11.02H2a.979.979 0 1 0 0 1.958h4.12c.558 0 1.094.222 1.489.617l2.207 2.288c.27.27.27.687.012.944a.656.656 0 0 1-.928 0L7.744 15.67a.98.98 0 0 0-1.386 1.384l1.157 1.158c.535.536 1.244.791 1.946.765l.041.002h6.922c.874 0 1.597.748 1.597 1.688 0 .203-.146.354-.309.354H7.755c-.487 0-.96-.178-1.339-.504L2.64 17.259a.979.979 0 0 0-1.28 1.482L5.137 22c.733.631 1.66.979 2.618.979h9.957c1.26 0 2.267-1.043 2.267-2.312 0-2.006-1.584-3.646-3.555-3.646h-4.529a2.617 2.617 0 0 0-.681-2.509l-2.208-2.287ZM16 3a5 5 0 1 0 0 10 5 5 0 0 0 0-10Zm.979 3.5a.979.979 0 1 0-1.958 0v3a.979.979 0 1 0 1.958 0v-3Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-hashtag-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18ZM9.52 18.189a1 1 0 1 1-1.964-.378l.437-2.274H6a1 1 0 1 1 0-2h2.378l.592-3.076H6a1 1 0 0 1 0-2h3.354l.51-2.65a1 1 0 1 1 1.964.378l-.437 2.272h3.04l.51-2.65a1 1 0 1 1 1.964.378l-.438 2.272H18a1 1 0 0 1 0 2h-1.917l-.592 3.076H18a1 1 0 0 1 0 2h-2.893l-.51 2.652a1 1 0 1 1-1.964-.378l.437-2.274h-3.04l-.51 2.652Zm.895-4.652h3.04l.591-3.076h-3.04l-.591 3.076Z"/></symbol><symbol id="icon-eds-i-home-medium" viewBox="0 0 24 24"><path d="M5 22a1 1 0 0 1-1-1v-8.586l-1.293 1.293a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l10-10a1 1 0 0 1 1.414 0l10 10a1 1 0 0 1-1.414 1.414L20 12.415V21a1 1 0 0 1-1 1H5Zm7-17.585-6 5.999V20h5v-4a1 1 0 0 1 2 0v4h5v-9.585l-6-6Z"/></symbol><symbol id="icon-eds-i-image-medium" viewBox="0 0 24 24"><path d="M19.615 2A2.385 2.385 0 0 1 22 4.385v15.23A2.385 2.385 0 0 1 19.615 22H4.385A2.385 2.385 0 0 1 2 19.615V4.385A2.385 2.385 0 0 1 4.385 2h15.23Zm0 2H4.385A.385.385 0 0 0 4 4.385v15.23c0 .213.172.385.385.385h1.244l10.228-8.76a1 1 0 0 1 1.254-.037L20 13.392V4.385A.385.385 0 0 0 19.615 4Zm-3.07 9.283L8.703 20h10.912a.385.385 0 0 0 .385-.385v-3.713l-3.455-2.619ZM9.5 6a3.5 3.5 0 1 1 0 7 3.5 3.5 0 0 1 0-7Zm0 2a1.5 1.5 0 1 0 0 3 1.5 1.5 0 0 0 0-3Z"/></symbol><symbol id="icon-eds-i-impact-factor-medium" viewBox="0 0 24 24"><path d="M16.49 2.672c.74.694.986 1.765.632 2.712l-.04.1-1.549 3.54h1.477a2.496 2.496 0 0 1 2.485 2.34l.005.163c0 .618-.23 1.21-.642 1.675l-7.147 7.961a2.48 2.48 0 0 1-3.554.165 2.512 2.512 0 0 1-.633-2.712l.042-.103L9.108 15H7.46c-1.393 0-2.379-1.11-2.455-2.369L5 12.473c0-.593.142-1.145.628-1.692l7.307-7.944a2.48 2.48 0 0 1 3.555-.165ZM14.43 4.164l-7.33 7.97c-.083.093-.101.214-.101.34 0 .277.19.526.46.526h4.163l.097-.009c.015 0 .03.003.046.009.181.078.264.32.186.5l-2.554 5.817a.512.512 0 0 0 .127.552.48.48 0 0 0 .69-.033l7.155-7.97a.513.513 0 0 0 .13-.34.497.497 0 0 0-.49-.502h-3.988a.355.355 0 0 1-.328-.497l2.555-5.844a.512.512 0 0 0-.127-.552.48.48 0 0 0-.69.033Z"/></symbol><symbol id="icon-eds-i-info-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm0 7a1 1 0 0 1 1 1v5h1.5a1 1 0 0 1 0 2h-5a1 1 0 0 1 0-2H11v-4h-.5a1 1 0 0 1-.993-.883L9.5 11a1 1 0 0 1 1-1H12Zm0-4.5a1.5 1.5 0 0 1 .144 2.993L12 8.5a1.5 1.5 0 0 1 0-3Z"/></symbol><symbol id="icon-eds-i-info-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 9h-1.5a1 1 0 0 0-1 1l.007.117A1 1 0 0 0 10.5 12h.5v4H9.5a1 1 0 0 0 0 2h5a1 1 0 0 0 0-2H13v-5a1 1 0 0 0-1-1Zm0-4.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 5.5Z"/></symbol><symbol id="icon-eds-i-journal-medium" viewBox="0 0 24 24"><path d="M18.5 1A2.5 2.5 0 0 1 21 3.5v14a2.5 2.5 0 0 1-2.5 2.5h-13a.5.5 0 1 0 0 1H20a1 1 0 0 1 0 2H5.5A2.5 2.5 0 0 1 3 20.5v-17A2.5 2.5 0 0 1 5.5 1h13ZM7 3H5.5a.5.5 0 0 0-.5.5v14.549l.016-.002c.104-.02.211-.035.32-.042L5.5 18H7V3Zm11.5 0H9v15h9.5a.5.5 0 0 0 .5-.5v-14a.5.5 0 0 0-.5-.5ZM16 5a1 1 0 0 1 1 1v4a1 1 0 0 1-1 1h-5a1 1 0 0 1-1-1V6a1 1 0 0 1 1-1h5Zm-1 2h-3v2h3V7Z"/></symbol><symbol id="icon-eds-i-mail-medium" viewBox="0 0 24 24"><path d="M20.462 3C21.875 3 23 4.184 23 5.619v12.762C23 19.816 21.875 21 20.462 21H3.538C2.125 21 1 19.816 1 18.381V5.619C1 4.184 2.125 3 3.538 3h16.924ZM21 8.158l-7.378 6.258a2.549 2.549 0 0 1-3.253-.008L3 8.16v10.222c0 .353.253.619.538.619h16.924c.285 0 .538-.266.538-.619V8.158ZM20.462 5H3.538c-.264 0-.5.228-.534.542l8.65 7.334c.2.165.492.165.684.007l8.656-7.342-.001-.025c-.044-.3-.274-.516-.531-.516Z"/></symbol><symbol id="icon-eds-i-mail-send-medium" viewBox="0 0 24 24"><path d="M20.444 5a2.562 2.562 0 0 1 2.548 2.37l.007.078.001.123v7.858A2.564 2.564 0 0 1 20.444 18H9.556A2.564 2.564 0 0 1 7 15.429l.001-7.977.007-.082A2.561 2.561 0 0 1 9.556 5h10.888ZM21 9.331l-5.46 3.51a1 1 0 0 1-1.08 0L9 9.332v6.097c0 .317.251.571.556.571h10.888a.564.564 0 0 0 .556-.571V9.33ZM20.444 7H9.556a.543.543 0 0 0-.32.105l5.763 3.706 5.766-3.706a.543.543 0 0 0-.32-.105ZM4.308 5a1 1 0 1 1 0 2H2a1 1 0 1 1 0-2h2.308Zm0 5.5a1 1 0 0 1 0 2H2a1 1 0 0 1 0-2h2.308Zm0 5.5a1 1 0 0 1 0 2H2a1 1 0 0 1 0-2h2.308Z"/></symbol><symbol id="icon-eds-i-mentions-medium" viewBox="0 0 24 24"><path d="m9.452 1.293 5.92 5.92 2.92-2.92a1 1 0 0 1 1.415 1.414l-2.92 2.92 5.92 5.92a1 1 0 0 1 0 1.415 10.371 10.371 0 0 1-10.378 2.584l.652 3.258A1 1 0 0 1 12 23H2a1 1 0 0 1-.874-1.486l4.789-8.62C4.194 9.074 4.9 4.43 8.038 1.292a1 1 0 0 1 1.414 0Zm-2.355 13.59L3.699 21h7.081l-.689-3.442a10.392 10.392 0 0 1-2.775-2.396l-.22-.28Zm1.69-11.427-.07.09a8.374 8.374 0 0 0 11.737 11.737l.089-.071L8.787 3.456Z"/></symbol><symbol id="icon-eds-i-menu-medium" viewBox="0 0 24 24"><path d="M21 4a1 1 0 0 1 0 2H3a1 1 0 1 1 0-2h18Zm-4 7a1 1 0 0 1 0 2H3a1 1 0 0 1 0-2h14Zm4 7a1 1 0 0 1 0 2H3a1 1 0 0 1 0-2h18Z"/></symbol><symbol id="icon-eds-i-metrics-medium" viewBox="0 0 24 24"><path d="M3 22a1 1 0 0 1-1-1V3a1 1 0 0 1 1-1h6a1 1 0 0 1 1 1v7h4V8a1 1 0 0 1 1-1h6a1 1 0 0 1 1 1v13a1 1 0 0 1-.883.993L21 22H3Zm17-2V9h-4v11h4Zm-6-8h-4v8h4v-8ZM8 4H4v16h4V4Z"/></symbol><symbol id="icon-eds-i-news-medium" viewBox="0 0 24 24"><path d="M17.384 3c.975 0 1.77.787 1.77 1.762v13.333c0 .462.354.846.815.899l.107.006.109-.006a.915.915 0 0 0 .809-.794l.006-.105V8.19a1 1 0 0 1 2 0v9.905A2.914 2.914 0 0 1 20.077 21H3.538a2.547 2.547 0 0 1-1.644-.601l-.147-.135A2.516 2.516 0 0 1 1 18.476V4.762C1 3.787 1.794 3 2.77 3h14.614Zm-.231 2H3v13.476c0 .11.035.216.1.304l.054.063c.101.1.24.157.384.157l13.761-.001-.026-.078a2.88 2.88 0 0 1-.115-.655l-.004-.17L17.153 5ZM14 15.021a.979.979 0 1 1 0 1.958H6a.979.979 0 1 1 0-1.958h8Zm0-8c.54 0 .979.438.979.979v4c0 .54-.438.979-.979.979H6A.979.979 0 0 1 5.021 12V8c0-.54.438-.979.979-.979h8Zm-.98 1.958H6.979v2.041h6.041V8.979Z"/></symbol><symbol id="icon-eds-i-newsletter-medium" viewBox="0 0 24 24"><path d="M21 10a1 1 0 0 1 1 1v9.5a2.5 2.5 0 0 1-2.5 2.5h-15A2.5 2.5 0 0 1 2 20.5V11a1 1 0 0 1 2 0v.439l8 4.888 8-4.889V11a1 1 0 0 1 1-1Zm-1 3.783-7.479 4.57a1 1 0 0 1-1.042 0l-7.48-4.57V20.5a.5.5 0 0 0 .501.5h15a.5.5 0 0 0 .5-.5v-6.717ZM15 9a1 1 0 0 1 0 2H9a1 1 0 0 1 0-2h6Zm2.5-8A2.5 2.5 0 0 1 20 3.5V9a1 1 0 0 1-2 0V3.5a.5.5 0 0 0-.5-.5h-11a.5.5 0 0 0-.5.5V9a1 1 0 1 1-2 0V3.5A2.5 2.5 0 0 1 6.5 1h11ZM15 5a1 1 0 0 1 0 2H9a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-notifcation-medium" viewBox="0 0 24 24"><path d="M14 20a1 1 0 0 1 0 2h-4a1 1 0 0 1 0-2h4ZM3 18l-.133-.007c-1.156-.124-1.156-1.862 0-1.986l.3-.012C4.32 15.923 5 15.107 5 14V9.5C5 5.368 8.014 2 12 2s7 3.368 7 7.5V14c0 1.107.68 1.923 1.832 1.995l.301.012c1.156.124 1.156 1.862 0 1.986L21 18H3Zm9-14C9.17 4 7 6.426 7 9.5V14c0 .671-.146 1.303-.416 1.858L6.51 16h10.979l-.073-.142a4.192 4.192 0 0 1-.412-1.658L17 14V9.5C17 6.426 14.83 4 12 4Z"/></symbol><symbol id="icon-eds-i-publish-medium" viewBox="0 0 24 24"><g><path d="M16.296 1.291A1 1 0 0 0 15.591 1H5.545A2.542 2.542 0 0 0 3 3.538V13a1 1 0 1 0 2 0V3.538l.007-.087A.543.543 0 0 1 5.545 3h9.633L20 7.8v12.662a.534.534 0 0 1-.158.379.548.548 0 0 1-.387.159H11a1 1 0 1 0 0 2h8.455c.674 0 1.32-.267 1.798-.742A2.534 2.534 0 0 0 22 20.462V7.385a1 1 0 0 0-.294-.709l-5.41-5.385Z"/><path d="M10.762 16.647a1 1 0 0 0-1.525-1.294l-4.472 5.271-2.153-1.665a1 1 0 1 0-1.224 1.582l2.91 2.25a1 1 0 0 0 1.374-.144l5.09-6ZM16 10a1 1 0 1 1 0 2H8a1 1 0 1 1 0-2h8ZM12 7a1 1 0 0 0-1-1H8a1 1 0 1 0 0 2h3a1 1 0 0 0 1-1Z"/></g></symbol><symbol id="icon-eds-i-refresh-medium" viewBox="0 0 24 24"><g><path d="M7.831 5.636H6.032A8.76 8.76 0 0 1 9 3.631 8.549 8.549 0 0 1 12.232 3c.603 0 1.192.063 1.76.182C17.979 4.017 21 7.632 21 12a1 1 0 1 0 2 0c0-5.296-3.674-9.746-8.591-10.776A10.61 10.61 0 0 0 5 3.851V2.805a1 1 0 0 0-.987-1H4a1 1 0 0 0-1 1v3.831a1 1 0 0 0 1 1h3.831a1 1 0 0 0 .013-2h-.013ZM17.968 18.364c-1.59 1.632-3.784 2.636-6.2 2.636C6.948 21 3 16.993 3 12a1 1 0 1 0-2 0c0 6.053 4.799 11 10.768 11 2.788 0 5.324-1.082 7.232-2.85v1.045a1 1 0 1 0 2 0v-3.831a1 1 0 0 0-1-1h-3.831a1 1 0 0 0 0 2h1.799Z"/></g></symbol><symbol id="icon-eds-i-search-medium" viewBox="0 0 24 24"><path d="M11 1c5.523 0 10 4.477 10 10 0 2.4-.846 4.604-2.256 6.328l3.963 3.965a1 1 0 0 1-1.414 1.414l-3.965-3.963A9.959 9.959 0 0 1 11 21C5.477 21 1 16.523 1 11S5.477 1 11 1Zm0 2a8 8 0 1 0 0 16 8 8 0 0 0 0-16Z"/></symbol><symbol id="icon-eds-i-settings-medium" viewBox="0 0 24 24"><path d="M11.382 1h1.24a2.508 2.508 0 0 1 2.334 1.63l.523 1.378 1.59.933 1.444-.224c.954-.132 1.89.3 2.422 1.101l.095.155.598 1.066a2.56 2.56 0 0 1-.195 2.848l-.894 1.161v1.896l.92 1.163c.6.768.707 1.812.295 2.674l-.09.17-.606 1.08a2.504 2.504 0 0 1-2.531 1.25l-1.428-.223-1.589.932-.523 1.378a2.512 2.512 0 0 1-2.155 1.625L12.65 23h-1.27a2.508 2.508 0 0 1-2.334-1.63l-.524-1.379-1.59-.933-1.443.225c-.954.132-1.89-.3-2.422-1.101l-.095-.155-.598-1.066a2.56 2.56 0 0 1 .195-2.847l.891-1.161v-1.898l-.919-1.162a2.562 2.562 0 0 1-.295-2.674l.09-.17.606-1.08a2.504 2.504 0 0 1 2.531-1.25l1.43.223 1.618-.938.524-1.375.07-.167A2.507 2.507 0 0 1 11.382 1Zm.003 2a.509.509 0 0 0-.47.338l-.65 1.71a1 1 0 0 1-.434.51L7.6 6.85a1 1 0 0 1-.655.123l-1.762-.275a.497.497 0 0 0-.498.252l-.61 1.088a.562.562 0 0 0 .04.619l1.13 1.43a1 1 0 0 1 .216.62v2.585a1 1 0 0 1-.207.61L4.15 15.339a.568.568 0 0 0-.036.634l.601 1.072a.494.494 0 0 0 .484.26l1.78-.278a1 1 0 0 1 .66.126l2.2 1.292a1 1 0 0 1 .43.507l.648 1.71a.508.508 0 0 0 .467.338h1.263a.51.51 0 0 0 .47-.34l.65-1.708a1 1 0 0 1 .428-.507l2.201-1.292a1 1 0 0 1 .66-.126l1.763.275a.497.497 0 0 0 .498-.252l.61-1.088a.562.562 0 0 0-.04-.619l-1.13-1.43a1 1 0 0 1-.216-.62v-2.585a1 1 0 0 1 .207-.61l1.105-1.437a.568.568 0 0 0 .037-.634l-.601-1.072a.494.494 0 0 0-.484-.26l-1.78.278a1 1 0 0 1-.66-.126l-2.2-1.292a1 1 0 0 1-.43-.507l-.649-1.71A.508.508 0 0 0 12.62 3h-1.234ZM12 8a4 4 0 1 1 0 8 4 4 0 0 1 0-8Zm0 2a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"/></symbol><symbol id="icon-eds-i-shipping-medium" viewBox="0 0 24 24"><path d="M16.515 2c1.406 0 2.706.728 3.352 1.902l2.02 3.635.02.042.036.089.031.105.012.058.01.073.004.075v11.577c0 .64-.244 1.255-.683 1.713a2.356 2.356 0 0 1-1.701.731H4.386a2.356 2.356 0 0 1-1.702-.731 2.476 2.476 0 0 1-.683-1.713V7.948c.01-.217.083-.43.22-.6L4.2 3.905C4.833 2.755 6.089 2.032 7.486 2h9.029ZM20 9H4v10.556a.49.49 0 0 0 .075.26l.053.07a.356.356 0 0 0 .257.114h15.23c.094 0 .186-.04.258-.115a.477.477 0 0 0 .127-.33V9Zm-2 7.5a1 1 0 0 1 0 2h-4a1 1 0 0 1 0-2h4ZM16.514 4H13v3h6.3l-1.183-2.13c-.288-.522-.908-.87-1.603-.87ZM11 3.999H7.51c-.679.017-1.277.36-1.566.887L4.728 7H11V3.999Z"/></symbol><symbol id="icon-eds-i-step-guide-medium" viewBox="0 0 24 24"><path d="M11.394 9.447a1 1 0 1 0-1.788-.894l-.88 1.759-.019-.02a1 1 0 1 0-1.414 1.415l1 1a1 1 0 0 0 1.601-.26l1.5-3ZM12 11a1 1 0 0 1 1-1h3a1 1 0 1 1 0 2h-3a1 1 0 0 1-1-1ZM12 17a1 1 0 0 1 1-1h3a1 1 0 1 1 0 2h-3a1 1 0 0 1-1-1ZM10.947 14.105a1 1 0 0 1 .447 1.342l-1.5 3a1 1 0 0 1-1.601.26l-1-1a1 1 0 1 1 1.414-1.414l.02.019.879-1.76a1 1 0 0 1 1.341-.447Z"/><path d="M5.545 1A2.542 2.542 0 0 0 3 3.538v16.924A2.542 2.542 0 0 0 5.545 23h12.91A2.542 2.542 0 0 0 21 20.462V7.5a1 1 0 0 0-.293-.707l-5.5-5.5A1 1 0 0 0 14.5 1H5.545ZM5 3.538C5 3.245 5.24 3 5.545 3h8.54L19 7.914v12.547c0 .294-.24.539-.546.539H5.545A.542.542 0 0 1 5 20.462V3.538Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-submission-medium" viewBox="0 0 24 24"><g><path d="M5 3.538C5 3.245 5.24 3 5.545 3h9.633L20 7.8v12.662a.535.535 0 0 1-.158.379.549.549 0 0 1-.387.159H6a1 1 0 0 1-1-1v-2.5a1 1 0 1 0-2 0V20a3 3 0 0 0 3 3h13.455c.673 0 1.32-.266 1.798-.742A2.535 2.535 0 0 0 22 20.462V7.385a1 1 0 0 0-.294-.709l-5.41-5.385A1 1 0 0 0 15.591 1H5.545A2.542 2.542 0 0 0 3 3.538V7a1 1 0 0 0 2 0V3.538Z"/><path d="m13.707 13.707-4 4a1 1 0 0 1-1.414 0l-.083-.094a1 1 0 0 1 .083-1.32L10.585 14 2 14a1 1 0 1 1 0-2l8.583.001-2.29-2.294a1 1 0 0 1 1.414-1.414l4.037 4.04.043.05.043.06.059.098.03.063.031.085.03.113.017.122L14 13l-.004.087-.017.118-.013.056-.034.104-.049.105-.048.081-.07.093-.058.063Z"/></g></symbol><symbol id="icon-eds-i-table-1-medium" viewBox="0 0 24 24"><path d="M4.385 22a2.56 2.56 0 0 1-1.14-.279C2.485 21.341 2 20.614 2 19.615V4.385c0-.315.067-.716.279-1.14C2.659 2.485 3.386 2 4.385 2h15.23c.315 0 .716.067 1.14.279.76.38 1.245 1.107 1.245 2.106v15.23c0 .315-.067.716-.279 1.14-.38.76-1.107 1.245-2.106 1.245H4.385ZM4 19.615c0 .213.034.265.14.317a.71.71 0 0 0 .245.068H8v-4H4v3.615ZM20 16H10v4h9.615c.213 0 .265-.034.317-.14a.71.71 0 0 0 .068-.245V16Zm0-2v-4H10v4h10ZM4 14h4v-4H4v4ZM19.615 4H10v4h10V4.385c0-.213-.034-.265-.14-.317A.71.71 0 0 0 19.615 4ZM8 4H4.385l-.082.002c-.146.01-.19.047-.235.138A.71.71 0 0 0 4 4.385V8h4V4Z"/></symbol><symbol id="icon-eds-i-table-2-medium" viewBox="0 0 24 24"><path d="M4.384 22A2.384 2.384 0 0 1 2 19.616V4.384A2.384 2.384 0 0 1 4.384 2h15.232A2.384 2.384 0 0 1 22 4.384v15.232A2.384 2.384 0 0 1 19.616 22H4.384ZM10 15H4v4.616c0 .212.172.384.384.384H10v-5Zm5 0h-3v5h3v-5Zm5 0h-3v5h2.616a.384.384 0 0 0 .384-.384V15ZM10 9H4v4h6V9Zm5 0h-3v4h3V9Zm5 0h-3v4h3V9Zm-.384-5H4.384A.384.384 0 0 0 4 4.384V7h16V4.384A.384.384 0 0 0 19.616 4Z"/></symbol><symbol id="icon-eds-i-tag-medium" viewBox="0 0 24 24"><path d="m12.621 1.998.127.004L20.496 2a1.5 1.5 0 0 1 1.497 1.355L22 3.5l-.005 7.669c.038.456-.133.905-.447 1.206l-9.02 9.018a2.075 2.075 0 0 1-2.932 0l-6.99-6.99a2.075 2.075 0 0 1 .001-2.933L11.61 2.47c.246-.258.573-.418.881-.46l.131-.011Zm.286 2-8.885 8.886a.075.075 0 0 0 0 .106l6.987 6.988c.03.03.077.03.106 0l8.883-8.883L19.999 4l-7.092-.002ZM16 6.5a1.5 1.5 0 0 1 .144 2.993L16 9.5a1.5 1.5 0 0 1 0-3Z"/></symbol><symbol id="icon-eds-i-trash-medium" viewBox="0 0 24 24"><path d="M12 1c2.717 0 4.913 2.232 4.997 5H21a1 1 0 0 1 0 2h-1v12.5c0 1.389-1.152 2.5-2.556 2.5H6.556C5.152 23 4 21.889 4 20.5V8H3a1 1 0 1 1 0-2h4.003l.001-.051C7.114 3.205 9.3 1 12 1Zm6 7H6v12.5c0 .238.19.448.454.492l.102.008h10.888c.315 0 .556-.232.556-.5V8Zm-4 3a1 1 0 0 1 1 1v6.005a1 1 0 0 1-2 0V12a1 1 0 0 1 1-1Zm-4 0a1 1 0 0 1 1 1v6a1 1 0 0 1-2 0v-6a1 1 0 0 1 1-1Zm2-8c-1.595 0-2.914 1.32-2.996 3h5.991v-.02C14.903 4.31 13.589 3 12 3Z"/></symbol><symbol id="icon-eds-i-user-account-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 16c-1.806 0-3.52.994-4.664 2.698A8.947 8.947 0 0 0 12 21a8.958 8.958 0 0 0 4.664-1.301C15.52 17.994 13.806 17 12 17Zm0-14a9 9 0 0 0-6.25 15.476C7.253 16.304 9.54 15 12 15s4.747 1.304 6.25 3.475A9 9 0 0 0 12 3Zm0 3a4 4 0 1 1 0 8 4 4 0 0 1 0-8Zm0 2a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"/></symbol><symbol id="icon-eds-i-user-add-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm9 10a1 1 0 0 1 1 1v3h3a1 1 0 0 1 0 2h-3v3a1 1 0 0 1-2 0v-3h-3a1 1 0 0 1 0-2h3v-3a1 1 0 0 1 1-1Zm-5.545-.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378Z"/></symbol><symbol id="icon-eds-i-user-assign-medium" viewBox="0 0 24 24"><path d="M16.226 13.298a1 1 0 0 1 1.414-.01l.084.093a1 1 0 0 1-.073 1.32L15.39 17H22a1 1 0 0 1 0 2h-6.611l2.262 2.298a1 1 0 0 1-1.425 1.404l-3.939-4a1 1 0 0 1 0-1.404l3.94-4Zm-3.771-.449a1 1 0 1 1-.91 1.781 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 10.5 20a1 1 0 0 1 .993.883L11.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Z"/></symbol><symbol id="icon-eds-i-user-block-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm9 10a5 5 0 1 1 0 10 5 5 0 0 1 0-10Zm-5.545-.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM15 18a3 3 0 0 0 4.294 2.707l-4.001-4c-.188.391-.293.83-.293 1.293Zm3-3c-.463 0-.902.105-1.294.293l4.001 4A3 3 0 0 0 18 15Z"/></symbol><symbol id="icon-eds-i-user-check-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm13.647 12.237a1 1 0 0 1 .116 1.41l-5.091 6a1 1 0 0 1-1.375.144l-2.909-2.25a1 1 0 1 1 1.224-1.582l2.153 1.665 4.472-5.271a1 1 0 0 1 1.41-.116Zm-8.139-.977c.22.214.428.44.622.678a1 1 0 1 1-1.548 1.266 6.025 6.025 0 0 0-1.795-1.49.86.86 0 0 1-.163-.048l-.079-.036a5.721 5.721 0 0 0-2.62-.63l-.194.006c-2.76.134-5.022 2.177-5.592 4.864l-.035.175-.035.213c-.03.201-.05.405-.06.61L3.003 20 10 20a1 1 0 0 1 .993.883L11 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876l.005-.223.02-.356.02-.222.03-.248.022-.15c.02-.133.044-.265.071-.397.44-2.178 1.725-4.105 3.595-5.301a7.75 7.75 0 0 1 3.755-1.215l.12-.004a7.908 7.908 0 0 1 5.87 2.252Z"/></symbol><symbol id="icon-eds-i-user-delete-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6ZM4.763 13.227a7.713 7.713 0 0 1 7.692-.378 1 1 0 1 1-.91 1.781 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20H11.5a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897Zm11.421 1.543 2.554 2.553 2.555-2.553a1 1 0 0 1 1.414 1.414l-2.554 2.554 2.554 2.555a1 1 0 0 1-1.414 1.414l-2.555-2.554-2.554 2.554a1 1 0 0 1-1.414-1.414l2.553-2.555-2.553-2.554a1 1 0 0 1 1.414-1.414Z"/></symbol><symbol id="icon-eds-i-user-edit-medium" viewBox="0 0 24 24"><path d="m19.876 10.77 2.831 2.83a1 1 0 0 1 0 1.415l-7.246 7.246a1 1 0 0 1-.572.284l-3.277.446a1 1 0 0 1-1.125-1.13l.461-3.277a1 1 0 0 1 .283-.567l7.23-7.246a1 1 0 0 1 1.415-.001Zm-7.421 2.08a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 7.5 20a1 1 0 0 1 .993.883L8.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378Zm6.715.042-6.29 6.3-.23 1.639 1.633-.222 6.302-6.302-1.415-1.415ZM9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Z"/></symbol><symbol id="icon-eds-i-user-linked-medium" viewBox="0 0 24 24"><path d="M15.65 6c.31 0 .706.066 1.122.274C17.522 6.65 18 7.366 18 8.35v12.3c0 .31-.066.706-.274 1.122-.375.75-1.092 1.228-2.076 1.228H3.35a2.52 2.52 0 0 1-1.122-.274C1.478 22.35 1 21.634 1 20.65V8.35c0-.31.066-.706.274-1.122C1.65 6.478 2.366 6 3.35 6h12.3Zm0 2-12.376.002c-.134.007-.17.04-.21.12A.672.672 0 0 0 3 8.35v12.3c0 .198.028.24.122.287.09.044.2.063.228.063h.887c.788-2.269 2.814-3.5 5.263-3.5 2.45 0 4.475 1.231 5.263 3.5h.887c.198 0 .24-.028.287-.122.044-.09.063-.2.063-.228V8.35c0-.198-.028-.24-.122-.287A.672.672 0 0 0 15.65 8ZM9.5 19.5c-1.36 0-2.447.51-3.06 1.5h6.12c-.613-.99-1.7-1.5-3.06-1.5ZM20.65 1A2.35 2.35 0 0 1 23 3.348V15.65A2.35 2.35 0 0 1 20.65 18H20a1 1 0 0 1 0-2h.65a.35.35 0 0 0 .35-.35V3.348A.35.35 0 0 0 20.65 3H8.35a.35.35 0 0 0-.35.348V4a1 1 0 1 1-2 0v-.652A2.35 2.35 0 0 1 8.35 1h12.3ZM9.5 10a3.5 3.5 0 1 1 0 7 3.5 3.5 0 0 1 0-7Zm0 2a1.5 1.5 0 1 0 0 3 1.5 1.5 0 0 0 0-3Z"/></symbol><symbol id="icon-eds-i-user-multiple-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm6 0a5 5 0 0 1 0 10 1 1 0 0 1-.117-1.993L15 9a3 3 0 0 0 0-6 1 1 0 0 1 0-2ZM9 3a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm8.857 9.545a7.99 7.99 0 0 1 2.651 1.715A8.31 8.31 0 0 1 23 20.134V21a1 1 0 0 1-1 1h-3a1 1 0 0 1 0-2h1.995l-.005-.153a6.307 6.307 0 0 0-1.673-3.945l-.204-.209a5.99 5.99 0 0 0-1.988-1.287 1 1 0 1 1 .732-1.861Zm-3.349 1.715A8.31 8.31 0 0 1 17 20.134V21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.877c.044-4.343 3.387-7.908 7.638-8.115a7.908 7.908 0 0 1 5.87 2.252ZM9.016 14l-.285.006c-3.104.15-5.58 2.718-5.725 5.9L3.004 20h11.991l-.005-.153a6.307 6.307 0 0 0-1.673-3.945l-.204-.209A5.924 5.924 0 0 0 9.3 14.008L9.016 14Z"/></symbol><symbol id="icon-eds-i-user-notify-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm10 18v1a1 1 0 0 1-2 0v-1h-3a1 1 0 0 1 0-2v-2.818C14 13.885 15.777 12 18 12s4 1.885 4 4.182V19a1 1 0 0 1 0 2h-3Zm-6.545-8.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM18 14c-1.091 0-2 .964-2 2.182V19h4v-2.818c0-1.165-.832-2.098-1.859-2.177L18 14Z"/></symbol><symbol id="icon-eds-i-user-remove-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm3.455 9.85a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM22 17a1 1 0 0 1 0 2h-8a1 1 0 0 1 0-2h8Z"/></symbol><symbol id="icon-eds-i-user-single-medium" viewBox="0 0 24 24"><path d="M12 1a5 5 0 1 1 0 10 5 5 0 0 1 0-10Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm-.406 9.008a8.965 8.965 0 0 1 6.596 2.494A9.161 9.161 0 0 1 21 21.025V22a1 1 0 0 1-1 1H4a1 1 0 0 1-1-1v-.985c.05-4.825 3.815-8.777 8.594-9.007Zm.39 1.992-.299.006c-3.63.175-6.518 3.127-6.678 6.775L5 21h13.998l-.009-.268a7.157 7.157 0 0 0-1.97-4.573l-.214-.213A6.967 6.967 0 0 0 11.984 14Z"/></symbol><symbol id="icon-eds-i-warning-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm0 11.5a1.5 1.5 0 0 1 .144 2.993L12 17.5a1.5 1.5 0 0 1 0-3ZM12 6a1 1 0 0 1 1 1v5a1 1 0 0 1-2 0V7a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-warning-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 13.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 14.5ZM12 6a1 1 0 0 0-1 1v5a1 1 0 0 0 2 0V7a1 1 0 0 0-1-1Z"/></symbol><symbol id="icon-chevron-left-medium" viewBox="0 0 24 24"><path d="M15.7194 3.3054C15.3358 2.90809 14.7027 2.89699 14.3054 3.28061L6.54342 10.7757C6.19804 11.09 6 11.5335 6 12C6 12.4665 6.19804 12.91 6.5218 13.204L14.3054 20.7194C14.7027 21.103 15.3358 21.0919 15.7194 20.6946C16.103 20.2973 16.0919 19.6642 15.6946 19.2806L8.155 12L15.6946 4.71939C16.0614 4.36528 16.099 3.79863 15.8009 3.40105L15.7194 3.3054Z"/></symbol><symbol id="icon-chevron-right-medium" viewBox="0 0 24 24"><path d="M8.28061 3.3054C8.66423 2.90809 9.29729 2.89699 9.6946 3.28061L17.4566 10.7757C17.802 11.09 18 11.5335 18 12C18 12.4665 17.802 12.91 17.4782 13.204L9.6946 20.7194C9.29729 21.103 8.66423 21.0919 8.28061 20.6946C7.89699 20.2973 7.90809 19.6642 8.3054 19.2806L15.845 12L8.3054 4.71939C7.93865 4.36528 7.90098 3.79863 8.19908 3.40105L8.28061 3.3054Z"/></symbol><symbol id="icon-eds-alerts" viewBox="0 0 32 32"><path d="M28 12.667c.736 0 1.333.597 1.333 1.333v13.333A3.333 3.333 0 0 1 26 30.667H6a3.333 3.333 0 0 1-3.333-3.334V14a1.333 1.333 0 1 1 2.666 0v1.252L16 21.769l10.667-6.518V14c0-.736.597-1.333 1.333-1.333Zm-1.333 5.71-9.972 6.094c-.427.26-.963.26-1.39 0l-9.972-6.094v8.956c0 .368.299.667.667.667h20a.667.667 0 0 0 .667-.667v-8.956ZM19.333 12a1.333 1.333 0 1 1 0 2.667h-6.666a1.333 1.333 0 1 1 0-2.667h6.666Zm4-10.667a3.333 3.333 0 0 1 3.334 3.334v6.666a1.333 1.333 0 1 1-2.667 0V4.667A.667.667 0 0 0 23.333 4H8.667A.667.667 0 0 0 8 4.667v6.666a1.333 1.333 0 1 1-2.667 0V4.667a3.333 3.333 0 0 1 3.334-3.334h14.666Zm-4 5.334a1.333 1.333 0 0 1 0 2.666h-6.666a1.333 1.333 0 1 1 0-2.666h6.666Z"/></symbol><symbol id="icon-eds-arrow-up" viewBox="0 0 24 24"><path fill-rule="evenodd" d="m13.002 7.408 4.88 4.88a.99.99 0 0 0 1.32.08l.09-.08c.39-.39.39-1.03 0-1.42l-6.58-6.58a1.01 1.01 0 0 0-1.42 0l-6.58 6.58a1 1 0 0 0-.09 1.32l.08.1a1 1 0 0 0 1.42-.01l4.88-4.87v11.59a.99.99 0 0 0 .88.99l.12.01c.55 0 1-.45 1-1V7.408z" class="layer"/></symbol><symbol id="icon-eds-checklist" viewBox="0 0 32 32"><path d="M19.2 1.333a3.468 3.468 0 0 1 3.381 2.699L24.667 4C26.515 4 28 5.52 28 7.38v19.906c0 1.86-1.485 3.38-3.333 3.38H7.333c-1.848 0-3.333-1.52-3.333-3.38V7.38C4 5.52 5.485 4 7.333 4h2.093A3.468 3.468 0 0 1 12.8 1.333h6.4ZM9.426 6.667H7.333c-.36 0-.666.312-.666.713v19.906c0 .401.305.714.666.714h17.334c.36 0 .666-.313.666-.714V7.38c0-.4-.305-.713-.646-.714l-2.121.033A3.468 3.468 0 0 1 19.2 9.333h-6.4a3.468 3.468 0 0 1-3.374-2.666Zm12.715 5.606c.586.446.7 1.283.253 1.868l-7.111 9.334a1.333 1.333 0 0 1-1.792.306l-3.556-2.333a1.333 1.333 0 1 1 1.463-2.23l2.517 1.651 6.358-8.344a1.333 1.333 0 0 1 1.868-.252ZM19.2 4h-6.4a.8.8 0 0 0-.8.8v1.067a.8.8 0 0 0 .8.8h6.4a.8.8 0 0 0 .8-.8V4.8a.8.8 0 0 0-.8-.8Z"/></symbol><symbol id="icon-eds-citation" viewBox="0 0 36 36"><path d="M23.25 1.5a1.5 1.5 0 0 1 1.06.44l8.25 8.25a1.5 1.5 0 0 1 .44 1.06v19.5c0 2.105-1.645 3.75-3.75 3.75H18a1.5 1.5 0 0 1 0-3h11.25c.448 0 .75-.302.75-.75V11.873L22.628 4.5H8.31a.811.811 0 0 0-.8.68l-.011.13V16.5a1.5 1.5 0 0 1-3 0V5.31A3.81 3.81 0 0 1 8.31 1.5h14.94ZM8.223 20.358a.984.984 0 0 1-.192 1.378l-.048.034c-.54.36-.942.676-1.206.951-.59.614-.885 1.395-.885 2.343.115-.028.288-.042.518-.042.662 0 1.26.237 1.791.711.533.474.799 1.074.799 1.799 0 .753-.259 1.352-.777 1.799-.518.446-1.151.669-1.9.669-1.006 0-1.812-.293-2.417-.878C3.302 28.536 3 27.657 3 26.486c0-1.115.165-2.085.496-2.907.331-.823.734-1.513 1.209-2.071.475-.558.971-.997 1.49-1.318a6.01 6.01 0 0 1 .347-.2 1.321 1.321 0 0 1 1.681.368Zm7.5 0a.984.984 0 0 1-.192 1.378l-.048.034c-.54.36-.942.676-1.206.951-.59.614-.885 1.395-.885 2.343.115-.028.288-.042.518-.042.662 0 1.26.237 1.791.711.533.474.799 1.074.799 1.799 0 .753-.259 1.352-.777 1.799-.518.446-1.151.669-1.9.669-1.006 0-1.812-.293-2.417-.878-.604-.586-.906-1.465-.906-2.636 0-1.115.165-2.085.496-2.907.331-.823.734-1.513 1.209-2.071.475-.558.971-.997 1.49-1.318a6.01 6.01 0 0 1 .347-.2 1.321 1.321 0 0 1 1.681.368Z"/></symbol><symbol id="icon-eds-i-access-indicator" viewBox="0 0 16 16"><circle cx="4.5" cy="11.5" r="3.5" style="fill:currentColor"/><path fill-rule="evenodd" d="M4 3v3a1 1 0 0 1-2 0V2.923C2 1.875 2.84 1 3.909 1h5.909a1 1 0 0 1 .713.298l3.181 3.231a1 1 0 0 1 .288.702v7.846c0 .505-.197.993-.554 1.354a1.902 1.902 0 0 1-1.355.569H10a1 1 0 1 1 0-2h2V5.64L9.4 3H4Z" clip-rule="evenodd" style="fill:#222"/></symbol><symbol id="icon-eds-i-github-medium" viewBox="0 0 24 24"><path d="M 11.964844 0 C 5.347656 0 0 5.269531 0 11.792969 C 0 17.003906 3.425781 21.417969 8.179688 22.976562 C 8.773438 23.09375 8.992188 22.722656 8.992188 22.410156 C 8.992188 22.136719 8.972656 21.203125 8.972656 20.226562 C 5.644531 20.929688 4.953125 18.820312 4.953125 18.820312 C 4.417969 17.453125 3.625 17.101562 3.625 17.101562 C 2.535156 16.378906 3.703125 16.378906 3.703125 16.378906 C 4.914062 16.457031 5.546875 17.589844 5.546875 17.589844 C 6.617188 19.386719 8.339844 18.878906 9.03125 18.566406 C 9.132812 17.804688 9.449219 17.277344 9.785156 16.984375 C 7.132812 16.710938 4.339844 15.695312 4.339844 11.167969 C 4.339844 9.878906 4.8125 8.824219 5.566406 8.003906 C 5.445312 7.710938 5.03125 6.5 5.683594 4.878906 C 5.683594 4.878906 6.695312 4.566406 8.972656 6.089844 C 9.949219 5.832031 10.953125 5.703125 11.964844 5.699219 C 12.972656 5.699219 14.003906 5.835938 14.957031 6.089844 C 17.234375 4.566406 18.242188 4.878906 18.242188 4.878906 C 18.898438 6.5 18.480469 7.710938 18.363281 8.003906 C 19.136719 8.824219 19.589844 9.878906 19.589844 11.167969 C 19.589844 15.695312 16.796875 16.691406 14.125 16.984375 C 14.558594 17.355469 14.933594 18.058594 14.933594 19.171875 C 14.933594 20.753906 14.914062 22.019531 14.914062 22.410156 C 14.914062 22.722656 15.132812 23.09375 15.726562 22.976562 C 20.480469 21.414062 23.910156 17.003906 23.910156 11.792969 C 23.929688 5.269531 18.558594 0 11.964844 0 Z M 11.964844 0 "/></symbol><symbol id="icon-eds-i-limited-access" viewBox="0 0 16 16"><path fill-rule="evenodd" d="M4 3v3a1 1 0 0 1-2 0V2.923C2 1.875 2.84 1 3.909 1h5.909a1 1 0 0 1 .713.298l3.181 3.231a1 1 0 0 1 .288.702V6a1 1 0 1 1-2 0v-.36L9.4 3H4ZM3 8a1 1 0 0 1 1 1v1a1 1 0 1 1-2 0V9a1 1 0 0 1 1-1Zm10 0a1 1 0 0 1 1 1v1a1 1 0 1 1-2 0V9a1 1 0 0 1 1-1Zm-3.5 6a1 1 0 0 1-1 1h-1a1 1 0 1 1 0-2h1a1 1 0 0 1 1 1Zm2.441-1a1 1 0 0 1 2 0c0 .73-.246 1.306-.706 1.664a1.61 1.61 0 0 1-.876.334l-.032.002H11.5a1 1 0 1 1 0-2h.441ZM4 13a1 1 0 0 0-2 0c0 .73.247 1.306.706 1.664a1.609 1.609 0 0 0 .876.334l.032.002H4.5a1 1 0 1 0 0-2H4Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-subjects-medium" viewBox="0 0 24 24"><g id="icon-subjects-copy" stroke="none" stroke-width="1" fill-rule="evenodd"><path d="M13.3846154,2 C14.7015971,2 15.7692308,3.06762994 15.7692308,4.38461538 L15.7692308,7.15384615 C15.7692308,8.47082629 14.7015955,9.53846154 13.3846154,9.53846154 L13.1038388,9.53925278 C13.2061091,9.85347965 13.3815528,10.1423885 13.6195822,10.3804178 C13.9722182,10.7330539 14.436524,10.9483278 14.9293854,10.9918129 L15.1153846,11 C16.2068332,11 17.2535347,11.433562 18.0254647,12.2054189 C18.6411944,12.8212361 19.0416785,13.6120766 19.1784166,14.4609738 L19.6153846,14.4615385 C20.932386,14.4615385 22,15.5291672 22,16.8461538 L22,19.6153846 C22,20.9323924 20.9323924,22 19.6153846,22 L16.8461538,22 C15.5291672,22 14.4615385,20.932386 14.4615385,19.6153846 L14.4615385,16.8461538 C14.4615385,15.5291737 15.5291737,14.4615385 16.8461538,14.4615385 L17.126925,14.460779 C17.0246537,14.1465537 16.8492179,13.857633 16.6112344,13.6196157 C16.2144418,13.2228606 15.6764136,13 15.1153846,13 C14.0239122,13 12.9771569,12.5664197 12.2053686,11.7946314 C12.1335167,11.7227795 12.0645962,11.6485444 11.9986839,11.5721119 C11.9354038,11.6485444 11.8664833,11.7227795 11.7946314,11.7946314 C11.0228431,12.5664197 9.97608778,13 8.88461538,13 C8.323576,13 7.78552852,13.2228666 7.38881294,13.6195822 C7.15078359,13.8576115 6.97533988,14.1465203 6.8730696,14.4607472 L7.15384615,14.4615385 C8.47082629,14.4615385 9.53846154,15.5291737 9.53846154,16.8461538 L9.53846154,19.6153846 C9.53846154,20.932386 8.47083276,22 7.15384615,22 L4.38461538,22 C3.06762347,22 2,20.9323876 2,19.6153846 L2,16.8461538 C2,15.5291721 3.06762994,14.4615385 4.38461538,14.4615385 L4.8215823,14.4609378 C4.95831893,13.6120029 5.3588057,12.8211623 5.97459937,12.2053686 C6.69125996,11.488708 7.64500941,11.0636656 8.6514968,11.0066017 L8.88461538,11 C9.44565477,11 9.98370225,10.7771334 10.3804178,10.3804178 C10.6184472,10.1423885 10.7938909,9.85347965 10.8961612,9.53925278 L10.6153846,9.53846154 C9.29840448,9.53846154 8.23076923,8.47082629 8.23076923,7.15384615 L8.23076923,4.38461538 C8.23076923,3.06762994 9.29840286,2 10.6153846,2 L13.3846154,2 Z M7.15384615,16.4615385 L4.38461538,16.4615385 C4.17220099,16.4615385 4,16.63374 4,16.8461538 L4,19.6153846 C4,19.8278134 4.17218833,20 4.38461538,20 L7.15384615,20 C7.36626945,20 7.53846154,19.8278103 7.53846154,19.6153846 L7.53846154,16.8461538 C7.53846154,16.6337432 7.36625679,16.4615385 7.15384615,16.4615385 Z M19.6153846,16.4615385 L16.8461538,16.4615385 C16.6337432,16.4615385 16.4615385,16.6337432 16.4615385,16.8461538 L16.4615385,19.6153846 C16.4615385,19.8278103 16.6337306,20 16.8461538,20 L19.6153846,20 C19.8278229,20 20,19.8278229 20,19.6153846 L20,16.8461538 C20,16.6337306 19.8278103,16.4615385 19.6153846,16.4615385 Z M13.3846154,4 L10.6153846,4 C10.4029708,4 10.2307692,4.17220099 10.2307692,4.38461538 L10.2307692,7.15384615 C10.2307692,7.36625679 10.402974,7.53846154 10.6153846,7.53846154 L13.3846154,7.53846154 C13.597026,7.53846154 13.7692308,7.36625679 13.7692308,7.15384615 L13.7692308,4.38461538 C13.7692308,4.17220099 13.5970292,4 13.3846154,4 Z" id="Shape" fill-rule="nonzero"/></g></symbol><symbol id="icon-eds-small-arrow-left" viewBox="0 0 16 17"><path stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M14 8.092H2m0 0L8 2M2 8.092l6 6.035"/></symbol><symbol id="icon-eds-small-arrow-right" viewBox="0 0 16 16"><g fill-rule="evenodd" stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="2"><path d="M2 8.092h12M8 2l6 6.092M8 14.127l6-6.035"/></g></symbol><symbol id="icon-orcid-logo" viewBox="0 0 40 40"><path fill-rule="evenodd" d="M12.281 10.453c.875 0 1.578-.719 1.578-1.578 0-.86-.703-1.578-1.578-1.578-.875 0-1.578.703-1.578 1.578 0 .86.703 1.578 1.578 1.578Zm-1.203 18.641h2.406V12.359h-2.406v16.735Z"/><path fill-rule="evenodd" d="M17.016 12.36h6.5c6.187 0 8.906 4.421 8.906 8.374 0 4.297-3.36 8.375-8.875 8.375h-6.531V12.36Zm6.234 14.578h-3.828V14.53h3.703c4.688 0 6.828 2.844 6.828 6.203 0 2.063-1.25 6.203-6.703 6.203Z" clip-rule="evenodd"/></symbol></svg> </div> <a class="c-skip-link" href="#main">Skip to main content</a> <header class="eds-c-header" data-eds-c-header> <div class="eds-c-header__container" data-eds-c-header-expander-anchor> <div class="eds-c-header__brand"> <a href="https://link.springer.com" data-test=springerlink-logo data-track="click_imprint_logo" data-track-context="unified header" data-track-action="click logo link" data-track-category="unified header" data-track-label="link" > <img src="/oscar-static/images/darwin/header/img/logo-springer-nature-link-3149409f62.svg" alt="Springer Nature Link"> </a> </div> <a class="c-header__link eds-c-header__link" id="identity-account-widget" href='https://idp.springer.com/auth/personal/springernature?redirect_uri=https://link.springer.com/article/10.1007/s44196-024-00697-0?'><span class="eds-c-header__widget-fragment-title">Log in</span></a> </div> <nav class="eds-c-header__nav" aria-label="header navigation"> <div class="eds-c-header__nav-container"> <div class="eds-c-header__item eds-c-header__item--menu"> <a href="#eds-c-header-nav" class="eds-c-header__link" data-eds-c-header-expander> <svg class="eds-c-header__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-menu-medium"></use> </svg><span>Menu</span> </a> </div> <div class="eds-c-header__item eds-c-header__item--inline-links"> <a class="eds-c-header__link" href="https://link.springer.com/journals/" data-track="nav_find_a_journal" data-track-context="unified header" data-track-action="click find a journal" data-track-category="unified header" data-track-label="link" > Find a journal </a> <a class="eds-c-header__link" href="https://www.springernature.com/gp/authors" data-track="nav_how_to_publish" data-track-context="unified header" data-track-action="click publish with us link" data-track-category="unified header" data-track-label="link" > Publish with us </a> <a class="eds-c-header__link" href="https://link.springernature.com/home/" data-track="nav_track_your_research" data-track-context="unified header" data-track-action="click track your research" data-track-category="unified header" data-track-label="link" > Track your research </a> </div> <div class="eds-c-header__link-container"> <div class="eds-c-header__item eds-c-header__item--divider"> <a href="#eds-c-header-popup-search" class="eds-c-header__link" data-eds-c-header-expander data-eds-c-header-test-search-btn> <svg class="eds-c-header__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-search-medium"></use> </svg><span>Search</span> </a> </div> <div id="ecommerce-header-cart-icon-link" class="eds-c-header__item ecommerce-cart" style="display:inline-block"> <a class="eds-c-header__link" href="https://order.springer.com/public/cart" style="appearance:none;border:none;background:none;color:inherit;position:relative"> <svg id="eds-i-cart" class="eds-c-header__icon" xmlns="http://www.w3.org/2000/svg" height="24" width="24" viewBox="0 0 24 24" aria-hidden="true" focusable="false"> <path fill="currentColor" fill-rule="nonzero" d="M2 1a1 1 0 0 0 0 2l1.659.001 2.257 12.808a2.599 2.599 0 0 0 2.435 2.185l.167.004 9.976-.001a2.613 2.613 0 0 0 2.61-1.748l.03-.106 1.755-7.82.032-.107a2.546 2.546 0 0 0-.311-1.986l-.108-.157a2.604 2.604 0 0 0-2.197-1.076L6.042 5l-.56-3.17a1 1 0 0 0-.864-.82l-.12-.007L2.001 1ZM20.35 6.996a.63.63 0 0 1 .54.26.55.55 0 0 1 .082.505l-.028.1L19.2 15.63l-.022.05c-.094.177-.282.299-.526.317l-10.145.002a.61.61 0 0 1-.618-.515L6.394 6.999l13.955-.003ZM18 19a2 2 0 1 0 0 4 2 2 0 0 0 0-4ZM8 19a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"></path> </svg><span>Cart</span><span class="cart-info" style="display:none;position:absolute;top:10px;right:45px;background-color:#C65301;color:#fff;width:18px;height:18px;font-size:11px;border-radius:50%;line-height:17.5px;text-align:center"></span></a> <script>(function () { var exports = {}; if (window.fetch) { "use strict"; Object.defineProperty(exports, "__esModule", { value: true }); exports.headerWidgetClientInit = void 0; var headerWidgetClientInit = function (getCartInfo) { document.body.addEventListener("updatedCart", function () { updateCartIcon(); }, false); return updateCartIcon(); function updateCartIcon() { return getCartInfo() .then(function (res) { return res.json(); }) .then(refreshCartState) .catch(function (_) { }); } function refreshCartState(json) { var indicator = document.querySelector("#ecommerce-header-cart-icon-link .cart-info"); /* istanbul ignore else */ if (indicator && json.itemCount) { indicator.style.display = 'block'; indicator.textContent = json.itemCount > 9 ? '9+' : json.itemCount.toString(); var moreThanOneItem = json.itemCount > 1; indicator.setAttribute('title', "there ".concat(moreThanOneItem ? "are" : "is", " ").concat(json.itemCount, " item").concat(moreThanOneItem ? "s" : "", " in your cart")); } return json; } }; exports.headerWidgetClientInit = headerWidgetClientInit; headerWidgetClientInit( function () { return window.fetch("https://cart.springer.com/cart-info", { credentials: "include", headers: { Accept: "application/json" } }) } ) }})()</script> </div> </div> </div> </nav> </header> <article lang="en" id="main" class="app-masthead__colour-21"> <section class="app-masthead " aria-label="article masthead"> <div class="app-masthead__container"> <div class="app-article-masthead u-sans-serif js-context-bar-sticky-point-masthead" data-track-component="article" data-test="masthead-component"> <div class="app-article-masthead__info"> <nav aria-label="breadcrumbs" data-test="breadcrumbs"> <ol class="c-breadcrumbs c-breadcrumbs--contrast" itemscope itemtype="https://schema.org/BreadcrumbList"> <li class="c-breadcrumbs__item" id="breadcrumb0" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <a href="/" class="c-breadcrumbs__link" itemprop="item" data-track="click_breadcrumb" data-track-context="article page" data-track-category="article" data-track-action="breadcrumbs" data-track-label="breadcrumb1"><span itemprop="name">Home</span></a><meta itemprop="position" content="1"> <svg class="c-breadcrumbs__chevron" role="img" aria-hidden="true" focusable="false" width="10" height="10" viewBox="0 0 10 10"> <path d="m5.96738168 4.70639573 2.39518594-2.41447274c.37913917-.38219212.98637524-.38972225 1.35419292-.01894278.37750606.38054586.37784436.99719163-.00013556 1.37821513l-4.03074001 4.06319683c-.37758093.38062133-.98937525.38100976-1.367372-.00003075l-4.03091981-4.06337806c-.37759778-.38063832-.38381821-.99150444-.01600053-1.3622839.37750607-.38054587.98772445-.38240057 1.37006824.00302197l2.39538588 2.4146743.96295325.98624457z" fill-rule="evenodd" transform="matrix(0 -1 1 0 0 10)"/> </svg> </li> <li class="c-breadcrumbs__item" id="breadcrumb1" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <a href="/journal/44196" class="c-breadcrumbs__link" itemprop="item" data-track="click_breadcrumb" data-track-context="article page" data-track-category="article" data-track-action="breadcrumbs" data-track-label="breadcrumb2"><span itemprop="name">International Journal of Computational Intelligence Systems</span></a><meta itemprop="position" content="2"> <svg class="c-breadcrumbs__chevron" role="img" aria-hidden="true" focusable="false" width="10" height="10" viewBox="0 0 10 10"> <path d="m5.96738168 4.70639573 2.39518594-2.41447274c.37913917-.38219212.98637524-.38972225 1.35419292-.01894278.37750606.38054586.37784436.99719163-.00013556 1.37821513l-4.03074001 4.06319683c-.37758093.38062133-.98937525.38100976-1.367372-.00003075l-4.03091981-4.06337806c-.37759778-.38063832-.38381821-.99150444-.01600053-1.3622839.37750607-.38054587.98772445-.38240057 1.37006824.00302197l2.39538588 2.4146743.96295325.98624457z" fill-rule="evenodd" transform="matrix(0 -1 1 0 0 10)"/> </svg> </li> <li class="c-breadcrumbs__item" id="breadcrumb2" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <span itemprop="name">Article</span><meta itemprop="position" content="3"> </li> </ol> </nav> <h1 class="c-article-title" data-test="article-title" data-article-title="">Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition</h1> <ul class="c-article-identifiers"> <li class="c-article-identifiers__item" data-test="article-category">Research Article</li> <li class="c-article-identifiers__item"> <a href="https://www.springernature.com/gp/open-research/about/the-fundamentals-of-open-access-and-open-research" data-track="click" data-track-action="open access" data-track-label="link" class="u-color-open-access" data-test="open-access">Open access</a> </li> <li class="c-article-identifiers__item"> Published: <time datetime="2024-11-26">26 November 2024</time> </li> </ul> <ul class="c-article-identifiers c-article-identifiers--cite-list"> <li class="c-article-identifiers__item"> <span data-test="journal-volume">Volume 17</span>, article number <span data-test="article-number">296</span>, (<span data-test="article-publication-year">2024</span>) </li> <li class="c-article-identifiers__item c-article-identifiers__item--cite"> <a href="#citeas" data-track="click" data-track-action="cite this article" data-track-category="article body" data-track-label="link">Cite this article</a> </li> </ul> <div class="app-article-masthead__buttons" data-test="download-article-link-wrapper" data-track-context="masthead"> <div class="c-pdf-container"> <div class="c-pdf-download u-clear-both u-mb-16"> <a href="/content/pdf/10.1007/s44196-024-00697-0.pdf" class="u-button u-button--full-width u-button--primary u-justify-content-space-between c-pdf-download__link" data-article-pdf="true" data-readcube-pdf-url="true" data-test="pdf-link" data-draft-ignore="true" data-track="content_download" data-track-type="article pdf download" data-track-action="download pdf" data-track-label="button" data-track-external download> <span class="c-pdf-download__text">Download PDF</span> <svg aria-hidden="true" focusable="false" width="16" height="16" class="u-icon"><use xlink:href="#icon-eds-i-download-medium"/></svg> </a> </div> </div> <p class="app-article-masthead__access"> <svg width="16" height="16" focusable="false" role="img" aria-hidden="true"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-check-filled-medium"></use></svg> You have full access to this <a href="https://www.springernature.com/gp/open-research/about/the-fundamentals-of-open-access-and-open-research" data-track="click" data-track-action="open access" data-track-label="link">open access</a> article</p> </div> </div> <div class="app-article-masthead__brand"> <a href="/journal/44196" class="app-article-masthead__journal-link" data-track="click_journal_home" data-track-action="journal homepage" data-track-context="article page" data-track-label="link"> <picture> <source type="image/webp" media="(min-width: 768px)" width="120" height="159" srcset="https://media.springernature.com/w120/springer-static/cover-hires/journal/44196?as=webp, https://media.springernature.com/w316/springer-static/cover-hires/journal/44196?as=webp 2x"> <img width="72" height="95" src="https://media.springernature.com/w72/springer-static/cover-hires/journal/44196?as=webp" srcset="https://media.springernature.com/w144/springer-static/cover-hires/journal/44196?as=webp 2x" alt=""> </picture> <span class="app-article-masthead__journal-title">International Journal of Computational Intelligence Systems</span> </a> <a href="https://www.springer.com/journal/44196/aims-and-scope" class="app-article-masthead__submission-link" data-track="click_aims_and_scope" data-track-action="aims and scope" data-track-context="article page" data-track-label="link"> Aims and scope <svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-arrow-right-medium"></use></svg> </a> <a href="https://submission.springernature.com/new-submission/44196/3" class="app-article-masthead__submission-link" data-track="click_submit_manuscript" data-track-context="article masthead on springerlink article page" data-track-action="submit manuscript" data-track-label="link"> Submit manuscript <svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-arrow-right-medium"></use></svg> </a> </div> </div> </div> </section> <div class="c-article-main u-container u-mt-24 u-mb-32 l-with-sidebar" id="main-content" data-component="article-container"> <main class="u-serif js-main-column" data-track-component="article body"> <div class="c-context-bar u-hide" data-test="context-bar" data-context-bar aria-hidden="true"> <div class="c-context-bar__container u-container"> <div class="c-context-bar__title"> Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition </div> <div data-test="inCoD" data-track-context="sticky banner"> <div class="c-pdf-container"> <div class="c-pdf-download u-clear-both u-mb-16"> <a href="/content/pdf/10.1007/s44196-024-00697-0.pdf" class="u-button u-button--full-width u-button--primary u-justify-content-space-between c-pdf-download__link" data-article-pdf="true" data-readcube-pdf-url="true" data-test="pdf-link" data-draft-ignore="true" data-track="content_download" data-track-type="article pdf download" data-track-action="download pdf" data-track-label="button" data-track-external download> <span class="c-pdf-download__text">Download PDF</span> <svg aria-hidden="true" focusable="false" width="16" height="16" class="u-icon"><use xlink:href="#icon-eds-i-download-medium"/></svg> </a> </div> </div> </div> </div> </div> <div class="c-article-header"> <header> <ul class="c-article-author-list c-article-author-list--short" data-test="authors-list" data-component-authors-activator="authors-list"><li class="c-article-author-list__item"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Yuan_Hao-Lin-Aff1" data-author-popup="auth-Yuan_Hao-Lin-Aff1" data-author-search="Lin, Yuan-Hao">Yuan-Hao Lin</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Chia_Hui-Chang-Aff1" data-author-popup="auth-Chia_Hui-Chang-Aff1" data-author-search="Chang, Chia-Hui" data-corresp-id="c1">Chia-Hui Chang<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-mail-medium"></use></svg></a><sup class="u-js-hide"><a href="#Aff1">1</a></sup> &amp; </li><li class="c-article-author-list__item"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Hsiu_Min-Chuang-Aff2" data-author-popup="auth-Hsiu_Min-Chuang-Aff2" data-author-search="Chuang, Hsiu-Min">Hsiu-Min Chuang</a><sup class="u-js-hide"><a href="#Aff2">2</a></sup> </li></ul> <div data-test="article-metrics"> <ul class="app-article-metrics-bar u-list-reset"> </ul> </div> <div class="u-mt-32"> </div> </header> </div> <div data-article-body="true" data-track-component="article body" class="c-article-body"> <section aria-labelledby="Abs1" data-title="Abstract" lang="en"><div class="c-article-section" id="Abs1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Abs1">Abstract</h2><div class="c-article-section__content" id="Abs1-content"><p>Extracting meetup events from social network posts or webpage announcements is the core technology to build event search services on the Web. While event extraction in English achieves good performance in sentence-level evaluation [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 1" title="Wang, Q., Kanagal, B., Garg, V., Sivakumar, D.: Constructing a comprehensive events database from the web. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. CIKM ’19, pp. 229–238. Association for Computing Machinery, New York, NY, USA (2019). &#xA; https://doi.org/10.1145/3357384.3357986&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR1" id="ref-link-section-d43096310e340">1</a>], the quality of auto-labeled training data via distant supervision is not good enough for word-level event extraction due to long event titles [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2" title="Lin, Y.-H., Chang, C.-H., Chuang, H.-M.: Eventgo! mining events through semi-supervised event title recognition and pattern-based venue/date coupling. J. Inf. Sci. Eng. 39(3), 655–670 (2023). &#xA; https://doi.org/10.6688/JISE.20230339(2).0014&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR2" id="ref-link-section-d43096310e343">2</a>]. Additionally, meetup event titles are more complex and diverse than trigger-word-based event extraction. Therefore, the performance of event title extraction is usually worse than that of traditional named entity recognition (NER). In this paper, we propose a context-aware meetup event extraction (CAMEE) framework that incorporates a sentence-level event argument positioning model to locate event fields (i.e., title, venue, dates, etc.) within a message and then perform word-level event title, venue, and date extraction. Experimental results show that adding sentence-level event argument positioning as a filtering step improves the word-level event field extraction performance from 0.726 to 0.743 macro-F1, outperforming large language models like GPT-4-turbo (with 0.549 F1) and SOTA NER model SoftLexicon (with 0.733 F1). Furthermore, when evaluating the main event extraction task, the proposed model achieves 0.784 macro-F1.</p></div></div></section> <div data-test="cobranding-download"> </div> <section aria-labelledby="content-related-subjects" data-test="subject-content"> <h3 id="content-related-subjects" class="c-article__sub-heading">Explore related subjects</h3> <span class="u-sans-serif u-text-s u-display-block u-mb-24">Discover the latest articles, news and stories from top researchers in related subjects.</span> <ul class="c-article-subject-list" role="list"> <li class="c-article-subject-list__subject"> <a href="/subject/artificial-intelligence" data-track="select_related_subject_1" data-track-context="related subjects from content page" data-track-label="Artificial Intelligence">Artificial Intelligence</a> </li> </ul> </section> <div class="app-card-service" data-test="article-checklist-banner"> <div> <a class="app-card-service__link" data-track="click_presubmission_checklist" data-track-context="article page top of reading companion" data-track-category="pre-submission-checklist" data-track-action="clicked article page checklist banner test 2 old version" data-track-label="link" href="https://beta.springernature.com/pre-submission?journalId=44196" data-test="article-checklist-banner-link"> <span class="app-card-service__link-text">Use our pre-submission checklist</span> <svg class="app-card-service__link-icon" aria-hidden="true" focusable="false"><use xlink:href="#icon-eds-i-arrow-right-small"></use></svg> </a> <p class="app-card-service__description">Avoid common mistakes on your manuscript.</p> </div> <div class="app-card-service__icon-container"> <svg class="app-card-service__icon" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-clipboard-check-medium"></use> </svg> </div> </div> <div class="main-content"> <section data-title="Introduction"><div class="c-article-section" id="Sec1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec1"><span class="c-article-section__title-number">1 </span>Introduction</h2><div class="c-article-section__content" id="Sec1-content"><p>Looking for local events to attend is a common need for most people when traveling or exploring a city. Therefore, extracting meetup events from the Web is crucial to building a meetup event search service. A meetup event is featured by event title, hosting organization, date, location, target participants, and registration fee, etc. This study aims to design an efficient model for extracting four key elements of meetup events: title, location, start date, and end date. As shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig1">1</a>, we highlighted title in blue, start and end dates in orange and purple, and venue in green within the Facebook fan page post.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-1" data-title="Fig. 1"><figure><figcaption><b id="Fig1" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 1</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/1" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig1_HTML.png?as=webp"><img aria-describedby="Fig1" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig1_HTML.png" alt="figure 1" loading="lazy" width="685" height="349"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-1-desc"><p>An event post with the title boxed in blue, start/end dates in orange/purple, venue in green, and non-target entity underlined (Top) In Chinese (Bottom) In English</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/1" data-track-dest="link:Figure1 Full size image" aria-label="Full size image figure 1" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>Since a webpage or a post on social networks may mention multiple meetup events, we define a target event to be an event that includes a title and at least one other piece of event information, such as the event location, start date, or end date. Only events that meet this criterion will be labeled as target events for extraction. In Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig1">1</a>, the underlined text shows another entity of event name, but it is not a target event that we aim to extract. When multiple target events co-locate in one page, associating recognized locations and dates with event title is challenging. Therefore, we focus on the extraction of target events.</p><p>Existing meetup event extraction methods primarily focus on event extraction at the webpage level. For example, Foley et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 3" title="Foley, J., Bendersky, M., Josifovski, V.: Learning to extract local events from the web. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval. SIGIR ’15, pp. 423–432. Association for Computing Machinery, New York, NY, USA (2015). &#xA; https://doi.org/10.1145/2766462.2767739&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR3" id="ref-link-section-d43096310e399">3</a>] uses distant supervision to automatically label ClueWeb12 corpus based on 211K unique events leveraged from <a href="http://schema.org/event">http://schema.org/event</a> records. They adopt a bottom-up approach to recognize three basic event elements, including “When,” “Where,” and “What” through a LIBLINEAR sentence classifier based on 13 features. However, the precision of event field extraction for these three elements are only 0.36, 0.32, and 0.66 for “What,” “When,” and “Where", respectively.</p><p>To improve the performance of event extraction, Wang et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 1" title="Wang, Q., Kanagal, B., Garg, V., Sivakumar, D.: Constructing a comprehensive events database from the web. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. CIKM ’19, pp. 229–238. Association for Computing Machinery, New York, NY, USA (2019). &#xA; https://doi.org/10.1145/3357384.3357986&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR1" id="ref-link-section-d43096310e412">1</a>] proposed an event extraction pipeline divided into two phases. The first phase consists of three modules which predict if a web page contains any event information, decide whether a page contains a single event or multiple events, and extract the event title by classifying a text node of more than 20 words. The event date and location are extracted through a joint extraction method. The second phase includes multiple event extraction through repeated patterns, event consolidation, and wrapper induction, which are designed to use the raw event extractions as input to generate events with high confidence.</p><p>On the other hand, Lin et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2" title="Lin, Y.-H., Chang, C.-H., Chuang, H.-M.: Eventgo! mining events through semi-supervised event title recognition and pattern-based venue/date coupling. J. Inf. Sci. Eng. 39(3), 655–670 (2023). &#xA; https://doi.org/10.6688/JISE.20230339(2).0014&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR2" id="ref-link-section-d43096310e419">2</a>] targeted word-level event extraction and proposed the task of extracting local events from Facebook Fanpage posts. They modeled event extraction as a sequence labeling problem similar to named entity recognition (NER) through distant supervision with 109 K seed events. However, word-level event extraction is hindered because the automatic tagging of event titles cannot be precise for long titles. Although model-based auto-labeling is exploited to improve the quality of the training data, the performance of word-level event title extraction only achieves 0.573 F1 for event title recognition using BERT-based sequence labeling.</p><p>The performance gap between the extraction of sentence-level events (coarse-grained) in English and the extraction of word-level events (fine-grained) in Chinese can be attributed to several reasons: First, precise matching of lengthy titles frequently results in numerous false negatives, while approximate matching may yield excessive false positives. This issue is magnified in word-level event extraction. Second, non-English meetup event sources for distant supervision are typically scarce and lack popularity on social networks, rendering distant supervision unfeasible.</p><p>Despite the availability of manually annotated data, extracting events at the word-level remains a formidable challenge. Even the most advanced large language models, such as GPT-4, achieve only an F1 score of 0.549. This challenge arises because not all entities mentioned in the text pertain to our task of extracting information about meetup events ; that is, our targets are event arguments described in the text, rather than all mentioned entities. Moreover, word-level extraction of meetup events suffers from the data sparsity problem. This implies that the percentage of sentences labeled with event titles, locations, and start/end dates is incredibly low.</p><p>Furthermore, unlike traditional event extraction, in which event arguments are usually mentioned in the same sentence, meetup event arguments (especially event titles) are often scattered in multiple sentences. They may be mixed with descriptions of other meetup events. Therefore, performance is hampered when contextual information is disregarded. All these issues pose substantial hurdles, making meetup event extraction an arduous endeavor.</p><p>To address these challenges, we propose a context-aware meetup-event extraction (CAMEE) framework. Our framework is designed to handle the specific difficulties presented by meetup events, as event titles are often lengthy and full of details, with relevant information frequently scattered across multiple sentences, making key information extraction more difficult. The CAMEE framework operates in two phases: event positioning and event argument extraction. In the first phase, we utilize a sentence-level model to identify whether a given sentence contains relevant event information, such as the event’s title, date, or location. This reduces the impact of irrelevant content and helps focus on key information. In the second phase, we apply a word-level model to jointly identify the boundaries and types of event arguments using the Joint Boundary and Type Recognition (JBTR) approach. This approach accurately determines the boundaries of event arguments and classifies them into their respective types. Specifically, JBTR addresses challenges of boundary ambiguity and type overlap by simultaneously learning boundaries and types, enabling precise event argument identification and differentiation of overlapping types. Through this dual-layered approach, CAMEE effectively filters out irrelevant entities, enhancing the accuracy and completeness of event extraction. The contribution of the work can be summarized in three parts.</p><ul class="u-list-style-bullet"> <li> <p>We introduce a context-aware event argument positioning model based on the BERT architecture for locating event arguments at the sentence level. Experimental results show that the proposed model, Context-Aware Multi-Label Classifier (CAMLC), outperforms other sentence-level detection models (e.g., BERT-CLS [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). &#xA; https://doi.org/10.18653/v1/N19-1423&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/N19-1423&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR4" id="ref-link-section-d43096310e440">4</a>], BERT-Att-BiLSTM-RC [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). &#xA; https://doi.org/10.18653/v1/N19-1423&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/N19-1423&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR4" id="ref-link-section-d43096310e443">4</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 5" title="Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., Xu, B.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 207–212. Association for Computational Linguistics, Berlin, Germany (2016). &#xA; https://doi.org/10.18653/v1/P16-2034&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/P16-2034&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR5" id="ref-link-section-d43096310e446">5</a>]), and H-BERT-MLP [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 6" title="Lei, J., Zhang, Q., Wang, J., Luo, H.: Bert based hierarchical sequence classification for context-aware microblog sentiment analysis. In: International Conference on Neural Information Processing, pp. 376–386. Springer, Sydney, NSW, Australia (2019). Springer" href="/article/10.1007/s44196-024-00697-0#ref-CR6" id="ref-link-section-d43096310e449">6</a>]) and achieves the highest performance, with a Macro-F1 score of 0.780, for the event argument positioning task at the sentence level.</p> </li> <li> <p>For word-level event field recognition, we adopted a multitask learning model by decoupling the event field extraction problem into two subproblems: boundary identification and argument type categorization. The proposed Joint Boundary and Type Recognition (JBTR) model outperforms BERT-QA [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 7" title="Du, X., Cardie, C.: Event extraction by answering (almost) natural questions. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 671–683. Association for Computational Linguistics, Online (2020). &#xA; https://doi.org/10.18653/v1/2020.emnlp-main.49&#xA; &#xA; . &#xA; https://aclanthology.org/2020.emnlp-main.49&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR7" id="ref-link-section-d43096310e458">7</a>] and ERNIE [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 8" title="Sun, Y., Wang, S., Feng, S., Ding, S., Pang, C., Shang, J., Liu, J., Chen, X., Zhao, Y., Lu, Y., Liu, W., Wu, Z., Gong, W., Liang, J., Shang, Z., Sun, P., Liu, W., Xuan, O., Yu, D., Tian, H., Wu, H., Wang, H.: Ernie 3.0: large-scale knowledge enhanced pre-training for language understanding and generation. In: arXiv Preprint &#xA; arXiv:2107.02137&#xA; &#xA; , vol. abs/2107.02137. arXiv, &#34;Online&#34; (2021). &#xA; https://api.semanticscholar.org/CorpusID:235731579&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR8" id="ref-link-section-d43096310e461">8</a>] by more than 14 to 28% and is comparable to the lexically enhanced BERT-based SoftLexicon model [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 9" title="Ma, R., Peng, M., Zhang, Q., Wei, Z., Huang, X.: Simplify the usage of lexicon in Chinese NER. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5951–5960. Association for Computational Linguistics, Online (2020). &#xA; https://doi.org/10.18653/v1/2020.acl-main.528&#xA; &#xA; . &#xA; https://aclanthology.org/2020.acl-main.528&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR9" id="ref-link-section-d43096310e464">9</a>] (0.726 vs. 0.733 macro-F1).</p> </li> <li> <p>Through the CAMEE framework, the two-stage model achieves an overall performance of 0.743 macro-F1, outperforming BERT-based SoftLexicon [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 9" title="Ma, R., Peng, M., Zhang, Q., Wei, Z., Huang, X.: Simplify the usage of lexicon in Chinese NER. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5951–5960. Association for Computational Linguistics, Online (2020). &#xA; https://doi.org/10.18653/v1/2020.acl-main.528&#xA; &#xA; . &#xA; https://aclanthology.org/2020.acl-main.528&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR9" id="ref-link-section-d43096310e473">9</a>] by 1% macro-F1 (0.743 vs. 0.733). In terms of the top 1 extraction, i.e., P/R/F@1, the two-stage method achieves 0.784 macro-F1.</p> </li> </ul><p>The following section compares related work on meetup events and traditional event extraction. Section <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec6">3</a> introduces the architecture of the system to build meetup events. We present the performance of different event extraction models in Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec10">4</a>. Finally, Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec18">5</a> concludes the article and suggests future work.</p></div></div></section><section data-title="Related Work"><div class="c-article-section" id="Sec2-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec2"><span class="c-article-section__title-number">2 </span>Related Work</h2><div class="c-article-section__content" id="Sec2-content"><p>Finding local events in a new city can be a search service for users to explore the city. Like GIS and geo-social search, event search or recommendation is also a location-based service that aims to meet users’ information needs due to mobility. An event search system may recommend a kid-friendly event at a children’s amusement park to users. The motivation is similar to the research on contextual suggestions by Dean-Hall et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 10" title="Dean-Hall, A., Clarke, C.L., Simone, N., Kamps, J., Thomas, P., Voorhees, E.: Overview of the TREC 2013 contextual suggestion track. In: Voorhees, E. (ed.) Proceedings of The Twenty-Second Text REtrieval Conference, TREC 2013, Gaithersburg, Maryland, USA, November 19-22, 2013. NIST Special Publication, vol. 500-302. National Institute of Standards and Technology (NIST), Gaithersburg, Maryland, USA (2013). &#xA; http://trec.nist.gov/pubs/trec22/papers/CONTEXT.OVERVIEW.pdf&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR10" id="ref-link-section-d43096310e496">10</a>], which aims to provide users with a better search experience. Constructing an event database can provide a comprehensive event search service that depicts what people do in a city and, to some extent, reflects the community culture. Therefore, Google researchers [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 1" title="Wang, Q., Kanagal, B., Garg, V., Sivakumar, D.: Constructing a comprehensive events database from the web. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. CIKM ’19, pp. 229–238. Association for Computing Machinery, New York, NY, USA (2019). &#xA; https://doi.org/10.1145/3357384.3357986&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR1" id="ref-link-section-d43096310e499">1</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 3" title="Foley, J., Bendersky, M., Josifovski, V.: Learning to extract local events from the web. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval. SIGIR ’15, pp. 423–432. Association for Computing Machinery, New York, NY, USA (2015). &#xA; https://doi.org/10.1145/2766462.2767739&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR3" id="ref-link-section-d43096310e502">3</a>] have focused on extracting events from semistructured web pages or plain text messages posted on social networks.</p><h3 class="c-article__sub-heading" id="Sec3"><span class="c-article-section__title-number">2.1 </span>Meetup-Event Extraction from the Web</h3><p>Wang et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 1" title="Wang, Q., Kanagal, B., Garg, V., Sivakumar, D.: Constructing a comprehensive events database from the web. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. CIKM ’19, pp. 229–238. Association for Computing Machinery, New York, NY, USA (2019). &#xA; https://doi.org/10.1145/3357384.3357986&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR1" id="ref-link-section-d43096310e512">1</a>] used HTML tag paths as landmarks for event extraction rules. They handled single and multiple event pages differently. For pages with multiple events, they exploited repeated patterns, labeling event titles from schema.org as positive examples to train a neural network for title extraction. For single-event pages from the same domain, they clustered similar HTML structures to generate XPath template rules for field extraction. For single-event pages without similar templates, they grouped extracted titles/arguments across sources, removing duplicates to consolidate the best event title, date, and location representations from each cluster. This multi-source consolidation improved extraction quality even when individual page extractions were imprecise.</p><p>Unlike above, Foley et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 3" title="Foley, J., Bendersky, M., Josifovski, V.: Learning to extract local events from the web. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval. SIGIR ’15, pp. 423–432. Association for Computing Machinery, New York, NY, USA (2015). &#xA; https://doi.org/10.1145/2766462.2767739&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR3" id="ref-link-section-d43096310e518">3</a>] do not expect the target data to align perfectly with any structure. Since web pages often contain multiple areas, such as navigation links, banners, main content, advertisements, and footers, they model the task as a scoring problem following a bottom-up approach (from field-level, region-level, to document-level) and greedily grouped extracted fields into disjoint event records with the assumption that predicted events should not overlap. The field set scoring function included the field-level scoring for each field (i.e., text span) and the joint scoring of field occurrences in a region. However, Foley et al. use simple SVM methods to get the score for each event field. Thus, the performance does not achieve the precision needed for real-world applications.</p><p>Essentially, Foley et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 3" title="Foley, J., Bendersky, M., Josifovski, V.: Learning to extract local events from the web. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval. SIGIR ’15, pp. 423–432. Association for Computing Machinery, New York, NY, USA (2015). &#xA; https://doi.org/10.1145/2766462.2767739&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR3" id="ref-link-section-d43096310e524">3</a>] and Wang et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 1" title="Wang, Q., Kanagal, B., Garg, V., Sivakumar, D.: Constructing a comprehensive events database from the web. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. CIKM ’19, pp. 229–238. Association for Computing Machinery, New York, NY, USA (2019). &#xA; https://doi.org/10.1145/3357384.3357986&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR1" id="ref-link-section-d43096310e527">1</a>] handled the problem based on text nodes parsed from HTML DOM trees, i.e., the output is a coarse-grained text string output, which may cover other irrelevant information. Foley et al. conducted a multiclass classification on each text field divided by HTML tags. Since a text node may contain more than one field and other information, using multiclass classification does not seem reasonable. Thus, the performance for event title extraction is not high (36% precision). To improve performance, Wang et al. ignore text nodes with less than 20 words to reduce the number of negative examples for event title classification and improve performance to 84% precision. As for event date and address extraction, they trained an independent binary classifier using pattern-based approaches. Since there could be multiple titles, venues, and start and end dates in a webpage, Foley et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 3" title="Foley, J., Bendersky, M., Josifovski, V.: Learning to extract local events from the web. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval. SIGIR ’15, pp. 423–432. Association for Computing Machinery, New York, NY, USA (2015). &#xA; https://doi.org/10.1145/2766462.2767739&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR3" id="ref-link-section-d43096310e530">3</a>] ranked predicted nodes by scores and greedily output the highest-scoring event on a page.</p><p>On the contrary, Lin et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2" title="Lin, Y.-H., Chang, C.-H., Chuang, H.-M.: Eventgo! mining events through semi-supervised event title recognition and pattern-based venue/date coupling. J. Inf. Sci. Eng. 39(3), 655–670 (2023). &#xA; https://doi.org/10.6688/JISE.20230339(2).0014&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR2" id="ref-link-section-d43096310e536">2</a>] focused on word-level event extraction from posts on social media networks to obtain a more fine-grained output. They used seed-based distant supervision to prepare two training corpora (based on Google search snippets and posts in the Facebook Fan group) using Facebook Event and Citytalk. Because exact matches could be used to annotate only a small percentage of posts relating to events, they introduced longest common subsequence (LCS)-based matching and the filtering of core words to overcome noise in approximate matching. They achieved 0.565 F1 in extracting the title of the events. They used this model to label 604K sentences from fan groups on Facebook, and their F1 score improved from 0.565 to 0.573 F1.</p><h3 class="c-article__sub-heading" id="Sec4"><span class="c-article-section__title-number">2.2 </span>Traditional (Closed Domain) Event Extraction from Plain Texts</h3><p>Unlike the extraction of meetup-type events that users can add to their calendars, traditional event extraction dates back to the message understanding conference (MUC) series, which aimed to develop systems capable of recognizing named entities (e.g., people, organizations, and locations) and extracting events (such as military conflicts, changes in corporate management, and joint ventures). Following the MUC series, the Automatic Content Extraction (ACE) project [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 11" title="Doddington, G., Mitchell, A., Przybocki, M., Ramshaw, L., Strassel, S., Weischedel, R.: The automatic content extraction (ACE) program – tasks, data, and evaluation. In: Lino, M.T., Xavier, M.F., Ferreira, F., Costa, R., Silva, R. (eds.) Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC’04). European Language Resources Association (ELRA), Lisbon, Portugal (2004). &#xA; http://www.lrec-conf.org/proceedings/lrec2004/pdf/5.pdf&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR11" id="ref-link-section-d43096310e547">11</a>]<sup><a href="#Fn1"><span class="u-visually-hidden">Footnote </span>1</a></sup> was launched in the early 2000s as part of DARPA’s Translingual Information Detection, Extraction, and Summarization (TIDES) program.</p><p>According to the task definition of event extraction provided by ACE 2005, event mentions are triggered by single verbs or nouns and are associated with other entities referred to as arguments, describing changes in event states, including who, what, when, where, and how [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 12" title="Xiang, W., Wang, B.: A survey of event extraction from text. IEEE Access 7, 173111–173137 (2019). &#xA; https://doi.org/10.1109/ACCESS.2019.2956831&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR12" id="ref-link-section-d43096310e564">12</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 13" title="Li, Q., Li, J., Sheng, J., Cui, S., Wu, J., Hei, Y., Peng, H., Guo, S., Wang, L., Beheshti, A., Yu, P.S.: A survey on deep learning event extraction: approaches and applications. IEEE Trans. Neural Netw. Learn. Syst. 35, 6301–6321 (2021)" href="/article/10.1007/s44196-024-00697-0#ref-CR13" id="ref-link-section-d43096310e567">13</a>]. In the ACE 2005 event corpus, eight event types and 33 event subtypes are predefined. Typical closed-domain event extraction methods involve four subtasks: trigger word identification, event type classification, argument identification, and argument role classification, which can be executed sequentially or simultaneously.</p><p>While classical approaches such as Dynamic Multi-pooling Convolutional Neural Network (DMCNN) [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 14" title="Chen, Y., Xu, L., Liu, K., Zeng, D., Zhao, J.: Event extraction via dynamic multi-pooling convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 167–176. Association for Computational Linguistics, Beijing, China (2015). &#xA; https://doi.org/10.3115/v1/P15-1017&#xA; &#xA; . &#xA; https://aclanthology.org/P15-1017&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR14" id="ref-link-section-d43096310e573">14</a>] and Joint Recurrent Neural Network (JRNN) [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 15" title="Nguyen, T.H., Cho, K., Grishman, R.: Joint event extraction via recurrent neural networks. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 300–309. Association for Computational Linguistics, San Diego, California (2016). &#xA; https://doi.org/10.18653/v1/N16-1034&#xA; &#xA; . &#xA; https://aclanthology.org/N16-1034&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR15" id="ref-link-section-d43096310e576">15</a>] apply event type classification or argument role labeling for each word, others [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 16" title="Tian, C., Zhao, Y., Ren, L.: A Chinese event relation extraction model based on bert. In: 2019 2nd International Conference on Artificial Intelligence and Big Data (ICAIBD), pp. 271–276 (2019). IEEE" href="/article/10.1007/s44196-024-00697-0#ref-CR16" id="ref-link-section-d43096310e579">16</a>] utilize sequence labeling models to identify both triggers and argument roles.</p><p>One benefit of sequence labeling is that one can utilize lexical information to recognize trigger words as well as argument roles to improve model performance. For example, the BERT-based SoftLexicon [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 9" title="Ma, R., Peng, M., Zhang, Q., Wei, Z., Huang, X.: Simplify the usage of lexicon in Chinese NER. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5951–5960. Association for Computational Linguistics, Online (2020). &#xA; https://doi.org/10.18653/v1/2020.acl-main.528&#xA; &#xA; . &#xA; https://aclanthology.org/2020.acl-main.528&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR9" id="ref-link-section-d43096310e585">9</a>] introduces a simple yet effective method incorporating lexical information into character representations. Compatible with BERT, it further enhances performance by effectively leveraging lexicons containing proper nouns and common entity names, significantly boosting model precision and recall.</p><p>Meanwhile, large pre-trained language models like BERT [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). &#xA; https://doi.org/10.18653/v1/N19-1423&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/N19-1423&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR4" id="ref-link-section-d43096310e592">4</a>], GPT [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 17" title="Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I., et al.: Language models are unsupervised multitask learners. OpenAI Blog 1(8), 9 (2019)" href="/article/10.1007/s44196-024-00697-0#ref-CR17" id="ref-link-section-d43096310e595">17</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 18" title="Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J.D., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., et al.: Language models are few-shot learners. Adv. Neural. Inf. Process. Syst. 33, 1877–1901 (2020)" href="/article/10.1007/s44196-024-00697-0#ref-CR18" id="ref-link-section-d43096310e598">18</a>], and T5 [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 19" title="Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., Zhou, Y., Li, W., Liu, P.J.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. 21(1), 5485–5551 (2020)" href="/article/10.1007/s44196-024-00697-0#ref-CR19" id="ref-link-section-d43096310e601">19</a>] have revolutionized model building in deep learning. BERT-like models, founded upon Auto-encoding Language Models, exhibit outstanding performance in singular-task Natural Language Understanding (NLU) tasks [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 6" title="Lei, J., Zhang, Q., Wang, J., Luo, H.: Bert based hierarchical sequence classification for context-aware microblog sentiment analysis. In: International Conference on Neural Information Processing, pp. 376–386. Springer, Sydney, NSW, Australia (2019). Springer" href="/article/10.1007/s44196-024-00697-0#ref-CR6" id="ref-link-section-d43096310e604">6</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 7" title="Du, X., Cardie, C.: Event extraction by answering (almost) natural questions. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 671–683. Association for Computational Linguistics, Online (2020). &#xA; https://doi.org/10.18653/v1/2020.emnlp-main.49&#xA; &#xA; . &#xA; https://aclanthology.org/2020.emnlp-main.49&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR7" id="ref-link-section-d43096310e608">7</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 9" title="Ma, R., Peng, M., Zhang, Q., Wei, Z., Huang, X.: Simplify the usage of lexicon in Chinese NER. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5951–5960. Association for Computational Linguistics, Online (2020). &#xA; https://doi.org/10.18653/v1/2020.acl-main.528&#xA; &#xA; . &#xA; https://aclanthology.org/2020.acl-main.528&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR9" id="ref-link-section-d43096310e611">9</a>]. Of particular note is their excellent performance on the ACE 2005 event retrieval task, surpassing the category T5 model proposed in 2019 [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 20" title="Lu, Y., Lin, H., Xu, J., Han, X., Tang, J., Li, A., Sun, L., Liao, M., Chen, S.: Text2event: controllable sequence-to-structure generation for end-to-end event extraction. arXiv preprint &#xA; arXiv:2106.09232&#xA; &#xA; (2021)" href="/article/10.1007/s44196-024-00697-0#ref-CR20" id="ref-link-section-d43096310e614">20</a>].</p><p>Furthermore, traditional event extraction methods typically focus on identifying event arguments from explicit mentions in the text, relying heavily on clearly stated event triggers and arguments. However, in real-world scenarios, event arguments are not always explicitly mentioned, making implicit event argument extraction increasingly important. In the field of implicit event argument extraction, prior work mainly focused on capturing direct relationships between arguments and event triggers. However, the FEAE framework [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 21" title="Wei, K., Sun, X., Zhang, Z., Zhang, J., Zhi, G., Jin, L.: Trigger is not sufficient: exploiting frame-aware knowledge for implicit event argument extraction. In: Zong, C., Xia, F., Li, W., Navigli, R. (eds.) Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 4672–4682. Association for Computational Linguistics, Online (2021). &#xA; https://doi.org/10.18653/v1/2021.acl-long.360&#xA; &#xA; . &#xA; https://aclanthology.org/2021.acl-long.360&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR21" id="ref-link-section-d43096310e620">21</a>] introduces a novel reasoning method based on event frames, using related arguments as clues to guide the extraction of implicit arguments. This approach integrates a curriculum knowledge distillation strategy and achieves state-of-the-art performance on the RAMS dataset. Similarly, the AREA framework [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 22" title="Wei, K., Sun, X., Zhang, Z., Jin, L., Zhang, J., Lv, J., Guo, Z.: Implicit event argument extraction with argument-argument relational knowledge. IEEE Trans. Knowl. Data Eng. 35(9), 8865–8879 (2023). &#xA; https://doi.org/10.1109/TKDE.2022.3218830&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR22" id="ref-link-section-d43096310e623">22</a>] proposes a model for implicit event argument extraction using argument-relationship reasoning. By incorporating knowledge distillation and curriculum learning, it achieves strong performance on the RAMS and Wikievents datasets.</p><h3 class="c-article__sub-heading" id="Sec5"><span class="c-article-section__title-number">2.3 </span>Open Domain Event Extraction</h3><p>With the popularity of social networks, the task of event extraction has shifted to user-generated content on social networks or blogs. The new event extraction goal is extracting open-domain events (such as epidemiology spreads, natural disaster damage reports, riots, etc.) for information understanding. For example, Ritter et al. propose TwiCal [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 23" title="Ritter, A., Mausam, Etzioni, O., Clark, S.: Open domain event extraction from twitter. In: Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. KDD ’12, pp. 1104–1112. Association for Computing Machinery, New York, NY, USA (2012). &#xA; https://doi.org/10.1145/2339530.2339704&#xA; &#xA; . &#xA; https://doi.org/10.1145/2339530.2339704&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR23" id="ref-link-section-d43096310e634">23</a>] to extract, summarize, and classify important events. Events in TwiCal are represented by 4-tuples, including named entities, event phrases, calendar dates, and event types. The authors apply a named entity tagger trained on in-domain Twitter data, train a CRF-based sequence model to extract event phrases using 1000 manually annotated tweets, and use TempEx [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 24" title="Mani, I., Wilson, G.: Robust temporal processing of news. In: Proceedings of the 38th Annual Meeting of the Association for Computational Linguistics, pp. 69–76. Association for Computational Linguistics, Hong Kong (2000). &#xA; https://doi.org/10.3115/1075218.1075228&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR24" id="ref-link-section-d43096310e637">24</a>] to parse explicit calendar-referenced time expressions.</p><p>In contrast to supervised training for event phrase extraction, Chen et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 25" title="Chen, Y., Liu, S., Zhang, X., Liu, K., Zhao, J.: Automatically labeled data generation for large scale event extraction. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 409–419. Association for Computational Linguistics, Vancouver, Canada (2017). &#xA; https://doi.org/10.18653/v1/P17-1038&#xA; &#xA; . &#xA; https://aclanthology.org/P17-1038&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR25" id="ref-link-section-d43096310e643">25</a>] proposed distant supervision to automatically generate labeled data for large-scale event extraction by jointly using world knowledge (Freebase) and linguistic knowledge (FrameNet). The labeled data preparation involves four steps: (i) key argument detection via Freebase; (ii) trigger word detection by labeling sentences in Wikipedia; (iii) trigger word filtering and expansion by FrameNet; and (iv) automatic labeled data generation by the Soft Distant Supervision. The experimental result showed that the model trained with large-scale auto-labeled data is competitive with models trained with human-annotated data from the ACE corpus [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 14" title="Chen, Y., Xu, L., Liu, K., Zeng, D., Zhao, J.: Event extraction via dynamic multi-pooling convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 167–176. Association for Computational Linguistics, Beijing, China (2015). &#xA; https://doi.org/10.3115/v1/P15-1017&#xA; &#xA; . &#xA; https://aclanthology.org/P15-1017&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR14" id="ref-link-section-d43096310e646">14</a>].</p><p>However, earlier Open Information Extraction (OIE) systems often faced difficulties in accuracy due to their inability to model dependencies among extractions. In contrast, sequence generation-based methods have shown improved performance but are limited by autoregressive strategies that may reduce efficiency. Therefore, OIE methods have gradually evolved from traditional sequence labeling approaches to detection-based models. For example, Wei et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 26" title="Wei, K., Yang, Y., Jin, L., Sun, X., Zhang, Z., Zhang, J., Li, X., Zhang, L., Liu, J., Zhi, G.: Guide the many-to-one assignment: Open information extraction via IoU-aware optimal transport. In: Rogers, A., Boyd-Graber, J., Okazaki, N. (eds.) Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4971–4984. Association for Computational Linguistics, Toronto, Canada (2023). &#xA; https://doi.org/10.18653/v1/2023.acl-long.272&#xA; &#xA; . &#xA; https://aclanthology.org/2023.acl-long.272&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR26" id="ref-link-section-d43096310e652">26</a>] utilized parallelism for tuple extraction and highlighted the challenges related to label assignment in these models, proposing a novel IoU-aware Optimal Transport method to enhance label assignment in OIE.</p><p>In addition to the development of open domain event extraction techniques, the problem of event detection, description and linking from heterogeneous social media has become a new research topic. For example, Abebe et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 27" title="Abebe, M.A., Tekli, J., Getahun, F., Chbeir, R., Tekli, G.: Generic metadata representation framework for social-based event detection, description, and linkage. Knowl.-Based Syst. 188, 104817 (2020). &#xA; https://doi.org/10.1016/j.knosys.2019.06.025&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR27" id="ref-link-section-d43096310e658">27</a>] proposed a framework SEDDaL, which uses a Metadata Representation Space Model (MRSM) with temporal, spatial, and semantic dimensions to uniformly represent diverse social media events. The framework then applies similarity evaluation, event detection, and relationship identification to create a knowledge graph of interconnected events, demonstrating a new application of open domain event extraction.</p></div></div></section><section data-title="Proposed Methods"><div class="c-article-section" id="Sec6-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec6"><span class="c-article-section__title-number">3 </span>Proposed Methods</h2><div class="c-article-section__content" id="Sec6-content"><p>The focus of this study is to design an effective model to extract meetup events. Specifically, we want to extract the title, location, and start/end date of the main event mentioned in posts on fan groups on Facebook, as shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig1">1</a>. Similar to relation extraction tasks, which involve NER and relation classification between entities, event extraction methods typically follow a pipeline framework first to identify trigger words and arguments followed by event type and argument role classification, as shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig2">2</a>a.</p><p>However, extracting meetup events poses unique challenges compared to traditional event extraction: (1) meetup event titles are often verbose and difficult to identify via single trigger words. The vocabulary must accommodate community jargon (e.g., Dining In The Dark) and event-related details (such as dates, person names, etc.) beyond simple triggers. (2) Event titles, dates, and locations for meet-ups are scattered across multiple sentences within posts while existing methods operate at the single-sentence level. (3) Meetup extraction focuses specifically on extracting event arguments like titles, times, and venues—a restrained scope compared to all entity mentions captured in named entity recognition. Therefore, event-irrelevant entities need to be excluded to avoid incorrect recognition of the event relation.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-2" data-title="Fig. 2"><figure><figcaption><b id="Fig2" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 2</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/2" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig2_HTML.png?as=webp"><img aria-describedby="Fig2" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig2_HTML.png" alt="figure 2" loading="lazy" width="685" height="350"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-2-desc"><p>Comparison of bottom-up traditional event extraction approach and top-down CAMEE framework</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/2" data-track-dest="link:Figure2 Full size image" aria-label="Full size image figure 2" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>Based on the analysis, we propose the context-aware meetup event extraction (CAMEE) framework (see Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig2">2</a>b) that consists of a sentence-level model for locating event arguments (i.e. predicting whether each sentence contains the title, location, start date, and end date of an event based on four binary outputs) and a multi-task argument recognition model (that extracts the boundary of each event field and predicts its type).</p><h3 class="c-article__sub-heading" id="Sec7"><span class="c-article-section__title-number">3.1 </span>Sentence-Level Model for Event Argument Positioning</h3><p>As mentioned, traditional event extraction models are limited to processing single-sentence inputs, while existing meetup event extraction models only extract sentence-level event fields. To address this limitation, we propose the event argument positioning approach at the sentence level to capture contextual information from adjacent sentences to predict if any event arguments are present in a sentence. Meanwhile, because a sentence may contain the event title, event venue, and start or end date simultaneously, we frame the problem as a multi-label classification task.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-3" data-title="Fig. 3"><figure><figcaption><b id="Fig3" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 3</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/3" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig3_HTML.png?as=webp"><img aria-describedby="Fig3" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig3_HTML.png" alt="figure 3" loading="lazy" width="685" height="484"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-3-desc"><p>Sentence-level event argument positioning by context-aware multi-label classifier (CAMLC)</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/3" data-track-dest="link:Figure3 Full size image" aria-label="Full size image figure 3" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>The sentence-level model for locating event arguments is shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig3">3</a>. Let <span class="mathjax-tex">\(L=\{Title\)</span>, <i>Venue</i>, <i>StartDate</i>, <span class="mathjax-tex">\(EndDate\}\)</span> be the set of argument roles for event extraction and <span class="mathjax-tex">\(\textbf{s} = (\textbf{s}_1, \textbf{s}_2, \ldots , \textbf{s}_{|\textbf{s}|})\)</span> denotes a message, where each <span class="mathjax-tex">\(\textbf{s}_i\)</span> is a sentence in <span class="mathjax-tex">\(\textbf{s}\)</span>, represented by <span class="mathjax-tex">\({\textbf{s}_i}= (w_{i1}, \ldots , w_{ik})\)</span>, <span class="mathjax-tex">\(k=|{\textbf{s}_i}|\)</span>. Each sentence is associated with a subset of <i>L</i>, represented by a vector <span class="mathjax-tex">\(\textbf{y}_i=[y_{i1}, y_{i2}, y_{i3}, y_{i4}]\)</span> where <span class="mathjax-tex">\(y_{ij}\)</span> = 1 if the sentence <span class="mathjax-tex">\(\textbf{s}_i\)</span> contains an event argument <span class="mathjax-tex">\(L_j\)</span> and 0 otherwise.</p><p>To represent a sentence, we adopt a pre-trained BERT-based encoding module to produce sentence-level representations. A special token [CLS] is added in front of every sentence, i.e., <span class="mathjax-tex">\(w_{i0}=[CLS]\)</span> for all <i>i</i>. The output of BERT’s bidirectional transformer on the [CLS] token, <span class="mathjax-tex">\(T_i=BERT({{{\textbf {s}}}_i}) \in R^d\)</span>, is then used as the sentence representation for sentence <span class="mathjax-tex">\(\textbf{s}_i\)</span>.</p><p>To highlight the position of the sentence, we incorporate positional embeddings employing a sinusoidal curve encoding method [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 28" title="Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., Polosukhin, I.: Attention is all you need. Advances in neural information processing systems 30 (2017)" href="/article/10.1007/s44196-024-00697-0#ref-CR28" id="ref-link-section-d43096310e1341">28</a>] as BERT.</p><div id="Equ1" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} T_i = T_i + PositionEmbedding(i) \end{aligned}$$</span></div><div class="c-article-equation__number"> (1) </div></div><p>Next, all sentence representations <span class="mathjax-tex">\(\textbf{T}=[T_1, T_2, \ldots , T_{|\textbf{s}|}]\)</span> will pass through a Gated Recurrent Unit (GRU) layer to capture contextual information from adjacent sentences. We stack two GRU networks to get information from backward and forward states simultaneously. Formally,</p><div id="Equ2" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} \begin{array}{ll} \overrightarrow{z_i} = \overrightarrow{GRU} (\overrightarrow{z_{i-1}},{T_i}) \\ \overleftarrow{z_i} = \overleftarrow{GRU} (\overleftarrow{z_{i+1}},{T_i}) \\ z_i = \overrightarrow{z_i}\oplus \overleftarrow{z_i} \end{array} \end{aligned}$$</span></div><div class="c-article-equation__number"> (2) </div></div><p>where <span class="mathjax-tex">\(\overrightarrow{z_i}\)</span> and <span class="mathjax-tex">\(\overleftarrow{z_i}\)</span> (with <span class="mathjax-tex">\(d_r\)</span> hidden neurons) denote the hidden states of the forward and backward GRU at the <i>i</i>-th time step. Let <span class="mathjax-tex">\(\theta _r\)</span> denote the parameters of Bi-GRU, we define the output of Bi-GRU function <span class="mathjax-tex">\(G_r(\textbf{T};\theta _r)=[z_1, z_2, \ldots , z_{\mathbf{|s|}}]\in R^{\mathbf{|s|} \times 2d_r}\)</span>. Finally, two fully connected layers with parameters <span class="mathjax-tex">\(W_l\)</span> and <span class="mathjax-tex">\(\textbf{b}_l\)</span>, <span class="mathjax-tex">\(l=1, 2\)</span> are added to predict whether each sentence <span class="mathjax-tex">\(\textbf{s}_i\)</span> is associated with label <span class="mathjax-tex">\(\textbf{y}_i\)</span> by</p><div id="Equ3" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} {\hat{\textbf{y}}}=\sigma (G_r(\textbf{T};\theta _r)*W_1+\textbf{b}_{\textbf{1}})*W_2+\textbf{b}_{\mathbf{2)}} \end{aligned}$$</span></div><div class="c-article-equation__number"> (3) </div></div><p>where <span class="mathjax-tex">\({\hat{\textbf{y}}=\{{\hat{y}}_1, {\hat{y}}_2, \ldots , {\hat{y}}_{|s|}\}}\in R^{|\textbf{s}|*4}\)</span> and <span class="mathjax-tex">\(\sigma \)</span> denotes the sigmoid function. For layer one: <span class="mathjax-tex">\(W_1\in R^{2d_r\times d_f}\)</span>, and <span class="mathjax-tex">\(\textbf{b}_{\textbf{1}}\in R^{d_f}\)</span>. For layer two: <span class="mathjax-tex">\(W_2 \in R^{d_f\times 4}\)</span>, and <span class="mathjax-tex">\(\textbf{b}_{\textbf{2}}\in R^4\)</span>. We use the cross-entropy to evaluate the loss for each message <i>s</i> as our loss function:</p><div id="Equ4" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} L(\textbf{s}) = - \frac{1}{4*|\mathbf{s|}}\sum _{i=1}^{\mathbf{|s|}} \sum _{j=1}^4 y_{ij}log({\hat{y}}_{ij}) +(1 - y_{ij})log( 1 - {\hat{y}}_{ij})\nonumber \\ \end{aligned}$$</span></div><div class="c-article-equation__number"> (4) </div></div><h4 class="c-article__sub-heading c-article__sub-heading--small" id="Sec8"><span class="c-article-section__title-number">3.1.1 </span>Implementations</h4><p>There are two possible implementations of BERT sentence-level models. One can encode all sentences by one BERT and fine-tune the model during training (see Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig4">4</a>a) [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 6" title="Lei, J., Zhang, Q., Wang, J., Luo, H.: Bert based hierarchical sequence classification for context-aware microblog sentiment analysis. In: International Conference on Neural Information Processing, pp. 376–386. Springer, Sydney, NSW, Australia (2019). Springer" href="/article/10.1007/s44196-024-00697-0#ref-CR6" id="ref-link-section-d43096310e2802">6</a>] by utilizing average pooling with 50 window size to extract sentence vectors for each sentence. However, simply using average pooling may lead to the loss of some crucial semantic information within sentences and may not accurately encode each sentence independently, thus limiting the model’s grasp of the internal details of each sentence. Another approach is to encode each sentence by one BERT model and fine-tune the model with the pre-trained BERT frozen during training (see Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig4">4</a>b) [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 29" title="Chang, C.-H., Liao, Y.-C., Yeh, T.: Event source page discovery via policy-based rl with multi-task neural sequence model. In: International Conference on Web Information Systems Engineering, pp. 597–606 (2022). Springer" href="/article/10.1007/s44196-024-00697-0#ref-CR29" id="ref-link-section-d43096310e2808">29</a>]. In other words, this approach employs BERT as feature extraction such that the pre-trained BERT does not participate in the downstream tasks training (no gradient feedback).</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-4" data-title="Fig. 4"><figure><figcaption><b id="Fig4" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 4</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/4" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig4_HTML.png?as=webp"><img aria-describedby="Fig4" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig4_HTML.png" alt="figure 4" loading="lazy" width="685" height="403"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-4-desc"><p>Three possible implementations of sentence-level sequential output: <b>a</b> encoding of all sentences by one BERT [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 6" title="Lei, J., Zhang, Q., Wang, J., Luo, H.: Bert based hierarchical sequence classification for context-aware microblog sentiment analysis. In: International Conference on Neural Information Processing, pp. 376–386. Springer, Sydney, NSW, Australia (2019). Springer" href="/article/10.1007/s44196-024-00697-0#ref-CR6" id="ref-link-section-d43096310e2824">6</a>] <b>b</b> encoding of each sentence by one BERT without fine-tuning [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 29" title="Chang, C.-H., Liao, Y.-C., Yeh, T.: Event source page discovery via policy-based rl with multi-task neural sequence model. In: International Conference on Web Information Systems Engineering, pp. 597–606 (2022). Springer" href="/article/10.1007/s44196-024-00697-0#ref-CR29" id="ref-link-section-d43096310e2830">29</a>] <b>c</b> encoding of each sentence by one BERT with fine-tuning</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/4" data-track-dest="link:Figure4 Full size image" aria-label="Full size image figure 4" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>On the contrary, our implementation (see Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig4">4</a>c) records both the forward propagation and vector values of sentences each time BERT is used as a sentence encoder. We use TensorFlow’s while_loop function to retain tensor values across iterations during the forward pass, ensuring that gradient updates can be calculated accurately during backpropagation. This enables the calculation of gradient changes during the backward process, allowing BERT to utilize the gradient information from each sentence to update its weights and better capture semantic relationships across sentences. By harnessing this gradient information, BERT’s weights are adjusted to enhance further its ability to capture semantic information. Specifically, we adopt a sequence-to-sequence GRU structure, which has the advantage that the loss includes the output of the GRU at each time step. This means that the error gradient of the model will flow backward from the output of each time step, allowing multiple fine-tuning of BERT.</p><h3 class="c-article__sub-heading" id="Sec9"><span class="c-article-section__title-number">3.2 </span>Word-Level Argument Recognition Model</h3><p>Following the previous section, we will perform argument boundary recognition if the event argument positioning indicates a sentence is marked as “yes” for any event field. Otherwise, we will skip it to avoid false positives. Note that argument recognition is similar to named entity recognition. However, this task can involve nested structures and mutually exclusive categories. Nested structures mean some labels may be contained within others in the sequence labeling task. Mutually exclusive categories mean each input can only belong to one class—the classes are mutually exclusive. For instance, date can be categorized as either “start date” or “end date”, but it cannot simultaneously belong to both.</p><p>To handle both nested structures and mutually exclusive categories effectively, we propose combining conditional random fields (CRF) and softmax. First, we use CRF to identify the boundaries of entities or event arguments, separating the sequence into chunks. Then, we use softmax to vote on each chunk and classify it as a particular entity or event argument type.</p><p>Specifically, for each sentence <span class="mathjax-tex">\({\textbf{x}}= (w_{1}, \ldots , w_{|x|})\)</span> with |<i>x</i>| words, the output is divided into two parts, the boundary of the event arguments <span class="mathjax-tex">\(\textbf{y}^{\textbf{B}}=\{y_1^B, y_2^B, \ldots , y_{|x|}^B\}\)</span> and the type of the event arguments <span class="mathjax-tex">\(\textbf{y}^{\textbf{T}}=\{y_1^T, y_2^T, \ldots , y_{|x|}^T\}\)</span>. The former is implemented with <span class="mathjax-tex">\(L_B\)</span>={B, I, E, S, O} tags denoting <i>Begin</i>, <i>Inside</i>, <i>End</i>, <i>Single</i>, and <i>Outside</i>, i.e., <span class="mathjax-tex">\(y^B_i \in L_B\)</span>; while the latter is formulated as a five-label classification problem, i.e., <span class="mathjax-tex">\(y^T_i \in \{0,1\}^{5}\)</span> (<span class="mathjax-tex">\(|\textbf{y}^T_i|=1\)</span>), where each dimension denotes one of the argument role <span class="mathjax-tex">\(L_T=\{Title, Venue,\)</span> <span class="mathjax-tex">\(StartDate, EndDate\}\)</span> or None.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-5" data-title="Fig. 5"><figure><figcaption><b id="Fig5" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 5</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/5" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig5_HTML.png?as=webp"><img aria-describedby="Fig5" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig5_HTML.png" alt="figure 5" loading="lazy" width="685" height="404"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-5-desc"><p>Word-level event argument recognition by joint boundary and type recognition (JBTR)</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/5" data-track-dest="link:Figure5 Full size image" aria-label="Full size image figure 5" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>The loss function is divided into two parts (Eq. <a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ5">5</a>): the boundary loss <span class="mathjax-tex">\(L_{boundary}\)</span> calculates the loss value using negative log-likelihood (Eq. <a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ6">6</a>) and the type loss uses cross-entropy to estimate the loss value for each token label and predicted probability (Eq. <a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ7">7</a>):</p><div id="Equ5" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; L( \textbf{x}) = L_{boundary}( \textbf{x}) + L_{type}( \textbf{x}) \end{aligned}$$</span></div><div class="c-article-equation__number"> (5) </div></div><div id="Equ6" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; L_{boundary}( \textbf{x})=-log p(\textbf{y}^{\textbf{B}} | \textbf{x}) \end{aligned}$$</span></div><div class="c-article-equation__number"> (6) </div></div><div id="Equ7" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; L_{type}( \textbf{x})=-\sum _{i=1}^{| \textbf{x}|}\sum _{j=1}^5 y_{ij}^T\log p(y_{ij}^T| \textbf{x}) \end{aligned}$$</span></div><div class="c-article-equation__number"> (7) </div></div><p>where the <span class="mathjax-tex">\(p(\textbf{y}^{\textbf{B}} | \textbf{x})\)</span> is calculated by a conditional random field (CRF) layer for boundary identification with a transition matrix <i>T</i> and an omission matrix <i>O</i> (see Eqs. <a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ8">8</a> and  <a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ9">9</a>), and <span class="mathjax-tex">\(p(\textbf{y}^{\textbf{T}} | \textbf{x})\)</span> is calculated by a fully connected layer with softmax output.</p><div id="Equ8" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; p(\textbf{y}^{\textbf{B}} | \textbf{x}) = \frac{e^{s(\textbf{x, y}^{\textbf{B}})}}{\sum _{\textbf{y}}{e^{s( \textbf{x,y})}}}, \quad \end{aligned}$$</span></div><div class="c-article-equation__number"> (8) </div></div><div id="Equ9" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; s(\textbf{x, y}) = \sum _{i=1}^{| \textbf{x}|} T_{y_{i-1}, y_{i}} + \sum _{i=1}^{| \textbf{x}|} O_{w_i, y_i} \end{aligned}$$</span></div><div class="c-article-equation__number"> (9) </div></div><p>By combining the output of these two parts, we can extract the event title, event location, and start and end date in the message. More specifically, given the multi-task predictions, we first follow the boundary prediction to decide the string to be output and use the majority vote to determine the event field based on the type prediction of each token in the boundary. For example, as illustrated in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig6">6</a>, consider the sentence “1/13 – 31 <img src="//media.springernature.com/lw100/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Figa_HTML.gif" style="width:100px;max-width:none;" alt="">” (which means “Painting and Calligraphy Charity Exhibition from January 13th to 31st”). The CRF boundary identification (the second row in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig6">6</a>) results in the entire sentence as a single boundary. However, softmax classification assigns the first three tokens “1/13” to the start date, the following three tokens “ – ” to other, “31” to the end date, and the remaining 6 tokens “ <img src="//media.springernature.com/lw100/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Figb_HTML.gif" style="width:100px;max-width:none;" alt="">” to the event title. Using majority voting, the entire sentence is ultimately classified as the event title.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-6" data-title="Fig. 6"><figure><figcaption><b id="Fig6" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 6</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/6" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig6_HTML.png?as=webp"><img aria-describedby="Fig6" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Fig6_HTML.png" alt="figure 6" loading="lazy" width="685" height="121"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-6-desc"><p>Example of combining boundary prediction and type prediction</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00697-0/figures/6" data-track-dest="link:Figure6 Full size image" aria-label="Full size image figure 6" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div></div></div></section><section data-title="Experiments"><div class="c-article-section" id="Sec10-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec10"><span class="c-article-section__title-number">4 </span>Experiments</h2><div class="c-article-section__content" id="Sec10-content"><p>In this section, we first detail the preparation of the training dataset (Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec11">4.1</a>) used in our experiments and then show why distant supervision with auto-labeled data could not achieve manual-labeled performance in data Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec12">4.2</a>. Following this, we report the performance on (sentence-level) event argument positioning in Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec13">4.3</a> and word-level event argument extraction in Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec14">4.4</a>. Finally, we show that the two-stage CAMEE framework outperforms other state-of-the-art methods for event extraction in Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec15">4.5</a> and report the performance of main event extraction in Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec17">4.7</a>.</p><h3 class="c-article__sub-heading" id="Sec11"><span class="c-article-section__title-number">4.1 </span>Preparation of Training Data and Evaluation Metric</h3><p>We used both automatic and manual labeling to prepare the training data. For automatic labeling, we used events from Accupass,<sup><a href="#Fn2"><span class="u-visually-hidden">Footnote </span>2</a></sup> a ticket-selling website, as seeds to automatically label event fields in the corresponding posts regarding events to prepare the training data. However, since only 11% and 32% of the posts in the dataset mentioned event title and location, respectively, in order to avoid insufficient automatic labeling, we also manually labeled 2065 posts<sup><a href="#Fn3"><span class="u-visually-hidden">Footnote </span>3</a></sup> (1274 for training, 791 for validation) from the Facebook fanpages. The number of sentences containing event titles/venues/dates and the number of respective event fields are shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab1">1</a>.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-1"><figure><figcaption class="c-article-table__figcaption"><b id="Tab1" data-test="table-caption">Table 1 Datasets from which event fields were extracted</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/1" aria-label="Full size table 1"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>To evaluate the proposed method’s performance in extracting the event field, we used the same test data as Lin et al. [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2" title="Lin, Y.-H., Chang, C.-H., Chuang, H.-M.: Eventgo! mining events through semi-supervised event title recognition and pattern-based venue/date coupling. J. Inf. Sci. Eng. 39(3), 655–670 (2023). &#xA; https://doi.org/10.6688/JISE.20230339(2).0014&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR2" id="ref-link-section-d43096310e4648">2</a>], which consists of 1300 posts (33,991 sentences) from event announcements on fanpages on Facebook. Note that half of these posts contain only one event title, while the other half might contain more than one event title. Therefore, the number of sentences that contain event titles (2010) and event venues (1563) is higher than 1300. It is worth noting that most event posts had a main event, which is consistent with the nature of event posts on Facebook fanpages.</p><p>For sentence-level event argument positioning, we calculated the ratio of the number of correct predicted sentences to the total number of predicted sentences (<span class="mathjax-tex">\(\sum {\hat{y}}_i^{c}\)</span>) and the labeled sentences (<span class="mathjax-tex">\(\sum y_i^{c}\)</span>) to obtain the precision and recall, respectively:</p><div id="Equ10" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; P^{c}=\frac{\sum ^{N}_{i=1}{y_i^{c}{{\hat{y}}}_i}^{c}}{\sum ^{N}_{i=1}{{{\hat{y}}}_i^{c}}}, \quad R^{c}=\frac{\sum ^{N}_{i=1}{y_i^{c}{{\hat{y}}}_i^{c}}}{\sum ^{N}_{i=1}{y_i^{c}}}, \nonumber \\ &amp; \quad F^{c}=\frac{2~\times ~P^{c}~\times ~R^{c}}{P^{c}+{~R}^{c}} \end{aligned}$$</span></div><div class="c-article-equation__number"> (10) </div></div><p>where <i>N</i> is number of sentences, <span class="mathjax-tex">\(y_i^{c}\)</span> and <span class="mathjax-tex">\({\hat{y}}_i^{c}\)</span> <span class="mathjax-tex">\(\in \{0, 1\}\)</span> are the true label and the predicted label of sentence <i>i</i>, respectively. We then calculated the harmonic mean of <span class="mathjax-tex">\(P^{c}\)</span> and <span class="mathjax-tex">\(R^{c}\)</span> as the F1 measure. To access the overall performance, we used macro-averaging to summarize its performance on <i>C</i> event fields.</p><div id="Equ11" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; P_{macro}=\frac{1}{C} \sum ^C_{c=1}{P^{c}}, \quad R_{macro}=\frac{1}{C} \sum ^C_{c=1}{R^{c}}, \nonumber \\ &amp; \quad F_{macro}=\frac{2~\times ~P_{macro}\times {~R}_{macro}}{P_{macro}+{~R}_{macro}} \end{aligned}$$</span></div><div class="c-article-equation__number"> (11) </div></div><p>For word-level evaluation of event argument extraction, we used partial precision/recall for each case in the extracted fields <i>E</i> and true answer fields <i>A</i>. Formally, for each extracted field <i>e</i> that overlapped with the true answer field <i>a</i>, we defined the partial scores <span class="mathjax-tex">\(P\_score(e, a)\)</span> and <span class="mathjax-tex">\(R\_score(e, a)\)</span>, as shown in Eq. (<a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ12">12</a>). The precision and recall were then averaged over all the extracted fields <i>E</i> and the true answer fields <i>A</i>, respectively, as shown in Eq. (<a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ13">13</a>).</p><div id="Equ12" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; P\_score(e, a)= \frac{P(e\cap a)}{|e|}, \quad R\_score(e, a)= \frac{P(e\cap a)}{|a|}\nonumber \\ \end{aligned}$$</span></div><div class="c-article-equation__number"> (12) </div></div><div id="Equ13" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} &amp; \text {Precision}=\frac{\sum _{e\in E}P\_score(e, a)}{|{E}|}, \,\nonumber \\ &amp; \quad \text {Recall}=\frac{\sum _{a\in A}R\_score(e, a)}{|{A}|} \end{aligned}$$</span></div><div class="c-article-equation__number"> (13) </div></div><p>For the following experiments, we train the models using Chinese <span class="mathjax-tex">\(BERT_{BASE}\)</span><sup><a href="#Fn4"><span class="u-visually-hidden">Footnote </span>4</a></sup> [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). &#xA; https://doi.org/10.18653/v1/N19-1423&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/N19-1423&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR4" id="ref-link-section-d43096310e5921">4</a>] for subsequent experiments. We use the Adam weight optimizer with an initial learning rate of 1e– 5. For the sentence-level Model, the maximum learning epochs are set to 60, while for the word-level Model, the maximum learning epochs are set to 30. All experimental results were averaged over three trials. We utilized the validation set to select the epoch with the best Macro F1 score and report the models’ performance.</p><h3 class="c-article__sub-heading" id="Sec12"><span class="c-article-section__title-number">4.2 </span>Comparison of Auto-Labeled and Manually Labeled Data</h3><p>To see why we chose manual-labeled data instead of auto-labeled data, we evaluated the performance of the proposed method CAMLC and JBTR in terms of sentence level and word level, respectively. For sentence-level event argument positioning, we utilize a positive-to-negative ratio of 1:9 for both auto-labeled and manual-labeled data. As shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab2">2</a>, the performance of the sentence-level model trained on manual-labeled data surpasses that of models trained on auto-labeled data (0.780 vs. 0.493 macro F1).</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-2"><figure><figcaption class="c-article-table__figcaption"><b id="Tab2" data-test="table-caption">Table 2 Sentence-level event argument positioning by CAMLC model (Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig3">3</a>)</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/2" aria-label="Full size table 2"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>As for word-level event extraction, we employed a positive-to-negative ratio of 1:5 and 1:2 for manual-labeled and auto-labeled data, respectively to obtain the best performance. One explanation is that auto-labeled data might contain more noise than manual-labeled data. Note that the performance gap in sentence-level and word-level evaluation urges us to boost JBTR by CAMLC. As shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab3">3</a>, since it is more difficult for auto-labeling to achieve accurate labeling at the word level (especially by “exact match”), the performance of the model trained on manual-labeled data is much higher than that trained on auto-labeled data (0.726 vs 0.408 macro F1).</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-3"><figure><figcaption class="c-article-table__figcaption"><b id="Tab3" data-test="table-caption">Table 3 Word-level event argument extraction using JTBR model in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig5">5</a></b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/3" aria-label="Full size table 3"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><h3 class="c-article__sub-heading" id="Sec13"><span class="c-article-section__title-number">4.3 </span>Effect of Event Arguments Positioning</h3><p>We compared the proposed CAMLC model with three sentence-level event argument positioning methods: BERT-CLS [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). &#xA; https://doi.org/10.18653/v1/N19-1423&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/N19-1423&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR4" id="ref-link-section-d43096310e6552">4</a>], BERT-Att-BiLSTM-RC [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 5" title="Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., Xu, B.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 207–212. Association for Computational Linguistics, Berlin, Germany (2016). &#xA; https://doi.org/10.18653/v1/P16-2034&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/P16-2034&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR5" id="ref-link-section-d43096310e6555">5</a>], and H-BERT-MLP [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 6" title="Lei, J., Zhang, Q., Wang, J., Luo, H.: Bert based hierarchical sequence classification for context-aware microblog sentiment analysis. In: International Conference on Neural Information Processing, pp. 376–386. Springer, Sydney, NSW, Australia (2019). Springer" href="/article/10.1007/s44196-024-00697-0#ref-CR6" id="ref-link-section-d43096310e6558">6</a>]. The BERT-CLS and BERT-Att-BiLSTM-RC accepted a single-sentence input, while the H-BERT-MLP and CAMLC accepted multiple-sentence inputs. BERT-CLS is based on the fine-tuned BERT and uses the [CLS] token to represent the entire sentence [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). &#xA; https://doi.org/10.18653/v1/N19-1423&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/N19-1423&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR4" id="ref-link-section-d43096310e6561">4</a>]. BERT-Att-BiLSTM-RC uses the BiLSTM and an attention mechanism to model sentence-level relation extraction [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). &#xA; https://doi.org/10.18653/v1/N19-1423&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/N19-1423&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR4" id="ref-link-section-d43096310e6564">4</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 5" title="Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., Xu, B.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 207–212. Association for Computational Linguistics, Berlin, Germany (2016). &#xA; https://doi.org/10.18653/v1/P16-2034&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/P16-2034&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR5" id="ref-link-section-d43096310e6568">5</a>]. H-BERT-MLP [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 6" title="Lei, J., Zhang, Q., Wang, J., Luo, H.: Bert based hierarchical sequence classification for context-aware microblog sentiment analysis. In: International Conference on Neural Information Processing, pp. 376–386. Springer, Sydney, NSW, Australia (2019). Springer" href="/article/10.1007/s44196-024-00697-0#ref-CR6" id="ref-link-section-d43096310e6571">6</a>] receives a multiple-sentence input separated for context-aware representation and uses a shared dense layer to predict the output. Since we model the problem as a multi-label classification problem, we equip the output layer with four sigmoid functions and train the model with the same loss function as Eq. <a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ4">4</a>.</p><p>Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab4">4</a> shows that of the two single-sentence input models considered, BERT-Att-BiLSTM-RC (0.764 F1 score) outperformed BERT-CLS (0.674 F1 score), although the latter had the highest recall (0.879). Of the models that accepted multi-sentence input, the proposed CAMLC model had a higher precision than H-BERT-MLP by almost 10% (0.748 vs 0.655) and a slightly lower recall (0.817 vs. 0.836). Therefore, it also had a better macro-F1 score (0.780 vs. 0.735).</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-4"><figure><figcaption class="c-article-table__figcaption"><b id="Tab4" data-test="table-caption">Table 4 Comparison of sentence-level models for event argument positioning</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/4" aria-label="Full size table 4"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>We conducted an ablation study on the proposed CAMLC model. As shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab5">5</a>, the performance drops to 0.575, 0.759, and 0.754 without fine-tuning, downsampling mechanism, and position encoding, respectively. Thus, it is vital that the BERT model is fine-tuned during training.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-5"><figure><figcaption class="c-article-table__figcaption"><b id="Tab5" data-test="table-caption">Table 5 Sentence-level model ablation study</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/5" aria-label="Full size table 5"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><h3 class="c-article__sub-heading" id="Sec14"><span class="c-article-section__title-number">4.4 </span>Performance of Word-Level Event Field Extraction</h3><p>Next, we compared the proposed word-level event argument recognition model with <i>BERT-QA</i>, <i>ERNIE</i>, <i>BERT-based SoftLexicon</i>, and <i>BERT-BiLSTM-CRF</i>. <i>BERT-QA</i> [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 7" title="Du, X., Cardie, C.: Event extraction by answering (almost) natural questions. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 671–683. Association for Computational Linguistics, Online (2020). &#xA; https://doi.org/10.18653/v1/2020.emnlp-main.49&#xA; &#xA; . &#xA; https://aclanthology.org/2020.emnlp-main.49&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR7" id="ref-link-section-d43096310e7368">7</a>] formulates event field extraction as a question-answering (QA) task by asking questions, such as event triggers and argument roles. <i>ERNIE</i> [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 8" title="Sun, Y., Wang, S., Feng, S., Ding, S., Pang, C., Shang, J., Liu, J., Chen, X., Zhao, Y., Lu, Y., Liu, W., Wu, Z., Gong, W., Liang, J., Shang, Z., Sun, P., Liu, W., Xuan, O., Yu, D., Tian, H., Wu, H., Wang, H.: Ernie 3.0: large-scale knowledge enhanced pre-training for language understanding and generation. In: arXiv Preprint &#xA; arXiv:2107.02137&#xA; &#xA; , vol. abs/2107.02137. arXiv, &#34;Online&#34; (2021). &#xA; https://api.semanticscholar.org/CorpusID:235731579&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR8" id="ref-link-section-d43096310e7374">8</a>],<sup><a href="#Fn5"><span class="u-visually-hidden">Footnote </span>5</a></sup> a Baidu-released pretraining model learns prior knowledge about phrases and entities during training through knowledge masking. The <i>BERT-based SoftLexicon</i> [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 9" title="Ma, R., Peng, M., Zhang, Q., Wei, Z., Huang, X.: Simplify the usage of lexicon in Chinese NER. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5951–5960. Association for Computational Linguistics, Online (2020). &#xA; https://doi.org/10.18653/v1/2020.acl-main.528&#xA; &#xA; . &#xA; https://aclanthology.org/2020.acl-main.528&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR9" id="ref-link-section-d43096310e7391">9</a>] is a state-of-the-art (SOTA) entity recognition model that integrates character embedding (Char + bichar), CTB (Chinese Treebank) 6.0 embedding, BERT embedding, and lexical information to label sequences of sentences in Chinese. This model enhances character representations by incorporating lexical information, which helps to capture richer semantic and syntactic features. Lexical information is derived from external lexical resources and is used to augment the character embeddings, providing more context and improving the model’s ability to recognize entities accurately. Finally, the <i>BERT-BiLSTM-CRF</i> model serves as the baseline for sequence-labeling tasks. This model leverages contextual information captured by BERT encoder [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). &#xA; https://doi.org/10.18653/v1/N19-1423&#xA; &#xA; . &#xA; https://www.aclweb.org/anthology/N19-1423&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR4" id="ref-link-section-d43096310e7398">4</a>] and sequential dependencies modeled by Bi-LSTM [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 30" title="Lample, G., Ballesteros, M., Subramanian, S., Kawakami, K., Dyer, C.: Neural architectures for named entity recognition. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 260–270. Association for Computational Linguistics, San Diego, California (2016). &#xA; https://doi.org/10.18653/v1/N16-1030&#xA; &#xA; . &#xA; https://aclanthology.org/N16-1030&#xA; &#xA; " href="/article/10.1007/s44196-024-00697-0#ref-CR30" id="ref-link-section-d43096310e7401">30</a>], enhancing its ability to accurately assign labels to sequential data, making it a robust starting point for various sequence-labeling problems.</p><p>As shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab6">6</a>, the proposed JBTR model with Chinese base BERT outperforms (in terms of macro F1 score) BERT-QA, ERNIE and BERT-BiLSTM-CRF and is comparable (0.726) to the BERT-based SoftLexicon model (0.733). However, our proposed model is much simpler than BERT-based SoftLexicon and does not rely on additional lexical information.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-6"><figure><figcaption class="c-article-table__figcaption"><b id="Tab6" data-test="table-caption">Table 6 Comparison of word-level event argument extraction models</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/6" aria-label="Full size table 6"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>To reason about the low performance of BERT-QA, we compare the ACE dataset with our meetup event corpus as shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab7">7</a>. The ratio of OOV trigger words in the ACE 2005 dataset (58/212) was much lower than that in the FB event dataset (1888/1889). The ratio of sentences containing mentions of events in ACE 2005 (18% = 3136/17,172) was much higher than that of the FB event dataset (1.79% = 1117/62,474). Finally, event triggers in the ACE 2005 dataset were defined as single words, while those in the FB event dataset were defined as the event titles, usually phases.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-7"><figure><figcaption class="c-article-table__figcaption"><b id="Tab7" data-test="table-caption">Table 7 Comparison of the ACE 2005 corpus and the meetup event corpus</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/7" aria-label="Full size table 7"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>In summary, although BERT-QA achieved the SOTA performance in the ACE 2005 Trigger Identification + Classification task, its performance drops significantly for the meetup event extraction, which requires deeper semantic or contextual understanding. As for ERNIE, it had the highest precision on all tasks of event field extraction. This verifies its success in minimizing bias for false-negative instances through knowledge-masking. However, its low recall rate also caused the model to lose generalizability. We believe that the bias caused by knowledge masking during the pre-training of ERNIE led to this phenomenon because the title and venue to be identified were OOV.</p><h3 class="c-article__sub-heading" id="Sec15"><span class="c-article-section__title-number">4.5 </span>Evaluation of the Two-Stage CAMEE Framework</h3><p>Tables <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab4">4</a> and <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab6">6</a> manifest our intuition that coarse-grain event extraction typically performs better than fine-grain event extraction. Therefore, we exploited the sentence-level model to filter out sentences that did not contain event arguments and boost the performance of the word-level model as shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00697-0#Fig2">2</a>b. Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab8">8</a> compares the performance of the four sentence-level event argument positioning models combined with the proposed word-level JBTR model. The macro F1 of the word-level JBTR model is improved by the sentence-level event argument positioning model CAMLC to 0.743, outperforming the BERT-Att-BiLSTM-RC model (0.739) and the H-BERT-MLP model (0.738). In general, the performance of word-level event extraction (0.726) improves by 1.2 to 1.7%, except for BERT-CLS. Although BERT-CLS has the best recall (0.879) in predicting event argument positions at the sentence level, the inferior precision (0.546) worsens the performance of the JBTR model from 0.726 to 0.724 F1.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-8"><figure><figcaption class="c-article-table__figcaption"><b id="Tab8" data-test="table-caption">Table 8 Performance boosting of JBTR word-level event argument extraction by four sentence-level event argument positioning models (except for BERT-CLS)</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/8" aria-label="Full size table 8"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>In addition to Table  <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab8">8</a>, we also examined the effect of the proposed CAMEE framework on the BERT-based Softlexicon and Roberta-large-based JBTR model. As shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab9">9</a>, when filtered through the sentence-level CAMLC model, both word-level event argument extraction models exhibited a 6.8 to 7.8% improvement in precision but a 5.6 to 10% decrease in recall performance. Since the BERT-based Softlexicon model itself showed significantly lower recall (0.718) compared to JBTR based on BERT and JBTR based on Roberta-large (0.795), its final Macro-F1 score is only 0.704, which is lower than the Macro-F1 scores of 0.743 and 0.750 for the BERT-based and Roberta-large-based JBTR models respectively.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-9"><figure><figcaption class="c-article-table__figcaption"><b id="Tab9" data-test="table-caption">Table 9 Performance boosting of three word-level event argument recognition models by the CAMLC model</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/9" aria-label="Full size table 9"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><h3 class="c-article__sub-heading" id="Sec16"><span class="c-article-section__title-number">4.6 </span>Evaluation of Large Language Models on Meetup Event Extraction Tasks</h3><p>To see how large language models perform in event meetup extraction tasks, we assessed GPT<span class="mathjax-tex">\(-\)</span>3.5-turbo and GPT-4-turbo to extract meetup events on the test data set using zero-shot prompt as shown in Appendix <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s44196-024-00697-0#Sec19">A</a>. Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab10">10</a> presents the precision (P), recall (R), and F1 scores of each model in the meetup event extraction task. Specifically, GPT<span class="mathjax-tex">\(-\)</span>3.5-turbo and GPT-4-turbo had macro-F1 scores of 0.369 and 0.549, respectively.</p><p>These results may stem from several factors. Firstly, BERT is a model architecture specifically designed for natural language understanding and representation tasks. After fine-tuning, it can better adapt to the specific requirements of the task. Secondly, during the fine-tuning process, the BERT model can effectively capture subtle nuances in the meetup event extraction tasks, thereby enhancing its performance on this particular task. While GPT models possess powerful generative capabilities and versatility, their precision and recall in specific tasks may be compromised, particularly in scenarios requiring high-precision extraction.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-10"><figure><figcaption class="c-article-table__figcaption"><b id="Tab10" data-test="table-caption">Table 10 Evaluating the performance of GPT large language models on meetup event extraction tasks</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/10" aria-label="Full size table 10"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-11"><figure><figcaption class="c-article-table__figcaption"><b id="Tab11" data-test="table-caption">Table 11 Performance of the main event extraction</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00697-0/tables/11" aria-label="Full size table 11"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><h3 class="c-article__sub-heading" id="Sec17"><span class="c-article-section__title-number">4.7 </span>Performance of Main Event Extraction</h3><p>Finally, a message may contain more than one meetup event, but only the main event is our extraction target. Therefore, we also evaluated the precision and recall of the top-ranked extraction in each message to assess its message-level performance. We defined the precision at <i>k</i> (<i>P</i>@<i>k</i>) and the recall at <i>k</i> (<i>R</i>@<i>k</i>) to evaluate the performance of the proposed method in terms of extracting the main event from each post. Suppose that a system returns a ranked list <span class="mathjax-tex">\(R_l\)</span> = (<span class="mathjax-tex">\(r^l_1, \ldots , r^l_n\)</span>) for the event field <i>l</i>, the <i>P</i>@<i>k</i> is defined by matching it with the golden answer, i.e., human-annotated labels, <span class="mathjax-tex">\(A_l\)</span> for event field <i>l</i> (<span class="mathjax-tex">\(l\in L\)</span>) as shown in Eq. (<a data-track="click" data-track-label="link" data-track-action="equation anchor" href="/article/10.1007/s44196-024-00697-0#Equ14">14</a>):</p><div id="Equ14" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} \text {P}\mathop {@}\text {k}={\frac{1}{k}\sum ^k_{j=1}{b^l_j}}, \quad \text {R}\mathop {@}\text {k}={\frac{1}{|A_l|}\sum ^k_{j=1}{b^l_j}} \end{aligned}$$</span></div><div class="c-article-equation__number"> (14) </div></div><p>where <span class="mathjax-tex">\({b_j^{l}}\)</span> is one if <span class="mathjax-tex">\(r_j^{l} \in A_l\)</span> and is zero otherwise.</p><p>Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00697-0#Tab11">11</a> shows the overall performance and the P@1/R@1/F1@1 values of the proposed method for the two scenarios when the sentence-level model was and was not used. Filtering of the sentence-level extraction model improved its overall precision and F1 score from 0.669/0.726 to 0.747/0.743 at the cost of the recall value. The values of P@1/R@1/F1@1 of the proposed method in terms of extracting the main event improved from 0.736/0.673/0.703 to 0.837/0.738/0.784, an increase of 10.1%/6.5%/8.1%.</p></div></div></section><section data-title="Conclusion and Future Work"><div class="c-article-section" id="Sec18-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec18"><span class="c-article-section__title-number">5 </span>Conclusion and Future Work</h2><div class="c-article-section__content" id="Sec18-content"><p>In this paper, we investigate the problem of extracting fine-grained Chinese meetup events from posts on social networks. We found that automatic labeling of training data by preexisting events was of rather limited quality, which can adversely affect performance. Therefore, manual labeling of data remains a crucial requirement. Additionally, compared to traditional named entity recognition or event extraction tasks, the frequency of meetup event sentences is relatively low. Consequently, traditional sequence labeling methods that operate solely on single sentences exhibit suboptimal performance in this scenario. To solve this problem, we propose a two-stage pipeline strategy to improve the word-level argument recognition task through sentence-level event argument positioning CAMLC model and a multi-task argument recognition JBTR model. Experimental results show that our proposed event extraction method can improve the extraction performance from 0.726 to 0.743 macro-F1 for all events and from 0.703 to 0.784 macro-F1 for main events. A demo website can be accessed at <a href="https://eventgo.widm.csie.ncu.edu.tw/">https://eventgo.widm.csie.ncu.edu.tw/</a>.</p></div></div></section> </div> <section data-title="Data Availibility"><div class="c-article-section" id="data-availibility-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="data-availibility">Data Availibility</h2><div class="c-article-section__content" id="data-availibility-content"> <p>The Meetup Events Extraction dataset and codes used in this study have been publicly shared to promote transparency and reproducibility of the research findings. The data is available at the following URL: <a href="https://github.com/luff543/CA-BERT-MLP/">https://github.com/luff543/CA-BERT-MLP/</a>.</p> </div></div></section><section data-title="Notes"><div class="c-article-section" id="notes-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="notes">Notes</h2><div class="c-article-section__content" id="notes-content"><ol class="c-article-footnote c-article-footnote--listed"><li class="c-article-footnote--listed__item" id="Fn1" data-counter="1."><div class="c-article-footnote--listed__content"><p><a href="https://www.ldc.upenn.edu/collaborations/past-projects/ace.">https://www.ldc.upenn.edu/collaborations/past-projects/ace.</a></p></div></li><li class="c-article-footnote--listed__item" id="Fn2" data-counter="2."><div class="c-article-footnote--listed__content"><p><a href="https://www.accupass.com/.">https://www.accupass.com/.</a></p></div></li><li class="c-article-footnote--listed__item" id="Fn3" data-counter="3."><div class="c-article-footnote--listed__content"><p>The dataset and code are available at <a href="https://github.com/luff543/CA-BERT-MLP/.">https://github.com/luff543/CA-BERT-MLP/.</a></p></div></li><li class="c-article-footnote--listed__item" id="Fn4" data-counter="4."><div class="c-article-footnote--listed__content"><p>Google Research BERT: <a href="https://github.com/google-research/bert.">https://github.com/google-research/bert.</a></p></div></li><li class="c-article-footnote--listed__item" id="Fn5" data-counter="5."><div class="c-article-footnote--listed__content"><p><a href="https://github.com/PaddlePaddle/ERNIE/tree/ernie-kit-open-v1.0/applications/tasks/sequence_labeling.">https://github.com/PaddlePaddle/ERNIE/tree/ernie-kit-open-v1.0/applications/tasks/sequence_labeling.</a></p></div></li></ol></div></div></section><div id="MagazineFulltextArticleBodySuffix"><section aria-labelledby="Bib1" data-title="References"><div class="c-article-section" id="Bib1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Bib1">References</h2><div class="c-article-section__content" id="Bib1-content"><div data-container-section="references"><ol class="c-article-references" data-track-component="outbound reference" data-track-context="references section"><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="1."><p class="c-article-references__text" id="ref-CR1">Wang, Q., Kanagal, B., Garg, V., Sivakumar, D.: Constructing a comprehensive events database from the web. In: Proceedings of the 28th ACM International Conference on Information and Knowledge Management. CIKM ’19, pp. 229–238. Association for Computing Machinery, New York, NY, USA (2019). <a href="https://doi.org/10.1145/3357384.3357986" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1145/3357384.3357986">https://doi.org/10.1145/3357384.3357986</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="2."><p class="c-article-references__text" id="ref-CR2">Lin, Y.-H., Chang, C.-H., Chuang, H.-M.: Eventgo! mining events through semi-supervised event title recognition and pattern-based venue/date coupling. J. Inf. Sci. Eng. <b>39</b>(3), 655–670 (2023). <a href="https://doi.org/10.6688/JISE.20230339(2).0014" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.6688/JISE.20230339(2).0014">https://doi.org/10.6688/JISE.20230339(2).0014</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.6688/JISE.20230339(2).0014" data-track-item_id="10.6688/JISE.20230339(2).0014" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.6688%2FJISE.20230339%282%29.0014" aria-label="Article reference 2" data-doi="10.6688/JISE.20230339(2).0014">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 2" href="http://scholar.google.com/scholar_lookup?&amp;title=Eventgo%21%20mining%20events%20through%20semi-supervised%20event%20title%20recognition%20and%20pattern-based%20venue%2Fdate%20coupling&amp;journal=J.%20Inf.%20Sci.%20Eng.&amp;doi=10.6688%2FJISE.20230339%282%29.0014&amp;volume=39&amp;issue=3&amp;pages=655-670&amp;publication_year=2023&amp;author=Lin%2CY-H&amp;author=Chang%2CC-H&amp;author=Chuang%2CH-M"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="3."><p class="c-article-references__text" id="ref-CR3">Foley, J., Bendersky, M., Josifovski, V.: Learning to extract local events from the web. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval. SIGIR ’15, pp. 423–432. Association for Computing Machinery, New York, NY, USA (2015). <a href="https://doi.org/10.1145/2766462.2767739" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1145/2766462.2767739">https://doi.org/10.1145/2766462.2767739</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="4."><p class="c-article-references__text" id="ref-CR4">Devlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp. 4171–4186. Association for Computational Linguistics, Minneapolis, Minnesota (2019). <a href="https://doi.org/10.18653/v1/N19-1423" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/N19-1423">https://doi.org/10.18653/v1/N19-1423</a>. <a href="https://www.aclweb.org/anthology/N19-1423" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://www.aclweb.org/anthology/N19-1423">https://www.aclweb.org/anthology/N19-1423</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="5."><p class="c-article-references__text" id="ref-CR5">Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., Xu, B.: Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pp. 207–212. Association for Computational Linguistics, Berlin, Germany (2016). <a href="https://doi.org/10.18653/v1/P16-2034" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/P16-2034">https://doi.org/10.18653/v1/P16-2034</a>. <a href="https://www.aclweb.org/anthology/P16-2034" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://www.aclweb.org/anthology/P16-2034">https://www.aclweb.org/anthology/P16-2034</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="6."><p class="c-article-references__text" id="ref-CR6">Lei, J., Zhang, Q., Wang, J., Luo, H.: Bert based hierarchical sequence classification for context-aware microblog sentiment analysis. In: International Conference on Neural Information Processing, pp. 376–386. Springer, Sydney, NSW, Australia (2019). Springer</p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="7."><p class="c-article-references__text" id="ref-CR7">Du, X., Cardie, C.: Event extraction by answering (almost) natural questions. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 671–683. Association for Computational Linguistics, Online (2020). <a href="https://doi.org/10.18653/v1/2020.emnlp-main.49" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/2020.emnlp-main.49">https://doi.org/10.18653/v1/2020.emnlp-main.49</a>. <a href="https://aclanthology.org/2020.emnlp-main.49" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://aclanthology.org/2020.emnlp-main.49">https://aclanthology.org/2020.emnlp-main.49</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="8."><p class="c-article-references__text" id="ref-CR8">Sun, Y., Wang, S., Feng, S., Ding, S., Pang, C., Shang, J., Liu, J., Chen, X., Zhao, Y., Lu, Y., Liu, W., Wu, Z., Gong, W., Liang, J., Shang, Z., Sun, P., Liu, W., Xuan, O., Yu, D., Tian, H., Wu, H., Wang, H.: Ernie 3.0: large-scale knowledge enhanced pre-training for language understanding and generation. In: arXiv Preprint <a href="http://arxiv.org/abs/2107.02137" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="http://arxiv.org/abs/2107.02137">arXiv:2107.02137</a>, vol. abs/2107.02137. arXiv, "Online" (2021). <a href="https://api.semanticscholar.org/CorpusID:235731579" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://api.semanticscholar.org/CorpusID:235731579">https://api.semanticscholar.org/CorpusID:235731579</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="9."><p class="c-article-references__text" id="ref-CR9">Ma, R., Peng, M., Zhang, Q., Wei, Z., Huang, X.: Simplify the usage of lexicon in Chinese NER. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pp. 5951–5960. Association for Computational Linguistics, Online (2020). <a href="https://doi.org/10.18653/v1/2020.acl-main.528" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/2020.acl-main.528">https://doi.org/10.18653/v1/2020.acl-main.528</a>. <a href="https://aclanthology.org/2020.acl-main.528" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://aclanthology.org/2020.acl-main.528">https://aclanthology.org/2020.acl-main.528</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="10."><p class="c-article-references__text" id="ref-CR10">Dean-Hall, A., Clarke, C.L., Simone, N., Kamps, J., Thomas, P., Voorhees, E.: Overview of the TREC 2013 contextual suggestion track. In: Voorhees, E. (ed.) Proceedings of The Twenty-Second Text REtrieval Conference, TREC 2013, Gaithersburg, Maryland, USA, November 19-22, 2013. NIST Special Publication, vol. 500-302. National Institute of Standards and Technology (NIST), Gaithersburg, Maryland, USA (2013). <a href="http://trec.nist.gov/pubs/trec22/papers/CONTEXT.OVERVIEW.pdf" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="http://trec.nist.gov/pubs/trec22/papers/CONTEXT.OVERVIEW.pdf">http://trec.nist.gov/pubs/trec22/papers/CONTEXT.OVERVIEW.pdf</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="11."><p class="c-article-references__text" id="ref-CR11">Doddington, G., Mitchell, A., Przybocki, M., Ramshaw, L., Strassel, S., Weischedel, R.: The automatic content extraction (ACE) program – tasks, data, and evaluation. In: Lino, M.T., Xavier, M.F., Ferreira, F., Costa, R., Silva, R. (eds.) Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC’04). European Language Resources Association (ELRA), Lisbon, Portugal (2004). <a href="http://www.lrec-conf.org/proceedings/lrec2004/pdf/5.pdf" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="http://www.lrec-conf.org/proceedings/lrec2004/pdf/5.pdf">http://www.lrec-conf.org/proceedings/lrec2004/pdf/5.pdf</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="12."><p class="c-article-references__text" id="ref-CR12">Xiang, W., Wang, B.: A survey of event extraction from text. IEEE Access <b>7</b>, 173111–173137 (2019). <a href="https://doi.org/10.1109/ACCESS.2019.2956831" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/ACCESS.2019.2956831">https://doi.org/10.1109/ACCESS.2019.2956831</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/ACCESS.2019.2956831" data-track-item_id="10.1109/ACCESS.2019.2956831" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FACCESS.2019.2956831" aria-label="Article reference 12" data-doi="10.1109/ACCESS.2019.2956831">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 12" href="http://scholar.google.com/scholar_lookup?&amp;title=A%20survey%20of%20event%20extraction%20from%20text&amp;journal=IEEE%20Access&amp;doi=10.1109%2FACCESS.2019.2956831&amp;volume=7&amp;pages=173111-173137&amp;publication_year=2019&amp;author=Xiang%2CW&amp;author=Wang%2CB"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="13."><p class="c-article-references__text" id="ref-CR13">Li, Q., Li, J., Sheng, J., Cui, S., Wu, J., Hei, Y., Peng, H., Guo, S., Wang, L., Beheshti, A., Yu, P.S.: A survey on deep learning event extraction: approaches and applications. IEEE Trans. Neural Netw. Learn. Syst. <b>35</b>, 6301–6321 (2021)</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TNNLS.2022.3213168" data-track-item_id="10.1109/TNNLS.2022.3213168" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTNNLS.2022.3213168" aria-label="Article reference 13" data-doi="10.1109/TNNLS.2022.3213168">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 13" href="http://scholar.google.com/scholar_lookup?&amp;title=A%20survey%20on%20deep%20learning%20event%20extraction%3A%20approaches%20and%20applications&amp;journal=IEEE%20Trans.%20Neural%20Netw.%20Learn.%20Syst.&amp;doi=10.1109%2FTNNLS.2022.3213168&amp;volume=35&amp;pages=6301-6321&amp;publication_year=2021&amp;author=Li%2CQ&amp;author=Li%2CJ&amp;author=Sheng%2CJ&amp;author=Cui%2CS&amp;author=Wu%2CJ&amp;author=Hei%2CY&amp;author=Peng%2CH&amp;author=Guo%2CS&amp;author=Wang%2CL&amp;author=Beheshti%2CA&amp;author=Yu%2CPS"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="14."><p class="c-article-references__text" id="ref-CR14">Chen, Y., Xu, L., Liu, K., Zeng, D., Zhao, J.: Event extraction via dynamic multi-pooling convolutional neural networks. In: Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 167–176. Association for Computational Linguistics, Beijing, China (2015). <a href="https://doi.org/10.3115/v1/P15-1017" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.3115/v1/P15-1017">https://doi.org/10.3115/v1/P15-1017</a>. <a href="https://aclanthology.org/P15-1017" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://aclanthology.org/P15-1017">https://aclanthology.org/P15-1017</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="15."><p class="c-article-references__text" id="ref-CR15">Nguyen, T.H., Cho, K., Grishman, R.: Joint event extraction via recurrent neural networks. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 300–309. Association for Computational Linguistics, San Diego, California (2016). <a href="https://doi.org/10.18653/v1/N16-1034" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/N16-1034">https://doi.org/10.18653/v1/N16-1034</a>. <a href="https://aclanthology.org/N16-1034" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://aclanthology.org/N16-1034">https://aclanthology.org/N16-1034</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="16."><p class="c-article-references__text" id="ref-CR16">Tian, C., Zhao, Y., Ren, L.: A Chinese event relation extraction model based on bert. In: 2019 2nd International Conference on Artificial Intelligence and Big Data (ICAIBD), pp. 271–276 (2019). IEEE</p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="17."><p class="c-article-references__text" id="ref-CR17">Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I., et al.: Language models are unsupervised multitask learners. OpenAI Blog <b>1</b>(8), 9 (2019)</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 17" href="http://scholar.google.com/scholar_lookup?&amp;title=Language%20models%20are%20unsupervised%20multitask%20learners&amp;journal=OpenAI%20Blog&amp;volume=1&amp;issue=8&amp;publication_year=2019&amp;author=Radford%2CA&amp;author=Wu%2CJ&amp;author=Child%2CR&amp;author=Luan%2CD&amp;author=Amodei%2CD&amp;author=Sutskever%2CI"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="18."><p class="c-article-references__text" id="ref-CR18">Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J.D., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., et al.: Language models are few-shot learners. Adv. Neural. Inf. Process. Syst. <b>33</b>, 1877–1901 (2020)</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 18" href="http://scholar.google.com/scholar_lookup?&amp;title=Language%20models%20are%20few-shot%20learners&amp;journal=Adv.%20Neural.%20Inf.%20Process.%20Syst.&amp;volume=33&amp;pages=1877-1901&amp;publication_year=2020&amp;author=Brown%2CT&amp;author=Mann%2CB&amp;author=Ryder%2CN&amp;author=Subbiah%2CM&amp;author=Kaplan%2CJD&amp;author=Dhariwal%2CP&amp;author=Neelakantan%2CA&amp;author=Shyam%2CP&amp;author=Sastry%2CG&amp;author=Askell%2CA"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="19."><p class="c-article-references__text" id="ref-CR19">Raffel, C., Shazeer, N., Roberts, A., Lee, K., Narang, S., Matena, M., Zhou, Y., Li, W., Liu, P.J.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. <b>21</b>(1), 5485–5551 (2020)</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="mathscinet reference" data-track-action="mathscinet reference" href="http://www.ams.org/mathscinet-getitem?mr=4138124" aria-label="MathSciNet reference 19">MathSciNet</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 19" href="http://scholar.google.com/scholar_lookup?&amp;title=Exploring%20the%20limits%20of%20transfer%20learning%20with%20a%20unified%20text-to-text%20transformer&amp;journal=J.%20Mach.%20Learn.%20Res.&amp;volume=21&amp;issue=1&amp;pages=5485-5551&amp;publication_year=2020&amp;author=Raffel%2CC&amp;author=Shazeer%2CN&amp;author=Roberts%2CA&amp;author=Lee%2CK&amp;author=Narang%2CS&amp;author=Matena%2CM&amp;author=Zhou%2CY&amp;author=Li%2CW&amp;author=Liu%2CPJ"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="20."><p class="c-article-references__text" id="ref-CR20">Lu, Y., Lin, H., Xu, J., Han, X., Tang, J., Li, A., Sun, L., Liao, M., Chen, S.: Text2event: controllable sequence-to-structure generation for end-to-end event extraction. arXiv preprint <a href="http://arxiv.org/abs/2106.09232" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="http://arxiv.org/abs/2106.09232">arXiv:2106.09232</a> (2021)</p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="21."><p class="c-article-references__text" id="ref-CR21">Wei, K., Sun, X., Zhang, Z., Zhang, J., Zhi, G., Jin, L.: Trigger is not sufficient: exploiting frame-aware knowledge for implicit event argument extraction. In: Zong, C., Xia, F., Li, W., Navigli, R. (eds.) Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp. 4672–4682. Association for Computational Linguistics, Online (2021). <a href="https://doi.org/10.18653/v1/2021.acl-long.360" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/2021.acl-long.360">https://doi.org/10.18653/v1/2021.acl-long.360</a>. <a href="https://aclanthology.org/2021.acl-long.360" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://aclanthology.org/2021.acl-long.360">https://aclanthology.org/2021.acl-long.360</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="22."><p class="c-article-references__text" id="ref-CR22">Wei, K., Sun, X., Zhang, Z., Jin, L., Zhang, J., Lv, J., Guo, Z.: Implicit event argument extraction with argument-argument relational knowledge. IEEE Trans. Knowl. Data Eng. <b>35</b>(9), 8865–8879 (2023). <a href="https://doi.org/10.1109/TKDE.2022.3218830" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/TKDE.2022.3218830">https://doi.org/10.1109/TKDE.2022.3218830</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TKDE.2022.3218830" data-track-item_id="10.1109/TKDE.2022.3218830" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTKDE.2022.3218830" aria-label="Article reference 22" data-doi="10.1109/TKDE.2022.3218830">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 22" href="http://scholar.google.com/scholar_lookup?&amp;title=Implicit%20event%20argument%20extraction%20with%20argument-argument%20relational%20knowledge&amp;journal=IEEE%20Trans.%20Knowl.%20Data%20Eng.&amp;doi=10.1109%2FTKDE.2022.3218830&amp;volume=35&amp;issue=9&amp;pages=8865-8879&amp;publication_year=2023&amp;author=Wei%2CK&amp;author=Sun%2CX&amp;author=Zhang%2CZ&amp;author=Jin%2CL&amp;author=Zhang%2CJ&amp;author=Lv%2CJ&amp;author=Guo%2CZ"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="23."><p class="c-article-references__text" id="ref-CR23">Ritter, A., Mausam, Etzioni, O., Clark, S.: Open domain event extraction from twitter. In: Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. KDD ’12, pp. 1104–1112. Association for Computing Machinery, New York, NY, USA (2012). <a href="https://doi.org/10.1145/2339530.2339704" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1145/2339530.2339704">https://doi.org/10.1145/2339530.2339704</a>. <a href="https://doi.org/10.1145/2339530.2339704" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://doi.org/10.1145/2339530.2339704">https://doi.org/10.1145/2339530.2339704</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="24."><p class="c-article-references__text" id="ref-CR24">Mani, I., Wilson, G.: Robust temporal processing of news. In: Proceedings of the 38th Annual Meeting of the Association for Computational Linguistics, pp. 69–76. Association for Computational Linguistics, Hong Kong (2000). <a href="https://doi.org/10.3115/1075218.1075228" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.3115/1075218.1075228">https://doi.org/10.3115/1075218.1075228</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="25."><p class="c-article-references__text" id="ref-CR25">Chen, Y., Liu, S., Zhang, X., Liu, K., Zhao, J.: Automatically labeled data generation for large scale event extraction. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 409–419. Association for Computational Linguistics, Vancouver, Canada (2017). <a href="https://doi.org/10.18653/v1/P17-1038" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/P17-1038">https://doi.org/10.18653/v1/P17-1038</a>. <a href="https://aclanthology.org/P17-1038" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://aclanthology.org/P17-1038">https://aclanthology.org/P17-1038</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="26."><p class="c-article-references__text" id="ref-CR26">Wei, K., Yang, Y., Jin, L., Sun, X., Zhang, Z., Zhang, J., Li, X., Zhang, L., Liu, J., Zhi, G.: Guide the many-to-one assignment: Open information extraction via IoU-aware optimal transport. In: Rogers, A., Boyd-Graber, J., Okazaki, N. (eds.) Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 4971–4984. Association for Computational Linguistics, Toronto, Canada (2023). <a href="https://doi.org/10.18653/v1/2023.acl-long.272" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/2023.acl-long.272">https://doi.org/10.18653/v1/2023.acl-long.272</a>. <a href="https://aclanthology.org/2023.acl-long.272" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://aclanthology.org/2023.acl-long.272">https://aclanthology.org/2023.acl-long.272</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="27."><p class="c-article-references__text" id="ref-CR27">Abebe, M.A., Tekli, J., Getahun, F., Chbeir, R., Tekli, G.: Generic metadata representation framework for social-based event detection, description, and linkage. Knowl.-Based Syst. <b>188</b>, 104817 (2020). <a href="https://doi.org/10.1016/j.knosys.2019.06.025" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1016/j.knosys.2019.06.025">https://doi.org/10.1016/j.knosys.2019.06.025</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1016/j.knosys.2019.06.025" data-track-item_id="10.1016/j.knosys.2019.06.025" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1016%2Fj.knosys.2019.06.025" aria-label="Article reference 27" data-doi="10.1016/j.knosys.2019.06.025">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 27" href="http://scholar.google.com/scholar_lookup?&amp;title=Generic%20metadata%20representation%20framework%20for%20social-based%20event%20detection%2C%20description%2C%20and%20linkage&amp;journal=Knowl.-Based%20Syst.&amp;doi=10.1016%2Fj.knosys.2019.06.025&amp;volume=188&amp;publication_year=2020&amp;author=Abebe%2CMA&amp;author=Tekli%2CJ&amp;author=Getahun%2CF&amp;author=Chbeir%2CR&amp;author=Tekli%2CG"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="28."><p class="c-article-references__text" id="ref-CR28">Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł., Polosukhin, I.: Attention is all you need. Advances in neural information processing systems <b>30</b> (2017)</p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="29."><p class="c-article-references__text" id="ref-CR29">Chang, C.-H., Liao, Y.-C., Yeh, T.: Event source page discovery via policy-based rl with multi-task neural sequence model. In: International Conference on Web Information Systems Engineering, pp. 597–606 (2022). Springer</p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="30."><p class="c-article-references__text" id="ref-CR30">Lample, G., Ballesteros, M., Subramanian, S., Kawakami, K., Dyer, C.: Neural architectures for named entity recognition. In: Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 260–270. Association for Computational Linguistics, San Diego, California (2016). <a href="https://doi.org/10.18653/v1/N16-1030" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.18653/v1/N16-1030">https://doi.org/10.18653/v1/N16-1030</a>. <a href="https://aclanthology.org/N16-1030" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://aclanthology.org/N16-1030">https://aclanthology.org/N16-1030</a></p></li></ol><p class="c-article-references__download u-hide-print"><a data-track="click" data-track-action="download citation references" data-track-label="link" rel="nofollow" href="https://citation-needed.springer.com/v2/references/10.1007/s44196-024-00697-0?format=refman&amp;flavour=references">Download references<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-download-medium"></use></svg></a></p></div></div></div></section></div><section data-title="Funding"><div class="c-article-section" id="Fun-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Fun">Funding</h2><div class="c-article-section__content" id="Fun-content"><p>This work was supported by the National Science and Technology Council, Taiwan, under grant NSTC109-2221-E-008-060-MY3.</p></div></div></section><section aria-labelledby="author-information" data-title="Author information"><div class="c-article-section" id="author-information-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="author-information">Author information</h2><div class="c-article-section__content" id="author-information-content"><h3 class="c-article__sub-heading" id="affiliations">Authors and Affiliations</h3><ol class="c-article-author-affiliation__list"><li id="Aff1"><p class="c-article-author-affiliation__address">Department of Computer Science and Information Engineering, National Central University, No. 300, Zhongda Rd., Zhongli Dist., Taoyuan City, 320317, Taiwan</p><p class="c-article-author-affiliation__authors-list">Yuan-Hao Lin &amp; Chia-Hui Chang</p></li><li id="Aff2"><p class="c-article-author-affiliation__address">Information and Computer Engineering, Chung Yuan Christian University, No. 200, Zhongbei Rd., Zhongli Dist., Taoyuan, 320314, Taiwan</p><p class="c-article-author-affiliation__authors-list">Hsiu-Min Chuang</p></li></ol><div class="u-js-hide u-hide-print" data-test="author-info"><span class="c-article__sub-heading">Authors</span><ol class="c-article-authors-search u-list-reset"><li id="auth-Yuan_Hao-Lin-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Yuan-Hao Lin</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Yuan-Hao%20Lin" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Yuan-Hao%20Lin" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Yuan-Hao%20Lin%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Chia_Hui-Chang-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Chia-Hui Chang</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Chia-Hui%20Chang" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Chia-Hui%20Chang" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Chia-Hui%20Chang%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Hsiu_Min-Chuang-Aff2"><span class="c-article-authors-search__title u-h3 js-search-name">Hsiu-Min Chuang</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Hsiu-Min%20Chuang" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Hsiu-Min%20Chuang" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Hsiu-Min%20Chuang%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li></ol></div><h3 class="c-article__sub-heading" id="contributions">Contributions</h3><p>Y.-H. Lin conducted methodology, model training, programming, and writing. C.-H. Chang conducted review and editing and project administration. H.-M. Chuang conducted writing and reviewed. All authors have read and agreed to submit the manuscript.</p><h3 class="c-article__sub-heading" id="corresponding-author">Corresponding author</h3><p id="corresponding-author-list">Correspondence to <a id="corresp-c1" href="mailto:chiahui@g.ncu.edu.tw">Chia-Hui Chang</a>.</p></div></div></section><section data-title="Ethics declarations"><div class="c-article-section" id="ethics-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="ethics">Ethics declarations</h2><div class="c-article-section__content" id="ethics-content"> <h3 class="c-article__sub-heading" id="FPar1">Conflict of interest</h3> <p>All authors have no Conflict of interest to disclose regarding the publication of this study.</p> <h3 class="c-article__sub-heading" id="FPar2">Ethical and Informed Consent for Data Used</h3> <p>Our study adheres to ethical principles and has obtained appropriate informed consent for using the data involved. All data collection and processing procedures comply with international and local laws and regulations, as well as relevant research ethical standards. All personal data involved in the research process has undergone anonymization to safeguard the privacy and data security of participants. We are committed to ensuring transparency and legality in data usage, while also respecting the rights and interests of participants.</p> </div></div></section><section data-title="Additional information"><div class="c-article-section" id="additional-information-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="additional-information">Additional information</h2><div class="c-article-section__content" id="additional-information-content"><h3 class="c-article__sub-heading">Publisher's Note</h3><p>Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.</p></div></div></section><section aria-labelledby="appendices"><div class="c-article-section" id="appendices-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="appendices">Prompts Used for GPT-3.5-turbo and GPT-4-turbo</h2><div class="c-article-section__content" id="appendices-content"><h3 class="c-article__sub-heading u-visually-hidden" id="App1">Prompts Used for GPT-3.5-turbo and GPT-4-turbo</h3><div class="c-article-section__figure c-article-section__figure--no-border" data-test="figure" data-container-section="figure" id="figure-c"><figure><div class="c-article-section__figure-content" id="Figc"><div class="c-article-section__figure-item"><div class="c-article-section__figure-content"><picture><img aria-describedby="Figc" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Figc_HTML.png" alt="figure c" loading="lazy" width="685" height="127"></picture></div></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-c-desc"></div></div></figure></div><div class="c-article-section__figure c-article-section__figure--no-border" data-test="figure" data-container-section="figure" id="figure-d"><figure><div class="c-article-section__figure-content" id="Figd"><div class="c-article-section__figure-item"><div class="c-article-section__figure-content"><picture><img aria-describedby="Figd" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00697-0/MediaObjects/44196_2024_697_Figd_HTML.png" alt="figure d" loading="lazy" width="685" height="127"></picture></div></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-d-desc"></div></div></figure></div></div></div></section><section data-title="Rights and permissions"><div class="c-article-section" id="rightslink-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="rightslink">Rights and permissions</h2><div class="c-article-section__content" id="rightslink-content"> <p><b>Open Access</b> This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit <a href="http://creativecommons.org/licenses/by-nc-nd/4.0/" rel="license">http://creativecommons.org/licenses/by-nc-nd/4.0/</a>.</p> <p class="c-article-rights"><a data-track="click" data-track-action="view rights and permissions" data-track-label="link" href="https://s100.copyright.com/AppDispatchServlet?title=Fine-Grained%20Meetup%20Events%20Extraction%20Through%20Context-Aware%20Event%20Argument%20Positioning%20and%20Recognition&amp;author=Yuan-Hao%20Lin%20et%20al&amp;contentID=10.1007%2Fs44196-024-00697-0&amp;copyright=The%20Author%28s%29&amp;publication=1875-6883&amp;publicationDate=2024-11-26&amp;publisherName=SpringerNature&amp;orderBeanReset=true&amp;oa=CC%20BY-NC-ND">Reprints and permissions</a></p></div></div></section><section aria-labelledby="article-info" data-title="About this article"><div class="c-article-section" id="article-info-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="article-info">About this article</h2><div class="c-article-section__content" id="article-info-content"><div class="c-bibliographic-information"><div class="u-hide-print c-bibliographic-information__column c-bibliographic-information__column--border"><a data-crossmark="10.1007/s44196-024-00697-0" target="_blank" rel="noopener" href="https://crossmark.crossref.org/dialog/?doi=10.1007/s44196-024-00697-0" data-track="click" data-track-action="Click Crossmark" data-track-label="link" data-test="crossmark"><img loading="lazy" width="57" height="81" alt="Check for updates. Verify currency and authenticity via CrossMark" src="data:image/svg+xml;base64,<svg height="81" width="57" xmlns="http://www.w3.org/2000/svg"><g fill="none" fill-rule="evenodd"><path d="m17.35 35.45 21.3-14.2v-17.03h-21.3" fill="#989898"/><path d="m38.65 35.45-21.3-14.2v-17.03h21.3" fill="#747474"/><path d="m28 .5c-12.98 0-23.5 10.52-23.5 23.5s10.52 23.5 23.5 23.5 23.5-10.52 23.5-23.5c0-6.23-2.48-12.21-6.88-16.62-4.41-4.4-10.39-6.88-16.62-6.88zm0 41.25c-9.8 0-17.75-7.95-17.75-17.75s7.95-17.75 17.75-17.75 17.75 7.95 17.75 17.75c0 4.71-1.87 9.22-5.2 12.55s-7.84 5.2-12.55 5.2z" fill="#535353"/><path d="m41 36c-5.81 6.23-15.23 7.45-22.43 2.9-7.21-4.55-10.16-13.57-7.03-21.5l-4.92-3.11c-4.95 10.7-1.19 23.42 8.78 29.71 9.97 6.3 23.07 4.22 30.6-4.86z" fill="#9c9c9c"/><path d="m.2 58.45c0-.75.11-1.42.33-2.01s.52-1.09.91-1.5c.38-.41.83-.73 1.34-.94.51-.22 1.06-.32 1.65-.32.56 0 1.06.11 1.51.35.44.23.81.5 1.1.81l-.91 1.01c-.24-.24-.49-.42-.75-.56-.27-.13-.58-.2-.93-.2-.39 0-.73.08-1.05.23-.31.16-.58.37-.81.66-.23.28-.41.63-.53 1.04-.13.41-.19.88-.19 1.39 0 1.04.23 1.86.68 2.46.45.59 1.06.88 1.84.88.41 0 .77-.07 1.07-.23s.59-.39.85-.68l.91 1c-.38.43-.8.76-1.28.99-.47.22-1 .34-1.58.34-.59 0-1.13-.1-1.64-.31-.5-.2-.94-.51-1.31-.91-.38-.4-.67-.9-.88-1.48-.22-.59-.33-1.26-.33-2.02zm8.4-5.33h1.61v2.54l-.05 1.33c.29-.27.61-.51.96-.72s.76-.31 1.24-.31c.73 0 1.27.23 1.61.71.33.47.5 1.14.5 2.02v4.31h-1.61v-4.1c0-.57-.08-.97-.25-1.21-.17-.23-.45-.35-.83-.35-.3 0-.56.08-.79.22-.23.15-.49.36-.78.64v4.8h-1.61zm7.37 6.45c0-.56.09-1.06.26-1.51.18-.45.42-.83.71-1.14.29-.3.63-.54 1.01-.71.39-.17.78-.25 1.18-.25.47 0 .88.08 1.23.24.36.16.65.38.89.67s.42.63.54 1.03c.12.41.18.84.18 1.32 0 .32-.02.57-.07.76h-4.36c.07.62.29 1.1.65 1.44.36.33.82.5 1.38.5.29 0 .57-.04.83-.13s.51-.21.76-.37l.55 1.01c-.33.21-.69.39-1.09.53-.41.14-.83.21-1.26.21-.48 0-.92-.08-1.34-.25-.41-.16-.76-.4-1.07-.7-.31-.31-.55-.69-.72-1.13-.18-.44-.26-.95-.26-1.52zm4.6-.62c0-.55-.11-.98-.34-1.28-.23-.31-.58-.47-1.06-.47-.41 0-.77.15-1.07.45-.31.29-.5.73-.58 1.3zm2.5.62c0-.57.09-1.08.28-1.53.18-.44.43-.82.75-1.13s.69-.54 1.1-.71c.42-.16.85-.24 1.31-.24.45 0 .84.08 1.17.23s.61.34.85.57l-.77 1.02c-.19-.16-.38-.28-.56-.37-.19-.09-.39-.14-.61-.14-.56 0-1.01.21-1.35.63-.35.41-.52.97-.52 1.67 0 .69.17 1.24.51 1.66.34.41.78.62 1.32.62.28 0 .54-.06.78-.17.24-.12.45-.26.64-.42l.67 1.03c-.33.29-.69.51-1.08.65-.39.15-.78.23-1.18.23-.46 0-.9-.08-1.31-.24-.4-.16-.75-.39-1.05-.7s-.53-.69-.7-1.13c-.17-.45-.25-.96-.25-1.53zm6.91-6.45h1.58v6.17h.05l2.54-3.16h1.77l-2.35 2.8 2.59 4.07h-1.75l-1.77-2.98-1.08 1.23v1.75h-1.58zm13.69 1.27c-.25-.11-.5-.17-.75-.17-.58 0-.87.39-.87 1.16v.75h1.34v1.27h-1.34v5.6h-1.61v-5.6h-.92v-1.2l.92-.07v-.72c0-.35.04-.68.13-.98.08-.31.21-.57.4-.79s.42-.39.71-.51c.28-.12.63-.18 1.04-.18.24 0 .48.02.69.07.22.05.41.1.57.17zm.48 5.18c0-.57.09-1.08.27-1.53.17-.44.41-.82.72-1.13.3-.31.65-.54 1.04-.71.39-.16.8-.24 1.23-.24s.84.08 1.24.24c.4.17.74.4 1.04.71s.54.69.72 1.13c.19.45.28.96.28 1.53s-.09 1.08-.28 1.53c-.18.44-.42.82-.72 1.13s-.64.54-1.04.7-.81.24-1.24.24-.84-.08-1.23-.24-.74-.39-1.04-.7c-.31-.31-.55-.69-.72-1.13-.18-.45-.27-.96-.27-1.53zm1.65 0c0 .69.14 1.24.43 1.66.28.41.68.62 1.18.62.51 0 .9-.21 1.19-.62.29-.42.44-.97.44-1.66 0-.7-.15-1.26-.44-1.67-.29-.42-.68-.63-1.19-.63-.5 0-.9.21-1.18.63-.29.41-.43.97-.43 1.67zm6.48-3.44h1.33l.12 1.21h.05c.24-.44.54-.79.88-1.02.35-.24.7-.36 1.07-.36.32 0 .59.05.78.14l-.28 1.4-.33-.09c-.11-.01-.23-.02-.38-.02-.27 0-.56.1-.86.31s-.55.58-.77 1.1v4.2h-1.61zm-47.87 15h1.61v4.1c0 .57.08.97.25 1.2.17.24.44.35.81.35.3 0 .57-.07.8-.22.22-.15.47-.39.73-.73v-4.7h1.61v6.87h-1.32l-.12-1.01h-.04c-.3.36-.63.64-.98.86-.35.21-.76.32-1.24.32-.73 0-1.27-.24-1.61-.71-.33-.47-.5-1.14-.5-2.02zm9.46 7.43v2.16h-1.61v-9.59h1.33l.12.72h.05c.29-.24.61-.45.97-.63.35-.17.72-.26 1.1-.26.43 0 .81.08 1.15.24.33.17.61.4.84.71.24.31.41.68.53 1.11.13.42.19.91.19 1.44 0 .59-.09 1.11-.25 1.57-.16.47-.38.85-.65 1.16-.27.32-.58.56-.94.73-.35.16-.72.25-1.1.25-.3 0-.6-.07-.9-.2s-.59-.31-.87-.56zm0-2.3c.26.22.5.37.73.45.24.09.46.13.66.13.46 0 .84-.2 1.15-.6.31-.39.46-.98.46-1.77 0-.69-.12-1.22-.35-1.61-.23-.38-.61-.57-1.13-.57-.49 0-.99.26-1.52.77zm5.87-1.69c0-.56.08-1.06.25-1.51.16-.45.37-.83.65-1.14.27-.3.58-.54.93-.71s.71-.25 1.08-.25c.39 0 .73.07 1 .2.27.14.54.32.81.55l-.06-1.1v-2.49h1.61v9.88h-1.33l-.11-.74h-.06c-.25.25-.54.46-.88.64-.33.18-.69.27-1.06.27-.87 0-1.56-.32-2.07-.95s-.76-1.51-.76-2.65zm1.67-.01c0 .74.13 1.31.4 1.7.26.38.65.58 1.15.58.51 0 .99-.26 1.44-.77v-3.21c-.24-.21-.48-.36-.7-.45-.23-.08-.46-.12-.7-.12-.45 0-.82.19-1.13.59-.31.39-.46.95-.46 1.68zm6.35 1.59c0-.73.32-1.3.97-1.71.64-.4 1.67-.68 3.08-.84 0-.17-.02-.34-.07-.51-.05-.16-.12-.3-.22-.43s-.22-.22-.38-.3c-.15-.06-.34-.1-.58-.1-.34 0-.68.07-1 .2s-.63.29-.93.47l-.59-1.08c.39-.24.81-.45 1.28-.63.47-.17.99-.26 1.54-.26.86 0 1.51.25 1.93.76s.63 1.25.63 2.21v4.07h-1.32l-.12-.76h-.05c-.3.27-.63.48-.98.66s-.73.27-1.14.27c-.61 0-1.1-.19-1.48-.56-.38-.36-.57-.85-.57-1.46zm1.57-.12c0 .3.09.53.27.67.19.14.42.21.71.21.28 0 .54-.07.77-.2s.48-.31.73-.56v-1.54c-.47.06-.86.13-1.18.23-.31.09-.57.19-.76.31s-.33.25-.41.4c-.09.15-.13.31-.13.48zm6.29-3.63h-.98v-1.2l1.06-.07.2-1.88h1.34v1.88h1.75v1.27h-1.75v3.28c0 .8.32 1.2.97 1.2.12 0 .24-.01.37-.04.12-.03.24-.07.34-.11l.28 1.19c-.19.06-.4.12-.64.17-.23.05-.49.08-.76.08-.4 0-.74-.06-1.02-.18-.27-.13-.49-.3-.67-.52-.17-.21-.3-.48-.37-.78-.08-.3-.12-.64-.12-1.01zm4.36 2.17c0-.56.09-1.06.27-1.51s.41-.83.71-1.14c.29-.3.63-.54 1.01-.71.39-.17.78-.25 1.18-.25.47 0 .88.08 1.23.24.36.16.65.38.89.67s.42.63.54 1.03c.12.41.18.84.18 1.32 0 .32-.02.57-.07.76h-4.37c.08.62.29 1.1.65 1.44.36.33.82.5 1.38.5.3 0 .58-.04.84-.13.25-.09.51-.21.76-.37l.54 1.01c-.32.21-.69.39-1.09.53s-.82.21-1.26.21c-.47 0-.92-.08-1.33-.25-.41-.16-.77-.4-1.08-.7-.3-.31-.54-.69-.72-1.13-.17-.44-.26-.95-.26-1.52zm4.61-.62c0-.55-.11-.98-.34-1.28-.23-.31-.58-.47-1.06-.47-.41 0-.77.15-1.08.45-.31.29-.5.73-.57 1.3zm3.01 2.23c.31.24.61.43.92.57.3.13.63.2.98.2.38 0 .65-.08.83-.23s.27-.35.27-.6c0-.14-.05-.26-.13-.37-.08-.1-.2-.2-.34-.28-.14-.09-.29-.16-.47-.23l-.53-.22c-.23-.09-.46-.18-.69-.3-.23-.11-.44-.24-.62-.4s-.33-.35-.45-.55c-.12-.21-.18-.46-.18-.75 0-.61.23-1.1.68-1.49.44-.38 1.06-.57 1.83-.57.48 0 .91.08 1.29.25s.71.36.99.57l-.74.98c-.24-.17-.49-.32-.73-.42-.25-.11-.51-.16-.78-.16-.35 0-.6.07-.76.21-.17.15-.25.33-.25.54 0 .14.04.26.12.36s.18.18.31.26c.14.07.29.14.46.21l.54.19c.23.09.47.18.7.29s.44.24.64.4c.19.16.34.35.46.58.11.23.17.5.17.82 0 .3-.06.58-.17.83-.12.26-.29.48-.51.68-.23.19-.51.34-.84.45-.34.11-.72.17-1.15.17-.48 0-.95-.09-1.41-.27-.46-.19-.86-.41-1.2-.68z" fill="#535353"/></g></svg>"></a></div><div class="c-bibliographic-information__column"><h3 class="c-article__sub-heading" id="citeas">Cite this article</h3><p class="c-bibliographic-information__citation">Lin, YH., Chang, CH. &amp; Chuang, HM. Fine-Grained Meetup Events Extraction Through Context-Aware Event Argument Positioning and Recognition. <i>Int J Comput Intell Syst</i> <b>17</b>, 296 (2024). https://doi.org/10.1007/s44196-024-00697-0</p><p class="c-bibliographic-information__download-citation u-hide-print"><a data-test="citation-link" data-track="click" data-track-action="download article citation" data-track-label="link" data-track-external="" rel="nofollow" href="https://citation-needed.springer.com/v2/references/10.1007/s44196-024-00697-0?format=refman&amp;flavour=citation">Download citation<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-download-medium"></use></svg></a></p><ul class="c-bibliographic-information__list" data-test="publication-history"><li class="c-bibliographic-information__list-item"><p>Received<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2024-06-11">11 June 2024</time></span></p></li><li class="c-bibliographic-information__list-item"><p>Accepted<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2024-11-04">04 November 2024</time></span></p></li><li class="c-bibliographic-information__list-item"><p>Published<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2024-11-26">26 November 2024</time></span></p></li><li class="c-bibliographic-information__list-item c-bibliographic-information__list-item--full-width"><p><abbr title="Digital Object Identifier">DOI</abbr><span class="u-hide">: </span><span class="c-bibliographic-information__value">https://doi.org/10.1007/s44196-024-00697-0</span></p></li></ul><div data-component="share-box"><div class="c-article-share-box u-display-none" hidden=""><h3 class="c-article__sub-heading">Share this article</h3><p class="c-article-share-box__description">Anyone you share the following link with will be able to read this content:</p><button class="js-get-share-url c-article-share-box__button" type="button" id="get-share-url" data-track="click" data-track-label="button" data-track-external="" data-track-action="get shareable link">Get shareable link</button><div class="js-no-share-url-container u-display-none" hidden=""><p class="js-c-article-share-box__no-sharelink-info c-article-share-box__no-sharelink-info">Sorry, a shareable link is not currently available for this article.</p></div><div class="js-share-url-container u-display-none" hidden=""><p class="js-share-url c-article-share-box__only-read-input" id="share-url" data-track="click" data-track-label="button" data-track-action="select share url"></p><button class="js-copy-share-url c-article-share-box__button--link-like" type="button" id="copy-share-url" data-track="click" data-track-label="button" data-track-action="copy share url" data-track-external="">Copy to clipboard</button></div><p class="js-c-article-share-box__additional-info c-article-share-box__additional-info"> Provided by the Springer Nature SharedIt content-sharing initiative </p></div></div><h3 class="c-article__sub-heading">Keywords</h3><ul class="c-article-subject-list"><li class="c-article-subject-list__subject"><span><a href="/search?query=Meetup%20event%20extraction&amp;facet-discipline=&#34;Engineering&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Meetup event extraction</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Context-aware%20event%20extraction&amp;facet-discipline=&#34;Engineering&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Context-aware event extraction</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Event%20argument%20positioning&amp;facet-discipline=&#34;Engineering&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Event argument positioning</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Event%20argument%20recognition&amp;facet-discipline=&#34;Engineering&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Event argument recognition</a></span></li></ul><div data-component="article-info-list"></div></div></div></div></div></section> </div> </main> <div class="c-article-sidebar u-text-sm u-hide-print l-with-sidebar__sidebar" id="sidebar" data-container-type="reading-companion" data-track-component="reading companion"> <aside> <div class="app-card-service" data-test="article-checklist-banner"> <div> <a class="app-card-service__link" data-track="click_presubmission_checklist" data-track-context="article page top of reading companion" data-track-category="pre-submission-checklist" data-track-action="clicked article page checklist banner test 2 old version" data-track-label="link" href="https://beta.springernature.com/pre-submission?journalId=44196" data-test="article-checklist-banner-link"> <span class="app-card-service__link-text">Use our pre-submission checklist</span> <svg class="app-card-service__link-icon" aria-hidden="true" focusable="false"><use xlink:href="#icon-eds-i-arrow-right-small"></use></svg> </a> <p class="app-card-service__description">Avoid common mistakes on your manuscript.</p> </div> <div class="app-card-service__icon-container"> <svg class="app-card-service__icon" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-clipboard-check-medium"></use> </svg> </div> </div> <div data-test="collections"> <aside> <div class="c-article-associated-content__container"> <h2 class="c-article-associated-content__title u-h3 u-mb-24 u-visually-hidden">Associated Content</h2> <div class="c-article-associated-content__collection collection u-mb-24"> <p class="c-article-associated-content__collection-label u-sans-serif u-text-bold u-mb-8">Part of a collection:</p> <h3 class="c-article-associated-content__collection-title u-mt-0 u-h3 u-mb-8" itemprop="name headline"> <a href="/collections/ahdfbhaeje" data-track="click" data-track-action="view collection" data-track-label="link">Computational Intelligence for Artificial Intelligence and Data Analytics</a> </h3> </div> </div> </aside> <script> window.dataLayer = window.dataLayer || []; window.dataLayer[0] = window.dataLayer[0] || {}; window.dataLayer[0].content = window.dataLayer[0].content || {}; window.dataLayer[0].content.collections = 'ahdfbhaeje'; </script> </div> <div data-test="editorial-summary"> </div> <div class="c-reading-companion"> <div class="c-reading-companion__sticky" data-component="reading-companion-sticky" data-test="reading-companion-sticky"> <div class="c-reading-companion__panel c-reading-companion__sections c-reading-companion__panel--active" id="tabpanel-sections"> <div class="u-lazy-ad-wrapper u-mt-16 u-hide" data-component-mpu><div class="c-ad c-ad--300x250"> <div class="c-ad__inner"> <p class="c-ad__label">Advertisement</p> <div id="div-gpt-ad-MPU1" class="div-gpt-ad grade-c-hide" data-pa11y-ignore data-gpt data-gpt-unitpath="/270604982/springerlink/44196/article" data-gpt-sizes="300x250" data-test="MPU1-ad" data-gpt-targeting="pos=MPU1;articleid=s44196-024-00697-0;"> </div> </div> </div> </div> </div> <div class="c-reading-companion__panel c-reading-companion__figures c-reading-companion__panel--full-width" id="tabpanel-figures"></div> <div class="c-reading-companion__panel c-reading-companion__references c-reading-companion__panel--full-width" id="tabpanel-references"></div> </div> </div> </aside> </div> </div> </article> <div class="app-elements"> <div class="eds-c-header__expander eds-c-header__expander--search" id="eds-c-header-popup-search"> <h2 class="eds-c-header__heading">Search</h2> <div class="u-container"> <search class="eds-c-header__search" role="search" aria-label="Search from the header"> <form method="GET" action="//link.springer.com/search" data-test="header-search" data-track="search" data-track-context="search from header" data-track-action="submit search form" data-track-category="unified header" data-track-label="form" > <label for="eds-c-header-search" class="eds-c-header__search-label">Search by keyword or author</label> <div class="eds-c-header__search-container"> <input id="eds-c-header-search" class="eds-c-header__search-input" autocomplete="off" name="query" type="search" value="" required> <button class="eds-c-header__search-button" type="submit"> <svg class="eds-c-header__icon" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-search-medium"></use> </svg> <span class="u-visually-hidden">Search</span> </button> </div> </form> </search> </div> </div> <div class="eds-c-header__expander eds-c-header__expander--menu" id="eds-c-header-nav"> <h2 class="eds-c-header__heading">Navigation</h2> <ul class="eds-c-header__list"> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://link.springer.com/journals/" data-track="nav_find_a_journal" data-track-context="unified header" data-track-action="click find a journal" data-track-category="unified header" data-track-label="link" > Find a journal </a> </li> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://www.springernature.com/gp/authors" data-track="nav_how_to_publish" data-track-context="unified header" data-track-action="click publish with us link" data-track-category="unified header" data-track-label="link" > Publish with us </a> </li> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://link.springernature.com/home/" data-track="nav_track_your_research" data-track-context="unified header" data-track-action="click track your research" data-track-category="unified header" data-track-label="link" > Track your research </a> </li> </ul> </div> <footer > <div class="eds-c-footer" > <div class="eds-c-footer__container"> <div class="eds-c-footer__grid eds-c-footer__group--separator"> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Discover content</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/journals/a/1" data-track="nav_journals_a_z" data-track-action="journals a-z" data-track-context="unified footer" data-track-label="link">Journals A-Z</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/books/a/1" data-track="nav_books_a_z" data-track-action="books a-z" data-track-context="unified footer" data-track-label="link">Books A-Z</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Publish with us</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/journals" data-track="nav_journal_finder" data-track-action="journal finder" data-track-context="unified footer" data-track-label="link">Journal finder</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/authors" data-track="nav_publish_your_research" data-track-action="publish your research" data-track-context="unified footer" data-track-label="link">Publish your research</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/open-research/about/the-fundamentals-of-open-access-and-open-research" data-track="nav_open_access_publishing" data-track-action="open access publishing" data-track-context="unified footer" data-track-label="link">Open access publishing</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Products and services</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/products" data-track="nav_our_products" data-track-action="our products" data-track-context="unified footer" data-track-label="link">Our products</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/librarians" data-track="nav_librarians" data-track-action="librarians" data-track-context="unified footer" data-track-label="link">Librarians</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/societies" data-track="nav_societies" data-track-action="societies" data-track-context="unified footer" data-track-label="link">Societies</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/partners" data-track="nav_partners_and_advertisers" data-track-action="partners and advertisers" data-track-context="unified footer" data-track-label="link">Partners and advertisers</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Our imprints</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springer.com/" data-track="nav_imprint_Springer" data-track-action="Springer" data-track-context="unified footer" data-track-label="link">Springer</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.nature.com/" data-track="nav_imprint_Nature_Portfolio" data-track-action="Nature Portfolio" data-track-context="unified footer" data-track-label="link">Nature Portfolio</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.biomedcentral.com/" data-track="nav_imprint_BMC" data-track-action="BMC" data-track-context="unified footer" data-track-label="link">BMC</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.palgrave.com/" data-track="nav_imprint_Palgrave_Macmillan" data-track-action="Palgrave Macmillan" data-track-context="unified footer" data-track-label="link">Palgrave Macmillan</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.apress.com/" data-track="nav_imprint_Apress" data-track-action="Apress" data-track-context="unified footer" data-track-label="link">Apress</a></li> </ul> </div> </div> </div> <div class="eds-c-footer__container"> <nav aria-label="footer navigation"> <ul class="eds-c-footer__links"> <li class="eds-c-footer__item"> <button class="eds-c-footer__link" data-cc-action="preferences" data-track="dialog_manage_cookies" data-track-action="Manage cookies" data-track-context="unified footer" data-track-label="link"><span class="eds-c-footer__button-text">Your privacy choices/Manage cookies</span></button> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://www.springernature.com/gp/legal/ccpa" data-track="nav_california_privacy_statement" data-track-action="california privacy statement" data-track-context="unified footer" data-track-label="link">Your US state privacy rights</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://www.springernature.com/gp/info/accessibility" data-track="nav_accessibility_statement" data-track-action="accessibility statement" data-track-context="unified footer" data-track-label="link">Accessibility statement</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://link.springer.com/termsandconditions" data-track="nav_terms_and_conditions" data-track-action="terms and conditions" data-track-context="unified footer" data-track-label="link">Terms and conditions</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://link.springer.com/privacystatement" data-track="nav_privacy_policy" data-track-action="privacy policy" data-track-context="unified footer" data-track-label="link">Privacy policy</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://support.springernature.com/en/support/home" data-track="nav_help_and_support" data-track-action="help and support" data-track-context="unified footer" data-track-label="link">Help and support</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://support.springernature.com/en/support/solutions/articles/6000255911-subscription-cancellations" data-track-action="cancel contracts here">Cancel contracts here</a> </li> </ul> </nav> <div class="eds-c-footer__user"> <p class="eds-c-footer__user-info"> <span data-test="footer-user-ip">8.222.208.146</span> </p> <p class="eds-c-footer__user-info" data-test="footer-business-partners">Not affiliated</p> </div> <a href="https://www.springernature.com/" class="eds-c-footer__link"> <img src="/oscar-static/images/logo-springernature-white-19dd4ba190.svg" alt="Springer Nature" loading="lazy" width="200" height="20"/> </a> <p class="eds-c-footer__legal" data-test="copyright">&copy; 2024 Springer Nature</p> </div> </div> </footer> </div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10