CINXE.COM
Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision | International Journal of Computational Intelligence Systems
<!DOCTYPE html> <html lang="en" class="no-js"> <head> <meta charset="UTF-8"> <meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta name="applicable-device" content="pc,mobile"> <meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="robots" content="max-image-preview:large"> <meta name="access" content="Yes"> <meta name="360-site-verification" content="1268d79b5e96aecf3ff2a7dac04ad990" /> <title>Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision | International Journal of Computational Intelligence Systems</title> <meta name="twitter:site" content="@SpringerLink"/> <meta name="twitter:card" content="summary_large_image"/> <meta name="twitter:image:alt" content="Content cover image"/> <meta name="twitter:title" content="Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision"/> <meta name="twitter:description" content="International Journal of Computational Intelligence Systems - In view of the current problems of low detection accuracy, poor stability and slow detection speed of intelligent vehicle violation..."/> <meta name="twitter:image" content="https://static-content.springer.com/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig1_HTML.jpg"/> <meta name="journal_id" content="44196"/> <meta name="dc.title" content="Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision"/> <meta name="dc.source" content="International Journal of Computational Intelligence Systems 2024 17:1"/> <meta name="dc.format" content="text/html"/> <meta name="dc.publisher" content="Springer"/> <meta name="dc.date" content="2024-02-26"/> <meta name="dc.type" content="OriginalPaper"/> <meta name="dc.language" content="En"/> <meta name="dc.copyright" content="2024 The Author(s)"/> <meta name="dc.rights" content="2024 The Author(s)"/> <meta name="dc.rightsAgent" content="journalpermissions@springernature.com"/> <meta name="dc.description" content="In view of the current problems of low detection accuracy, poor stability and slow detection speed of intelligent vehicle violation detection systems, this article will use human–computer interaction and computer vision technology to solve the existing problems. First, the picture data required for the experiment is collected through the Bit Vehicle model dataset, and computer vision technology is used for preprocessing. Then, use Kalman filtering to track and study the vehicle to help better predict the trajectory of the vehicle in the area that needs to be detected; finally, use human–computer interaction technology to build the interactive interface of the system and improve the operability of the system. The violation detection system based on computer vision technology has an accuracy of more than 96.86% for the detection of the eight types of violations extracted, and the average detection is 98%. Through computer vision technology, the system can accurately detect and identify vehicle violations in real time, effectively improving the efficiency and safety of traffic management. In addition, the system also pays special attention to the design of human–computer interaction, provides an intuitive and easy-to-use user interface, and enables traffic managers to easily monitor and manage traffic conditions. This innovative intelligent vehicle violation detection system is expected to help the development of traffic management technology in the future."/> <meta name="prism.issn" content="1875-6883"/> <meta name="prism.publicationName" content="International Journal of Computational Intelligence Systems"/> <meta name="prism.publicationDate" content="2024-02-26"/> <meta name="prism.volume" content="17"/> <meta name="prism.number" content="1"/> <meta name="prism.section" content="OriginalPaper"/> <meta name="prism.startingPage" content="1"/> <meta name="prism.endingPage" content="14"/> <meta name="prism.copyright" content="2024 The Author(s)"/> <meta name="prism.rightsAgent" content="journalpermissions@springernature.com"/> <meta name="prism.url" content="https://link.springer.com/article/10.1007/s44196-024-00427-6"/> <meta name="prism.doi" content="doi:10.1007/s44196-024-00427-6"/> <meta name="citation_pdf_url" content="https://link.springer.com/content/pdf/10.1007/s44196-024-00427-6.pdf"/> <meta name="citation_fulltext_html_url" content="https://link.springer.com/article/10.1007/s44196-024-00427-6"/> <meta name="citation_journal_title" content="International Journal of Computational Intelligence Systems"/> <meta name="citation_journal_abbrev" content="Int J Comput Intell Syst"/> <meta name="citation_publisher" content="Springer Netherlands"/> <meta name="citation_issn" content="1875-6883"/> <meta name="citation_title" content="Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision"/> <meta name="citation_volume" content="17"/> <meta name="citation_issue" content="1"/> <meta name="citation_publication_date" content="2024/12"/> <meta name="citation_online_date" content="2024/02/26"/> <meta name="citation_firstpage" content="1"/> <meta name="citation_lastpage" content="14"/> <meta name="citation_article_type" content="Research Article"/> <meta name="citation_fulltext_world_readable" content=""/> <meta name="citation_language" content="en"/> <meta name="dc.identifier" content="doi:10.1007/s44196-024-00427-6"/> <meta name="DOI" content="10.1007/s44196-024-00427-6"/> <meta name="size" content="192691"/> <meta name="citation_doi" content="10.1007/s44196-024-00427-6"/> <meta name="citation_springer_api_url" content="http://api.springer.com/xmldata/jats?q=doi:10.1007/s44196-024-00427-6&api_key="/> <meta name="description" content="In view of the current problems of low detection accuracy, poor stability and slow detection speed of intelligent vehicle violation detection systems, this"/> <meta name="dc.creator" content="Ren, Yang"/> <meta name="dc.subject" content="Computational Intelligence"/> <meta name="dc.subject" content="Artificial Intelligence"/> <meta name="dc.subject" content="Mathematical Logic and Foundations"/> <meta name="dc.subject" content="Control, Robotics, Mechatronics"/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Intellig. Transport. Syst.; citation_title=6G-Enabled network in box for internet of connected vehicles; citation_author=Z Lv, L Qiao, I You; citation_publication_date=2020; citation_doi=10.1109/TITS.2020.3034817; citation_id=CR1"/> <meta name="citation_reference" content="citation_journal_title=Multimedia Tools Appl.; citation_title=Big data platform of traffic violation detection system: identifying the risky behaviors of vehicle drivers; citation_author=S Asadianfam, M Shamsi, A Rasouli Kenari; citation_volume=79; citation_issue=33–34; citation_publication_date=2020; citation_pages=24645-24684; citation_doi=10.1007/s11042-020-09099-8; citation_id=CR2"/> <meta name="citation_reference" content="Sahraoui, Y., Kerrache, C. A., Korichi, A., Nour, B., Adnane, A., Hussain, R.: “DeepDist: a deep-learning-based IoV framework for real-time objects and distance violation detection.” IEEE Internet Things Magaz. 33, 30–34 (2020)  https://doi.org/10.1109/IOTM.0001.2000116 "/> <meta name="citation_reference" content="citation_journal_title=Cluster Comput.; citation_title=Intelligent traffic video surveillance and accident detection system with dynamic traffic signal control; citation_author=VC Maha Vishnu, M Rajalakshmi, R Nedunchezhian; citation_volume=215; citation_publication_date=2018; citation_pages=135-147; citation_doi=10.1007/s10586-017-0974-5; citation_id=CR4"/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Intellig. Transport. Syst.; citation_title=Using reinforcement learning with partial vehicle detection for intelligent traffic signal control; citation_author=R Zhang, A Ishikawa, W Wang, B Striner, OK Tonguz; citation_volume=22; citation_issue=1; citation_publication_date=2020; citation_pages=404-415; citation_doi=10.1109/TITS.2019.2958859; citation_id=CR5"/> <meta name="citation_reference" content="Liu Shuo., Gu Yuhai., Rao Wenjun., Wang Juyuan.: “A method for detecting illegal vehicles based on the optimized YOLOv3 algorithm”. J. Chongqing Univer. Technol. (Nat. Sci.) 35.4, 135–141 (2021). https://doi.org/10.3969/j.issn.1674-8425(z).2021.04.018 "/> <meta name="citation_reference" content="Alagarsamy, S., Ramkumar, S., Kamatchi, K., Shankar, H., Kumar, A., Karthick, S., Kumar, P.: “Designing a advanced technique for detection and violation of traffic control system.” J. Crit. Rev. 7.8, 2874–2879 (2020). https://doi.org/10.31838/jcr.07.08.473 "/> <meta name="citation_reference" content="citation_journal_title=Glob. Trans. Proc.; citation_title=Traffic violation detection in India using genetic algorithm; citation_author=AT Bhat, MS Rao, DG Pai; citation_volume=2; citation_issue=2; citation_publication_date=2021; citation_pages=309-314; citation_doi=10.1016/j.gltp.2021.08.056; citation_id=CR8"/> <meta name="citation_reference" content="Charran, R. S., Dubey. R. K.: “Two-Wheeler Vehicle Traffic Violations Detection and Automated Ticketing for Indian Road Scenario.” IEEE Trans. Intellig. Transport. Syst. 23.11, 22002–22007 (2022). https://doi.org/10.1109/TITS.2022.3186679 "/> <meta name="citation_reference" content="citation_journal_title=IET Intellig. Trans. Syst.; citation_title=Police-less multi-party traffic violation detection and reporting system with privacy preservation; citation_author=M Ozkul, I Çapuni; citation_volume=12; citation_issue=5; citation_publication_date=2018; citation_pages=351-358; citation_doi=10.1049/iet-its.2017.0122; citation_id=CR10"/> <meta name="citation_reference" content="Abbas, A. F., Sheikh, U. U., Al-Dhief, F. T., Mohd, M. N. H.: “A comprehensive review of vehicle detection using computer vision.” TELKOMNIKA (Telecommunication Computing Electronics and Control) 19.3, 838–850 (2021). https://doi.org/10.12928/telkomnika.v19i3.12880 "/> <meta name="citation_reference" content="citation_journal_title=Comput. Visual Media; citation_title=Attention mechanisms in computer vision: a survey; citation_author=M-H Guo, TX Xu, JJ Liu, ZN Liu, PT Jiang, TJ Mu; citation_volume=8; citation_issue=3; citation_publication_date=2022; citation_pages=331-368; citation_doi=10.1007/s41095-022-0271-y; citation_id=CR12"/> <meta name="citation_reference" content="citation_journal_title=Image Vis. Comput.; citation_title=Vehicle detection in intelligent transportation systems and its applications under varying environments: a review; citation_author=Z Yang, LSC Pun-Cheng; citation_volume=6; citation_issue=9; citation_publication_date=2018; citation_pages=143-154; citation_doi=10.1016/j.imavis.2017.09.008; citation_id=CR13"/> <meta name="citation_reference" content="Kumar, A., Kundu, S., Kumar, S., Tiwari, U. K., Kalra, J.: “S-tvds: Smart traffic violation detection system for Indian traffic scenario.” Int. J. Innovat. Technol. Explor. Eng. (IJITEE) 8.4S3, 6–10 (2019). https://doi.org/10.35940/ijitee.D1002.0384S319 "/> <meta name="citation_reference" content="Tutsoy, O., Tanrikulu. M. Y.: “Priority and age specific vaccination algorithm for the pandemic diseases: a comprehensive parametric prediction model.” BMC Med. Inform. Decis. Mak. 22.1, 4 (2022). https://doi.org/10.13140/RG.2.2.25044.32646 "/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Pattern Analysis Mach. Intelligence; citation_title=Graph theory based large-scale machine learning with multi-dimensional constrained optimization approaches for exact epidemiological modelling of pandemic diseases; citation_author=O Tutsoy; citation_publication_date=2023; citation_doi=10.1109/TPAMI.2023.3256421; citation_id=CR16"/> <meta name="citation_reference" content="citation_journal_title=Comput. Aided Civil Infrastr. Eng.; citation_title=A deep-learning-based computer vision solution for construction vehicle detection; citation_author=S Arabi, A Haghighat, A Sharma; citation_volume=35; citation_issue=7; citation_publication_date=2020; citation_pages=753-767; citation_doi=10.1111/mice.12530; citation_id=CR17"/> <meta name="citation_reference" content="Wiley, V., Lucas, T.: “Computer vision and image processing: a paper review.” Int. J. Artif. Intellig. Res. 2.1, 29–36 (2018). https://doi.org/10.29099/ijair.v2i1.42 "/> <meta name="citation_reference" content="citation_journal_title=Inform. Process. Agric.; citation_title=Computer vision technology in agricultural automation—a review; citation_author=H Tian, T Wang, Y Liu, X Qiao, Y Li; citation_volume=7; citation_issue=1; citation_publication_date=2020; citation_pages=1-19; citation_doi=10.1016/j.inpa.2019.09.006; citation_id=CR19"/> <meta name="citation_reference" content="citation_journal_title=Comput. Eng. Appl.; citation_title=Model recognition based on improved sparse stack coding; citation_author=D Qianlong, S Wei; citation_volume=56; citation_issue=1; citation_publication_date=2020; citation_pages=136-141; citation_doi=10.1109/ACCESS.2020.2997286; citation_id=CR20"/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Pattern Anal. Mach. Intell.; citation_title=The apolloscape open dataset for autonomous driving and its application; citation_author=X Huang, P Wang, X Cheng, D Zhou, Q Geng, R Yang; citation_volume=42; citation_issue=10; citation_publication_date=2019; citation_pages=2702-2719; citation_doi=10.1109/TPAMI.2019.2926463; citation_id=CR21"/> <meta name="citation_reference" content="citation_journal_title=Biomed. Signal Process. Control; citation_title=An efficient ECG denoising methodology using empirical mode decomposition and adaptive switching mean filter; citation_author=M Rakshit, S Das; citation_volume=40; citation_issue=3; citation_publication_date=2018; citation_pages=140-148; citation_doi=10.1016/j.bspc.2017.09.020; citation_id=CR22"/> <meta name="citation_reference" content="citation_journal_title=Prog. Geophys.; citation_title=A calculation method for floating datum of complex surface area based on non-local mean filtering; citation_author=W Shengchun, Li Jin, Hu Shanshan; citation_volume=33; citation_issue=5; citation_publication_date=2018; citation_pages=1985-1988; citation_doi=10.6038/pg2018CC0132; citation_id=CR23"/> <meta name="citation_reference" content="citation_journal_title=Vis. Comput.; citation_title=Histogram equalization using a selective filter; citation_author=RM Dyke, K Hormann; citation_volume=39; citation_issue=12; citation_publication_date=2023; citation_pages=6221-6235; citation_doi=10.1007/s00371-022-02723-8; citation_id=CR24"/> <meta name="citation_reference" content="Agrawal, S., Panda, R., Mishro, P. K., Abraham, A.: “A novel joint histogram equalization based image contrast enhancement.” J. King Saud University Comput. Inform. Sci. 34.4, 1172–1182 (2022). https://doi.org/10.1016/j.jksuci.2019.05.010 "/> <meta name="citation_reference" content="citation_journal_title=Circuits Syst. Signal Process.; citation_title=A novel contrast enhancement technique using gradient-based joint histogram equalization; citation_author=D Vijayalakshmi, MK Nath; citation_volume=40; citation_issue=8; citation_publication_date=2021; citation_pages=3929-3967; citation_doi=10.1007/s00034-021-01655-3; citation_id=CR26"/> <meta name="citation_reference" content="citation_journal_title=Commun. ACM; citation_title=An elementary introduction to Kalman filtering; citation_author=Y Pei, S Biswas, DS Fussell, K Pingali; citation_volume=62; citation_issue=11; citation_publication_date=2019; citation_pages=122-133; citation_doi=10.1145/3363294; citation_id=CR27"/> <meta name="citation_reference" content="citation_journal_title=IEEE/CAA J. Automat. Sinica; citation_title=Nonlinear Bayesian estimation: From Kalman filtering to a broader horizon; citation_author=H Fang, N Tian, Y Wang, M Zhou, MA Haile; citation_volume=5; citation_issue=2; citation_publication_date=2018; citation_pages=401-417; citation_doi=10.1109/JAS.2017.7510808; citation_id=CR28"/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Autom. Control; citation_title=Protocol-based unscented Kalman filtering in the presence of stochastic uncertainties; citation_author=S Liu, Z Wang, Y Chen, G Wei; citation_volume=65; citation_issue=3; citation_publication_date=2019; citation_pages=1303-1309; citation_doi=10.1109/TAC.2019.2929817; citation_id=CR29"/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Autom. Control; citation_title=A novel outlier-robust Kalman filtering framework based on statistical similarity measure; citation_author=Y Huang, Y Zhang, Y Zhao, P Shi, JA Chambers; citation_volume=66; citation_issue=6; citation_publication_date=2020; citation_pages=2677-2692; citation_doi=10.1109/TAC.2020.3011443; citation_id=CR30"/> <meta name="citation_reference" content="citation_journal_title=IEEE Trans. Sustain. Energy APR; citation_title=Adaptive dynamic state estimation of distribution network based on interacting multiple model [J]; citation_author=K Xiangyu, Z Xiaopeng, Z Xuanyong; citation_volume=13; citation_issue=2; citation_publication_date=2022; citation_pages=643-652; citation_doi=10.1109/TSTE.2021.3118030; citation_id=CR31"/> <meta name="citation_reference" content="Go, M.J., Park, M., Yeo, J.: “Detecting vehicles that are illegally driving on road shoulders using faster R-CNN.” J. Korea Instit. Intellig. Trans. Syst. 21.1, 105–122 (2022). https://doi.org/10.12815/kits.2022.21.1.105 "/> <meta name="citation_reference" content="citation_journal_title=Int. J. Pure Appl. Math.; citation_title=RFID-based traffic violation detection and traffic flow analysis system; citation_author=M Saritha, S Rajalakshmi, S Angel Deborah, RS Milton, S Thirumla Devi, M Vrithika; citation_volume=118; citation_issue=20; citation_publication_date=2018; citation_pages=319-328; citation_doi=10.1007/s11042-020-09714-8; citation_id=CR33"/> <meta name="citation_reference" content="citation_journal_title=Procedia Comput. Sci.; citation_title=Implementing ALPR for detection of traffic violations: a step towards sustainability; citation_author=P Agarwal, K Chopra, M Kashif, V Kumari; citation_volume=13; citation_issue=2; citation_publication_date=2018; citation_pages=738-743; citation_doi=10.1016/j.procs.2018.05.085; citation_id=CR34"/> <meta name="citation_reference" content="citation_journal_title=ACM Comput. Surveys (CSUR); citation_title=Anomaly detection in road traffic using visual surveillance: A survey; citation_author=KK Santhosh, DP Dogra, PP Roy; citation_volume=53; citation_issue=6; citation_publication_date=2020; citation_pages=1-26; citation_doi=10.1145/3417989; citation_id=CR35"/> <meta name="citation_reference" content="citation_journal_title=Comput. Intellig. Neurosci.; citation_title=Fault diagnosis in regenerative braking system of hybrid electric vehicles by using semigroup of finite-state deterministic fully intuitionistic fuzzy automata; citation_author=S Kousar, F Aslam, N Kausar, D Pamucar, GM Addis; citation_publication_date=2022; citation_doi=10.1155/2022/3684727; citation_id=CR36"/> <meta name="citation_reference" content="Liu, Y., Zhong, S., Kausar, N., Zhang, C., Mohammadzadeh, A., Pamucar, D.: “A stable fuzzy-based computational model and control for inductions motors.” Cmes-Comput. Model. Eng. Sci. 138, 793–812 (2024). https://doi.org/10.32604/cmes.2023.028175 "/> <meta name="citation_reference" content="citation_journal_title=Math. Prob. Eng.; citation_title=Computer-based fuzzy numerical method for solving engineering and real-world applications; citation_author=N Rafiq, N Yaqoob, N Kausar, M Shams, NA Mir, YU Gaba, N Khan; citation_publication_date=2021; citation_doi=10.1155/2021/6916282; citation_id=CR38"/> <meta name="citation_reference" content="Shams, M., Rafiq, N., Kausar, N., Mir, N. A., Alalyani, A.: “Computer oriented numerical scheme for solving engineering problems.” Comput. Syst. Sci. Eng. 42.2, 689–701 (2022). https://doi.org/10.32604/csse.2022.022269 ."/> <meta name="citation_author" content="Ren, Yang"/> <meta name="citation_author_email" content="renyangoffical@163.com"/> <meta name="citation_author_institution" content="MSCS, City University of Seattle, Seattle, USA"/> <meta name="format-detection" content="telephone=no"/> <meta name="citation_cover_date" content="2024/12/01"/> <meta property="og:url" content="https://link.springer.com/article/10.1007/s44196-024-00427-6"/> <meta property="og:type" content="article"/> <meta property="og:site_name" content="SpringerLink"/> <meta property="og:title" content="Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision - International Journal of Computational Intelligence Systems"/> <meta property="og:description" content="In view of the current problems of low detection accuracy, poor stability and slow detection speed of intelligent vehicle violation detection systems, this article will use human–computer interaction and computer vision technology to solve the existing problems. First, the picture data required for the experiment is collected through the Bit Vehicle model dataset, and computer vision technology is used for preprocessing. Then, use Kalman filtering to track and study the vehicle to help better predict the trajectory of the vehicle in the area that needs to be detected; finally, use human–computer interaction technology to build the interactive interface of the system and improve the operability of the system. The violation detection system based on computer vision technology has an accuracy of more than 96.86% for the detection of the eight types of violations extracted, and the average detection is 98%. Through computer vision technology, the system can accurately detect and identify vehicle violations in real time, effectively improving the efficiency and safety of traffic management. In addition, the system also pays special attention to the design of human–computer interaction, provides an intuitive and easy-to-use user interface, and enables traffic managers to easily monitor and manage traffic conditions. This innovative intelligent vehicle violation detection system is expected to help the development of traffic management technology in the future."/> <meta property="og:image" content="https://static-content.springer.com/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig1_HTML.jpg"/> <meta name="format-detection" content="telephone=no"> <link rel="apple-touch-icon" sizes="180x180" href=/oscar-static/img/favicons/darwin/apple-touch-icon-92e819bf8a.png> <link rel="icon" type="image/png" sizes="192x192" href=/oscar-static/img/favicons/darwin/android-chrome-192x192-6f081ca7e5.png> <link rel="icon" type="image/png" sizes="32x32" href=/oscar-static/img/favicons/darwin/favicon-32x32-1435da3e82.png> <link rel="icon" type="image/png" sizes="16x16" href=/oscar-static/img/favicons/darwin/favicon-16x16-ed57f42bd2.png> <link rel="shortcut icon" data-test="shortcut-icon" href=/oscar-static/img/favicons/darwin/favicon-c6d59aafac.ico> <meta name="theme-color" content="#e6e6e6"> <!-- Please see discussion: https://github.com/springernature/frontend-open-space/issues/316--> <!--TODO: Implement alternative to CTM in here if the discussion concludes we do not continue with CTM as a practice--> <link rel="stylesheet" media="print" href=/oscar-static/app-springerlink/css/print-b8af42253b.css> <style> html{text-size-adjust:100%;line-height:1.15}body{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;line-height:1.8;margin:0}details,main{display:block}h1{font-size:2em;margin:.67em 0}a{background-color:transparent;color:#025e8d}sub{bottom:-.25em;font-size:75%;line-height:0;position:relative;vertical-align:baseline}img{border:0;height:auto;max-width:100%;vertical-align:middle}button,input{font-family:inherit;font-size:100%;line-height:1.15;margin:0;overflow:visible}button{text-transform:none}[type=button],[type=submit],button{-webkit-appearance:button}[type=search]{-webkit-appearance:textfield;outline-offset:-2px}summary{display:list-item}[hidden]{display:none}button{cursor:pointer}svg{height:1rem;width:1rem} </style> <style>@media only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark) { body{background:#fff;color:#222;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;line-height:1.8;min-height:100%}a{color:#025e8d;text-decoration:underline;text-decoration-skip-ink:auto}button{cursor:pointer}img{border:0;height:auto;max-width:100%;vertical-align:middle}html{box-sizing:border-box;font-size:100%;height:100%;overflow-y:scroll}h1{font-size:2.25rem}h2{font-size:1.75rem}h1,h2,h4{font-weight:700;line-height:1.2}h4{font-size:1.25rem}body{font-size:1.125rem}*{box-sizing:inherit}p{margin-bottom:2rem;margin-top:0}p:last-of-type{margin-bottom:0}.c-ad{text-align:center}@media only screen and (min-width:480px){.c-ad{padding:8px}}.c-ad--728x90{display:none}.c-ad--728x90 .c-ad__inner{min-height:calc(1.5em + 94px)}@media only screen and (min-width:876px){.js .c-ad--728x90{display:none}}.c-ad__label{color:#333;font-size:.875rem;font-weight:400;line-height:1.5;margin-bottom:4px}.c-ad__label,.c-status-message{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-status-message{align-items:center;box-sizing:border-box;display:flex;position:relative;width:100%}.c-status-message :last-child{margin-bottom:0}.c-status-message--boxed{background-color:#fff;border:1px solid #ccc;line-height:1.4;padding:16px}.c-status-message__heading{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;font-weight:700}.c-status-message__icon{fill:currentcolor;display:inline-block;flex:0 0 auto;height:1.5em;margin-right:8px;transform:translate(0);vertical-align:text-top;width:1.5em}.c-status-message__icon--top{align-self:flex-start}.c-status-message--info .c-status-message__icon{color:#003f8d}.c-status-message--boxed.c-status-message--info{border-bottom:4px solid #003f8d}.c-status-message--error .c-status-message__icon{color:#c40606}.c-status-message--boxed.c-status-message--error{border-bottom:4px solid #c40606}.c-status-message--success .c-status-message__icon{color:#00b8b0}.c-status-message--boxed.c-status-message--success{border-bottom:4px solid #00b8b0}.c-status-message--warning .c-status-message__icon{color:#edbc53}.c-status-message--boxed.c-status-message--warning{border-bottom:4px solid #edbc53}.eds-c-header{background-color:#fff;border-bottom:2px solid #01324b;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;line-height:1.5;padding:8px 0 0}.eds-c-header__container{align-items:center;display:flex;flex-wrap:nowrap;gap:8px 16px;justify-content:space-between;margin:0 auto 8px;max-width:1280px;padding:0 8px;position:relative}.eds-c-header__nav{border-top:2px solid #c5e0f4;padding-top:4px;position:relative}.eds-c-header__nav-container{align-items:center;display:flex;flex-wrap:wrap;margin:0 auto 4px;max-width:1280px;padding:0 8px;position:relative}.eds-c-header__nav-container>:not(:last-child){margin-right:32px}.eds-c-header__link-container{align-items:center;display:flex;flex:1 0 auto;gap:8px 16px;justify-content:space-between}.eds-c-header__list{list-style:none;margin:0;padding:0}.eds-c-header__list-item{font-weight:700;margin:0 auto;max-width:1280px;padding:8px}.eds-c-header__list-item:not(:last-child){border-bottom:2px solid #c5e0f4}.eds-c-header__item{color:inherit}@media only screen and (min-width:768px){.eds-c-header__item--menu{display:none;visibility:hidden}.eds-c-header__item--menu:first-child+*{margin-block-start:0}}.eds-c-header__item--inline-links{display:none;visibility:hidden}@media only screen and (min-width:768px){.eds-c-header__item--inline-links{display:flex;gap:16px 16px;visibility:visible}}.eds-c-header__item--divider:before{border-left:2px solid #c5e0f4;content:"";height:calc(100% - 16px);margin-left:-15px;position:absolute;top:8px}.eds-c-header__brand{padding:16px 8px}.eds-c-header__brand a{display:block;line-height:1;text-decoration:none}.eds-c-header__brand img{height:1.5rem;width:auto}.eds-c-header__link{color:inherit;display:inline-block;font-weight:700;padding:16px 8px;position:relative;text-decoration-color:transparent;white-space:nowrap;word-break:normal}.eds-c-header__icon{fill:currentcolor;display:inline-block;font-size:1.5rem;height:1em;transform:translate(0);vertical-align:bottom;width:1em}.eds-c-header__icon+*{margin-left:8px}.eds-c-header__expander{background-color:#f0f7fc}.eds-c-header__search{display:block;padding:24px 0}@media only screen and (min-width:768px){.eds-c-header__search{max-width:70%}}.eds-c-header__search-container{position:relative}.eds-c-header__search-label{color:inherit;display:inline-block;font-weight:700;margin-bottom:8px}.eds-c-header__search-input{background-color:#fff;border:1px solid #000;padding:8px 48px 8px 8px;width:100%}.eds-c-header__search-button{background-color:transparent;border:0;color:inherit;height:100%;padding:0 8px;position:absolute;right:0}.has-tethered.eds-c-header__expander{border-bottom:2px solid #01324b;left:0;margin-top:-2px;top:100%;width:100%;z-index:10}@media only screen and (min-width:768px){.has-tethered.eds-c-header__expander--menu{display:none;visibility:hidden}}.has-tethered .eds-c-header__heading{display:none;visibility:hidden}.has-tethered .eds-c-header__heading:first-child+*{margin-block-start:0}.has-tethered .eds-c-header__search{margin:auto}.eds-c-header__heading{margin:0 auto;max-width:1280px;padding:16px 16px 0}.eds-c-pagination{align-items:center;display:flex;flex-wrap:wrap;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;gap:16px 0;justify-content:center;line-height:1.4;list-style:none;margin:0;padding:32px 0}@media only screen and (min-width:480px){.eds-c-pagination{padding:32px 16px}}.eds-c-pagination__item{margin-right:8px}.eds-c-pagination__item--prev{margin-right:16px}.eds-c-pagination__item--next .eds-c-pagination__link,.eds-c-pagination__item--prev .eds-c-pagination__link{padding:16px 8px}.eds-c-pagination__item--next{margin-left:8px}.eds-c-pagination__item:last-child{margin-right:0}.eds-c-pagination__link{align-items:center;color:#222;cursor:pointer;display:inline-block;font-size:1rem;margin:0;padding:16px 24px;position:relative;text-align:center;transition:all .2s ease 0s}.eds-c-pagination__link:visited{color:#222}.eds-c-pagination__link--disabled{border-color:#555;color:#555;cursor:default}.eds-c-pagination__link--active{background-color:#01324b;background-image:none;border-radius:8px;color:#fff}.eds-c-pagination__link--active:focus,.eds-c-pagination__link--active:hover,.eds-c-pagination__link--active:visited{color:#fff}.eds-c-pagination__link-container{align-items:center;display:flex}.eds-c-pagination__icon{fill:#222;height:1.5rem;width:1.5rem}.eds-c-pagination__icon--disabled{fill:#555}.eds-c-pagination__visually-hidden{clip:rect(0,0,0,0);border:0;clip-path:inset(50%);height:1px;overflow:hidden;padding:0;position:absolute!important;white-space:nowrap;width:1px}.c-breadcrumbs{color:#333;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;list-style:none;margin:0;padding:0}.c-breadcrumbs>li{display:inline}svg.c-breadcrumbs__chevron{fill:#333;height:10px;margin:0 .25rem;width:10px}.c-breadcrumbs--contrast,.c-breadcrumbs--contrast .c-breadcrumbs__link{color:#fff}.c-breadcrumbs--contrast svg.c-breadcrumbs__chevron{fill:#fff}@media only screen and (max-width:479px){.c-breadcrumbs .c-breadcrumbs__item{display:none}.c-breadcrumbs .c-breadcrumbs__item:last-child,.c-breadcrumbs .c-breadcrumbs__item:nth-last-child(2){display:inline}}.c-skip-link{background:#01324b;bottom:auto;color:#fff;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;padding:8px;position:absolute;text-align:center;transform:translateY(-100%);width:100%;z-index:9999}@media (prefers-reduced-motion:reduce){.c-skip-link{transition:top .3s ease-in-out 0s}}@media print{.c-skip-link{display:none}}.c-skip-link:active,.c-skip-link:hover,.c-skip-link:link,.c-skip-link:visited{color:#fff}.c-skip-link:focus{transform:translateY(0)}.l-with-sidebar{display:flex;flex-wrap:wrap}.l-with-sidebar>*{margin:0}.l-with-sidebar__sidebar{flex-basis:var(--with-sidebar--basis,400px);flex-grow:1}.l-with-sidebar>:not(.l-with-sidebar__sidebar){flex-basis:0px;flex-grow:999;min-width:var(--with-sidebar--min,53%)}.l-with-sidebar>:first-child{padding-right:4rem}@supports (gap:1em){.l-with-sidebar>:first-child{padding-right:0}.l-with-sidebar{gap:var(--with-sidebar--gap,4rem)}}.c-header__link{color:inherit;display:inline-block;font-weight:700;padding:16px 8px;position:relative;text-decoration-color:transparent;white-space:nowrap;word-break:normal}.app-masthead__colour-4{--background-color:#ff9500;--gradient-light:rgba(0,0,0,.5);--gradient-dark:rgba(0,0,0,.8)}.app-masthead{background:var(--background-color,#0070a8);position:relative}.app-masthead:after{background:radial-gradient(circle at top right,var(--gradient-light,rgba(0,0,0,.4)),var(--gradient-dark,rgba(0,0,0,.7)));bottom:0;content:"";left:0;position:absolute;right:0;top:0}@media only screen and (max-width:479px){.app-masthead:after{background:linear-gradient(225deg,var(--gradient-light,rgba(0,0,0,.4)),var(--gradient-dark,rgba(0,0,0,.7)))}}.app-masthead__container{color:var(--masthead-color,#fff);margin:0 auto;max-width:1280px;padding:0 16px;position:relative;z-index:1}.u-button{align-items:center;background-color:#01324b;background-image:none;border:4px solid transparent;border-radius:32px;cursor:pointer;display:inline-flex;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;font-weight:700;justify-content:center;line-height:1.3;margin:0;padding:16px 32px;position:relative;transition:all .2s ease 0s;width:auto}.u-button svg,.u-button--contrast svg,.u-button--primary svg,.u-button--secondary svg,.u-button--tertiary svg{fill:currentcolor}.u-button,.u-button:visited{color:#fff}.u-button,.u-button:hover{box-shadow:0 0 0 1px #01324b;text-decoration:none}.u-button:hover{border:4px solid #fff}.u-button:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.u-button:focus,.u-button:hover{background-color:#fff;background-image:none;color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--primary:focus svg path,.app-masthead--pastel .c-pdf-download .u-button--primary:hover svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover svg path,.u-button--primary:focus svg path,.u-button--primary:hover svg path,.u-button:focus svg path,.u-button:hover svg path{fill:#01324b}.u-button--primary{background-color:#01324b;background-image:none;border:4px solid transparent;box-shadow:0 0 0 1px #01324b;color:#fff;font-weight:700}.u-button--primary:visited{color:#fff}.u-button--primary:hover{border:4px solid #fff;box-shadow:0 0 0 1px #01324b;text-decoration:none}.u-button--primary:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.u-button--primary:focus,.u-button--primary:hover{background-color:#fff;background-image:none;color:#01324b}.u-button--secondary{background-color:#fff;border:4px solid #fff;color:#01324b;font-weight:700}.u-button--secondary:visited{color:#01324b}.u-button--secondary:hover{border:4px solid #01324b;box-shadow:none}.u-button--secondary:focus,.u-button--secondary:hover{background-color:#01324b;color:#fff}.app-masthead--pastel .c-pdf-download .u-button--secondary:focus svg path,.app-masthead--pastel .c-pdf-download .u-button--secondary:hover svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:focus svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:hover svg path,.u-button--secondary:focus svg path,.u-button--secondary:hover svg path,.u-button--tertiary:focus svg path,.u-button--tertiary:hover svg path{fill:#fff}.u-button--tertiary{background-color:#ebf1f5;border:4px solid transparent;box-shadow:none;color:#666;font-weight:700}.u-button--tertiary:visited{color:#666}.u-button--tertiary:hover{border:4px solid #01324b;box-shadow:none}.u-button--tertiary:focus,.u-button--tertiary:hover{background-color:#01324b;color:#fff}.u-button--contrast{background-color:transparent;background-image:none;color:#fff;font-weight:400}.u-button--contrast:visited{color:#fff}.u-button--contrast,.u-button--contrast:focus,.u-button--contrast:hover{border:4px solid #fff}.u-button--contrast:focus,.u-button--contrast:hover{background-color:#fff;background-image:none;color:#000}.u-button--contrast:focus svg path,.u-button--contrast:hover svg path{fill:#000}.u-button--disabled,.u-button:disabled{background-color:transparent;background-image:none;border:4px solid #ccc;color:#000;cursor:default;font-weight:400;opacity:.7}.u-button--disabled svg,.u-button:disabled svg{fill:currentcolor}.u-button--disabled:visited,.u-button:disabled:visited{color:#000}.u-button--disabled:focus,.u-button--disabled:hover,.u-button:disabled:focus,.u-button:disabled:hover{border:4px solid #ccc;text-decoration:none}.u-button--disabled:focus,.u-button--disabled:hover,.u-button:disabled:focus,.u-button:disabled:hover{background-color:transparent;background-image:none;color:#000}.u-button--disabled:focus svg path,.u-button--disabled:hover svg path,.u-button:disabled:focus svg path,.u-button:disabled:hover svg path{fill:#000}.u-button--small,.u-button--xsmall{font-size:.875rem;padding:2px 8px}.u-button--small{padding:8px 16px}.u-button--large{font-size:1.125rem;padding:10px 35px}.u-button--full-width{display:flex;width:100%}.u-button--icon-left svg{margin-right:8px}.u-button--icon-right svg{margin-left:8px}.u-clear-both{clear:both}.u-container{margin:0 auto;max-width:1280px;padding:0 16px}.u-justify-content-space-between{justify-content:space-between}.u-display-none{display:none}.js .u-js-hide,.u-hide{display:none;visibility:hidden}.u-visually-hidden{clip:rect(0,0,0,0);border:0;clip-path:inset(50%);height:1px;overflow:hidden;padding:0;position:absolute!important;white-space:nowrap;width:1px}.u-icon{fill:currentcolor;display:inline-block;height:1em;transform:translate(0);vertical-align:text-top;width:1em}.u-list-reset{list-style:none;margin:0;padding:0}.u-ma-16{margin:16px}.u-mt-0{margin-top:0}.u-mt-24{margin-top:24px}.u-mt-32{margin-top:32px}.u-mb-8{margin-bottom:8px}.u-mb-32{margin-bottom:32px}.u-button-reset{background-color:transparent;border:0;padding:0}.u-sans-serif{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.u-serif{font-family:Merriweather,serif}h1,h2,h4{-webkit-font-smoothing:antialiased}p{overflow-wrap:break-word;word-break:break-word}.u-h4{font-size:1.25rem;font-weight:700;line-height:1.2}.u-mbs-0{margin-block-start:0!important}.c-article-header{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-article-identifiers{color:#6f6f6f;display:flex;flex-wrap:wrap;font-size:1rem;line-height:1.3;list-style:none;margin:0 0 8px;padding:0}.c-article-identifiers__item{border-right:1px solid #6f6f6f;list-style:none;margin-right:8px;padding-right:8px}.c-article-identifiers__item:last-child{border-right:0;margin-right:0;padding-right:0}@media only screen and (min-width:876px){.c-article-title{font-size:1.875rem;line-height:1.2}}.c-article-author-list{display:inline;font-size:1rem;list-style:none;margin:0 8px 0 0;padding:0;width:100%}.c-article-author-list__item{display:inline;padding-right:0}.c-article-author-list__show-more{display:none;margin-right:4px}.c-article-author-list__button,.js .c-article-author-list__item--hide,.js .c-article-author-list__show-more{display:none}.js .c-article-author-list--long .c-article-author-list__show-more,.js .c-article-author-list--long+.c-article-author-list__button{display:inline}@media only screen and (max-width:767px){.js .c-article-author-list__item--hide-small-screen{display:none}.js .c-article-author-list--short .c-article-author-list__show-more,.js .c-article-author-list--short+.c-article-author-list__button{display:inline}}#uptodate-client,.js .c-article-author-list--expanded .c-article-author-list__show-more{display:none!important}.js .c-article-author-list--expanded .c-article-author-list__item--hide-small-screen{display:inline!important}.c-article-author-list__button,.c-button-author-list{background:#ebf1f5;border:4px solid #ebf1f5;border-radius:20px;color:#666;font-size:.875rem;line-height:1.4;padding:2px 11px 2px 8px;text-decoration:none}.c-article-author-list__button svg,.c-button-author-list svg{margin:1px 4px 0 0}.c-article-author-list__button:hover,.c-button-author-list:hover{background:#025e8d;border-color:transparent;color:#fff}.c-article-body .c-article-access-provider{padding:8px 16px}.c-article-body .c-article-access-provider,.c-notes{border:1px solid #d5d5d5;border-image:initial;border-left:none;border-right:none;margin:24px 0}.c-article-body .c-article-access-provider__text{color:#555}.c-article-body .c-article-access-provider__text,.c-notes__text{font-size:1rem;margin-bottom:0;padding-bottom:2px;padding-top:2px;text-align:center}.c-article-body .c-article-author-affiliation__address{color:inherit;font-weight:700;margin:0}.c-article-body .c-article-author-affiliation__authors-list{list-style:none;margin:0;padding:0}.c-article-body .c-article-author-affiliation__authors-item{display:inline;margin-left:0}.c-article-authors-search{margin-bottom:24px;margin-top:0}.c-article-authors-search__item,.c-article-authors-search__title{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-article-authors-search__title{color:#626262;font-size:1.05rem;font-weight:700;margin:0;padding:0}.c-article-authors-search__item{font-size:1rem}.c-article-authors-search__text{margin:0}.c-code-block{border:1px solid #fff;font-family:monospace;margin:0 0 24px;padding:20px}.c-code-block__heading{font-weight:400;margin-bottom:16px}.c-code-block__line{display:block;overflow-wrap:break-word;white-space:pre-wrap}.c-article-share-box{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;margin-bottom:24px}.c-article-share-box__description{font-size:1rem;margin-bottom:8px}.c-article-share-box__no-sharelink-info{font-size:.813rem;font-weight:700;margin-bottom:24px;padding-top:4px}.c-article-share-box__only-read-input{border:1px solid #d5d5d5;box-sizing:content-box;display:inline-block;font-size:.875rem;font-weight:700;height:24px;margin-bottom:8px;padding:8px 10px}.c-article-share-box__additional-info{color:#626262;font-size:.813rem}.c-article-share-box__button{background:#fff;box-sizing:content-box;text-align:center}.c-article-share-box__button--link-like{background-color:transparent;border:0;color:#025e8d;cursor:pointer;font-size:.875rem;margin-bottom:8px;margin-left:10px}.c-article-associated-content__container .c-article-associated-content__collection-label{font-size:.875rem;line-height:1.4}.c-article-associated-content__container .c-article-associated-content__collection-title{line-height:1.3}.c-reading-companion{clear:both;min-height:389px}.c-reading-companion__figures-list,.c-reading-companion__references-list{list-style:none;min-height:389px;padding:0}.c-reading-companion__references-list--numeric{list-style:decimal inside}.c-reading-companion__figure-item{border-top:1px solid #d5d5d5;font-size:1rem;padding:16px 8px 16px 0}.c-reading-companion__figure-item:first-child{border-top:none;padding-top:8px}.c-reading-companion__reference-item{font-size:1rem}.c-reading-companion__reference-item:first-child{border-top:none}.c-reading-companion__reference-item a{word-break:break-word}.c-reading-companion__reference-citation{display:inline}.c-reading-companion__reference-links{font-size:.813rem;font-weight:700;list-style:none;margin:8px 0 0;padding:0;text-align:right}.c-reading-companion__reference-links>a{display:inline-block;padding-left:8px}.c-reading-companion__reference-links>a:first-child{display:inline-block;padding-left:0}.c-reading-companion__figure-title{display:block;font-size:1.25rem;font-weight:700;line-height:1.2;margin:0 0 8px}.c-reading-companion__figure-links{display:flex;justify-content:space-between;margin:8px 0 0}.c-reading-companion__figure-links>a{align-items:center;display:flex}.c-article-section__figure-caption{display:block;margin-bottom:8px;word-break:break-word}.c-article-section__figure .video,p.app-article-masthead__access--above-download{margin:0 0 16px}.c-article-section__figure-description{font-size:1rem}.c-article-section__figure-description>*{margin-bottom:0}.c-cod{display:block;font-size:1rem;width:100%}.c-cod__form{background:#ebf0f3}.c-cod__prompt{font-size:1.125rem;line-height:1.3;margin:0 0 24px}.c-cod__label{display:block;margin:0 0 4px}.c-cod__row{display:flex;margin:0 0 16px}.c-cod__row:last-child{margin:0}.c-cod__input{border:1px solid #d5d5d5;border-radius:2px;flex-shrink:0;margin:0;padding:13px}.c-cod__input--submit{background-color:#025e8d;border:1px solid #025e8d;color:#fff;flex-shrink:1;margin-left:8px;transition:background-color .2s ease-out 0s,color .2s ease-out 0s}.c-cod__input--submit-single{flex-basis:100%;flex-shrink:0;margin:0}.c-cod__input--submit:focus,.c-cod__input--submit:hover{background-color:#fff;color:#025e8d}.save-data .c-article-author-institutional-author__sub-division,.save-data .c-article-equation__number,.save-data .c-article-figure-description,.save-data .c-article-fullwidth-content,.save-data .c-article-main-column,.save-data .c-article-satellite-article-link,.save-data .c-article-satellite-subtitle,.save-data .c-article-table-container,.save-data .c-blockquote__body,.save-data .c-code-block__heading,.save-data .c-reading-companion__figure-title,.save-data .c-reading-companion__reference-citation,.save-data .c-site-messages--nature-briefing-email-variant .serif,.save-data .c-site-messages--nature-briefing-email-variant.serif,.save-data .serif,.save-data .u-serif,.save-data h1,.save-data h2,.save-data h3{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-pdf-download__link{display:flex;flex:1 1 0%;padding:13px 24px}.c-pdf-download__link:hover{text-decoration:none}@media only screen and (min-width:768px){.c-context-bar--sticky .c-pdf-download__link{align-items:center;flex:1 1 183px}}@media only screen and (max-width:320px){.c-context-bar--sticky .c-pdf-download__link{padding:16px}}.c-article-body .c-article-recommendations-list,.c-book-body .c-article-recommendations-list{display:flex;flex-direction:row;gap:16px 16px;margin:0;max-width:100%;padding:16px 0 0}.c-article-body .c-article-recommendations-list__item,.c-book-body .c-article-recommendations-list__item{flex:1 1 0%}@media only screen and (max-width:767px){.c-article-body .c-article-recommendations-list,.c-book-body .c-article-recommendations-list{flex-direction:column}}.c-article-body .c-article-recommendations-card__authors{display:none;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;line-height:1.5;margin:0 0 8px}@media only screen and (max-width:767px){.c-article-body .c-article-recommendations-card__authors{display:block;margin:0}}.c-article-body .c-article-history{margin-top:24px}.app-article-metrics-bar p{margin:0}.app-article-masthead{display:flex;flex-direction:column;gap:16px 16px;padding:16px 0 24px}.app-article-masthead__info{display:flex;flex-direction:column;flex-grow:1}.app-article-masthead__brand{border-top:1px solid hsla(0,0%,100%,.8);display:flex;flex-direction:column;flex-shrink:0;gap:8px 8px;min-height:96px;padding:16px 0 0}.app-article-masthead__brand img{border:1px solid #fff;border-radius:8px;box-shadow:0 4px 15px 0 hsla(0,0%,50%,.25);height:auto;left:0;position:absolute;width:72px}.app-article-masthead__journal-link{display:block;font-size:1.125rem;font-weight:700;margin:0 0 8px;max-width:400px;padding:0 0 0 88px;position:relative}.app-article-masthead__journal-title{-webkit-box-orient:vertical;-webkit-line-clamp:3;display:-webkit-box;overflow:hidden}.app-article-masthead__submission-link{align-items:center;display:flex;font-size:1rem;gap:4px 4px;margin:0 0 0 88px}.app-article-masthead__access{align-items:center;display:flex;flex-wrap:wrap;font-size:.875rem;font-weight:300;gap:4px 4px;margin:0}.app-article-masthead__buttons{display:flex;flex-flow:column wrap;gap:16px 16px}.app-article-masthead__access svg,.app-masthead--pastel .c-pdf-download .u-button--primary svg,.app-masthead--pastel .c-pdf-download .u-button--secondary svg,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary svg,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary svg{fill:currentcolor}.app-article-masthead a{color:#fff}.app-masthead--pastel .c-pdf-download .u-button--primary,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary{background-color:#025e8d;background-image:none;border:2px solid transparent;box-shadow:none;color:#fff;font-weight:700}.app-masthead--pastel .c-pdf-download .u-button--primary:visited,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:visited{color:#fff}.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{text-decoration:none}.app-masthead--pastel .c-pdf-download .u-button--primary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.app-masthead--pastel .c-pdf-download .u-button--primary:focus,.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{background-color:#fff;background-image:none;color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{background:0 0;border:2px solid #025e8d;box-shadow:none;color:#025e8d}.app-masthead--pastel .c-pdf-download .u-button--secondary,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary{background:0 0;border:2px solid #025e8d;color:#025e8d;font-weight:700}.app-masthead--pastel .c-pdf-download .u-button--secondary:visited,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:visited{color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--secondary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:hover{background-color:#01324b;background-color:#025e8d;border:2px solid transparent;box-shadow:none;color:#fff}.app-masthead--pastel .c-pdf-download .u-button--secondary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:focus{background-color:#fff;background-image:none;border:4px solid #fc0;color:#01324b}@media only screen and (min-width:768px){.app-article-masthead{flex-direction:row;gap:64px 64px;padding:24px 0}.app-article-masthead__brand{border:0;padding:0}.app-article-masthead__brand img{height:auto;position:static;width:auto}.app-article-masthead__buttons{align-items:center;flex-direction:row;margin-top:auto}.app-article-masthead__journal-link{display:flex;flex-direction:column;gap:24px 24px;margin:0 0 8px;padding:0}.app-article-masthead__submission-link{margin:0}}@media only screen and (min-width:1024px){.app-article-masthead__brand{flex-basis:400px}}.app-article-masthead .c-article-identifiers{font-size:.875rem;font-weight:300;line-height:1;margin:0 0 8px;overflow:hidden;padding:0}.app-article-masthead .c-article-identifiers--cite-list{margin:0 0 16px}.app-article-masthead .c-article-identifiers *{color:#fff}.app-article-masthead .c-cod{display:none}.app-article-masthead .c-article-identifiers__item{border-left:1px solid #fff;border-right:0;margin:0 17px 8px -9px;padding:0 0 0 8px}.app-article-masthead .c-article-identifiers__item--cite{border-left:0}.app-article-metrics-bar{display:flex;flex-wrap:wrap;font-size:1rem;padding:16px 0 0;row-gap:24px}.app-article-metrics-bar__item{padding:0 16px 0 0}.app-article-metrics-bar__count{font-weight:700}.app-article-metrics-bar__label{font-weight:400;padding-left:4px}.app-article-metrics-bar__icon{height:auto;margin-right:4px;margin-top:-4px;width:auto}.app-article-metrics-bar__arrow-icon{margin:4px 0 0 4px}.app-article-metrics-bar a{color:#000}.app-article-metrics-bar .app-article-metrics-bar__item--metrics{padding-right:0}.app-overview-section .c-article-author-list,.app-overview-section__authors{line-height:2}.app-article-metrics-bar{margin-top:8px}.c-book-toc-pagination+.c-book-section__back-to-top{margin-top:0}.c-article-body .c-article-access-provider__text--chapter{color:#222;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;padding:20px 0}.c-article-body .c-article-access-provider__text--chapter svg.c-status-message__icon{fill:#003f8d;vertical-align:middle}.c-article-body-section__content--separator{padding-top:40px}.c-pdf-download__link{max-height:44px}.app-article-access .u-button--primary,.app-article-access .u-button--primary:visited{color:#fff}.c-article-sidebar{display:none}@media only screen and (min-width:1024px){.c-article-sidebar{display:block}}.c-cod__form{border-radius:12px}.c-cod__label{font-size:.875rem}.c-cod .c-status-message{align-items:center;justify-content:center;margin-bottom:16px;padding-bottom:16px}@media only screen and (min-width:1024px){.c-cod .c-status-message{align-items:inherit}}.c-cod .c-status-message__icon{margin-top:4px}.c-cod .c-cod__prompt{font-size:1rem;margin-bottom:16px}.c-article-body .app-article-access,.c-book-body .app-article-access{display:block}@media only screen and (min-width:1024px){.c-article-body .app-article-access,.c-book-body .app-article-access{display:none}}.c-article-body .app-card-service{margin-bottom:32px}@media only screen and (min-width:1024px){.c-article-body .app-card-service{display:none}}.app-article-access .buybox__buy .u-button--secondary,.app-article-access .u-button--primary,.c-cod__row .u-button--primary{background-color:#025e8d;border:2px solid #025e8d;box-shadow:none;font-size:1rem;font-weight:700;gap:8px 8px;justify-content:center;line-height:1.5;padding:8px 24px}.app-article-access .buybox__buy .u-button--secondary,.app-article-access .u-button--primary:hover,.c-cod__row .u-button--primary:hover{background-color:#fff;color:#025e8d}.app-article-access .buybox__buy .u-button--secondary:hover{background-color:#025e8d;color:#fff}.buybox__buy .c-notes__text{color:#666;font-size:.875rem;padding:0 16px 8px}.c-cod__input{flex-basis:auto;width:100%}.c-article-title{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:2.25rem;font-weight:700;line-height:1.2;margin:12px 0}.c-reading-companion__figure-item figure{margin:0}@media only screen and (min-width:768px){.c-article-title{margin:16px 0}}.app-article-access{border:1px solid #c5e0f4;border-radius:12px}.app-article-access__heading{border-bottom:1px solid #c5e0f4;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1.125rem;font-weight:700;margin:0;padding:16px;text-align:center}.app-article-access .buybox__info svg{vertical-align:middle}.c-article-body .app-article-access p{margin-bottom:0}.app-article-access .buybox__info{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;margin:0}.app-article-access{margin:0 0 32px}@media only screen and (min-width:1024px){.app-article-access{margin:0 0 24px}}.c-status-message{font-size:1rem}.c-article-body{font-size:1.125rem}.c-article-body dl,.c-article-body ol,.c-article-body p,.c-article-body ul{margin-bottom:32px;margin-top:0}.c-article-access-provider__text:last-of-type,.c-article-body .c-notes__text:last-of-type{margin-bottom:0}.c-article-body ol p,.c-article-body ul p{margin-bottom:16px}.c-article-section__figure-caption{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-reading-companion__figure-item{border-top-color:#c5e0f4}.c-reading-companion__sticky{max-width:400px}.c-article-section .c-article-section__figure-description>*{font-size:1rem;margin-bottom:16px}.c-reading-companion__reference-item{border-top:1px solid #d5d5d5;padding:16px 0}.c-reading-companion__reference-item:first-child{padding-top:0}.c-article-share-box__button,.js .c-article-authors-search__item .c-article-button{background:0 0;border:2px solid #025e8d;border-radius:32px;box-shadow:none;color:#025e8d;font-size:1rem;font-weight:700;line-height:1.5;margin:0;padding:8px 24px;transition:all .2s ease 0s}.c-article-authors-search__item .c-article-button{width:100%}.c-pdf-download .u-button{background-color:#fff;border:2px solid #fff;color:#01324b;justify-content:center}.c-context-bar__container .c-pdf-download .u-button svg,.c-pdf-download .u-button svg{fill:currentcolor}.c-pdf-download .u-button:visited{color:#01324b}.c-pdf-download .u-button:hover{border:4px solid #01324b;box-shadow:none}.c-pdf-download .u-button:focus,.c-pdf-download .u-button:hover{background-color:#01324b}.c-pdf-download .u-button:focus svg path,.c-pdf-download .u-button:hover svg path{fill:#fff}.c-context-bar__container .c-pdf-download .u-button{background-image:none;border:2px solid;color:#fff}.c-context-bar__container .c-pdf-download .u-button:visited{color:#fff}.c-context-bar__container .c-pdf-download .u-button:hover{text-decoration:none}.c-context-bar__container .c-pdf-download .u-button:focus{box-shadow:none;outline:0;text-decoration:none}.c-context-bar__container .c-pdf-download .u-button:focus,.c-context-bar__container .c-pdf-download .u-button:hover{background-color:#fff;background-image:none;color:#01324b}.c-context-bar__container .c-pdf-download .u-button:focus svg path,.c-context-bar__container .c-pdf-download .u-button:hover svg path{fill:#01324b}.c-context-bar__container .c-pdf-download .u-button,.c-pdf-download .u-button{box-shadow:none;font-size:1rem;font-weight:700;line-height:1.5;padding:8px 24px}.c-context-bar__container .c-pdf-download .u-button{background-color:#025e8d}.c-pdf-download .u-button:hover{border:2px solid #fff}.c-pdf-download .u-button:focus,.c-pdf-download .u-button:hover{background:0 0;box-shadow:none;color:#fff}.c-context-bar__container .c-pdf-download .u-button:hover{border:2px solid #025e8d;box-shadow:none;color:#025e8d}.c-context-bar__container .c-pdf-download .u-button:focus,.c-pdf-download .u-button:focus{border:2px solid #025e8d}.c-article-share-box__button:focus:focus,.c-article__pill-button:focus:focus,.c-context-bar__container .c-pdf-download .u-button:focus:focus,.c-pdf-download .u-button:focus:focus{outline:3px solid #08c;will-change:transform}.c-pdf-download__link .u-icon{padding-top:0}.c-bibliographic-information__column button{margin-bottom:16px}.c-article-body .c-article-author-affiliation__list p,.c-article-body .c-article-author-information__list p,figure{margin:0}.c-article-share-box__button{margin-right:16px}.c-status-message--boxed{border-radius:12px}.c-article-associated-content__collection-title{font-size:1rem}.app-card-service__description,.c-article-body .app-card-service__description{color:#222;margin-bottom:0;margin-top:8px}.app-article-access__subscriptions a,.app-article-access__subscriptions a:visited,.app-book-series-listing__item a,.app-book-series-listing__item a:hover,.app-book-series-listing__item a:visited,.c-article-author-list a,.c-article-author-list a:visited,.c-article-buy-box a,.c-article-buy-box a:visited,.c-article-peer-review a,.c-article-peer-review a:visited,.c-article-satellite-subtitle a,.c-article-satellite-subtitle a:visited,.c-breadcrumbs__link,.c-breadcrumbs__link:hover,.c-breadcrumbs__link:visited{color:#000}.c-article-author-list svg{height:24px;margin:0 0 0 6px;width:24px}.c-article-header{margin-bottom:32px}@media only screen and (min-width:876px){.js .c-ad--conditional{display:block}}.u-lazy-ad-wrapper{background-color:#fff;display:none;min-height:149px}@media only screen and (min-width:876px){.u-lazy-ad-wrapper{display:block}}p.c-ad__label{margin-bottom:4px}.c-ad--728x90{background-color:#fff;border-bottom:2px solid #cedbe0} } </style> <style>@media only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark) { .eds-c-header__brand img{height:24px;width:203px}.app-article-masthead__journal-link img{height:93px;width:72px}@media only screen and (min-width:769px){.app-article-masthead__journal-link img{height:161px;width:122px}} } </style> <link rel="stylesheet" data-test="critical-css-handler" data-inline-css-source="critical-css" href=/oscar-static/app-springerlink/css/core-darwin-5272567b64.css media="print" onload="this.media='all';this.onload=null"> <link rel="stylesheet" data-test="critical-css-handler" data-inline-css-source="critical-css" href="/oscar-static/app-springerlink/css/enhanced-darwin-article-72ba046d97.css" media="print" onload="this.media='only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark)';this.onload=null"> <script type="text/javascript"> config = { env: 'live', site: '44196.springer.com', siteWithPath: '44196.springer.com' + window.location.pathname, twitterHashtag: '44196', cmsPrefix: 'https://studio-cms.springernature.com/studio/', publisherBrand: 'Springer', mustardcut: false }; </script> <script> window.dataLayer = [{"GA Key":"UA-26408784-1","DOI":"10.1007/s44196-024-00427-6","Page":"article","springerJournal":true,"Publishing Model":"Open Access","page":{"attributes":{"environment":"live"}},"Country":"HK","japan":false,"doi":"10.1007-s44196-024-00427-6","Journal Id":44196,"Journal Title":"International Journal of Computational Intelligence Systems","imprint":"Springer","Keywords":"Vehicle violation detection system, Computer vision, Human–computer interaction, Kalman filtering, Mean filtering","kwrd":["Vehicle_violation_detection_system","Computer_vision","Human–computer_interaction","Kalman_filtering","Mean_filtering"],"Labs":"Y","ksg":"Krux.segments","kuid":"Krux.uid","Has Body":"Y","Features":[],"Open Access":"Y","hasAccess":"Y","bypassPaywall":"N","user":{"license":{"businessPartnerID":[],"businessPartnerIDString":""}},"Access Type":"open","Bpids":"","Bpnames":"","BPID":["1"],"VG Wort Identifier":"vgzm.415900-10.1007-s44196-024-00427-6","Full HTML":"Y","Subject Codes":["SCT","SCT11014","SCI21000","SCM24005","SCT19000"],"pmc":["T","T11014","I21000","M24005","T19000"],"session":{"authentication":{"loginStatus":"N"},"attributes":{"edition":"academic"}},"content":{"serial":{"eissn":"1875-6883"},"type":"Article","category":{"pmc":{"primarySubject":"Engineering","primarySubjectCode":"T","secondarySubjects":{"1":"Computational Intelligence","2":"Artificial Intelligence","3":"Mathematical Logic and Foundations","4":"Control, Robotics, Mechatronics"},"secondarySubjectCodes":{"1":"T11014","2":"I21000","3":"M24005","4":"T19000"}},"sucode":"SC8","articleType":"Research Article"},"attributes":{"deliveryPlatform":"oscar"}},"Event Category":"Article"}]; </script> <script data-test="springer-link-article-datalayer"> window.dataLayer = window.dataLayer || []; window.dataLayer.push({ ga4MeasurementId: 'G-B3E4QL2TPR', ga360TrackingId: 'UA-26408784-1', twitterId: 'o47a7', baiduId: 'aef3043f025ccf2305af8a194652d70b', ga4ServerUrl: 'https://collect.springer.com', imprint: 'springerlink', page: { attributes:{ featureFlags: [{ name: 'darwin-orion', active: true }, { name: 'chapter-books-recs', active: true } ], darwinAvailable: true } } }); </script> <script> (function(w, d) { w.config = w.config || {}; w.config.mustardcut = false; if (w.matchMedia && w.matchMedia('only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark)').matches) { w.config.mustardcut = true; d.classList.add('js'); d.classList.remove('grade-c'); d.classList.remove('no-js'); } })(window, document.documentElement); </script> <script class="js-entry"> if (window.config.mustardcut) { (function(w, d) { window.Component = {}; window.suppressShareButton = false; window.onArticlePage = true; var currentScript = d.currentScript || d.head.querySelector('script.js-entry'); function catchNoModuleSupport() { var scriptEl = d.createElement('script'); return (!('noModule' in scriptEl) && 'onbeforeload' in scriptEl) } var headScripts = [ {'src': '/oscar-static/js/polyfill-es5-bundle-572d4fec60.js', 'async': false} ]; var bodyScripts = [ {'src': '/oscar-static/js/global-article-es5-bundle-dad1690b0d.js', 'async': false, 'module': false}, {'src': '/oscar-static/js/global-article-es6-bundle-e7d03c4cb3.js', 'async': false, 'module': true} ]; function createScript(script) { var scriptEl = d.createElement('script'); scriptEl.src = script.src; scriptEl.async = script.async; if (script.module === true) { scriptEl.type = "module"; if (catchNoModuleSupport()) { scriptEl.src = ''; } } else if (script.module === false) { scriptEl.setAttribute('nomodule', true) } if (script.charset) { scriptEl.setAttribute('charset', script.charset); } return scriptEl; } for (var i = 0; i < headScripts.length; ++i) { var scriptEl = createScript(headScripts[i]); currentScript.parentNode.insertBefore(scriptEl, currentScript.nextSibling); } d.addEventListener('DOMContentLoaded', function() { for (var i = 0; i < bodyScripts.length; ++i) { var scriptEl = createScript(bodyScripts[i]); d.body.appendChild(scriptEl); } }); // Webfont repeat view var config = w.config; if (config && config.publisherBrand && sessionStorage.fontsLoaded === 'true') { d.documentElement.className += ' webfonts-loaded'; } })(window, document); } </script> <script data-src="https://cdn.optimizely.com/js/27195530232.js" data-cc-script="C03"></script> <script data-test="gtm-head"> window.initGTM = function() { if (window.config.mustardcut) { (function (w, d, s, l, i) { w[l] = w[l] || []; w[l].push({'gtm.start': new Date().getTime(), event: 'gtm.js'}); var f = d.getElementsByTagName(s)[0], j = d.createElement(s), dl = l != 'dataLayer' ? '&l=' + l : ''; j.async = true; j.src = 'https://www.googletagmanager.com/gtm.js?id=' + i + dl; f.parentNode.insertBefore(j, f); })(window, document, 'script', 'dataLayer', 'GTM-MRVXSHQ'); } } </script> <script> (function (w, d, t) { function cc() { var h = w.location.hostname; var e = d.createElement(t), s = d.getElementsByTagName(t)[0]; if (h.indexOf('springer.com') > -1 && h.indexOf('biomedcentral.com') === -1 && h.indexOf('springeropen.com') === -1) { if (h.indexOf('link-qa.springer.com') > -1 || h.indexOf('test-www.springer.com') > -1) { e.src = 'https://cmp.springer.com/production_live/en/consent-bundle-17-52.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.springer.com/production_live/en/consent-bundle-17-52.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('biomedcentral.com') > -1) { if (h.indexOf('biomedcentral.com.qa') > -1) { e.src = 'https://cmp.biomedcentral.com/production_live/en/consent-bundle-15-36.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.biomedcentral.com/production_live/en/consent-bundle-15-36.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('springeropen.com') > -1) { if (h.indexOf('springeropen.com.qa') > -1) { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-16-34.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-16-34.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('springernature.com') > -1) { if (h.indexOf('beta-qa.springernature.com') > -1) { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-49-43.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-NK22KLS')"); } else { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-49-43.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-NK22KLS')"); } } else { e.src = '/oscar-static/js/cookie-consent-es5-bundle-cb57c2c98a.js'; e.setAttribute('data-consent', h); } s.insertAdjacentElement('afterend', e); } cc(); })(window, document, 'script'); </script> <link rel="canonical" href="https://link.springer.com/article/10.1007/s44196-024-00427-6"/> <script type="application/ld+json">{"mainEntity":{"headline":"Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision","description":"In view of the current problems of low detection accuracy, poor stability and slow detection speed of intelligent vehicle violation detection systems, this article will use human–computer interaction and computer vision technology to solve the existing problems. First, the picture data required for the experiment is collected through the Bit Vehicle model dataset, and computer vision technology is used for preprocessing. Then, use Kalman filtering to track and study the vehicle to help better predict the trajectory of the vehicle in the area that needs to be detected; finally, use human–computer interaction technology to build the interactive interface of the system and improve the operability of the system. The violation detection system based on computer vision technology has an accuracy of more than 96.86% for the detection of the eight types of violations extracted, and the average detection is 98%. Through computer vision technology, the system can accurately detect and identify vehicle violations in real time, effectively improving the efficiency and safety of traffic management. In addition, the system also pays special attention to the design of human–computer interaction, provides an intuitive and easy-to-use user interface, and enables traffic managers to easily monitor and manage traffic conditions. This innovative intelligent vehicle violation detection system is expected to help the development of traffic management technology in the future.","datePublished":"2024-02-26T00:00:00Z","dateModified":"2024-02-26T00:00:00Z","pageStart":"1","pageEnd":"14","license":"http://creativecommons.org/licenses/by/4.0/","sameAs":"https://doi.org/10.1007/s44196-024-00427-6","keywords":["Vehicle violation detection system","Computer vision","Human–computer interaction","Kalman filtering","Mean filtering","Computational Intelligence","Artificial Intelligence","Mathematical Logic and Foundations","Control","Robotics","Mechatronics"],"image":["https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig1_HTML.jpg","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig2_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig3_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig4_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig5_HTML.jpg","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig6_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig7_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig8_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig9_HTML.png","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig10_HTML.png"],"isPartOf":{"name":"International Journal of Computational Intelligence Systems","issn":["1875-6883"],"volumeNumber":"17","@type":["Periodical","PublicationVolume"]},"publisher":{"name":"Springer Netherlands","logo":{"url":"https://www.springernature.com/app-sn/public/images/logo-springernature.png","@type":"ImageObject"},"@type":"Organization"},"author":[{"name":"Yang Ren","affiliation":[{"name":"MSCS, City University of Seattle","address":{"name":"MSCS, City University of Seattle, Seattle, USA","@type":"PostalAddress"},"@type":"Organization"}],"email":"renyangoffical@163.com","@type":"Person"}],"isAccessibleForFree":true,"@type":"ScholarlyArticle"},"@context":"https://schema.org","@type":"WebPage"}</script> </head> <body class="" > <!-- Google Tag Manager (noscript) --> <noscript> <iframe src="https://www.googletagmanager.com/ns.html?id=GTM-MRVXSHQ" height="0" width="0" style="display:none;visibility:hidden"></iframe> </noscript> <!-- End Google Tag Manager (noscript) --> <!-- Google Tag Manager (noscript) --> <noscript data-test="gtm-body"> <iframe src="https://www.googletagmanager.com/ns.html?id=GTM-MRVXSHQ" height="0" width="0" style="display:none;visibility:hidden"></iframe> </noscript> <!-- End Google Tag Manager (noscript) --> <div class="u-visually-hidden" aria-hidden="true" data-test="darwin-icons"> <?xml version="1.0" encoding="UTF-8"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"><symbol id="icon-eds-i-accesses-medium" viewBox="0 0 24 24"><path d="M15.59 1a1 1 0 0 1 .706.291l5.41 5.385a1 1 0 0 1 .294.709v13.077c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742H15a1 1 0 0 1 0-2h4.455a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.8L15.178 3H5.545a.543.543 0 0 0-.538.451L5 3.538v8.607a1 1 0 0 1-2 0V3.538A2.542 2.542 0 0 1 5.545 1h10.046ZM8 13c2.052 0 4.66 1.61 6.36 3.4l.124.141c.333.41.516.925.516 1.459 0 .6-.232 1.178-.64 1.599C12.666 21.388 10.054 23 8 23c-2.052 0-4.66-1.61-6.353-3.393A2.31 2.31 0 0 1 1 18c0-.6.232-1.178.64-1.6C3.34 14.61 5.948 13 8 13Zm0 2c-1.369 0-3.552 1.348-4.917 2.785A.31.31 0 0 0 3 18c0 .083.031.161.09.222C4.447 19.652 6.631 21 8 21c1.37 0 3.556-1.35 4.917-2.785A.31.31 0 0 0 13 18a.32.32 0 0 0-.048-.17l-.042-.052C11.553 16.348 9.369 15 8 15Zm0 1a2 2 0 1 1 0 4 2 2 0 0 1 0-4Z"/></symbol><symbol id="icon-eds-i-altmetric-medium" viewBox="0 0 24 24"><path d="M12 1c5.978 0 10.843 4.77 10.996 10.712l.004.306-.002.022-.002.248C22.843 18.23 17.978 23 12 23 5.925 23 1 18.075 1 12S5.925 1 12 1Zm-1.726 9.246L8.848 12.53a1 1 0 0 1-.718.461L8.003 13l-4.947.014a9.001 9.001 0 0 0 17.887-.001L16.553 13l-2.205 3.53a1 1 0 0 1-1.735-.068l-.05-.11-2.289-6.106ZM12 3a9.001 9.001 0 0 0-8.947 8.013l4.391-.012L9.652 7.47a1 1 0 0 1 1.784.179l2.288 6.104 1.428-2.283a1 1 0 0 1 .722-.462l.129-.008 4.943.012A9.001 9.001 0 0 0 12 3Z"/></symbol><symbol id="icon-eds-i-arrow-bend-down-medium" viewBox="0 0 24 24"><path d="m11.852 20.989.058.007L12 21l.075-.003.126-.017.111-.03.111-.044.098-.052.104-.074.082-.073 6-6a1 1 0 0 0-1.414-1.414L13 17.585v-12.2C13 4.075 11.964 3 10.667 3H4a1 1 0 1 0 0 2h6.667c.175 0 .333.164.333.385v12.2l-4.293-4.292a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414l6 6c.035.036.073.068.112.097l.11.071.114.054.105.035.118.025Z"/></symbol><symbol id="icon-eds-i-arrow-bend-down-small" viewBox="0 0 16 16"><path d="M1 2a1 1 0 0 0 1 1h5v8.585L3.707 8.293a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414l5 5 .063.059.093.069.081.048.105.048.104.035.105.022.096.01h.136l.122-.018.113-.03.103-.04.1-.053.102-.07.052-.043 5.04-5.037a1 1 0 1 0-1.415-1.414L9 11.583V3a2 2 0 0 0-2-2H2a1 1 0 0 0-1 1Z"/></symbol><symbol id="icon-eds-i-arrow-bend-up-medium" viewBox="0 0 24 24"><path d="m11.852 3.011.058-.007L12 3l.075.003.126.017.111.03.111.044.098.052.104.074.082.073 6 6a1 1 0 1 1-1.414 1.414L13 6.415v12.2C13 19.925 11.964 21 10.667 21H4a1 1 0 0 1 0-2h6.667c.175 0 .333-.164.333-.385v-12.2l-4.293 4.292a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l6-6c.035-.036.073-.068.112-.097l.11-.071.114-.054.105-.035.118-.025Z"/></symbol><symbol id="icon-eds-i-arrow-bend-up-small" viewBox="0 0 16 16"><path d="M1 13.998a1 1 0 0 1 1-1h5V4.413L3.707 7.705a1 1 0 0 1-1.32.084l-.094-.084a1 1 0 0 1 0-1.414l5-5 .063-.059.093-.068.081-.05.105-.047.104-.035.105-.022L7.94 1l.136.001.122.017.113.03.103.04.1.053.102.07.052.043 5.04 5.037a1 1 0 1 1-1.415 1.414L9 4.415v8.583a2 2 0 0 1-2 2H2a1 1 0 0 1-1-1Z"/></symbol><symbol id="icon-eds-i-arrow-diagonal-medium" viewBox="0 0 24 24"><path d="M14 3h6l.075.003.126.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.054.114.035.105.03.148L21 4v6a1 1 0 0 1-2 0V6.414l-4.293 4.293a1 1 0 0 1-1.414-1.414L17.584 5H14a1 1 0 0 1-.993-.883L13 4a1 1 0 0 1 1-1ZM4 13a1 1 0 0 1 1 1v3.584l4.293-4.291a1 1 0 1 1 1.414 1.414L6.414 19H10a1 1 0 0 1 .993.883L11 20a1 1 0 0 1-1 1l-6.075-.003-.126-.017-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08a1.01 1.01 0 0 1-.097-.112l-.071-.11-.054-.114-.035-.105-.025-.118-.007-.058L3 20v-6a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-arrow-diagonal-small" viewBox="0 0 16 16"><path d="m2 15-.082-.004-.119-.016-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08a1.008 1.008 0 0 1-.097-.112l-.071-.11-.031-.062-.034-.081-.024-.076-.025-.118-.007-.058L1 14.02V9a1 1 0 1 1 2 0v2.584l2.793-2.791a1 1 0 1 1 1.414 1.414L4.414 13H7a1 1 0 0 1 .993.883L8 14a1 1 0 0 1-1 1H2ZM14 1l.081.003.12.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.031.062.034.081.024.076.03.148L15 2v5a1 1 0 0 1-2 0V4.414l-2.96 2.96A1 1 0 1 1 8.626 5.96L11.584 3H9a1 1 0 0 1-.993-.883L8 2a1 1 0 0 1 1-1h5Z"/></symbol><symbol id="icon-eds-i-arrow-down-medium" viewBox="0 0 24 24"><path d="m20.707 12.728-7.99 7.98a.996.996 0 0 1-.561.281l-.157.011a.998.998 0 0 1-.788-.384l-7.918-7.908a1 1 0 0 1 1.414-1.416L11 17.576V4a1 1 0 0 1 2 0v13.598l6.293-6.285a1 1 0 0 1 1.32-.082l.095.083a1 1 0 0 1-.001 1.414Z"/></symbol><symbol id="icon-eds-i-arrow-down-small" viewBox="0 0 16 16"><path d="m1.293 8.707 6 6 .063.059.093.069.081.048.105.049.104.034.056.013.118.017L8 15l.076-.003.122-.017.113-.03.085-.032.063-.03.098-.058.06-.043.05-.043 6.04-6.037a1 1 0 0 0-1.414-1.414L9 11.583V2a1 1 0 1 0-2 0v9.585L2.707 7.293a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414Z"/></symbol><symbol id="icon-eds-i-arrow-left-medium" viewBox="0 0 24 24"><path d="m11.272 3.293-7.98 7.99a.996.996 0 0 0-.281.561L3 12.001c0 .32.15.605.384.788l7.908 7.918a1 1 0 0 0 1.416-1.414L6.424 13H20a1 1 0 0 0 0-2H6.402l6.285-6.293a1 1 0 0 0 .082-1.32l-.083-.095a1 1 0 0 0-1.414.001Z"/></symbol><symbol id="icon-eds-i-arrow-left-small" viewBox="0 0 16 16"><path d="m7.293 1.293-6 6-.059.063-.069.093-.048.081-.049.105-.034.104-.013.056-.017.118L1 8l.003.076.017.122.03.113.032.085.03.063.058.098.043.06.043.05 6.037 6.04a1 1 0 0 0 1.414-1.414L4.417 9H14a1 1 0 0 0 0-2H4.415l4.292-4.293a1 1 0 0 0 .083-1.32l-.083-.094a1 1 0 0 0-1.414 0Z"/></symbol><symbol id="icon-eds-i-arrow-right-medium" viewBox="0 0 24 24"><path d="m12.728 3.293 7.98 7.99a.996.996 0 0 1 .281.561l.011.157c0 .32-.15.605-.384.788l-7.908 7.918a1 1 0 0 1-1.416-1.414L17.576 13H4a1 1 0 0 1 0-2h13.598l-6.285-6.293a1 1 0 0 1-.082-1.32l.083-.095a1 1 0 0 1 1.414.001Z"/></symbol><symbol id="icon-eds-i-arrow-right-small" viewBox="0 0 16 16"><path d="m8.707 1.293 6 6 .059.063.069.093.048.081.049.105.034.104.013.056.017.118L15 8l-.003.076-.017.122-.03.113-.032.085-.03.063-.058.098-.043.06-.043.05-6.037 6.04a1 1 0 0 1-1.414-1.414L11.583 9H2a1 1 0 1 1 0-2h9.585L7.293 2.707a1 1 0 0 1-.083-1.32l.083-.094a1 1 0 0 1 1.414 0Z"/></symbol><symbol id="icon-eds-i-arrow-up-medium" viewBox="0 0 24 24"><path d="m3.293 11.272 7.99-7.98a.996.996 0 0 1 .561-.281L12.001 3c.32 0 .605.15.788.384l7.918 7.908a1 1 0 0 1-1.414 1.416L13 6.424V20a1 1 0 0 1-2 0V6.402l-6.293 6.285a1 1 0 0 1-1.32.082l-.095-.083a1 1 0 0 1 .001-1.414Z"/></symbol><symbol id="icon-eds-i-arrow-up-small" viewBox="0 0 16 16"><path d="m1.293 7.293 6-6 .063-.059.093-.069.081-.048.105-.049.104-.034.056-.013.118-.017L8 1l.076.003.122.017.113.03.085.032.063.03.098.058.06.043.05.043 6.04 6.037a1 1 0 0 1-1.414 1.414L9 4.417V14a1 1 0 0 1-2 0V4.415L2.707 8.707a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414Z"/></symbol><symbol id="icon-eds-i-article-medium" viewBox="0 0 24 24"><path d="M8 7a1 1 0 0 0 0 2h4a1 1 0 1 0 0-2H8ZM8 11a1 1 0 1 0 0 2h8a1 1 0 1 0 0-2H8ZM7 16a1 1 0 0 1 1-1h8a1 1 0 1 1 0 2H8a1 1 0 0 1-1-1Z"/><path d="M5.545 1A2.542 2.542 0 0 0 3 3.538v16.924A2.542 2.542 0 0 0 5.545 23h12.91A2.542 2.542 0 0 0 21 20.462V3.5A2.5 2.5 0 0 0 18.5 1H5.545ZM5 3.538C5 3.245 5.24 3 5.545 3H18.5a.5.5 0 0 1 .5.5v16.962c0 .293-.24.538-.546.538H5.545A.542.542 0 0 1 5 20.462V3.538Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-book-medium" viewBox="0 0 24 24"><path d="M18.5 1A2.5 2.5 0 0 1 21 3.5v12c0 1.16-.79 2.135-1.86 2.418l-.14.031V21h1a1 1 0 0 1 .993.883L21 22a1 1 0 0 1-1 1H6.5A3.5 3.5 0 0 1 3 19.5v-15A3.5 3.5 0 0 1 6.5 1h12ZM17 18H6.5a1.5 1.5 0 0 0-1.493 1.356L5 19.5A1.5 1.5 0 0 0 6.5 21H17v-3Zm1.5-15h-12A1.5 1.5 0 0 0 5 4.5v11.837l.054-.025a3.481 3.481 0 0 1 1.254-.307L6.5 16h12a.5.5 0 0 0 .492-.41L19 15.5v-12a.5.5 0 0 0-.5-.5ZM15 6a1 1 0 0 1 0 2H9a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-book-series-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M1 3.786C1 2.759 1.857 2 2.82 2H6.18c.964 0 1.82.759 1.82 1.786V4h3.168c.668 0 1.298.364 1.616.938.158-.109.333-.195.523-.252l3.216-.965c.923-.277 1.962.204 2.257 1.187l4.146 13.82c.296.984-.307 1.957-1.23 2.234l-3.217.965c-.923.277-1.962-.203-2.257-1.187L13 10.005v10.21c0 1.04-.878 1.785-1.834 1.785H7.833c-.291 0-.575-.07-.83-.195A1.849 1.849 0 0 1 6.18 22H2.821C1.857 22 1 21.241 1 20.214V3.786ZM3 4v11h3V4H3Zm0 16v-3h3v3H3Zm15.075-.04-.814-2.712 2.874-.862.813 2.712-2.873.862Zm1.485-5.49-2.874.862-2.634-8.782 2.873-.862 2.635 8.782ZM8 20V6h3v14H8Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-calendar-acceptance-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-.534 7.747a1 1 0 0 1 .094 1.412l-4.846 5.538a1 1 0 0 1-1.352.141l-2.77-2.076a1 1 0 0 1 1.2-1.6l2.027 1.519 4.236-4.84a1 1 0 0 1 1.411-.094ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-date-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1ZM8 15a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm-4-4a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-decision-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-2.935 8.246 2.686 2.645c.34.335.34.883 0 1.218l-2.686 2.645a.858.858 0 0 1-1.213-.009.854.854 0 0 1 .009-1.21l1.05-1.035H7.984a.992.992 0 0 1-.984-1c0-.552.44-1 .984-1h5.928l-1.051-1.036a.854.854 0 0 1-.085-1.121l.076-.088a.858.858 0 0 1 1.213-.009ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-impact-factor-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-3.2 6.924a.48.48 0 0 1 .125.544l-1.52 3.283h2.304c.27 0 .491.215.491.483a.477.477 0 0 1-.13.327l-4.18 4.484a.498.498 0 0 1-.69.031.48.48 0 0 1-.125-.544l1.52-3.284H9.291a.487.487 0 0 1-.491-.482c0-.121.047-.238.13-.327l4.18-4.484a.498.498 0 0 1 .69-.031ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-call-papers-medium" viewBox="0 0 24 24"><g><path d="m20.707 2.883-1.414 1.414a1 1 0 0 0 1.414 1.414l1.414-1.414a1 1 0 0 0-1.414-1.414Z"/><path d="M6 16.054c0 2.026 1.052 2.943 3 2.943a1 1 0 1 1 0 2c-2.996 0-5-1.746-5-4.943v-1.227a4.068 4.068 0 0 1-1.83-1.189 4.553 4.553 0 0 1-.87-1.455 4.868 4.868 0 0 1-.3-1.686c0-1.17.417-2.298 1.17-3.14.38-.426.834-.767 1.338-1 .51-.237 1.06-.36 1.617-.36L6.632 6H7l7.932-2.895A2.363 2.363 0 0 1 18 5.36v9.28a2.36 2.36 0 0 1-3.069 2.25l.084.03L7 14.997H6v1.057Zm9.637-11.057a.415.415 0 0 0-.083.008L8 7.638v5.536l7.424 1.786.104.02c.035.01.072.02.109.02.2 0 .363-.16.363-.36V5.36c0-.2-.163-.363-.363-.363Zm-9.638 3h-.874a1.82 1.82 0 0 0-.625.111l-.15.063a2.128 2.128 0 0 0-.689.517c-.42.47-.661 1.123-.661 1.81 0 .34.06.678.176.992.114.308.28.585.485.816.4.447.925.691 1.464.691h.874v-5Z" clip-rule="evenodd"/><path d="M20 8.997h2a1 1 0 1 1 0 2h-2a1 1 0 1 1 0-2ZM20.707 14.293l1.414 1.414a1 1 0 0 1-1.414 1.414l-1.414-1.414a1 1 0 0 1 1.414-1.414Z"/></g></symbol><symbol id="icon-eds-i-card-medium" viewBox="0 0 24 24"><path d="M19.615 2c.315 0 .716.067 1.14.279.76.38 1.245 1.107 1.245 2.106v15.23c0 .315-.067.716-.279 1.14-.38.76-1.107 1.245-2.106 1.245H4.385a2.56 2.56 0 0 1-1.14-.279C2.485 21.341 2 20.614 2 19.615V4.385c0-.315.067-.716.279-1.14C2.659 2.485 3.386 2 4.385 2h15.23Zm0 2H4.385c-.213 0-.265.034-.317.14A.71.71 0 0 0 4 4.385v15.23c0 .213.034.265.14.317a.71.71 0 0 0 .245.068h15.23c.213 0 .265-.034.317-.14a.71.71 0 0 0 .068-.245V4.385c0-.213-.034-.265-.14-.317A.71.71 0 0 0 19.615 4ZM17 16a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h10Zm0-3a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h10Zm-.5-7A1.5 1.5 0 0 1 18 7.5v3a1.5 1.5 0 0 1-1.5 1.5h-9A1.5 1.5 0 0 1 6 10.5v-3A1.5 1.5 0 0 1 7.5 6h9ZM16 8H8v2h8V8Z"/></symbol><symbol id="icon-eds-i-cart-medium" viewBox="0 0 24 24"><path d="M5.76 1a1 1 0 0 1 .994.902L7.155 6h13.34c.18 0 .358.02.532.057l.174.045a2.5 2.5 0 0 1 1.693 3.103l-2.069 7.03c-.36 1.099-1.398 1.823-2.49 1.763H8.65c-1.272.015-2.352-.927-2.546-2.244L4.852 3H2a1 1 0 0 1-.993-.883L1 2a1 1 0 0 1 1-1h3.76Zm2.328 14.51a.555.555 0 0 0 .55.488l9.751.001a.533.533 0 0 0 .527-.357l2.059-7a.5.5 0 0 0-.48-.642H7.351l.737 7.51ZM18 19a2 2 0 1 1 0 4 2 2 0 0 1 0-4ZM8 19a2 2 0 1 1 0 4 2 2 0 0 1 0-4Z"/></symbol><symbol id="icon-eds-i-check-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm5.125 4.72a1 1 0 0 1 .156 1.405l-6 7.5a1 1 0 0 1-1.421.143l-3-2.5a1 1 0 0 1 1.28-1.536l2.217 1.846 5.362-6.703a1 1 0 0 1 1.406-.156Z"/></symbol><symbol id="icon-eds-i-check-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm5.125 6.72a1 1 0 0 0-1.406.155l-5.362 6.703-2.217-1.846a1 1 0 1 0-1.28 1.536l3 2.5a1 1 0 0 0 1.42-.143l6-7.5a1 1 0 0 0-.155-1.406Z"/></symbol><symbol id="icon-eds-i-chevron-down-medium" viewBox="0 0 24 24"><path d="M3.305 8.28a1 1 0 0 0-.024 1.415l7.495 7.762c.314.345.757.543 1.224.543.467 0 .91-.198 1.204-.522l7.515-7.783a1 1 0 1 0-1.438-1.39L12 15.845l-7.28-7.54A1 1 0 0 0 3.4 8.2l-.096.082Z"/></symbol><symbol id="icon-eds-i-chevron-down-small" viewBox="0 0 16 16"><path d="M13.692 5.278a1 1 0 0 1 .03 1.414L9.103 11.51a1.491 1.491 0 0 1-2.188.019L2.278 6.692a1 1 0 0 1 1.444-1.384L8 9.771l4.278-4.463a1 1 0 0 1 1.318-.111l.096.081Z"/></symbol><symbol id="icon-eds-i-chevron-left-medium" viewBox="0 0 24 24"><path d="M15.72 3.305a1 1 0 0 0-1.415-.024l-7.762 7.495A1.655 1.655 0 0 0 6 12c0 .467.198.91.522 1.204l7.783 7.515a1 1 0 1 0 1.39-1.438L8.155 12l7.54-7.28A1 1 0 0 0 15.8 3.4l-.082-.096Z"/></symbol><symbol id="icon-eds-i-chevron-left-small" viewBox="0 0 16 16"><path d="M10.722 2.308a1 1 0 0 0-1.414-.03L4.49 6.897a1.491 1.491 0 0 0-.019 2.188l4.838 4.637a1 1 0 1 0 1.384-1.444L6.229 8l4.463-4.278a1 1 0 0 0 .111-1.318l-.081-.096Z"/></symbol><symbol id="icon-eds-i-chevron-right-medium" viewBox="0 0 24 24"><path d="M8.28 3.305a1 1 0 0 1 1.415-.024l7.762 7.495c.345.314.543.757.543 1.224 0 .467-.198.91-.522 1.204l-7.783 7.515a1 1 0 1 1-1.39-1.438L15.845 12l-7.54-7.28A1 1 0 0 1 8.2 3.4l.082-.096Z"/></symbol><symbol id="icon-eds-i-chevron-right-small" viewBox="0 0 16 16"><path d="M5.278 2.308a1 1 0 0 1 1.414-.03l4.819 4.619a1.491 1.491 0 0 1 .019 2.188l-4.838 4.637a1 1 0 1 1-1.384-1.444L9.771 8 5.308 3.722a1 1 0 0 1-.111-1.318l.081-.096Z"/></symbol><symbol id="icon-eds-i-chevron-up-medium" viewBox="0 0 24 24"><path d="M20.695 15.72a1 1 0 0 0 .024-1.415l-7.495-7.762A1.655 1.655 0 0 0 12 6c-.467 0-.91.198-1.204.522l-7.515 7.783a1 1 0 1 0 1.438 1.39L12 8.155l7.28 7.54a1 1 0 0 0 1.319.106l.096-.082Z"/></symbol><symbol id="icon-eds-i-chevron-up-small" viewBox="0 0 16 16"><path d="M13.692 10.722a1 1 0 0 0 .03-1.414L9.103 4.49a1.491 1.491 0 0 0-2.188-.019L2.278 9.308a1 1 0 0 0 1.444 1.384L8 6.229l4.278 4.463a1 1 0 0 0 1.318.111l.096-.081Z"/></symbol><symbol id="icon-eds-i-citations-medium" viewBox="0 0 24 24"><path d="M15.59 1a1 1 0 0 1 .706.291l5.41 5.385a1 1 0 0 1 .294.709v13.077c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742h-5.843a1 1 0 1 1 0-2h5.843a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.8L15.178 3H5.545a.543.543 0 0 0-.538.451L5 3.538v8.607a1 1 0 0 1-2 0V3.538A2.542 2.542 0 0 1 5.545 1h10.046ZM5.483 14.35c.197.26.17.62-.049.848l-.095.083-.016.011c-.36.24-.628.45-.804.634-.393.409-.59.93-.59 1.562.077-.019.192-.028.345-.028.442 0 .84.158 1.195.474.355.316.532.716.532 1.2 0 .501-.173.9-.518 1.198-.345.298-.767.446-1.266.446-.672 0-1.209-.195-1.612-.585-.403-.39-.604-.976-.604-1.757 0-.744.11-1.39.33-1.938.222-.549.49-1.009.807-1.38a4.28 4.28 0 0 1 .992-.88c.07-.043.148-.087.232-.133a.881.881 0 0 1 1.121.245Zm5 0c.197.26.17.62-.049.848l-.095.083-.016.011c-.36.24-.628.45-.804.634-.393.409-.59.93-.59 1.562.077-.019.192-.028.345-.028.442 0 .84.158 1.195.474.355.316.532.716.532 1.2 0 .501-.173.9-.518 1.198-.345.298-.767.446-1.266.446-.672 0-1.209-.195-1.612-.585-.403-.39-.604-.976-.604-1.757 0-.744.11-1.39.33-1.938.222-.549.49-1.009.807-1.38a4.28 4.28 0 0 1 .992-.88c.07-.043.148-.087.232-.133a.881.881 0 0 1 1.121.245Z"/></symbol><symbol id="icon-eds-i-clipboard-check-medium" viewBox="0 0 24 24"><path d="M14.4 1c1.238 0 2.274.865 2.536 2.024L18.5 3C19.886 3 21 4.14 21 5.535v14.93C21 21.86 19.886 23 18.5 23h-13C4.114 23 3 21.86 3 20.465V5.535C3 4.14 4.114 3 5.5 3h1.57c.27-1.147 1.3-2 2.53-2h4.8Zm4.115 4-1.59.024A2.601 2.601 0 0 1 14.4 7H9.6c-1.23 0-2.26-.853-2.53-2H5.5c-.27 0-.5.234-.5.535v14.93c0 .3.23.535.5.535h13c.27 0 .5-.234.5-.535V5.535c0-.3-.23-.535-.485-.535Zm-1.909 4.205a1 1 0 0 1 .19 1.401l-5.334 7a1 1 0 0 1-1.344.23l-2.667-1.75a1 1 0 1 1 1.098-1.672l1.887 1.238 4.769-6.258a1 1 0 0 1 1.401-.19ZM14.4 3H9.6a.6.6 0 0 0-.6.6v.8a.6.6 0 0 0 .6.6h4.8a.6.6 0 0 0 .6-.6v-.8a.6.6 0 0 0-.6-.6Z"/></symbol><symbol id="icon-eds-i-clipboard-report-medium" viewBox="0 0 24 24"><path d="M14.4 1c1.238 0 2.274.865 2.536 2.024L18.5 3C19.886 3 21 4.14 21 5.535v14.93C21 21.86 19.886 23 18.5 23h-13C4.114 23 3 21.86 3 20.465V5.535C3 4.14 4.114 3 5.5 3h1.57c.27-1.147 1.3-2 2.53-2h4.8Zm4.115 4-1.59.024A2.601 2.601 0 0 1 14.4 7H9.6c-1.23 0-2.26-.853-2.53-2H5.5c-.27 0-.5.234-.5.535v14.93c0 .3.23.535.5.535h13c.27 0 .5-.234.5-.535V5.535c0-.3-.23-.535-.485-.535Zm-2.658 10.929a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h7.857Zm0-3.929a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h7.857ZM14.4 3H9.6a.6.6 0 0 0-.6.6v.8a.6.6 0 0 0 .6.6h4.8a.6.6 0 0 0 .6-.6v-.8a.6.6 0 0 0-.6-.6Z"/></symbol><symbol id="icon-eds-i-close-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18ZM8.707 7.293 12 10.585l3.293-3.292a1 1 0 0 1 1.414 1.414L13.415 12l3.292 3.293a1 1 0 0 1-1.414 1.414L12 13.415l-3.293 3.292a1 1 0 1 1-1.414-1.414L10.585 12 7.293 8.707a1 1 0 0 1 1.414-1.414Z"/></symbol><symbol id="icon-eds-i-cloud-upload-medium" viewBox="0 0 24 24"><path d="m12.852 10.011.028-.004L13 10l.075.003.126.017.086.022.136.052.098.052.104.074.082.073 3 3a1 1 0 0 1 0 1.414l-.094.083a1 1 0 0 1-1.32-.083L14 13.416V20a1 1 0 0 1-2 0v-6.586l-1.293 1.293a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l3-3 .112-.097.11-.071.114-.054.105-.035.118-.025Zm.587-7.962c3.065.362 5.497 2.662 5.992 5.562l.013.085.207.073c2.117.782 3.496 2.845 3.337 5.097l-.022.226c-.297 2.561-2.503 4.491-5.124 4.502a1 1 0 1 1-.009-2c1.619-.007 2.967-1.186 3.147-2.733.179-1.542-.86-2.979-2.487-3.353-.512-.149-.894-.579-.981-1.165-.21-2.237-2-4.035-4.308-4.308-2.31-.273-4.497 1.06-5.25 3.19l-.049.113c-.234.468-.718.756-1.176.743-1.418.057-2.689.857-3.32 2.084a3.668 3.668 0 0 0 .262 3.798c.796 1.136 2.169 1.764 3.583 1.635a1 1 0 1 1 .182 1.992c-2.125.194-4.193-.753-5.403-2.48a5.668 5.668 0 0 1-.403-5.86c.85-1.652 2.449-2.79 4.323-3.092l.287-.039.013-.028c1.207-2.741 4.125-4.404 7.186-4.042Z"/></symbol><symbol id="icon-eds-i-collection-medium" viewBox="0 0 24 24"><path d="M21 7a1 1 0 0 1 1 1v12.5a2.5 2.5 0 0 1-2.5 2.5H8a1 1 0 0 1 0-2h11.5a.5.5 0 0 0 .5-.5V8a1 1 0 0 1 1-1Zm-5.5-5A2.5 2.5 0 0 1 18 4.5v12a2.5 2.5 0 0 1-2.5 2.5h-11A2.5 2.5 0 0 1 2 16.5v-12A2.5 2.5 0 0 1 4.5 2h11Zm0 2h-11a.5.5 0 0 0-.5.5v12a.5.5 0 0 0 .5.5h11a.5.5 0 0 0 .5-.5v-12a.5.5 0 0 0-.5-.5ZM13 13a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h6Zm0-3.5a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h6ZM13 6a1 1 0 0 1 0 2H7a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-conference-series-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M4.5 2A2.5 2.5 0 0 0 2 4.5v11A2.5 2.5 0 0 0 4.5 18h2.37l-2.534 2.253a1 1 0 0 0 1.328 1.494L9.88 18H11v3a1 1 0 1 0 2 0v-3h1.12l4.216 3.747a1 1 0 0 0 1.328-1.494L17.13 18h2.37a2.5 2.5 0 0 0 2.5-2.5v-11A2.5 2.5 0 0 0 19.5 2h-15ZM20 6V4.5a.5.5 0 0 0-.5-.5h-15a.5.5 0 0 0-.5.5V6h16ZM4 8v7.5a.5.5 0 0 0 .5.5h15a.5.5 0 0 0 .5-.5V8H4Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-delivery-medium" viewBox="0 0 24 24"><path d="M8.51 20.598a3.037 3.037 0 0 1-3.02 0A2.968 2.968 0 0 1 4.161 19L3.5 19A2.5 2.5 0 0 1 1 16.5v-11A2.5 2.5 0 0 1 3.5 3h10a2.5 2.5 0 0 1 2.45 2.004L16 5h2.527c.976 0 1.855.585 2.27 1.49l2.112 4.62a1 1 0 0 1 .091.416v4.856C23 17.814 21.889 19 20.484 19h-.523a1.01 1.01 0 0 1-.121-.007 2.96 2.96 0 0 1-1.33 1.605 3.037 3.037 0 0 1-3.02 0A2.968 2.968 0 0 1 14.161 19H9.838a2.968 2.968 0 0 1-1.327 1.597Zm-2.024-3.462a.955.955 0 0 0-.481.73L5.999 18l.001.022a.944.944 0 0 0 .388.777l.098.065c.316.181.712.181 1.028 0A.97.97 0 0 0 8 17.978a.95.95 0 0 0-.486-.842 1.037 1.037 0 0 0-1.028 0Zm10 0a.955.955 0 0 0-.481.73l-.005.156a.944.944 0 0 0 .388.777l.098.065c.316.181.712.181 1.028 0a.97.97 0 0 0 .486-.886.95.95 0 0 0-.486-.842 1.037 1.037 0 0 0-1.028 0ZM21 12h-5v3.17a3.038 3.038 0 0 1 2.51.232 2.993 2.993 0 0 1 1.277 1.45l.058.155.058-.005.581-.002c.27 0 .516-.263.516-.618V12Zm-7.5-7h-10a.5.5 0 0 0-.5.5v11a.5.5 0 0 0 .5.5h.662a2.964 2.964 0 0 1 1.155-1.491l.172-.107a3.037 3.037 0 0 1 3.022 0A2.987 2.987 0 0 1 9.843 17H13.5a.5.5 0 0 0 .5-.5v-11a.5.5 0 0 0-.5-.5Zm5.027 2H16v3h4.203l-1.224-2.677a.532.532 0 0 0-.375-.316L18.527 7Z"/></symbol><symbol id="icon-eds-i-download-medium" viewBox="0 0 24 24"><path d="M22 18.5a3.5 3.5 0 0 1-3.5 3.5h-13A3.5 3.5 0 0 1 2 18.5V18a1 1 0 0 1 2 0v.5A1.5 1.5 0 0 0 5.5 20h13a1.5 1.5 0 0 0 1.5-1.5V18a1 1 0 0 1 2 0v.5Zm-3.293-7.793-6 6-.063.059-.093.069-.081.048-.105.049-.104.034-.056.013-.118.017L12 17l-.076-.003-.122-.017-.113-.03-.085-.032-.063-.03-.098-.058-.06-.043-.05-.043-6.04-6.037a1 1 0 0 1 1.414-1.414l4.294 4.29L11 3a1 1 0 0 1 2 0l.001 10.585 4.292-4.292a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414Z"/></symbol><symbol id="icon-eds-i-edit-medium" viewBox="0 0 24 24"><path d="M17.149 2a2.38 2.38 0 0 1 1.699.711l2.446 2.46a2.384 2.384 0 0 1 .005 3.38L10.01 19.906a1 1 0 0 1-.434.257l-6.3 1.8a1 1 0 0 1-1.237-1.237l1.8-6.3a1 1 0 0 1 .257-.434L15.443 2.718A2.385 2.385 0 0 1 17.15 2Zm-3.874 5.689-7.586 7.536-1.234 4.319 4.318-1.234 7.54-7.582-3.038-3.039ZM17.149 4a.395.395 0 0 0-.286.126L14.695 6.28l3.029 3.029 2.162-2.173a.384.384 0 0 0 .106-.197L20 6.864c0-.103-.04-.2-.119-.278l-2.457-2.47A.385.385 0 0 0 17.149 4Z"/></symbol><symbol id="icon-eds-i-education-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M12.41 2.088a1 1 0 0 0-.82 0l-10 4.5a1 1 0 0 0 0 1.824L3 9.047v7.124A3.001 3.001 0 0 0 4 22a3 3 0 0 0 1-5.83V9.948l1 .45V14.5a1 1 0 0 0 .087.408L7 14.5c-.913.408-.912.41-.912.41l.001.003.003.006.007.015a1.988 1.988 0 0 0 .083.16c.054.097.131.225.236.373.21.297.53.68.993 1.057C8.351 17.292 9.824 18 12 18c2.176 0 3.65-.707 4.589-1.476.463-.378.783-.76.993-1.057a4.162 4.162 0 0 0 .319-.533l.007-.015.003-.006v-.003h.002s0-.002-.913-.41l.913.408A1 1 0 0 0 18 14.5v-4.103l4.41-1.985a1 1 0 0 0 0-1.824l-10-4.5ZM16 11.297l-3.59 1.615a1 1 0 0 1-.82 0L8 11.297v2.94a3.388 3.388 0 0 0 .677.739C9.267 15.457 10.294 16 12 16s2.734-.543 3.323-1.024a3.388 3.388 0 0 0 .677-.739v-2.94ZM4.437 7.5 12 4.097 19.563 7.5 12 10.903 4.437 7.5ZM3 19a1 1 0 1 1 2 0 1 1 0 0 1-2 0Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-error-diamond-medium" viewBox="0 0 24 24"><path d="M12.002 1c.702 0 1.375.279 1.871.775l8.35 8.353a2.646 2.646 0 0 1 .001 3.744l-8.353 8.353a2.646 2.646 0 0 1-3.742 0l-8.353-8.353a2.646 2.646 0 0 1 0-3.744l8.353-8.353.156-.142c.424-.362.952-.58 1.507-.625l.21-.008Zm0 2a.646.646 0 0 0-.38.123l-.093.08-8.34 8.34a.646.646 0 0 0-.18.355L3 12c0 .171.068.336.19.457l8.353 8.354a.646.646 0 0 0 .914 0l8.354-8.354a.646.646 0 0 0-.001-.914l-8.351-8.354A.646.646 0 0 0 12.002 3ZM12 14.5a1.5 1.5 0 0 1 .144 2.993L12 17.5a1.5 1.5 0 0 1 0-3ZM12 6a1 1 0 0 1 1 1v5a1 1 0 0 1-2 0V7a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-error-filled-medium" viewBox="0 0 24 24"><path d="M12.002 1c.702 0 1.375.279 1.871.775l8.35 8.353a2.646 2.646 0 0 1 .001 3.744l-8.353 8.353a2.646 2.646 0 0 1-3.742 0l-8.353-8.353a2.646 2.646 0 0 1 0-3.744l8.353-8.353.156-.142c.424-.362.952-.58 1.507-.625l.21-.008ZM12 14.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 14.5ZM12 6a1 1 0 0 0-1 1v5a1 1 0 0 0 2 0V7a1 1 0 0 0-1-1Z"/></symbol><symbol id="icon-eds-i-external-link-medium" viewBox="0 0 24 24"><path d="M9 2a1 1 0 1 1 0 2H4.6c-.371 0-.6.209-.6.5v15c0 .291.229.5.6.5h14.8c.371 0 .6-.209.6-.5V15a1 1 0 0 1 2 0v4.5c0 1.438-1.162 2.5-2.6 2.5H4.6C3.162 22 2 20.938 2 19.5v-15C2 3.062 3.162 2 4.6 2H9Zm6 0h6l.075.003.126.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.054.114.035.105.03.148L22 3v6a1 1 0 0 1-2 0V5.414l-6.693 6.693a1 1 0 0 1-1.414-1.414L18.584 4H15a1 1 0 0 1-.993-.883L14 3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-external-link-small" viewBox="0 0 16 16"><path d="M5 1a1 1 0 1 1 0 2l-2-.001V13L13 13v-2a1 1 0 0 1 2 0v2c0 1.15-.93 2-2.067 2H3.067C1.93 15 1 14.15 1 13V3c0-1.15.93-2 2.067-2H5Zm4 0h5l.075.003.126.017.111.03.111.044.098.052.096.067.09.08.044.047.073.093.051.083.054.113.035.105.03.148L15 2v5a1 1 0 0 1-2 0V4.414L9.107 8.307a1 1 0 0 1-1.414-1.414L11.584 3H9a1 1 0 0 1-.993-.883L8 2a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-file-download-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3ZM12 7a1 1 0 0 1 1 1v6.585l2.293-2.292a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414l-4 4a1.008 1.008 0 0 1-.112.097l-.11.071-.114.054-.105.035-.149.03L12 18l-.075-.003-.126-.017-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08-4-4a1 1 0 0 1 1.414-1.414L11 14.585V8a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-file-report-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742H5.545c-.674 0-1.32-.267-1.798-.742A2.535 2.535 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .142.057.278.158.379.102.102.242.159.387.159h12.91a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.915L14.085 3ZM16 17a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm0-3a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm-4.793-6.207L13 9.585l1.793-1.792a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414l-2.5 2.5a1 1 0 0 1-1.414 0L10.5 9.915l-1.793 1.792a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l2.5-2.5a1 1 0 0 1 1.414 0Z"/></symbol><symbol id="icon-eds-i-file-text-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3ZM16 15a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm0-4a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm-5-4a1 1 0 0 1 0 2H8a1 1 0 1 1 0-2h3Z"/></symbol><symbol id="icon-eds-i-file-upload-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3Zm-2.233 4.011.058-.007L12 7l.075.003.126.017.111.03.111.044.098.052.104.074.082.073 4 4a1 1 0 0 1 0 1.414l-.094.083a1 1 0 0 1-1.32-.083L13 10.415V17a1 1 0 0 1-2 0v-6.585l-2.293 2.292a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l4-4 .112-.097.11-.071.114-.054.105-.035.118-.025Z"/></symbol><symbol id="icon-eds-i-filter-medium" viewBox="0 0 24 24"><path d="M21 2a1 1 0 0 1 .82 1.573L15 13.314V18a1 1 0 0 1-.31.724l-.09.076-4 3A1 1 0 0 1 9 21v-7.684L2.18 3.573a1 1 0 0 1 .707-1.567L3 2h18Zm-1.921 2H4.92l5.9 8.427a1 1 0 0 1 .172.45L11 13v6l2-1.5V13a1 1 0 0 1 .117-.469l.064-.104L19.079 4Z"/></symbol><symbol id="icon-eds-i-funding-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M23 8A7 7 0 1 0 9 8a7 7 0 0 0 14 0ZM9.006 12.225A4.07 4.07 0 0 0 6.12 11.02H2a.979.979 0 1 0 0 1.958h4.12c.558 0 1.094.222 1.489.617l2.207 2.288c.27.27.27.687.012.944a.656.656 0 0 1-.928 0L7.744 15.67a.98.98 0 0 0-1.386 1.384l1.157 1.158c.535.536 1.244.791 1.946.765l.041.002h6.922c.874 0 1.597.748 1.597 1.688 0 .203-.146.354-.309.354H7.755c-.487 0-.96-.178-1.339-.504L2.64 17.259a.979.979 0 0 0-1.28 1.482L5.137 22c.733.631 1.66.979 2.618.979h9.957c1.26 0 2.267-1.043 2.267-2.312 0-2.006-1.584-3.646-3.555-3.646h-4.529a2.617 2.617 0 0 0-.681-2.509l-2.208-2.287ZM16 3a5 5 0 1 0 0 10 5 5 0 0 0 0-10Zm.979 3.5a.979.979 0 1 0-1.958 0v3a.979.979 0 1 0 1.958 0v-3Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-hashtag-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18ZM9.52 18.189a1 1 0 1 1-1.964-.378l.437-2.274H6a1 1 0 1 1 0-2h2.378l.592-3.076H6a1 1 0 0 1 0-2h3.354l.51-2.65a1 1 0 1 1 1.964.378l-.437 2.272h3.04l.51-2.65a1 1 0 1 1 1.964.378l-.438 2.272H18a1 1 0 0 1 0 2h-1.917l-.592 3.076H18a1 1 0 0 1 0 2h-2.893l-.51 2.652a1 1 0 1 1-1.964-.378l.437-2.274h-3.04l-.51 2.652Zm.895-4.652h3.04l.591-3.076h-3.04l-.591 3.076Z"/></symbol><symbol id="icon-eds-i-home-medium" viewBox="0 0 24 24"><path d="M5 22a1 1 0 0 1-1-1v-8.586l-1.293 1.293a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l10-10a1 1 0 0 1 1.414 0l10 10a1 1 0 0 1-1.414 1.414L20 12.415V21a1 1 0 0 1-1 1H5Zm7-17.585-6 5.999V20h5v-4a1 1 0 0 1 2 0v4h5v-9.585l-6-6Z"/></symbol><symbol id="icon-eds-i-image-medium" viewBox="0 0 24 24"><path d="M19.615 2A2.385 2.385 0 0 1 22 4.385v15.23A2.385 2.385 0 0 1 19.615 22H4.385A2.385 2.385 0 0 1 2 19.615V4.385A2.385 2.385 0 0 1 4.385 2h15.23Zm0 2H4.385A.385.385 0 0 0 4 4.385v15.23c0 .213.172.385.385.385h1.244l10.228-8.76a1 1 0 0 1 1.254-.037L20 13.392V4.385A.385.385 0 0 0 19.615 4Zm-3.07 9.283L8.703 20h10.912a.385.385 0 0 0 .385-.385v-3.713l-3.455-2.619ZM9.5 6a3.5 3.5 0 1 1 0 7 3.5 3.5 0 0 1 0-7Zm0 2a1.5 1.5 0 1 0 0 3 1.5 1.5 0 0 0 0-3Z"/></symbol><symbol id="icon-eds-i-impact-factor-medium" viewBox="0 0 24 24"><path d="M16.49 2.672c.74.694.986 1.765.632 2.712l-.04.1-1.549 3.54h1.477a2.496 2.496 0 0 1 2.485 2.34l.005.163c0 .618-.23 1.21-.642 1.675l-7.147 7.961a2.48 2.48 0 0 1-3.554.165 2.512 2.512 0 0 1-.633-2.712l.042-.103L9.108 15H7.46c-1.393 0-2.379-1.11-2.455-2.369L5 12.473c0-.593.142-1.145.628-1.692l7.307-7.944a2.48 2.48 0 0 1 3.555-.165ZM14.43 4.164l-7.33 7.97c-.083.093-.101.214-.101.34 0 .277.19.526.46.526h4.163l.097-.009c.015 0 .03.003.046.009.181.078.264.32.186.5l-2.554 5.817a.512.512 0 0 0 .127.552.48.48 0 0 0 .69-.033l7.155-7.97a.513.513 0 0 0 .13-.34.497.497 0 0 0-.49-.502h-3.988a.355.355 0 0 1-.328-.497l2.555-5.844a.512.512 0 0 0-.127-.552.48.48 0 0 0-.69.033Z"/></symbol><symbol id="icon-eds-i-info-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm0 7a1 1 0 0 1 1 1v5h1.5a1 1 0 0 1 0 2h-5a1 1 0 0 1 0-2H11v-4h-.5a1 1 0 0 1-.993-.883L9.5 11a1 1 0 0 1 1-1H12Zm0-4.5a1.5 1.5 0 0 1 .144 2.993L12 8.5a1.5 1.5 0 0 1 0-3Z"/></symbol><symbol id="icon-eds-i-info-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 9h-1.5a1 1 0 0 0-1 1l.007.117A1 1 0 0 0 10.5 12h.5v4H9.5a1 1 0 0 0 0 2h5a1 1 0 0 0 0-2H13v-5a1 1 0 0 0-1-1Zm0-4.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 5.5Z"/></symbol><symbol id="icon-eds-i-journal-medium" viewBox="0 0 24 24"><path d="M18.5 1A2.5 2.5 0 0 1 21 3.5v14a2.5 2.5 0 0 1-2.5 2.5h-13a.5.5 0 1 0 0 1H20a1 1 0 0 1 0 2H5.5A2.5 2.5 0 0 1 3 20.5v-17A2.5 2.5 0 0 1 5.5 1h13ZM7 3H5.5a.5.5 0 0 0-.5.5v14.549l.016-.002c.104-.02.211-.035.32-.042L5.5 18H7V3Zm11.5 0H9v15h9.5a.5.5 0 0 0 .5-.5v-14a.5.5 0 0 0-.5-.5ZM16 5a1 1 0 0 1 1 1v4a1 1 0 0 1-1 1h-5a1 1 0 0 1-1-1V6a1 1 0 0 1 1-1h5Zm-1 2h-3v2h3V7Z"/></symbol><symbol id="icon-eds-i-mail-medium" viewBox="0 0 24 24"><path d="M20.462 3C21.875 3 23 4.184 23 5.619v12.762C23 19.816 21.875 21 20.462 21H3.538C2.125 21 1 19.816 1 18.381V5.619C1 4.184 2.125 3 3.538 3h16.924ZM21 8.158l-7.378 6.258a2.549 2.549 0 0 1-3.253-.008L3 8.16v10.222c0 .353.253.619.538.619h16.924c.285 0 .538-.266.538-.619V8.158ZM20.462 5H3.538c-.264 0-.5.228-.534.542l8.65 7.334c.2.165.492.165.684.007l8.656-7.342-.001-.025c-.044-.3-.274-.516-.531-.516Z"/></symbol><symbol id="icon-eds-i-mail-send-medium" viewBox="0 0 24 24"><path d="M20.444 5a2.562 2.562 0 0 1 2.548 2.37l.007.078.001.123v7.858A2.564 2.564 0 0 1 20.444 18H9.556A2.564 2.564 0 0 1 7 15.429l.001-7.977.007-.082A2.561 2.561 0 0 1 9.556 5h10.888ZM21 9.331l-5.46 3.51a1 1 0 0 1-1.08 0L9 9.332v6.097c0 .317.251.571.556.571h10.888a.564.564 0 0 0 .556-.571V9.33ZM20.444 7H9.556a.543.543 0 0 0-.32.105l5.763 3.706 5.766-3.706a.543.543 0 0 0-.32-.105ZM4.308 5a1 1 0 1 1 0 2H2a1 1 0 1 1 0-2h2.308Zm0 5.5a1 1 0 0 1 0 2H2a1 1 0 0 1 0-2h2.308Zm0 5.5a1 1 0 0 1 0 2H2a1 1 0 0 1 0-2h2.308Z"/></symbol><symbol id="icon-eds-i-mentions-medium" viewBox="0 0 24 24"><path d="m9.452 1.293 5.92 5.92 2.92-2.92a1 1 0 0 1 1.415 1.414l-2.92 2.92 5.92 5.92a1 1 0 0 1 0 1.415 10.371 10.371 0 0 1-10.378 2.584l.652 3.258A1 1 0 0 1 12 23H2a1 1 0 0 1-.874-1.486l4.789-8.62C4.194 9.074 4.9 4.43 8.038 1.292a1 1 0 0 1 1.414 0Zm-2.355 13.59L3.699 21h7.081l-.689-3.442a10.392 10.392 0 0 1-2.775-2.396l-.22-.28Zm1.69-11.427-.07.09a8.374 8.374 0 0 0 11.737 11.737l.089-.071L8.787 3.456Z"/></symbol><symbol id="icon-eds-i-menu-medium" viewBox="0 0 24 24"><path d="M21 4a1 1 0 0 1 0 2H3a1 1 0 1 1 0-2h18Zm-4 7a1 1 0 0 1 0 2H3a1 1 0 0 1 0-2h14Zm4 7a1 1 0 0 1 0 2H3a1 1 0 0 1 0-2h18Z"/></symbol><symbol id="icon-eds-i-metrics-medium" viewBox="0 0 24 24"><path d="M3 22a1 1 0 0 1-1-1V3a1 1 0 0 1 1-1h6a1 1 0 0 1 1 1v7h4V8a1 1 0 0 1 1-1h6a1 1 0 0 1 1 1v13a1 1 0 0 1-.883.993L21 22H3Zm17-2V9h-4v11h4Zm-6-8h-4v8h4v-8ZM8 4H4v16h4V4Z"/></symbol><symbol id="icon-eds-i-news-medium" viewBox="0 0 24 24"><path d="M17.384 3c.975 0 1.77.787 1.77 1.762v13.333c0 .462.354.846.815.899l.107.006.109-.006a.915.915 0 0 0 .809-.794l.006-.105V8.19a1 1 0 0 1 2 0v9.905A2.914 2.914 0 0 1 20.077 21H3.538a2.547 2.547 0 0 1-1.644-.601l-.147-.135A2.516 2.516 0 0 1 1 18.476V4.762C1 3.787 1.794 3 2.77 3h14.614Zm-.231 2H3v13.476c0 .11.035.216.1.304l.054.063c.101.1.24.157.384.157l13.761-.001-.026-.078a2.88 2.88 0 0 1-.115-.655l-.004-.17L17.153 5ZM14 15.021a.979.979 0 1 1 0 1.958H6a.979.979 0 1 1 0-1.958h8Zm0-8c.54 0 .979.438.979.979v4c0 .54-.438.979-.979.979H6A.979.979 0 0 1 5.021 12V8c0-.54.438-.979.979-.979h8Zm-.98 1.958H6.979v2.041h6.041V8.979Z"/></symbol><symbol id="icon-eds-i-newsletter-medium" viewBox="0 0 24 24"><path d="M21 10a1 1 0 0 1 1 1v9.5a2.5 2.5 0 0 1-2.5 2.5h-15A2.5 2.5 0 0 1 2 20.5V11a1 1 0 0 1 2 0v.439l8 4.888 8-4.889V11a1 1 0 0 1 1-1Zm-1 3.783-7.479 4.57a1 1 0 0 1-1.042 0l-7.48-4.57V20.5a.5.5 0 0 0 .501.5h15a.5.5 0 0 0 .5-.5v-6.717ZM15 9a1 1 0 0 1 0 2H9a1 1 0 0 1 0-2h6Zm2.5-8A2.5 2.5 0 0 1 20 3.5V9a1 1 0 0 1-2 0V3.5a.5.5 0 0 0-.5-.5h-11a.5.5 0 0 0-.5.5V9a1 1 0 1 1-2 0V3.5A2.5 2.5 0 0 1 6.5 1h11ZM15 5a1 1 0 0 1 0 2H9a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-notifcation-medium" viewBox="0 0 24 24"><path d="M14 20a1 1 0 0 1 0 2h-4a1 1 0 0 1 0-2h4ZM3 18l-.133-.007c-1.156-.124-1.156-1.862 0-1.986l.3-.012C4.32 15.923 5 15.107 5 14V9.5C5 5.368 8.014 2 12 2s7 3.368 7 7.5V14c0 1.107.68 1.923 1.832 1.995l.301.012c1.156.124 1.156 1.862 0 1.986L21 18H3Zm9-14C9.17 4 7 6.426 7 9.5V14c0 .671-.146 1.303-.416 1.858L6.51 16h10.979l-.073-.142a4.192 4.192 0 0 1-.412-1.658L17 14V9.5C17 6.426 14.83 4 12 4Z"/></symbol><symbol id="icon-eds-i-publish-medium" viewBox="0 0 24 24"><g><path d="M16.296 1.291A1 1 0 0 0 15.591 1H5.545A2.542 2.542 0 0 0 3 3.538V13a1 1 0 1 0 2 0V3.538l.007-.087A.543.543 0 0 1 5.545 3h9.633L20 7.8v12.662a.534.534 0 0 1-.158.379.548.548 0 0 1-.387.159H11a1 1 0 1 0 0 2h8.455c.674 0 1.32-.267 1.798-.742A2.534 2.534 0 0 0 22 20.462V7.385a1 1 0 0 0-.294-.709l-5.41-5.385Z"/><path d="M10.762 16.647a1 1 0 0 0-1.525-1.294l-4.472 5.271-2.153-1.665a1 1 0 1 0-1.224 1.582l2.91 2.25a1 1 0 0 0 1.374-.144l5.09-6ZM16 10a1 1 0 1 1 0 2H8a1 1 0 1 1 0-2h8ZM12 7a1 1 0 0 0-1-1H8a1 1 0 1 0 0 2h3a1 1 0 0 0 1-1Z"/></g></symbol><symbol id="icon-eds-i-refresh-medium" viewBox="0 0 24 24"><g><path d="M7.831 5.636H6.032A8.76 8.76 0 0 1 9 3.631 8.549 8.549 0 0 1 12.232 3c.603 0 1.192.063 1.76.182C17.979 4.017 21 7.632 21 12a1 1 0 1 0 2 0c0-5.296-3.674-9.746-8.591-10.776A10.61 10.61 0 0 0 5 3.851V2.805a1 1 0 0 0-.987-1H4a1 1 0 0 0-1 1v3.831a1 1 0 0 0 1 1h3.831a1 1 0 0 0 .013-2h-.013ZM17.968 18.364c-1.59 1.632-3.784 2.636-6.2 2.636C6.948 21 3 16.993 3 12a1 1 0 1 0-2 0c0 6.053 4.799 11 10.768 11 2.788 0 5.324-1.082 7.232-2.85v1.045a1 1 0 1 0 2 0v-3.831a1 1 0 0 0-1-1h-3.831a1 1 0 0 0 0 2h1.799Z"/></g></symbol><symbol id="icon-eds-i-search-medium" viewBox="0 0 24 24"><path d="M11 1c5.523 0 10 4.477 10 10 0 2.4-.846 4.604-2.256 6.328l3.963 3.965a1 1 0 0 1-1.414 1.414l-3.965-3.963A9.959 9.959 0 0 1 11 21C5.477 21 1 16.523 1 11S5.477 1 11 1Zm0 2a8 8 0 1 0 0 16 8 8 0 0 0 0-16Z"/></symbol><symbol id="icon-eds-i-settings-medium" viewBox="0 0 24 24"><path d="M11.382 1h1.24a2.508 2.508 0 0 1 2.334 1.63l.523 1.378 1.59.933 1.444-.224c.954-.132 1.89.3 2.422 1.101l.095.155.598 1.066a2.56 2.56 0 0 1-.195 2.848l-.894 1.161v1.896l.92 1.163c.6.768.707 1.812.295 2.674l-.09.17-.606 1.08a2.504 2.504 0 0 1-2.531 1.25l-1.428-.223-1.589.932-.523 1.378a2.512 2.512 0 0 1-2.155 1.625L12.65 23h-1.27a2.508 2.508 0 0 1-2.334-1.63l-.524-1.379-1.59-.933-1.443.225c-.954.132-1.89-.3-2.422-1.101l-.095-.155-.598-1.066a2.56 2.56 0 0 1 .195-2.847l.891-1.161v-1.898l-.919-1.162a2.562 2.562 0 0 1-.295-2.674l.09-.17.606-1.08a2.504 2.504 0 0 1 2.531-1.25l1.43.223 1.618-.938.524-1.375.07-.167A2.507 2.507 0 0 1 11.382 1Zm.003 2a.509.509 0 0 0-.47.338l-.65 1.71a1 1 0 0 1-.434.51L7.6 6.85a1 1 0 0 1-.655.123l-1.762-.275a.497.497 0 0 0-.498.252l-.61 1.088a.562.562 0 0 0 .04.619l1.13 1.43a1 1 0 0 1 .216.62v2.585a1 1 0 0 1-.207.61L4.15 15.339a.568.568 0 0 0-.036.634l.601 1.072a.494.494 0 0 0 .484.26l1.78-.278a1 1 0 0 1 .66.126l2.2 1.292a1 1 0 0 1 .43.507l.648 1.71a.508.508 0 0 0 .467.338h1.263a.51.51 0 0 0 .47-.34l.65-1.708a1 1 0 0 1 .428-.507l2.201-1.292a1 1 0 0 1 .66-.126l1.763.275a.497.497 0 0 0 .498-.252l.61-1.088a.562.562 0 0 0-.04-.619l-1.13-1.43a1 1 0 0 1-.216-.62v-2.585a1 1 0 0 1 .207-.61l1.105-1.437a.568.568 0 0 0 .037-.634l-.601-1.072a.494.494 0 0 0-.484-.26l-1.78.278a1 1 0 0 1-.66-.126l-2.2-1.292a1 1 0 0 1-.43-.507l-.649-1.71A.508.508 0 0 0 12.62 3h-1.234ZM12 8a4 4 0 1 1 0 8 4 4 0 0 1 0-8Zm0 2a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"/></symbol><symbol id="icon-eds-i-shipping-medium" viewBox="0 0 24 24"><path d="M16.515 2c1.406 0 2.706.728 3.352 1.902l2.02 3.635.02.042.036.089.031.105.012.058.01.073.004.075v11.577c0 .64-.244 1.255-.683 1.713a2.356 2.356 0 0 1-1.701.731H4.386a2.356 2.356 0 0 1-1.702-.731 2.476 2.476 0 0 1-.683-1.713V7.948c.01-.217.083-.43.22-.6L4.2 3.905C4.833 2.755 6.089 2.032 7.486 2h9.029ZM20 9H4v10.556a.49.49 0 0 0 .075.26l.053.07a.356.356 0 0 0 .257.114h15.23c.094 0 .186-.04.258-.115a.477.477 0 0 0 .127-.33V9Zm-2 7.5a1 1 0 0 1 0 2h-4a1 1 0 0 1 0-2h4ZM16.514 4H13v3h6.3l-1.183-2.13c-.288-.522-.908-.87-1.603-.87ZM11 3.999H7.51c-.679.017-1.277.36-1.566.887L4.728 7H11V3.999Z"/></symbol><symbol id="icon-eds-i-step-guide-medium" viewBox="0 0 24 24"><path d="M11.394 9.447a1 1 0 1 0-1.788-.894l-.88 1.759-.019-.02a1 1 0 1 0-1.414 1.415l1 1a1 1 0 0 0 1.601-.26l1.5-3ZM12 11a1 1 0 0 1 1-1h3a1 1 0 1 1 0 2h-3a1 1 0 0 1-1-1ZM12 17a1 1 0 0 1 1-1h3a1 1 0 1 1 0 2h-3a1 1 0 0 1-1-1ZM10.947 14.105a1 1 0 0 1 .447 1.342l-1.5 3a1 1 0 0 1-1.601.26l-1-1a1 1 0 1 1 1.414-1.414l.02.019.879-1.76a1 1 0 0 1 1.341-.447Z"/><path d="M5.545 1A2.542 2.542 0 0 0 3 3.538v16.924A2.542 2.542 0 0 0 5.545 23h12.91A2.542 2.542 0 0 0 21 20.462V7.5a1 1 0 0 0-.293-.707l-5.5-5.5A1 1 0 0 0 14.5 1H5.545ZM5 3.538C5 3.245 5.24 3 5.545 3h8.54L19 7.914v12.547c0 .294-.24.539-.546.539H5.545A.542.542 0 0 1 5 20.462V3.538Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-submission-medium" viewBox="0 0 24 24"><g><path d="M5 3.538C5 3.245 5.24 3 5.545 3h9.633L20 7.8v12.662a.535.535 0 0 1-.158.379.549.549 0 0 1-.387.159H6a1 1 0 0 1-1-1v-2.5a1 1 0 1 0-2 0V20a3 3 0 0 0 3 3h13.455c.673 0 1.32-.266 1.798-.742A2.535 2.535 0 0 0 22 20.462V7.385a1 1 0 0 0-.294-.709l-5.41-5.385A1 1 0 0 0 15.591 1H5.545A2.542 2.542 0 0 0 3 3.538V7a1 1 0 0 0 2 0V3.538Z"/><path d="m13.707 13.707-4 4a1 1 0 0 1-1.414 0l-.083-.094a1 1 0 0 1 .083-1.32L10.585 14 2 14a1 1 0 1 1 0-2l8.583.001-2.29-2.294a1 1 0 0 1 1.414-1.414l4.037 4.04.043.05.043.06.059.098.03.063.031.085.03.113.017.122L14 13l-.004.087-.017.118-.013.056-.034.104-.049.105-.048.081-.07.093-.058.063Z"/></g></symbol><symbol id="icon-eds-i-table-1-medium" viewBox="0 0 24 24"><path d="M4.385 22a2.56 2.56 0 0 1-1.14-.279C2.485 21.341 2 20.614 2 19.615V4.385c0-.315.067-.716.279-1.14C2.659 2.485 3.386 2 4.385 2h15.23c.315 0 .716.067 1.14.279.76.38 1.245 1.107 1.245 2.106v15.23c0 .315-.067.716-.279 1.14-.38.76-1.107 1.245-2.106 1.245H4.385ZM4 19.615c0 .213.034.265.14.317a.71.71 0 0 0 .245.068H8v-4H4v3.615ZM20 16H10v4h9.615c.213 0 .265-.034.317-.14a.71.71 0 0 0 .068-.245V16Zm0-2v-4H10v4h10ZM4 14h4v-4H4v4ZM19.615 4H10v4h10V4.385c0-.213-.034-.265-.14-.317A.71.71 0 0 0 19.615 4ZM8 4H4.385l-.082.002c-.146.01-.19.047-.235.138A.71.71 0 0 0 4 4.385V8h4V4Z"/></symbol><symbol id="icon-eds-i-table-2-medium" viewBox="0 0 24 24"><path d="M4.384 22A2.384 2.384 0 0 1 2 19.616V4.384A2.384 2.384 0 0 1 4.384 2h15.232A2.384 2.384 0 0 1 22 4.384v15.232A2.384 2.384 0 0 1 19.616 22H4.384ZM10 15H4v4.616c0 .212.172.384.384.384H10v-5Zm5 0h-3v5h3v-5Zm5 0h-3v5h2.616a.384.384 0 0 0 .384-.384V15ZM10 9H4v4h6V9Zm5 0h-3v4h3V9Zm5 0h-3v4h3V9Zm-.384-5H4.384A.384.384 0 0 0 4 4.384V7h16V4.384A.384.384 0 0 0 19.616 4Z"/></symbol><symbol id="icon-eds-i-tag-medium" viewBox="0 0 24 24"><path d="m12.621 1.998.127.004L20.496 2a1.5 1.5 0 0 1 1.497 1.355L22 3.5l-.005 7.669c.038.456-.133.905-.447 1.206l-9.02 9.018a2.075 2.075 0 0 1-2.932 0l-6.99-6.99a2.075 2.075 0 0 1 .001-2.933L11.61 2.47c.246-.258.573-.418.881-.46l.131-.011Zm.286 2-8.885 8.886a.075.075 0 0 0 0 .106l6.987 6.988c.03.03.077.03.106 0l8.883-8.883L19.999 4l-7.092-.002ZM16 6.5a1.5 1.5 0 0 1 .144 2.993L16 9.5a1.5 1.5 0 0 1 0-3Z"/></symbol><symbol id="icon-eds-i-trash-medium" viewBox="0 0 24 24"><path d="M12 1c2.717 0 4.913 2.232 4.997 5H21a1 1 0 0 1 0 2h-1v12.5c0 1.389-1.152 2.5-2.556 2.5H6.556C5.152 23 4 21.889 4 20.5V8H3a1 1 0 1 1 0-2h4.003l.001-.051C7.114 3.205 9.3 1 12 1Zm6 7H6v12.5c0 .238.19.448.454.492l.102.008h10.888c.315 0 .556-.232.556-.5V8Zm-4 3a1 1 0 0 1 1 1v6.005a1 1 0 0 1-2 0V12a1 1 0 0 1 1-1Zm-4 0a1 1 0 0 1 1 1v6a1 1 0 0 1-2 0v-6a1 1 0 0 1 1-1Zm2-8c-1.595 0-2.914 1.32-2.996 3h5.991v-.02C14.903 4.31 13.589 3 12 3Z"/></symbol><symbol id="icon-eds-i-user-account-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 16c-1.806 0-3.52.994-4.664 2.698A8.947 8.947 0 0 0 12 21a8.958 8.958 0 0 0 4.664-1.301C15.52 17.994 13.806 17 12 17Zm0-14a9 9 0 0 0-6.25 15.476C7.253 16.304 9.54 15 12 15s4.747 1.304 6.25 3.475A9 9 0 0 0 12 3Zm0 3a4 4 0 1 1 0 8 4 4 0 0 1 0-8Zm0 2a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"/></symbol><symbol id="icon-eds-i-user-add-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm9 10a1 1 0 0 1 1 1v3h3a1 1 0 0 1 0 2h-3v3a1 1 0 0 1-2 0v-3h-3a1 1 0 0 1 0-2h3v-3a1 1 0 0 1 1-1Zm-5.545-.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378Z"/></symbol><symbol id="icon-eds-i-user-assign-medium" viewBox="0 0 24 24"><path d="M16.226 13.298a1 1 0 0 1 1.414-.01l.084.093a1 1 0 0 1-.073 1.32L15.39 17H22a1 1 0 0 1 0 2h-6.611l2.262 2.298a1 1 0 0 1-1.425 1.404l-3.939-4a1 1 0 0 1 0-1.404l3.94-4Zm-3.771-.449a1 1 0 1 1-.91 1.781 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 10.5 20a1 1 0 0 1 .993.883L11.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Z"/></symbol><symbol id="icon-eds-i-user-block-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm9 10a5 5 0 1 1 0 10 5 5 0 0 1 0-10Zm-5.545-.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM15 18a3 3 0 0 0 4.294 2.707l-4.001-4c-.188.391-.293.83-.293 1.293Zm3-3c-.463 0-.902.105-1.294.293l4.001 4A3 3 0 0 0 18 15Z"/></symbol><symbol id="icon-eds-i-user-check-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm13.647 12.237a1 1 0 0 1 .116 1.41l-5.091 6a1 1 0 0 1-1.375.144l-2.909-2.25a1 1 0 1 1 1.224-1.582l2.153 1.665 4.472-5.271a1 1 0 0 1 1.41-.116Zm-8.139-.977c.22.214.428.44.622.678a1 1 0 1 1-1.548 1.266 6.025 6.025 0 0 0-1.795-1.49.86.86 0 0 1-.163-.048l-.079-.036a5.721 5.721 0 0 0-2.62-.63l-.194.006c-2.76.134-5.022 2.177-5.592 4.864l-.035.175-.035.213c-.03.201-.05.405-.06.61L3.003 20 10 20a1 1 0 0 1 .993.883L11 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876l.005-.223.02-.356.02-.222.03-.248.022-.15c.02-.133.044-.265.071-.397.44-2.178 1.725-4.105 3.595-5.301a7.75 7.75 0 0 1 3.755-1.215l.12-.004a7.908 7.908 0 0 1 5.87 2.252Z"/></symbol><symbol id="icon-eds-i-user-delete-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6ZM4.763 13.227a7.713 7.713 0 0 1 7.692-.378 1 1 0 1 1-.91 1.781 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20H11.5a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897Zm11.421 1.543 2.554 2.553 2.555-2.553a1 1 0 0 1 1.414 1.414l-2.554 2.554 2.554 2.555a1 1 0 0 1-1.414 1.414l-2.555-2.554-2.554 2.554a1 1 0 0 1-1.414-1.414l2.553-2.555-2.553-2.554a1 1 0 0 1 1.414-1.414Z"/></symbol><symbol id="icon-eds-i-user-edit-medium" viewBox="0 0 24 24"><path d="m19.876 10.77 2.831 2.83a1 1 0 0 1 0 1.415l-7.246 7.246a1 1 0 0 1-.572.284l-3.277.446a1 1 0 0 1-1.125-1.13l.461-3.277a1 1 0 0 1 .283-.567l7.23-7.246a1 1 0 0 1 1.415-.001Zm-7.421 2.08a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 7.5 20a1 1 0 0 1 .993.883L8.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378Zm6.715.042-6.29 6.3-.23 1.639 1.633-.222 6.302-6.302-1.415-1.415ZM9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Z"/></symbol><symbol id="icon-eds-i-user-linked-medium" viewBox="0 0 24 24"><path d="M15.65 6c.31 0 .706.066 1.122.274C17.522 6.65 18 7.366 18 8.35v12.3c0 .31-.066.706-.274 1.122-.375.75-1.092 1.228-2.076 1.228H3.35a2.52 2.52 0 0 1-1.122-.274C1.478 22.35 1 21.634 1 20.65V8.35c0-.31.066-.706.274-1.122C1.65 6.478 2.366 6 3.35 6h12.3Zm0 2-12.376.002c-.134.007-.17.04-.21.12A.672.672 0 0 0 3 8.35v12.3c0 .198.028.24.122.287.09.044.2.063.228.063h.887c.788-2.269 2.814-3.5 5.263-3.5 2.45 0 4.475 1.231 5.263 3.5h.887c.198 0 .24-.028.287-.122.044-.09.063-.2.063-.228V8.35c0-.198-.028-.24-.122-.287A.672.672 0 0 0 15.65 8ZM9.5 19.5c-1.36 0-2.447.51-3.06 1.5h6.12c-.613-.99-1.7-1.5-3.06-1.5ZM20.65 1A2.35 2.35 0 0 1 23 3.348V15.65A2.35 2.35 0 0 1 20.65 18H20a1 1 0 0 1 0-2h.65a.35.35 0 0 0 .35-.35V3.348A.35.35 0 0 0 20.65 3H8.35a.35.35 0 0 0-.35.348V4a1 1 0 1 1-2 0v-.652A2.35 2.35 0 0 1 8.35 1h12.3ZM9.5 10a3.5 3.5 0 1 1 0 7 3.5 3.5 0 0 1 0-7Zm0 2a1.5 1.5 0 1 0 0 3 1.5 1.5 0 0 0 0-3Z"/></symbol><symbol id="icon-eds-i-user-multiple-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm6 0a5 5 0 0 1 0 10 1 1 0 0 1-.117-1.993L15 9a3 3 0 0 0 0-6 1 1 0 0 1 0-2ZM9 3a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm8.857 9.545a7.99 7.99 0 0 1 2.651 1.715A8.31 8.31 0 0 1 23 20.134V21a1 1 0 0 1-1 1h-3a1 1 0 0 1 0-2h1.995l-.005-.153a6.307 6.307 0 0 0-1.673-3.945l-.204-.209a5.99 5.99 0 0 0-1.988-1.287 1 1 0 1 1 .732-1.861Zm-3.349 1.715A8.31 8.31 0 0 1 17 20.134V21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.877c.044-4.343 3.387-7.908 7.638-8.115a7.908 7.908 0 0 1 5.87 2.252ZM9.016 14l-.285.006c-3.104.15-5.58 2.718-5.725 5.9L3.004 20h11.991l-.005-.153a6.307 6.307 0 0 0-1.673-3.945l-.204-.209A5.924 5.924 0 0 0 9.3 14.008L9.016 14Z"/></symbol><symbol id="icon-eds-i-user-notify-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm10 18v1a1 1 0 0 1-2 0v-1h-3a1 1 0 0 1 0-2v-2.818C14 13.885 15.777 12 18 12s4 1.885 4 4.182V19a1 1 0 0 1 0 2h-3Zm-6.545-8.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM18 14c-1.091 0-2 .964-2 2.182V19h4v-2.818c0-1.165-.832-2.098-1.859-2.177L18 14Z"/></symbol><symbol id="icon-eds-i-user-remove-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm3.455 9.85a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM22 17a1 1 0 0 1 0 2h-8a1 1 0 0 1 0-2h8Z"/></symbol><symbol id="icon-eds-i-user-single-medium" viewBox="0 0 24 24"><path d="M12 1a5 5 0 1 1 0 10 5 5 0 0 1 0-10Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm-.406 9.008a8.965 8.965 0 0 1 6.596 2.494A9.161 9.161 0 0 1 21 21.025V22a1 1 0 0 1-1 1H4a1 1 0 0 1-1-1v-.985c.05-4.825 3.815-8.777 8.594-9.007Zm.39 1.992-.299.006c-3.63.175-6.518 3.127-6.678 6.775L5 21h13.998l-.009-.268a7.157 7.157 0 0 0-1.97-4.573l-.214-.213A6.967 6.967 0 0 0 11.984 14Z"/></symbol><symbol id="icon-eds-i-warning-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm0 11.5a1.5 1.5 0 0 1 .144 2.993L12 17.5a1.5 1.5 0 0 1 0-3ZM12 6a1 1 0 0 1 1 1v5a1 1 0 0 1-2 0V7a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-warning-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 13.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 14.5ZM12 6a1 1 0 0 0-1 1v5a1 1 0 0 0 2 0V7a1 1 0 0 0-1-1Z"/></symbol><symbol id="icon-chevron-left-medium" viewBox="0 0 24 24"><path d="M15.7194 3.3054C15.3358 2.90809 14.7027 2.89699 14.3054 3.28061L6.54342 10.7757C6.19804 11.09 6 11.5335 6 12C6 12.4665 6.19804 12.91 6.5218 13.204L14.3054 20.7194C14.7027 21.103 15.3358 21.0919 15.7194 20.6946C16.103 20.2973 16.0919 19.6642 15.6946 19.2806L8.155 12L15.6946 4.71939C16.0614 4.36528 16.099 3.79863 15.8009 3.40105L15.7194 3.3054Z"/></symbol><symbol id="icon-chevron-right-medium" viewBox="0 0 24 24"><path d="M8.28061 3.3054C8.66423 2.90809 9.29729 2.89699 9.6946 3.28061L17.4566 10.7757C17.802 11.09 18 11.5335 18 12C18 12.4665 17.802 12.91 17.4782 13.204L9.6946 20.7194C9.29729 21.103 8.66423 21.0919 8.28061 20.6946C7.89699 20.2973 7.90809 19.6642 8.3054 19.2806L15.845 12L8.3054 4.71939C7.93865 4.36528 7.90098 3.79863 8.19908 3.40105L8.28061 3.3054Z"/></symbol><symbol id="icon-eds-alerts" viewBox="0 0 32 32"><path d="M28 12.667c.736 0 1.333.597 1.333 1.333v13.333A3.333 3.333 0 0 1 26 30.667H6a3.333 3.333 0 0 1-3.333-3.334V14a1.333 1.333 0 1 1 2.666 0v1.252L16 21.769l10.667-6.518V14c0-.736.597-1.333 1.333-1.333Zm-1.333 5.71-9.972 6.094c-.427.26-.963.26-1.39 0l-9.972-6.094v8.956c0 .368.299.667.667.667h20a.667.667 0 0 0 .667-.667v-8.956ZM19.333 12a1.333 1.333 0 1 1 0 2.667h-6.666a1.333 1.333 0 1 1 0-2.667h6.666Zm4-10.667a3.333 3.333 0 0 1 3.334 3.334v6.666a1.333 1.333 0 1 1-2.667 0V4.667A.667.667 0 0 0 23.333 4H8.667A.667.667 0 0 0 8 4.667v6.666a1.333 1.333 0 1 1-2.667 0V4.667a3.333 3.333 0 0 1 3.334-3.334h14.666Zm-4 5.334a1.333 1.333 0 0 1 0 2.666h-6.666a1.333 1.333 0 1 1 0-2.666h6.666Z"/></symbol><symbol id="icon-eds-arrow-up" viewBox="0 0 24 24"><path fill-rule="evenodd" d="m13.002 7.408 4.88 4.88a.99.99 0 0 0 1.32.08l.09-.08c.39-.39.39-1.03 0-1.42l-6.58-6.58a1.01 1.01 0 0 0-1.42 0l-6.58 6.58a1 1 0 0 0-.09 1.32l.08.1a1 1 0 0 0 1.42-.01l4.88-4.87v11.59a.99.99 0 0 0 .88.99l.12.01c.55 0 1-.45 1-1V7.408z" class="layer"/></symbol><symbol id="icon-eds-checklist" viewBox="0 0 32 32"><path d="M19.2 1.333a3.468 3.468 0 0 1 3.381 2.699L24.667 4C26.515 4 28 5.52 28 7.38v19.906c0 1.86-1.485 3.38-3.333 3.38H7.333c-1.848 0-3.333-1.52-3.333-3.38V7.38C4 5.52 5.485 4 7.333 4h2.093A3.468 3.468 0 0 1 12.8 1.333h6.4ZM9.426 6.667H7.333c-.36 0-.666.312-.666.713v19.906c0 .401.305.714.666.714h17.334c.36 0 .666-.313.666-.714V7.38c0-.4-.305-.713-.646-.714l-2.121.033A3.468 3.468 0 0 1 19.2 9.333h-6.4a3.468 3.468 0 0 1-3.374-2.666Zm12.715 5.606c.586.446.7 1.283.253 1.868l-7.111 9.334a1.333 1.333 0 0 1-1.792.306l-3.556-2.333a1.333 1.333 0 1 1 1.463-2.23l2.517 1.651 6.358-8.344a1.333 1.333 0 0 1 1.868-.252ZM19.2 4h-6.4a.8.8 0 0 0-.8.8v1.067a.8.8 0 0 0 .8.8h6.4a.8.8 0 0 0 .8-.8V4.8a.8.8 0 0 0-.8-.8Z"/></symbol><symbol id="icon-eds-citation" viewBox="0 0 36 36"><path d="M23.25 1.5a1.5 1.5 0 0 1 1.06.44l8.25 8.25a1.5 1.5 0 0 1 .44 1.06v19.5c0 2.105-1.645 3.75-3.75 3.75H18a1.5 1.5 0 0 1 0-3h11.25c.448 0 .75-.302.75-.75V11.873L22.628 4.5H8.31a.811.811 0 0 0-.8.68l-.011.13V16.5a1.5 1.5 0 0 1-3 0V5.31A3.81 3.81 0 0 1 8.31 1.5h14.94ZM8.223 20.358a.984.984 0 0 1-.192 1.378l-.048.034c-.54.36-.942.676-1.206.951-.59.614-.885 1.395-.885 2.343.115-.028.288-.042.518-.042.662 0 1.26.237 1.791.711.533.474.799 1.074.799 1.799 0 .753-.259 1.352-.777 1.799-.518.446-1.151.669-1.9.669-1.006 0-1.812-.293-2.417-.878C3.302 28.536 3 27.657 3 26.486c0-1.115.165-2.085.496-2.907.331-.823.734-1.513 1.209-2.071.475-.558.971-.997 1.49-1.318a6.01 6.01 0 0 1 .347-.2 1.321 1.321 0 0 1 1.681.368Zm7.5 0a.984.984 0 0 1-.192 1.378l-.048.034c-.54.36-.942.676-1.206.951-.59.614-.885 1.395-.885 2.343.115-.028.288-.042.518-.042.662 0 1.26.237 1.791.711.533.474.799 1.074.799 1.799 0 .753-.259 1.352-.777 1.799-.518.446-1.151.669-1.9.669-1.006 0-1.812-.293-2.417-.878-.604-.586-.906-1.465-.906-2.636 0-1.115.165-2.085.496-2.907.331-.823.734-1.513 1.209-2.071.475-.558.971-.997 1.49-1.318a6.01 6.01 0 0 1 .347-.2 1.321 1.321 0 0 1 1.681.368Z"/></symbol><symbol id="icon-eds-i-access-indicator" viewBox="0 0 16 16"><circle cx="4.5" cy="11.5" r="3.5" style="fill:currentColor"/><path fill-rule="evenodd" d="M4 3v3a1 1 0 0 1-2 0V2.923C2 1.875 2.84 1 3.909 1h5.909a1 1 0 0 1 .713.298l3.181 3.231a1 1 0 0 1 .288.702v7.846c0 .505-.197.993-.554 1.354a1.902 1.902 0 0 1-1.355.569H10a1 1 0 1 1 0-2h2V5.64L9.4 3H4Z" clip-rule="evenodd" style="fill:#222"/></symbol><symbol id="icon-eds-i-github-medium" viewBox="0 0 24 24"><path d="M 11.964844 0 C 5.347656 0 0 5.269531 0 11.792969 C 0 17.003906 3.425781 21.417969 8.179688 22.976562 C 8.773438 23.09375 8.992188 22.722656 8.992188 22.410156 C 8.992188 22.136719 8.972656 21.203125 8.972656 20.226562 C 5.644531 20.929688 4.953125 18.820312 4.953125 18.820312 C 4.417969 17.453125 3.625 17.101562 3.625 17.101562 C 2.535156 16.378906 3.703125 16.378906 3.703125 16.378906 C 4.914062 16.457031 5.546875 17.589844 5.546875 17.589844 C 6.617188 19.386719 8.339844 18.878906 9.03125 18.566406 C 9.132812 17.804688 9.449219 17.277344 9.785156 16.984375 C 7.132812 16.710938 4.339844 15.695312 4.339844 11.167969 C 4.339844 9.878906 4.8125 8.824219 5.566406 8.003906 C 5.445312 7.710938 5.03125 6.5 5.683594 4.878906 C 5.683594 4.878906 6.695312 4.566406 8.972656 6.089844 C 9.949219 5.832031 10.953125 5.703125 11.964844 5.699219 C 12.972656 5.699219 14.003906 5.835938 14.957031 6.089844 C 17.234375 4.566406 18.242188 4.878906 18.242188 4.878906 C 18.898438 6.5 18.480469 7.710938 18.363281 8.003906 C 19.136719 8.824219 19.589844 9.878906 19.589844 11.167969 C 19.589844 15.695312 16.796875 16.691406 14.125 16.984375 C 14.558594 17.355469 14.933594 18.058594 14.933594 19.171875 C 14.933594 20.753906 14.914062 22.019531 14.914062 22.410156 C 14.914062 22.722656 15.132812 23.09375 15.726562 22.976562 C 20.480469 21.414062 23.910156 17.003906 23.910156 11.792969 C 23.929688 5.269531 18.558594 0 11.964844 0 Z M 11.964844 0 "/></symbol><symbol id="icon-eds-i-limited-access" viewBox="0 0 16 16"><path fill-rule="evenodd" d="M4 3v3a1 1 0 0 1-2 0V2.923C2 1.875 2.84 1 3.909 1h5.909a1 1 0 0 1 .713.298l3.181 3.231a1 1 0 0 1 .288.702V6a1 1 0 1 1-2 0v-.36L9.4 3H4ZM3 8a1 1 0 0 1 1 1v1a1 1 0 1 1-2 0V9a1 1 0 0 1 1-1Zm10 0a1 1 0 0 1 1 1v1a1 1 0 1 1-2 0V9a1 1 0 0 1 1-1Zm-3.5 6a1 1 0 0 1-1 1h-1a1 1 0 1 1 0-2h1a1 1 0 0 1 1 1Zm2.441-1a1 1 0 0 1 2 0c0 .73-.246 1.306-.706 1.664a1.61 1.61 0 0 1-.876.334l-.032.002H11.5a1 1 0 1 1 0-2h.441ZM4 13a1 1 0 0 0-2 0c0 .73.247 1.306.706 1.664a1.609 1.609 0 0 0 .876.334l.032.002H4.5a1 1 0 1 0 0-2H4Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-subjects-medium" viewBox="0 0 24 24"><g id="icon-subjects-copy" stroke="none" stroke-width="1" fill-rule="evenodd"><path d="M13.3846154,2 C14.7015971,2 15.7692308,3.06762994 15.7692308,4.38461538 L15.7692308,7.15384615 C15.7692308,8.47082629 14.7015955,9.53846154 13.3846154,9.53846154 L13.1038388,9.53925278 C13.2061091,9.85347965 13.3815528,10.1423885 13.6195822,10.3804178 C13.9722182,10.7330539 14.436524,10.9483278 14.9293854,10.9918129 L15.1153846,11 C16.2068332,11 17.2535347,11.433562 18.0254647,12.2054189 C18.6411944,12.8212361 19.0416785,13.6120766 19.1784166,14.4609738 L19.6153846,14.4615385 C20.932386,14.4615385 22,15.5291672 22,16.8461538 L22,19.6153846 C22,20.9323924 20.9323924,22 19.6153846,22 L16.8461538,22 C15.5291672,22 14.4615385,20.932386 14.4615385,19.6153846 L14.4615385,16.8461538 C14.4615385,15.5291737 15.5291737,14.4615385 16.8461538,14.4615385 L17.126925,14.460779 C17.0246537,14.1465537 16.8492179,13.857633 16.6112344,13.6196157 C16.2144418,13.2228606 15.6764136,13 15.1153846,13 C14.0239122,13 12.9771569,12.5664197 12.2053686,11.7946314 C12.1335167,11.7227795 12.0645962,11.6485444 11.9986839,11.5721119 C11.9354038,11.6485444 11.8664833,11.7227795 11.7946314,11.7946314 C11.0228431,12.5664197 9.97608778,13 8.88461538,13 C8.323576,13 7.78552852,13.2228666 7.38881294,13.6195822 C7.15078359,13.8576115 6.97533988,14.1465203 6.8730696,14.4607472 L7.15384615,14.4615385 C8.47082629,14.4615385 9.53846154,15.5291737 9.53846154,16.8461538 L9.53846154,19.6153846 C9.53846154,20.932386 8.47083276,22 7.15384615,22 L4.38461538,22 C3.06762347,22 2,20.9323876 2,19.6153846 L2,16.8461538 C2,15.5291721 3.06762994,14.4615385 4.38461538,14.4615385 L4.8215823,14.4609378 C4.95831893,13.6120029 5.3588057,12.8211623 5.97459937,12.2053686 C6.69125996,11.488708 7.64500941,11.0636656 8.6514968,11.0066017 L8.88461538,11 C9.44565477,11 9.98370225,10.7771334 10.3804178,10.3804178 C10.6184472,10.1423885 10.7938909,9.85347965 10.8961612,9.53925278 L10.6153846,9.53846154 C9.29840448,9.53846154 8.23076923,8.47082629 8.23076923,7.15384615 L8.23076923,4.38461538 C8.23076923,3.06762994 9.29840286,2 10.6153846,2 L13.3846154,2 Z M7.15384615,16.4615385 L4.38461538,16.4615385 C4.17220099,16.4615385 4,16.63374 4,16.8461538 L4,19.6153846 C4,19.8278134 4.17218833,20 4.38461538,20 L7.15384615,20 C7.36626945,20 7.53846154,19.8278103 7.53846154,19.6153846 L7.53846154,16.8461538 C7.53846154,16.6337432 7.36625679,16.4615385 7.15384615,16.4615385 Z M19.6153846,16.4615385 L16.8461538,16.4615385 C16.6337432,16.4615385 16.4615385,16.6337432 16.4615385,16.8461538 L16.4615385,19.6153846 C16.4615385,19.8278103 16.6337306,20 16.8461538,20 L19.6153846,20 C19.8278229,20 20,19.8278229 20,19.6153846 L20,16.8461538 C20,16.6337306 19.8278103,16.4615385 19.6153846,16.4615385 Z M13.3846154,4 L10.6153846,4 C10.4029708,4 10.2307692,4.17220099 10.2307692,4.38461538 L10.2307692,7.15384615 C10.2307692,7.36625679 10.402974,7.53846154 10.6153846,7.53846154 L13.3846154,7.53846154 C13.597026,7.53846154 13.7692308,7.36625679 13.7692308,7.15384615 L13.7692308,4.38461538 C13.7692308,4.17220099 13.5970292,4 13.3846154,4 Z" id="Shape" fill-rule="nonzero"/></g></symbol><symbol id="icon-eds-small-arrow-left" viewBox="0 0 16 17"><path stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M14 8.092H2m0 0L8 2M2 8.092l6 6.035"/></symbol><symbol id="icon-eds-small-arrow-right" viewBox="0 0 16 16"><g fill-rule="evenodd" stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="2"><path d="M2 8.092h12M8 2l6 6.092M8 14.127l6-6.035"/></g></symbol><symbol id="icon-orcid-logo" viewBox="0 0 40 40"><path fill-rule="evenodd" d="M12.281 10.453c.875 0 1.578-.719 1.578-1.578 0-.86-.703-1.578-1.578-1.578-.875 0-1.578.703-1.578 1.578 0 .86.703 1.578 1.578 1.578Zm-1.203 18.641h2.406V12.359h-2.406v16.735Z"/><path fill-rule="evenodd" d="M17.016 12.36h6.5c6.187 0 8.906 4.421 8.906 8.374 0 4.297-3.36 8.375-8.875 8.375h-6.531V12.36Zm6.234 14.578h-3.828V14.53h3.703c4.688 0 6.828 2.844 6.828 6.203 0 2.063-1.25 6.203-6.703 6.203Z" clip-rule="evenodd"/></symbol></svg> </div> <a class="c-skip-link" href="#main">Skip to main content</a> <header class="eds-c-header" data-eds-c-header> <div class="eds-c-header__container" data-eds-c-header-expander-anchor> <div class="eds-c-header__brand"> <a href="https://link.springer.com" data-test=springerlink-logo data-track="click_imprint_logo" data-track-context="unified header" data-track-action="click logo link" data-track-category="unified header" data-track-label="link" > <img src="/oscar-static/images/darwin/header/img/logo-springer-nature-link-3149409f62.svg" alt="Springer Nature Link"> </a> </div> <a class="c-header__link eds-c-header__link" id="identity-account-widget" href='https://idp.springer.com/auth/personal/springernature?redirect_uri=https://link.springer.com/article/10.1007/s44196-024-00427-6?'><span class="eds-c-header__widget-fragment-title">Log in</span></a> </div> <nav class="eds-c-header__nav" aria-label="header navigation"> <div class="eds-c-header__nav-container"> <div class="eds-c-header__item eds-c-header__item--menu"> <a href="#eds-c-header-nav" class="eds-c-header__link" data-eds-c-header-expander> <svg class="eds-c-header__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-menu-medium"></use> </svg><span>Menu</span> </a> </div> <div class="eds-c-header__item eds-c-header__item--inline-links"> <a class="eds-c-header__link" href="https://link.springer.com/journals/" data-track="nav_find_a_journal" data-track-context="unified header" data-track-action="click find a journal" data-track-category="unified header" data-track-label="link" > Find a journal </a> <a class="eds-c-header__link" href="https://www.springernature.com/gp/authors" data-track="nav_how_to_publish" data-track-context="unified header" data-track-action="click publish with us link" data-track-category="unified header" data-track-label="link" > Publish with us </a> <a class="eds-c-header__link" href="https://link.springernature.com/home/" data-track="nav_track_your_research" data-track-context="unified header" data-track-action="click track your research" data-track-category="unified header" data-track-label="link" > Track your research </a> </div> <div class="eds-c-header__link-container"> <div class="eds-c-header__item eds-c-header__item--divider"> <a href="#eds-c-header-popup-search" class="eds-c-header__link" data-eds-c-header-expander data-eds-c-header-test-search-btn> <svg class="eds-c-header__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-search-medium"></use> </svg><span>Search</span> </a> </div> <div id="ecommerce-header-cart-icon-link" class="eds-c-header__item ecommerce-cart" style="display:inline-block"> <a class="eds-c-header__link" href="https://order.springer.com/public/cart" style="appearance:none;border:none;background:none;color:inherit;position:relative"> <svg id="eds-i-cart" class="eds-c-header__icon" xmlns="http://www.w3.org/2000/svg" height="24" width="24" viewBox="0 0 24 24" aria-hidden="true" focusable="false"> <path fill="currentColor" fill-rule="nonzero" d="M2 1a1 1 0 0 0 0 2l1.659.001 2.257 12.808a2.599 2.599 0 0 0 2.435 2.185l.167.004 9.976-.001a2.613 2.613 0 0 0 2.61-1.748l.03-.106 1.755-7.82.032-.107a2.546 2.546 0 0 0-.311-1.986l-.108-.157a2.604 2.604 0 0 0-2.197-1.076L6.042 5l-.56-3.17a1 1 0 0 0-.864-.82l-.12-.007L2.001 1ZM20.35 6.996a.63.63 0 0 1 .54.26.55.55 0 0 1 .082.505l-.028.1L19.2 15.63l-.022.05c-.094.177-.282.299-.526.317l-10.145.002a.61.61 0 0 1-.618-.515L6.394 6.999l13.955-.003ZM18 19a2 2 0 1 0 0 4 2 2 0 0 0 0-4ZM8 19a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"></path> </svg><span>Cart</span><span class="cart-info" style="display:none;position:absolute;top:10px;right:45px;background-color:#C65301;color:#fff;width:18px;height:18px;font-size:11px;border-radius:50%;line-height:17.5px;text-align:center"></span></a> <script>(function () { var exports = {}; if (window.fetch) { "use strict"; Object.defineProperty(exports, "__esModule", { value: true }); exports.headerWidgetClientInit = void 0; var headerWidgetClientInit = function (getCartInfo) { document.body.addEventListener("updatedCart", function () { updateCartIcon(); }, false); return updateCartIcon(); function updateCartIcon() { return getCartInfo() .then(function (res) { return res.json(); }) .then(refreshCartState) .catch(function (_) { }); } function refreshCartState(json) { var indicator = document.querySelector("#ecommerce-header-cart-icon-link .cart-info"); /* istanbul ignore else */ if (indicator && json.itemCount) { indicator.style.display = 'block'; indicator.textContent = json.itemCount > 9 ? '9+' : json.itemCount.toString(); var moreThanOneItem = json.itemCount > 1; indicator.setAttribute('title', "there ".concat(moreThanOneItem ? "are" : "is", " ").concat(json.itemCount, " item").concat(moreThanOneItem ? "s" : "", " in your cart")); } return json; } }; exports.headerWidgetClientInit = headerWidgetClientInit; headerWidgetClientInit( function () { return window.fetch("https://cart.springer.com/cart-info", { credentials: "include", headers: { Accept: "application/json" } }) } ) }})()</script> </div> </div> </div> </nav> </header> <article lang="en" id="main" class="app-masthead__colour-21"> <section class="app-masthead " aria-label="article masthead"> <div class="app-masthead__container"> <div class="app-article-masthead u-sans-serif js-context-bar-sticky-point-masthead" data-track-component="article" data-test="masthead-component"> <div class="app-article-masthead__info"> <nav aria-label="breadcrumbs" data-test="breadcrumbs"> <ol class="c-breadcrumbs c-breadcrumbs--contrast" itemscope itemtype="https://schema.org/BreadcrumbList"> <li class="c-breadcrumbs__item" id="breadcrumb0" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <a href="/" class="c-breadcrumbs__link" itemprop="item" data-track="click_breadcrumb" data-track-context="article page" data-track-category="article" data-track-action="breadcrumbs" data-track-label="breadcrumb1"><span itemprop="name">Home</span></a><meta itemprop="position" content="1"> <svg class="c-breadcrumbs__chevron" role="img" aria-hidden="true" focusable="false" width="10" height="10" viewBox="0 0 10 10"> <path d="m5.96738168 4.70639573 2.39518594-2.41447274c.37913917-.38219212.98637524-.38972225 1.35419292-.01894278.37750606.38054586.37784436.99719163-.00013556 1.37821513l-4.03074001 4.06319683c-.37758093.38062133-.98937525.38100976-1.367372-.00003075l-4.03091981-4.06337806c-.37759778-.38063832-.38381821-.99150444-.01600053-1.3622839.37750607-.38054587.98772445-.38240057 1.37006824.00302197l2.39538588 2.4146743.96295325.98624457z" fill-rule="evenodd" transform="matrix(0 -1 1 0 0 10)"/> </svg> </li> <li class="c-breadcrumbs__item" id="breadcrumb1" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <a href="/journal/44196" class="c-breadcrumbs__link" itemprop="item" data-track="click_breadcrumb" data-track-context="article page" data-track-category="article" data-track-action="breadcrumbs" data-track-label="breadcrumb2"><span itemprop="name">International Journal of Computational Intelligence Systems</span></a><meta itemprop="position" content="2"> <svg class="c-breadcrumbs__chevron" role="img" aria-hidden="true" focusable="false" width="10" height="10" viewBox="0 0 10 10"> <path d="m5.96738168 4.70639573 2.39518594-2.41447274c.37913917-.38219212.98637524-.38972225 1.35419292-.01894278.37750606.38054586.37784436.99719163-.00013556 1.37821513l-4.03074001 4.06319683c-.37758093.38062133-.98937525.38100976-1.367372-.00003075l-4.03091981-4.06337806c-.37759778-.38063832-.38381821-.99150444-.01600053-1.3622839.37750607-.38054587.98772445-.38240057 1.37006824.00302197l2.39538588 2.4146743.96295325.98624457z" fill-rule="evenodd" transform="matrix(0 -1 1 0 0 10)"/> </svg> </li> <li class="c-breadcrumbs__item" id="breadcrumb2" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <span itemprop="name">Article</span><meta itemprop="position" content="3"> </li> </ol> </nav> <h1 class="c-article-title" data-test="article-title" data-article-title="">Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision</h1> <ul class="c-article-identifiers"> <li class="c-article-identifiers__item" data-test="article-category">Research Article</li> <li class="c-article-identifiers__item"> <a href="https://www.springernature.com/gp/open-research/about/the-fundamentals-of-open-access-and-open-research" data-track="click" data-track-action="open access" data-track-label="link" class="u-color-open-access" data-test="open-access">Open access</a> </li> <li class="c-article-identifiers__item"> Published: <time datetime="2024-02-26">26 February 2024</time> </li> </ul> <ul class="c-article-identifiers c-article-identifiers--cite-list"> <li class="c-article-identifiers__item"> <span data-test="journal-volume">Volume 17</span>, article number <span data-test="article-number">40</span>, (<span data-test="article-publication-year">2024</span>) </li> <li class="c-article-identifiers__item c-article-identifiers__item--cite"> <a href="#citeas" data-track="click" data-track-action="cite this article" data-track-category="article body" data-track-label="link">Cite this article</a> </li> </ul> <div class="app-article-masthead__buttons" data-test="download-article-link-wrapper" data-track-context="masthead"> <div class="c-pdf-container"> <div class="c-pdf-download u-clear-both u-mb-16"> <a href="/content/pdf/10.1007/s44196-024-00427-6.pdf" class="u-button u-button--full-width u-button--primary u-justify-content-space-between c-pdf-download__link" data-article-pdf="true" data-readcube-pdf-url="true" data-test="pdf-link" data-draft-ignore="true" data-track="content_download" data-track-type="article pdf download" data-track-action="download pdf" data-track-label="button" data-track-external download> <span class="c-pdf-download__text">Download PDF</span> <svg aria-hidden="true" focusable="false" width="16" height="16" class="u-icon"><use xlink:href="#icon-eds-i-download-medium"/></svg> </a> </div> </div> <p class="app-article-masthead__access"> <svg width="16" height="16" focusable="false" role="img" aria-hidden="true"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-check-filled-medium"></use></svg> You have full access to this <a href="https://www.springernature.com/gp/open-research/about/the-fundamentals-of-open-access-and-open-research" data-track="click" data-track-action="open access" data-track-label="link">open access</a> article</p> </div> </div> <div class="app-article-masthead__brand"> <a href="/journal/44196" class="app-article-masthead__journal-link" data-track="click_journal_home" data-track-action="journal homepage" data-track-context="article page" data-track-label="link"> <picture> <source type="image/webp" media="(min-width: 768px)" width="120" height="159" srcset="https://media.springernature.com/w120/springer-static/cover-hires/journal/44196?as=webp, https://media.springernature.com/w316/springer-static/cover-hires/journal/44196?as=webp 2x"> <img width="72" height="95" src="https://media.springernature.com/w72/springer-static/cover-hires/journal/44196?as=webp" srcset="https://media.springernature.com/w144/springer-static/cover-hires/journal/44196?as=webp 2x" alt=""> </picture> <span class="app-article-masthead__journal-title">International Journal of Computational Intelligence Systems</span> </a> <a href="https://www.springer.com/journal/44196/aims-and-scope" class="app-article-masthead__submission-link" data-track="click_aims_and_scope" data-track-action="aims and scope" data-track-context="article page" data-track-label="link"> Aims and scope <svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-arrow-right-medium"></use></svg> </a> <a href="https://submission.springernature.com/new-submission/44196/3" class="app-article-masthead__submission-link" data-track="click_submit_manuscript" data-track-context="article masthead on springerlink article page" data-track-action="submit manuscript" data-track-label="link"> Submit manuscript <svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-arrow-right-medium"></use></svg> </a> </div> </div> </div> </section> <div class="c-article-main u-container u-mt-24 u-mb-32 l-with-sidebar" id="main-content" data-component="article-container"> <main class="u-serif js-main-column" data-track-component="article body"> <div class="c-context-bar u-hide" data-test="context-bar" data-context-bar aria-hidden="true"> <div class="c-context-bar__container u-container"> <div class="c-context-bar__title"> Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision </div> <div data-test="inCoD" data-track-context="sticky banner"> <div class="c-pdf-container"> <div class="c-pdf-download u-clear-both u-mb-16"> <a href="/content/pdf/10.1007/s44196-024-00427-6.pdf" class="u-button u-button--full-width u-button--primary u-justify-content-space-between c-pdf-download__link" data-article-pdf="true" data-readcube-pdf-url="true" data-test="pdf-link" data-draft-ignore="true" data-track="content_download" data-track-type="article pdf download" data-track-action="download pdf" data-track-label="button" data-track-external download> <span class="c-pdf-download__text">Download PDF</span> <svg aria-hidden="true" focusable="false" width="16" height="16" class="u-icon"><use xlink:href="#icon-eds-i-download-medium"/></svg> </a> </div> </div> </div> </div> </div> <div class="c-article-header"> <header> <ul class="c-article-author-list c-article-author-list--short" data-test="authors-list" data-component-authors-activator="authors-list"><li class="c-article-author-list__item"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Yang-Ren-Aff1" data-author-popup="auth-Yang-Ren-Aff1" data-author-search="Ren, Yang" data-corresp-id="c1">Yang Ren<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-mail-medium"></use></svg></a><sup class="u-js-hide"><a href="#Aff1">1</a></sup> </li></ul> <div data-test="article-metrics"> <ul class="app-article-metrics-bar u-list-reset"> <li class="app-article-metrics-bar__item"> <p class="app-article-metrics-bar__count"><svg class="u-icon app-article-metrics-bar__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-accesses-medium"></use> </svg>3990 <span class="app-article-metrics-bar__label">Accesses</span></p> </li> <li class="app-article-metrics-bar__item app-article-metrics-bar__item--metrics"> <p class="app-article-metrics-bar__details"><a href="/article/10.1007/s44196-024-00427-6/metrics" data-track="click" data-track-action="view metrics" data-track-label="link" rel="nofollow">Explore all metrics <svg class="u-icon app-article-metrics-bar__arrow-icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-arrow-right-medium"></use> </svg></a></p> </li> </ul> </div> <div class="u-mt-32"> </div> </header> </div> <div data-article-body="true" data-track-component="article body" class="c-article-body"> <section aria-labelledby="Abs1" data-title="Abstract" lang="en"><div class="c-article-section" id="Abs1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Abs1">Abstract</h2><div class="c-article-section__content" id="Abs1-content"><p>In view of the current problems of low detection accuracy, poor stability and slow detection speed of intelligent vehicle violation detection systems, this article will use human–computer interaction and computer vision technology to solve the existing problems. First, the picture data required for the experiment is collected through the Bit Vehicle model dataset, and computer vision technology is used for preprocessing. Then, use Kalman filtering to track and study the vehicle to help better predict the trajectory of the vehicle in the area that needs to be detected; finally, use human–computer interaction technology to build the interactive interface of the system and improve the operability of the system. The violation detection system based on computer vision technology has an accuracy of more than 96.86% for the detection of the eight types of violations extracted, and the average detection is 98%. Through computer vision technology, the system can accurately detect and identify vehicle violations in real time, effectively improving the efficiency and safety of traffic management. In addition, the system also pays special attention to the design of human–computer interaction, provides an intuitive and easy-to-use user interface, and enables traffic managers to easily monitor and manage traffic conditions. This innovative intelligent vehicle violation detection system is expected to help the development of traffic management technology in the future.</p></div></div></section> <div data-test="cobranding-download"> </div> <section aria-labelledby="inline-recommendations" data-title="Inline Recommendations" class="c-article-recommendations" data-track-component="inline-recommendations"> <h3 class="c-article-recommendations-title" id="inline-recommendations">Similar content being viewed by others</h3> <div class="c-article-recommendations-list"> <div class="c-article-recommendations-list__item"> <article class="c-article-recommendations-card" itemscope itemtype="http://schema.org/ScholarlyArticle"> <div class="c-article-recommendations-card__img"><img src="https://media.springernature.com/w92h120/springer-static/cover-hires/book/978-981-16-5529-6?as=webp" loading="lazy" alt=""></div> <div class="c-article-recommendations-card__main"> <h3 class="c-article-recommendations-card__heading" itemprop="name headline"> <a class="c-article-recommendations-card__link" itemprop="url" href="https://link.springer.com/10.1007/978-981-16-5529-6_57?fromPaywallRec=false" data-track="select_recommendations_1" data-track-context="inline recommendations" data-track-action="click recommendations inline - 1" data-track-label="10.1007/978-981-16-5529-6_57">Vision-Based Real Time Vehicle Detection: A Survey </a> </h3> <div class="c-article-meta-recommendations" data-test="recommendation-info"> <span class="c-article-meta-recommendations__item-type">Chapter</span> <span class="c-article-meta-recommendations__date">© 2022</span> </div> </div> </article> </div> <div class="c-article-recommendations-list__item"> <article class="c-article-recommendations-card" itemscope itemtype="http://schema.org/ScholarlyArticle"> <div class="c-article-recommendations-card__img"><img src="https://media.springernature.com/w92h120/springer-static/cover-hires/book/978-3-031-01942-5?as=webp" loading="lazy" alt=""></div> <div class="c-article-recommendations-card__main"> <h3 class="c-article-recommendations-card__heading" itemprop="name headline"> <a class="c-article-recommendations-card__link" itemprop="url" href="https://link.springer.com/10.1007/978-3-031-01942-5_77?fromPaywallRec=false" data-track="select_recommendations_2" data-track-context="inline recommendations" data-track-action="click recommendations inline - 2" data-track-label="10.1007/978-3-031-01942-5_77">Road Traffic Anomaly Detection: A Survey </a> </h3> <div class="c-article-meta-recommendations" data-test="recommendation-info"> <span class="c-article-meta-recommendations__item-type">Chapter</span> <span class="c-article-meta-recommendations__date">© 2022</span> </div> </div> </article> </div> <div class="c-article-recommendations-list__item"> <article class="c-article-recommendations-card" itemscope itemtype="http://schema.org/ScholarlyArticle"> <div class="c-article-recommendations-card__img"><img src="https://media.springernature.com/w215h120/springer-static/image/art%3A10.1007%2Fs00500-023-08860-z/MediaObjects/500_2023_8860_Fig1_HTML.png" loading="lazy" alt=""></div> <div class="c-article-recommendations-card__main"> <h3 class="c-article-recommendations-card__heading" itemprop="name headline"> <a class="c-article-recommendations-card__link" itemprop="url" href="https://link.springer.com/10.1007/s00500-023-08860-z?fromPaywallRec=false" data-track="select_recommendations_3" data-track-context="inline recommendations" data-track-action="click recommendations inline - 3" data-track-label="10.1007/s00500-023-08860-z">A nighttime highway traffic flow monitoring system using vision-based vehicle detection and tracking </a> </h3> <div class="c-article-meta-recommendations" data-test="recommendation-info"> <span class="c-article-meta-recommendations__item-type">Article</span> <span class="c-article-meta-recommendations__date">06 July 2023</span> </div> </div> </article> </div> </div> </section> <script> window.dataLayer = window.dataLayer || []; window.dataLayer.push({ recommendations: { recommender: 'semantic', model: 'specter', policy_id: 'NA', timestamp: 1732668669, embedded_user: 'null' } }); </script> <section aria-labelledby="content-related-subjects" data-test="subject-content"> <h3 id="content-related-subjects" class="c-article__sub-heading">Explore related subjects</h3> <span class="u-sans-serif u-text-s u-display-block u-mb-24">Discover the latest articles, news and stories from top researchers in related subjects.</span> <ul class="c-article-subject-list" role="list"> <li class="c-article-subject-list__subject"> <a href="/subject/artificial-intelligence" data-track="select_related_subject_1" data-track-context="related subjects from content page" data-track-label="Artificial Intelligence">Artificial Intelligence</a> </li> </ul> </section> <div class="app-card-service" data-test="article-checklist-banner"> <div> <a class="app-card-service__link" data-track="click_presubmission_checklist" data-track-context="article page top of reading companion" data-track-category="pre-submission-checklist" data-track-action="clicked article page checklist banner test 2 old version" data-track-label="link" href="https://beta.springernature.com/pre-submission?journalId=44196" data-test="article-checklist-banner-link"> <span class="app-card-service__link-text">Use our pre-submission checklist</span> <svg class="app-card-service__link-icon" aria-hidden="true" focusable="false"><use xlink:href="#icon-eds-i-arrow-right-small"></use></svg> </a> <p class="app-card-service__description">Avoid common mistakes on your manuscript.</p> </div> <div class="app-card-service__icon-container"> <svg class="app-card-service__icon" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-clipboard-check-medium"></use> </svg> </div> </div> <div class="main-content"> <section data-title="Introduction"><div class="c-article-section" id="Sec1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec1"><span class="c-article-section__title-number">1 </span>Introduction</h2><div class="c-article-section__content" id="Sec1-content"><p>In recent years, people’s lifestyles have been continuously improving, and cars have become more family oriented. More and more people are using private cars for transportation, and cars have become the most important means of transportation. With the increase in the number of cars, there have been some violations. The situation of some people stealing other car license plates and forging fake license plates is becoming increasingly severe in order to avoid police investigation after traffic accidents. Traditional methods and systems for detecting violations not only consume police force and money, but also are very inefficient. At present, most traffic violation detections are based on manual methods, and there are obvious shortcomings in the accuracy and speed of vehicle violation detection systems [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 1" title="Lv, Z., Qiao, L., You, I.: 6G-Enabled network in box for internet of connected vehicles. IEEE Trans. Intellig. Transport. Syst. (2020). 
 https://doi.org/10.1109/TITS.2020.3034817
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR1" id="ref-link-section-d44058815e287">1</a>]. Therefore, this article would use human–computer interaction and computer vision technology to study the violation detection system. Computer vision technology can timely and accurately analyze and understand the collected image data, quickly detect images, and timely discover violation information. Through human–computer interaction technology, multiple information fusion of intelligent vehicle violation detection systems can be achieved. This helps to improve the accuracy, reliability, and fault tolerance of the detection system, reducing problems caused by single sensor failure or misjudgment.</p><p>Vehicle violation is one of the main causes of traffic accidents, which poses a serious threat to the safety of people’s lives and property. Through human–computer interaction and computer vision, the intelligent vehicle violation detection system can detect and correct violations in time, reduce the probability of traffic accidents, and improve the safety of road traffic. At the same time, violations will lead to traffic congestion and delays, affecting people’s travel efficiency and comfort. Through the intelligent vehicle violation detection system, it can reduce the occurrence of violations, improve the smooth degree of traffic flow, and alleviate the urban traffic congestion problem.</p><p>Nowadays, the main means of transportation for people to travel is by car. Cars not only bring many conveniences to people’s lives, but also increase the probability of traffic accidents and violations. Vehicle detection systems can effectively solve this problem [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2" title="Asadianfam, S., Shamsi, M., Rasouli Kenari, A.: Big data platform of traffic violation detection system: identifying the risky behaviors of vehicle drivers. Multimedia Tools Appl. 79(33–34), 24645–24684 (2020). 
 https://doi.org/10.1007/s11042-020-09099-8
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR2" id="ref-link-section-d44058815e296">2</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 3" title="Sahraoui, Y., Kerrache, C. A., Korichi, A., Nour, B., Adnane, A., Hussain, R.: “DeepDist: a deep-learning-based IoV framework for real-time objects and distance violation detection.” IEEE Internet Things Magaz. 33, 30–34 (2020) 
 https://doi.org/10.1109/IOTM.0001.2000116
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR3" id="ref-link-section-d44058815e299">3</a>], for which many researchers have conducted research. Maha Vishnu, V. C. proposed a mechanism for detecting violations using traffic videos, detecting accidents through dynamic traffic signal control, and classifying vehicles using flow gradient feature histograms [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 4" title="Maha Vishnu, V.C., Rajalakshmi, M., Nedunchezhian, R.: Intelligent traffic video surveillance and accident detection system with dynamic traffic signal control. Cluster Comput. 215, 135–147 (2018). 
 https://doi.org/10.1007/s10586-017-0974-5
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR4" id="ref-link-section-d44058815e302">4</a>]. Zhang, Rusheng proposed a new reinforcement learning algorithm for detecting intelligent traffic signal control systems, and studied the performance of the system under different traffic flows and road network types. Although the system can effectively reduce the average waiting time of vehicles at intersections, the accuracy of violation checks needs to be improved [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 5" title="Zhang, R., Ishikawa, A., Wang, W., Striner, B., Tonguz, O.K.: Using reinforcement learning with partial vehicle detection for intelligent traffic signal control. IEEE Trans. Intellig. Transport. Syst. 22(1), 404–415 (2020). 
 https://doi.org/10.1109/TITS.2019.2958859
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR5" id="ref-link-section-d44058815e305">5</a>]. Liu Shuo used the YOLOv3 algorithm to detect vehicles in traffic intersection images, improving the model’s detection ability for small target objects such as license plates and faces [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 6" title="Liu Shuo., Gu Yuhai., Rao Wenjun., Wang Juyuan.: “A method for detecting illegal vehicles based on the optimized YOLOv3 algorithm”. J. Chongqing Univer. Technol. (Nat. Sci.) 35.4, 135–141 (2021). 
 https://doi.org/10.3969/j.issn.1674-8425(z).2021.04.018
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR6" id="ref-link-section-d44058815e308">6</a>]. Alagarsamy, Saravanan designed a system to detect and analyze driver violations of traffic rules. The proposed system tracks driver activities and stores violations in a database [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 7" title="Alagarsamy, S., Ramkumar, S., Kamatchi, K., Shankar, H., Kumar, A., Karthick, S., Kumar, P.: “Designing a advanced technique for detection and violation of traffic control system.” J. Crit. Rev. 7.8, 2874–2879 (2020). 
 https://doi.org/10.31838/jcr.07.08.473
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR7" id="ref-link-section-d44058815e312">7</a>]. Bhat, Akhilalakshmi T believed that the machine has completed all tasks including automatic vehicle detection and violations. Traffic records would be collected through closed circuit television recordings and then detected by the system for violations [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 8" title="Bhat, A.T., Rao, M.S., Pai, D.G.: Traffic violation detection in India using genetic algorithm. Glob. Trans. Proc. 2(2), 309–314 (2021). 
 https://doi.org/10.1016/j.gltp.2021.08.056
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR8" id="ref-link-section-d44058815e315">8</a>]. Charran, R. Shree proposed a system to automatically detect car violations and ultimately process tickets by capturing violations and corresponding vehicle numbers in a database [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 9" title="Charran, R. S., Dubey. R. K.: “Two-Wheeler Vehicle Traffic Violations Detection and Automated Ticketing for Indian Road Scenario.” IEEE Trans. Intellig. Transport. Syst. 23.11, 22002–22007 (2022). 
 https://doi.org/10.1109/TITS.2022.3186679
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR9" id="ref-link-section-d44058815e318">9</a>]. Ozkul, Mukremin proposed a traffic violation detection and reporting system. It does not rely on expensive infrastructure or the presence of law enforcement personnel. It determines whether these changes comply with the traffic rules encoded in the system by observing the changes in nearby vehicles [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 10" title="Ozkul, M., Çapuni, I.: Police-less multi-party traffic violation detection and reporting system with privacy preservation. IET Intellig. Trans. Syst. 12(5), 351–358 (2018). 
 https://doi.org/10.1049/iet-its.2017.0122
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR10" id="ref-link-section-d44058815e321">10</a>]. In summary, research on vehicle violation detection has achieved some results, but there are still shortcomings in terms of accuracy in violation detection. Therefore, this article would use human–computer interaction and computer vision technology to study it, in order to improve the accuracy of detection.</p><p>In the literature review section, the comparison table of this literature and other literature is shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00427-6#Tab1">1</a>.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-1"><figure><figcaption class="c-article-table__figcaption"><b id="Tab1" data-test="table-caption">Table 1 Comparison of the literature contents</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00427-6/tables/1" aria-label="Full size table 1"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>Computer vision typically requires extracting and reconstructing three-dimensional data from two-dimensional data, so that computers can understand the environment and respond or react. It uses computers to simulate lively objects for preprocessing, view and analyze the characteristics of the target to determine its situation, thereby achieving target detection and planning functions [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 11" title="Abbas, A. F., Sheikh, U. U., Al-Dhief, F. T., Mohd, M. N. H.: “A comprehensive review of vehicle detection using computer vision.” TELKOMNIKA (Telecommunication Computing Electronics and Control) 19.3, 838–850 (2021). 
 https://doi.org/10.12928/telkomnika.v19i3.12880
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR11" id="ref-link-section-d44058815e477">11</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 12" title="Guo, M.-H., Xu, T.X., Liu, J.J., Liu, Z.N., Jiang, P.T., Mu, T.J., et al.: Attention mechanisms in computer vision: a survey. Comput. Visual Media 8(3), 331–368 (2022). 
 https://doi.org/10.1007/s41095-022-0271-y
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR12" id="ref-link-section-d44058815e480">12</a>]. Yang Zi believed that computer vision technology has been able to detect vehicles on the road. For this purpose, research was conducted on vehicle detection in different environments, and it was found that due to the variability of road driving environment, computer vision technology has effectively improved the accuracy of monitoring automobile illegal behavior [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 13" title="Yang, Z., Pun-Cheng, L.S.C.: Vehicle detection in intelligent transportation systems and its applications under varying environments: a review. Image Vis. Comput. 6(9), 143–154 (2018). 
 https://doi.org/10.1016/j.imavis.2017.09.008
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR13" id="ref-link-section-d44058815e483">13</a>]. Kumar, Aman proposed an intelligent traffic violation detection system. It helped detect violations in different scenarios and provide corresponding alerts based on the corresponding types of violations [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 14" title="Kumar, A., Kundu, S., Kumar, S., Tiwari, U. K., Kalra, J.: “S-tvds: Smart traffic violation detection system for Indian traffic scenario.” Int. J. Innovat. Technol. Explor. Eng. (IJITEE) 8.4S3, 6–10 (2019). 
 https://doi.org/10.35940/ijitee.D1002.0384S319
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR14" id="ref-link-section-d44058815e486">14</a>]. Mahmud Yusuf Tanrikulu Ensuring sound arrangements when organizing and promoting vaccines for people with dementia and establishing support mechanisms where public health measures designed to control the spread of the virus have profound and often tragic consequences for people with dementia, their families and caregivers [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 15" title="Tutsoy, O., Tanrikulu. M. Y.: “Priority and age specific vaccination algorithm for the pandemic diseases: a comprehensive parametric prediction model.” BMC Med. Inform. Decis. Mak. 22.1, 4 (2022). 
 https://doi.org/10.13140/RG.2.2.25044.32646
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR15" id="ref-link-section-d44058815e489">15</a>]. Tutsoy, Onder established constrained multidimensional mathematics and meta-heuristic algorithms based on graph theory to learn the unknown parameters of large-scale epidemiological model, with the specified parameters and the coupling parameters of the optimization problem. The results obtained under equal conditions show that the mathematical optimization algorithm CM-RLS is better than the MA algorithm [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 16" title="Tutsoy, O.: Graph theory based large-scale machine learning with multi-dimensional constrained optimization approaches for exact epidemiological modelling of pandemic diseases. IEEE Trans. Pattern Analysis Mach. Intelligence (2023). 
 https://doi.org/10.1109/TPAMI.2023.3256421
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR16" id="ref-link-section-d44058815e493">16</a>]. Arabi, Saeed provided a practical and comprehensive engineering vehicle detection solution based on deep learning and computer vision [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 17" title="Arabi, S., Haghighat, A., Sharma, A.: A deep-learning-based computer vision solution for construction vehicle detection. Comput. Aided Civil Infrastr. Eng. 35(7), 753–767 (2020). 
 https://doi.org/10.1111/mice.12530
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR17" id="ref-link-section-d44058815e496">17</a>]. Overall, using computer vision technology can effectively improve the accuracy of detection.</p><p>Among the accidents that occur every year, the proportion of accidents caused by violations is very high, which may directly cause financial losses of billions of yuan. In order to strengthen the detection of cars and avoid traffic accidents, in addition to strengthening legal education for drivers, a system for detecting illegal vehicles is also needed to help better correct driver violations. This article would use human–computer interaction and computer vision technology to construct an intelligent vehicle violation detection system. It first utilizes self-photography and public databases to collect the image dataset required for the experiment. Then, this article uses computer vision technology to preprocess a portion of the data images. It mainly involves noise filtering and image enhancement processing, followed by the use of Kalman filtering for vehicle tracking research. Finally, based on human–computer interaction technology for system interface design, this article effectively improves the performance of the violation detection system through the above steps. This also improves the accuracy and speed of the system’s detection.</p><p>The novelty of the intelligent vehicle violation detection system based on human–computer interaction and computer vision is mainly reflected in the following aspects: 1. Comprehensive use of human–computer interaction and computer vision technology: the system combines human–computer interaction and computer vision two technologies, so that the system can not only automatically detect and identify illegal behaviors, but also provide intuitive monitoring interface for traffic managers through human–computer interaction, so as to realize efficient human–machine cooperation. 2. Focus on user-friendly design: compared with the traditional monitoring system, the system pays more attention to user-friendly design. Through the human–computer interaction technology, users can intuitively view the information of the illegal vehicles, and respond quickly. This greatly improves the efficiency and convenience of traffic management. 3. Dynamic adjustment and adaptive learning ability: the system has the ability of dynamic adjustment and adaptive learning, and can automatically adjust the working mode according to different environments and conditions to ensure the accuracy of detection. In addition, the system can also improve its ability to detect violations through continuous learning and training.</p></div></div></section><section data-title="Vehicle Violation Image Data Based on Computer Vision-Related Technology"><div class="c-article-section" id="Sec2-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec2"><span class="c-article-section__title-number">2 </span>Vehicle Violation Image Data Based on Computer Vision-Related Technology</h2><div class="c-article-section__content" id="Sec2-content"><p>Computer vision is a general term that involves any visual content computation [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 18" title="Wiley, V., Lucas, T.: “Computer vision and image processing: a paper review.” Int. J. Artif. Intellig. Res. 2.1, 29–36 (2018).
 https://doi.org/10.29099/ijair.v2i1.42
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR18" id="ref-link-section-d44058815e513">18</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 19" title="Tian, H., Wang, T., Liu, Y., Qiao, X., Li, Y.: Computer vision technology in agricultural automation—a review. Inform. Process. Agric. 7(1), 1–19 (2020). 
 https://doi.org/10.1016/j.inpa.2019.09.006
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR19" id="ref-link-section-d44058815e516">19</a>]. It learns how to help computers gain a higher level of understanding from digital images or videos, and extract valuable information from the real world for decision-making purposes. In order to better detect violations of intelligent vehicles and improve the accuracy and real-time performance of violations detection, it is necessary to process the front image before detecting the image.</p><h3 class="c-article__sub-heading" id="Sec3"><span class="c-article-section__title-number">2.1 </span>Data Collection</h3><p>Data reliability mainly refers to the accuracy and completeness of the data. For the vehicle violation detection system, the accuracy of the data is very important, because the wrong detection results may lead to misjudgment or missed judgment. During data acquisition, there may be outliers, duplicate data, or incomplete data, requiring data cleaning to remove these invalid or low-quality data. For the data in the training set, annotation is required, that is, each sample is classified or marked by professionals. The annotation process needs to follow uniform rules and standards to ensure the accuracy and consistency of the data.</p><p>There are two sources of data for this article. One is online data, and the other is how to divide the size of photos taken with a mobile phone into four parts. The datasets used are the BIT-Vehicle dataset from Beijing University of Technology [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 20" title="Qianlong, D., Wei, S.: Model recognition based on improved sparse stack coding. Comput. Eng. Appl. 56(1), 136–141 (2020). 
 https://doi.org/10.1109/ACCESS.2020.2997286
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR20" id="ref-link-section-d44058815e529">20</a>] and the Apollo Scape dataset [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 21" title="Huang, X., Wang, P., Cheng, X., Zhou, D., Geng, Q., Yang, R.: The apolloscape open dataset for autonomous driving and its application. IEEE Trans. Pattern Anal. Mach. Intell. 42(10), 2702–2719 (2019). 
 https://doi.org/10.1109/TPAMI.2019.2926463
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR21" id="ref-link-section-d44058815e532">21</a>]. The paper extracted 49,223 images of size 1920 × 1080 or 1600 × 1200 from them, which were taken from cameras at intersections. Part of the image data listed in this article is shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig1">1</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-1" data-title="Fig. 1"><figure><figcaption><b id="Fig1" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 1</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/1" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig1_HTML.jpg?as=webp"><img aria-describedby="Fig1" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig1_HTML.jpg" alt="figure 1" loading="lazy" width="685" height="411"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-1-desc"><p>Partial dataset image data</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/1" data-track-dest="link:Figure1 Full size image" aria-label="Full size image figure 1" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>As shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig1">1</a>, this article randomly selected a portion of data from the dataset studied, each with different characteristics. Some of the vehicles in the pictures are illegal, while others are not. Based on these datasets, all the datasets in this study were photos taken by mobile phones and other public databases, with a total of 49,521 images and 56,372 annotation boxes.</p><h3 class="c-article__sub-heading" id="Sec4"><span class="c-article-section__title-number">2.2 </span>Image Preprocessing</h3><h4 class="c-article__sub-heading c-article__sub-heading--small" id="Sec5"><span class="c-article-section__title-number">2.2.1 </span>Image Denoising</h4><p>During the recording and transmission process of automotive digital images, they are affected by factors such as their own equipment and external noise, resulting in images with noise effects. Pixel blocks or elements that appear abrupt in an image can be understood as being affected by noise. This would seriously affect the quality of car images, making them unclear, so it is necessary to filter the images. Due to the difference between the grayscale values of noise in the image and those of nearby normal pixels, the mean filtering method utilizes this feature to replace the grayscale values of the center pixel with the average of all pixel grayscale values in the image. This reduces the impact of image noise and achieves the filtering effect [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 22" title="Rakshit, M., Das, S.: An efficient ECG denoising methodology using empirical mode decomposition and adaptive switching mean filter. Biomed. Signal Process. Control 40(3), 140–148 (2018). 
 https://doi.org/10.1016/j.bspc.2017.09.020
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR22" id="ref-link-section-d44058815e573">22</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 23" title="Shengchun, W., Jin, Li., Shanshan, Hu.: A calculation method for floating datum of complex surface area based on non-local mean filtering. Prog. Geophys. 33(5), 1985–1988 (2018). 
 https://doi.org/10.6038/pg2018CC0132
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR23" id="ref-link-section-d44058815e576">23</a>]. The mathematical expression is</p><div id="Equ1" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$h_{{\left( {a,b} \right)}} = \frac{1}{N}\mathop \sum \limits_{{\left( {n,m} \right) \in k}} f_{{\left( {a,b} \right)}} \left( {a - n,b - m} \right)$$</span></div><div class="c-article-equation__number"> (1) </div></div><p>In the above formula, <span class="mathjax-tex">\(k\)</span> represents the set of all pixels in the neighborhood determined by the <span class="mathjax-tex">\(\left(a,b\right)\)</span> point. <span class="mathjax-tex">\(N\)</span> represents the number of pixels in the neighborhood determined by <span class="mathjax-tex">\(\left(a,b\right)\)</span> points, while <span class="mathjax-tex">\({f}_{\left(a,b\right)}\)</span> represents the grayscale value of the original image at <span class="mathjax-tex">\(\left(a,b\right)\)</span> points. <span class="mathjax-tex">\({h}_{\left(a,b\right)}\)</span> represents the grayscale value of the image after mean filtering for the pixel. The comparison of the effects of mean filtering on the original image is shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig2">2</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-2" data-title="Fig. 2"><figure><figcaption><b id="Fig2" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 2</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/2" rel="nofollow"><picture><img aria-describedby="Fig2" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig2_HTML.png" alt="figure 2" loading="lazy" width="685" height="456"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-2-desc"><p>Comparison of the effects of mean filtering on the original image</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/2" data-track-dest="link:Figure2 Full size image" aria-label="Full size image figure 2" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><h4 class="c-article__sub-heading c-article__sub-heading--small" id="Sec6"><span class="c-article-section__title-number">2.2.2 </span>Image Contrast Enhancement</h4><p>The cameras used for vehicle image collection are in different environments, which greatly affects the photos taken. Therefore, improving the contrast of vehicle violation images is very important. In the same image, certain areas have lower pixel values, making these features unclear. Therefore, this can be improved by balancing the histogram of the target image, which is beneficial for subsequent correlation processing. The method of histogram equalization is simple and effective, and has been widely used in image enhancement. This article would also use this method to improve image comparison [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 24" title="Dyke, R.M., Hormann, K.: Histogram equalization using a selective filter. Vis. Comput. 39(12), 6221–6235 (2023). 
 https://doi.org/10.1007/s00371-022-02723-8
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR24" id="ref-link-section-d44058815e884">24</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 25" title="Agrawal, S., Panda, R., Mishro, P. K., Abraham, A.: “A novel joint histogram equalization based image contrast enhancement.” J. King Saud University Comput. Inform. Sci. 34.4, 1172–1182 (2022). 
 https://doi.org/10.1016/j.jksuci.2019.05.010
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR25" id="ref-link-section-d44058815e887">25</a>]. The principle of histogram equalization is to widen the grayscale values with more pixels in image processing and merge the grayscale values with smaller pixels, thereby improving the contrast of the image and making it clearer [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 26" title="Vijayalakshmi, D., Nath, M.K.: A novel contrast enhancement technique using gradient-based joint histogram equalization. Circuits Syst. Signal Process. 40(8), 3929–3967 (2021). 
 https://doi.org/10.1007/s00034-021-01655-3
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR26" id="ref-link-section-d44058815e890">26</a>]. The comparison before and after histogram equalization and the corresponding image histograms are shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig3">3</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-3" data-title="Fig. 3"><figure><figcaption><b id="Fig3" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 3</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/3" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig3_HTML.png?as=webp"><img aria-describedby="Fig3" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig3_HTML.png" alt="figure 3" loading="lazy" width="685" height="547"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-3-desc"><p>Comparison before and after histogram equalization and corresponding image histograms</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/3" data-track-dest="link:Figure3 Full size image" aria-label="Full size image figure 3" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>Parameter updates or learning rules are a key part in machine learning that describes how a model adjusts its internal parameters to improve performance against new data. In supervised learning, this often involves minimizing a loss function that measures the difference between the model predictions and the actual labels. For the vehicle violation detection system, these labels indicate whether there is any violation in the image and what type of violation it is. Invite professional annotators or teams to mark the collected vehicle images one by one. The labeling process needs to follow the unified labeling criteria to ensure the accuracy of each sample label. To ensure the accuracy of annotation, annotated data can be spot-sampled and verified to detect and correct possible errors.</p></div></div></section><section data-title="Vehicle Tracking on Kalman Filter"><div class="c-article-section" id="Sec7-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec7"><span class="c-article-section__title-number">3 </span>Vehicle Tracking on Kalman Filter</h2><div class="c-article-section__content" id="Sec7-content"><p>The detection and tracking technology of moving objects is one of the research hotspots in the fields of digital image processing and recognition, as well as computer vision. It has many applications in various aspects of human life. Computer vision technology has also made great progress in vehicle tracking and has many good benefits. The first thing that a violation detection system based on computer vision needs to do is to detect and recognize each moving object in the image, and then analyze the specific situation of the moving object according to necessary standards to determine whether further work is needed. Therefore, this article would use Kalman filtering to study vehicle tracking. Based on estimation theory, Kalman filtering introduces the concept of state equations and establishes a system state model [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 27" title="Pei, Y., Biswas, S., Fussell, D.S., Pingali, K.: An elementary introduction to Kalman filtering. Commun. ACM 62(11), 122–133 (2019). 
 https://doi.org/10.1145/3363294
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR27" id="ref-link-section-d44058815e926">27</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 28" title="Fang, H., Tian, N., Wang, Y., Zhou, M., Haile, M.A.: Nonlinear Bayesian estimation: From Kalman filtering to a broader horizon. IEEE/CAA J. Automat. Sinica 5(2), 401–417 (2018). 
 https://doi.org/10.1109/JAS.2017.7510808
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR28" id="ref-link-section-d44058815e929">28</a>].</p><p>The Kalman filter is described by a state equation and an observation equation. A discrete dynamic system can be considered to consist of two systems, and the expression is as follows:</p><div id="Equ2" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$Y_{h} = XY_{h - 1} + CF_{h} + M_{h - 1}$$</span></div><div class="c-article-equation__number"> (2) </div></div><p>Among them, <span class="mathjax-tex">\({\text{X}}\)</span> is the transition matrix used to measure the transition from the state of the system <span class="mathjax-tex">\({\text{h}}-1\)</span> to the state at that time <span class="mathjax-tex">\({\text{h}}\)</span>. <span class="mathjax-tex">\({{\text{M}}}_{{\text{h}}-1}\)</span> is the zero mean used to represent the state model error, and the observation system can be expressed by the following observation equation, as shown as follows:</p><div id="Equ3" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$V_{h} = F_{h} Y_{h} + X_{h}$$</span></div><div class="c-article-equation__number"> (3) </div></div><p>In the equation, h is called the observation matrix. <span class="mathjax-tex">\({V}_{h}\)</span> represents the value of the matrix.</p><p>The principle of Kalman filtering is linear least squares error estimation. Due to the fact that the noise is Gaussian white noise, its limitation on target movement is very high [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 29" title="Liu, S., Wang, Z., Chen, Y., Wei, G.: Protocol-based unscented Kalman filtering in the presence of stochastic uncertainties. IEEE Trans. Autom. Control 65(3), 1303–1309 (2019). 
 https://doi.org/10.1109/TAC.2019.2929817
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR29" id="ref-link-section-d44058815e1180">29</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 30" title="Huang, Y., Zhang, Y., Zhao, Y., Shi, P., Chambers, J.A.: A novel outlier-robust Kalman filtering framework based on statistical similarity measure. IEEE Trans. Autom. Control 66(6), 2677–2692 (2020). 
 https://doi.org/10.1109/TAC.2020.3011443
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR30" id="ref-link-section-d44058815e1183">30</a>]. The simulation results of Kalman filtering trajectory for uniform linear motion are shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig4">4</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-4" data-title="Fig. 4"><figure><figcaption><b id="Fig4" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 4</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/4" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig4_HTML.png?as=webp"><img aria-describedby="Fig4" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig4_HTML.png" alt="figure 4" loading="lazy" width="685" height="533"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-4-desc"><p>Simulation results of Kalman filtering trajectory for uniform linear motion</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/4" data-track-dest="link:Figure4 Full size image" aria-label="Full size image figure 4" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>In order to narrow the search range of the target vehicle in the new image, it is necessary to first use a certain algorithm to estimate the approximate position of the target vehicle in the new image frame and determine its matching range. The Kalman filter tracking model continuously predicts, corrects, and adjusts target violations through calculations [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 31" title="Xiangyu, K., Xiaopeng, Z., Xuanyong, Z., et al.: Adaptive dynamic state estimation of distribution network based on interacting multiple model [J]. IEEE Trans. Sustain. Energy APR 13(2), 643–652 (2022)" href="/article/10.1007/s44196-024-00427-6#ref-CR31" id="ref-link-section-d44058815e1209">31</a>]. Due to the small time delay between adjacent frames in the image, the speed and direction of target motion do not immediately change. Based on this characteristic, the matching of the target vehicle is determined as follows:</p><div id="Equ4" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\overline{{a_{i} }} = \overline{{a_{3} }} + \left( {\overline{{a_{3} }} - \overline{{a_{2} }} } \right) + \left( {\left( {\overline{{a_{3} }} - \overline{{a_{2} }} } \right) - \left( {\overline{{a_{2} }} - \overline{{a_{1} }} } \right)} \right)$$</span></div><div class="c-article-equation__number"> (4) </div></div><div id="Equ5" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\overline{{b_{i} }} = \overline{{b_{3} }} + \left( {\overline{{b_{3} }} - \overline{{b_{2} }} } \right) + \left( {\left( {\overline{{b_{3} }} - \overline{{b_{2} }} } \right) - \left( {\overline{{b_{2} }} - \overline{{b_{1} }} } \right)} \right)$$</span></div><div class="c-article-equation__number"> (5) </div></div><div id="Equ6" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$Q = \sqrt {(\overline{{a_{i} }} - \overline{{a_{3} }} )^{2} + (\overline{{b_{i} }} - \overline{{b_{3} }} )^{2} }$$</span></div><div class="c-article-equation__number"> (6) </div></div><p><span class="mathjax-tex">\((\overline{{a }_{i}},\overline{{b }_{i}})\)</span> is the predicted centroid coordinate of the target vehicle in a continuous image line; <span class="mathjax-tex">\(Q\)</span> is the radius of the target vehicle search area in the next image. The circular area with the predicted centroid as the center and <span class="mathjax-tex">\(Q\)</span> as the radius serves as the matching range of the target vehicle in the next frame of the image, avoiding search and matching within the entire image range. It not only increases the accuracy of tracking, but also improves the real-time performance of tracking, that is to say, when completing target matching, the vehicle to be matched meets the following conditions:</p><div id="Equ7" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$d = \sqrt {(\overline{{a_{i} }} - \overline{{a_{j} }} )^{2} + (\overline{{b_{i} }} - \overline{{b_{j} }} )^{2} } < Q$$</span></div><div class="c-article-equation__number"> (7) </div></div><p><span class="mathjax-tex">\((\overline{{a }_{j}},\overline{{b }_{j}})\)</span> is the centroid coordinate of the vehicle to be matched; <span class="mathjax-tex">\({\text{d}}\)</span> is the distance between the centroid of the vehicle to be matched and the predicted centroid of the target vehicle. For target vehicles that have just entered the image frame and have not yet created a tracking sequence, due to the lack of specific information, this method cannot narrow the search range. Instead, feature sets can be directly used to match within the entire image range. The Kalman filter tracking effect is shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig5">5</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-5" data-title="Fig. 5"><figure><figcaption><b id="Fig5" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 5</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/5" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig5_HTML.jpg?as=webp"><img aria-describedby="Fig5" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig5_HTML.jpg" alt="figure 5" loading="lazy" width="685" height="518"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-5-desc"><p>Kalman filter tracking effect</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/5" data-track-dest="link:Figure5 Full size image" aria-label="Full size image figure 5" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>As shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig5">5</a>, using Kalman filtering to track vehicles can still effectively track the vehicles that need to be tracked even when the traffic flow is high. Based on the results of object detection, this article frames the size of foreground clumps in the figure, as shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig5">5</a>. The red rectangle represents the smallest box that can frame the foreground, and its centroid coordinates can be calculated:</p><div id="Equ8" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$dif = x\sqrt {(\overline{{a_{i} }} - \overline{{a_{3} }} )^{2} + (\overline{{b_{i} }} - \overline{{b_{3} }} )^{2} } + \gamma \left| {M - M_{i} \left| { + \beta } \right|\overline{H} - \overline{{H_{i} }} } \right| < t$$</span></div><div class="c-article-equation__number"> (8) </div></div><p>Among them, <span class="mathjax-tex">\(M\)</span> and <span class="mathjax-tex">\({M}_{i}\)</span>, respectively, represent the size and average grayscale value of the vehicle to be matched; <span class="mathjax-tex">\(x\)</span>, <span class="mathjax-tex">\(\gamma\)</span>, <span class="mathjax-tex">\(\beta\)</span> are the weighted coefficients; <span class="mathjax-tex">\(t\)</span> is the threshold, and <span class="mathjax-tex">\(dif\)</span> is the degree of dissimilarity. The higher the degree of similarity, the higher the degree of matching between these two targets, and the greater is the likelihood of being the same target vehicle.</p><p>Based on the speed and direction of the vehicle, and over time, the estimated new position of the vehicle can be calculated. The new speed of the vehicle can be estimated based on the speed and direction in the previous state, and the time that has passed. The actual observation location can be obtained through the sensor vehicle actual location data. In some cases, control inputs, such as acceleration or braking commands, may be introduced to influence the speed and direction of the vehicle. These control inputs can be used as additional input conditions for the Kalman filter. The Kalman filter uses a noise model to describe both the measurement noise and the process noise. These noise models can be tuned to historical data and experience to optimize filter performance.</p><p>When conducting vehicle tracking, the first step is to determine the operating parameter values and initial values of the vehicle through multiple on-site tests. Then, this article calculates the feature results based on the vehicle detection results, establishes a centroid position estimation model, and performs search matching. This article selects a video for vehicle tracking, recording images at a frequency of 6 frames per second. This article marks the vehicles entering the tracking sequence and identifies the same vehicle with the same number. The effect of tracking processing is shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig6">6</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-6" data-title="Fig. 6"><figure><figcaption><b id="Fig6" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 6</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/6" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig6_HTML.png?as=webp"><img aria-describedby="Fig6" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig6_HTML.png" alt="figure 6" loading="lazy" width="685" height="592"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-6-desc"><p>Effect of tracking processing</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/6" data-track-dest="link:Figure6 Full size image" aria-label="Full size image figure 6" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>As shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig6">6</a>, it is the implementation result of the vehicle tracking process based on the Kalman filter prediction tracking method. Figure <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig6">6</a>(a–d) show the motion modes of two mobile vehicles named 4 and 5, respectively. From Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig6">6</a>(a–d), it can be seen that the area of the car has changed due to the car moving away from the camera along the lane. Therefore, the block diagram of the tracking algorithm based on Kalman filtering has also changed. It ensures that the size of the tracking window and the size of the moving vehicle are basically consistent at each moment. In addition, from the tracking window of the moving vehicle in the figure, it can be seen that the upper and lower positions of the tracking window are relatively accurate.</p></div></div></section><section data-title="Design of Intelligent Vehicle Violation Detection System"><div class="c-article-section" id="Sec8-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec8"><span class="c-article-section__title-number">4 </span>Design of Intelligent Vehicle Violation Detection System</h2><div class="c-article-section__content" id="Sec8-content"><h3 class="c-article__sub-heading" id="Sec9"><span class="c-article-section__title-number">4.1 </span>System Requirements and Architecture</h3><p>With the development of humanity and the advancement of technology, the number of vehicles continues to increase, leading to an increasing number of traffic problems, and illegal driving is one of the main reasons for traffic accidents. Therefore, establishing an inspection system that can detect illegal vehicles in a timely manner is very important [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 32" title="Go, M.J., Park, M., Yeo, J.: “Detecting vehicles that are illegally driving on road shoulders using faster R-CNN.” J. Korea Instit. Intellig. Trans. Syst. 21.1, 105–122 (2022). 
 https://doi.org/10.12815/kits.2022.21.1.105
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR32" id="ref-link-section-d44058815e2289">32</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 33" title="Saritha, M., Rajalakshmi, S., Angel Deborah, S., Milton, R.S., Thirumla Devi, S., Vrithika, M., et al.: RFID-based traffic violation detection and traffic flow analysis system. Int. J. Pure Appl. Math. 118(20), 319–328 (2018). 
 https://doi.org/10.1007/s11042-020-09714-8
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR33" id="ref-link-section-d44058815e2292">33</a>]. In the era of advanced technology, traffic management would not be easy to succeed if it relies on numerous police officers as before. In order to efficiently complete the detection of road violations, it is necessary to understand the intelligent vehicle violation detection system. This system typically uses computer vision and image processing technology to identify and detect vehicle violations, saving police force and time [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 34" title="Agarwal, P., Chopra, K., Kashif, M., Kumari, V.: Implementing ALPR for detection of traffic violations: a step towards sustainability. Procedia Comput. Sci. 13(2), 738–743 (2018). 
 https://doi.org/10.1016/j.procs.2018.05.085
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR34" id="ref-link-section-d44058815e2295">34</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 35" title="Santhosh, K.K., Dogra, D.P., Roy, P.P.: Anomaly detection in road traffic using visual surveillance: A survey. ACM Comput. Surveys (CSUR) 53(6), 1–26 (2020). 
 https://doi.org/10.1145/3417989
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR35" id="ref-link-section-d44058815e2298">35</a>]. At the same time, it has good stability and can work for a long time. The detailed information is shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig7">7</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-7" data-title="Fig. 7"><figure><figcaption><b id="Fig7" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 7</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/7" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig7_HTML.png?as=webp"><img aria-describedby="Fig7" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig7_HTML.png" alt="figure 7" loading="lazy" width="685" height="714"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-7-desc"><p>Architecture diagram of intelligent vehicle violation detection system</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/7" data-track-dest="link:Figure7 Full size image" aria-label="Full size image figure 7" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>As shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig7">7</a>, the system deploys cameras in areas or intersections that need to be monitored. It can take photos of vehicles based on photo triggering patterns, and then send the captured photos to the central processing system. It utilizes communication networks to collect images from the automotive data storage system for transmission and storage.</p><p>The violation detection module detects the vehicle’s speed and violation behavior based on the tracking results [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 36" title="Kousar, S., Aslam, F., Kausar, N., Pamucar, D., Addis, G.M.: Fault diagnosis in regenerative braking system of hybrid electric vehicles by using semigroup of finite-state deterministic fully intuitionistic fuzzy automata. Comput. Intellig. Neurosci. (2022). 
 https://doi.org/10.1155/2022/3684727
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR36" id="ref-link-section-d44058815e2331">36</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 37" title="Liu, Y., Zhong, S., Kausar, N., Zhang, C., Mohammadzadeh, A., Pamucar, D.: “A stable fuzzy-based computational model and control for inductions motors.” Cmes-Comput. Model. Eng. Sci. 138, 793–812 (2024). 
 https://doi.org/10.32604/cmes.2023.028175
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR37" id="ref-link-section-d44058815e2334">37</a>]. It can use computer vision technology to locate and recognize vehicle information, and then calculate the vehicle’s speed based on driver’s license information. If the speed of a car exceeds the threshold or is different from the speed of surrounding cars, it is determined that the car has violated regulations. The main characteristic parameter settings of the corresponding control are shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00427-6#Tab2">2</a>:</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-2"><figure><figcaption class="c-article-table__figcaption"><b id="Tab2" data-test="table-caption">Table 2 Control property parameter table</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00427-6/tables/2" aria-label="Full size table 2"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><h3 class="c-article__sub-heading" id="Sec10"><span class="c-article-section__title-number">4.2 </span>Illegal Parking of Vehicles</h3><p>At present, there is no unified and effective detection system for illegal parking, and research on illegal parking is still ongoing. This article compares the changes in background pixel values before and after vehicles enters the area of interest, and uses computer vision technology to construct an intelligent vehicle violation detection system. When an illegal car enters the area of interest and is illegally parked, the target car would stay for a period of time, so the problem is transformed into detecting a stationary target. Usually, the pixel values of each point in the background image of the region of interest can remain unchanged for a long time, but the background itself can change in a short period of time due to environmental influences. Therefore, when an object moves, the pixel value would undergo a significant change, and this change would be greater than the amplitude change of the background pixel itself [<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 38" title="Rafiq, N., Yaqoob, N., Kausar, N., Shams, M., Mir, N.A., Gaba, Y.U., Khan, N.: Computer-based fuzzy numerical method for solving engineering and real-world applications. Math. Prob. Eng. (2021). 
 https://doi.org/10.1155/2021/6916282
 
 " href="/article/10.1007/s44196-024-00427-6#ref-CR38" id="ref-link-section-d44058815e2679">38</a>, <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 39" title="Shams, M., Rafiq, N., Kausar, N., Mir, N. A., Alalyani, A.: “Computer oriented numerical scheme for solving engineering problems.” Comput. Syst. Sci. Eng. 42.2, 689–701 (2022). 
 https://doi.org/10.32604/csse.2022.022269
 
 ." href="/article/10.1007/s44196-024-00427-6#ref-CR39" id="ref-link-section-d44058815e2682">39</a>]. Therefore, when significant changes are detected in pixels within the region of interest, it can be determined that the target in front has entered the region of interest. When the background value pixel changes rapidly and significantly, and returns to the initial background state value in a short period of time, it indicates that a moving target has passed through the region of interest but has not stopped. When the background value changes rapidly and its pixel value remains unchanged for a period of time, it indicates that a moving object has entered and stopped, and there may be a violation of parking regulations.</p><h3 class="c-article__sub-heading" id="Sec11"><span class="c-article-section__title-number">4.3 </span>System Interface Under Human–Computer Interaction</h3><p>Each system is an implementation of human–computer interaction. The user interface is a platform for sending and exchanging information between users and computers. This platform should not only meet the needs of work information interaction, but also interact without the need for information, and the information processing speed should also meet the requirements. The intelligent vehicle violation detection system based on human–computer interaction and computer vision technology is similar to traffic police speed measurement for speeding recognition. Each intersection equipped with detection equipment is divided into two groups. Based on the driving distance and required speed of the two intersections, the starting time for the same vehicle to pass through a single intersection machine can be set. The intersection compares the difference time with the starting time based on the vehicle. If the measured value is greater than the starting time, it is considered that the vehicle is driving normally. If the measured value is less than the starting time, it is considered that the vehicle is speeding and an alarm is sent to the traffic police teams closest to the intersection where the vehicle passes through based on the network. At the same time, photos of passing through two intersections are stored as law enforcement evidence. The software system processing interface for speeding vehicles is shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig8">8</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-8" data-title="Fig. 8"><figure><figcaption><b id="Fig8" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 8</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/8" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig8_HTML.png?as=webp"><img aria-describedby="Fig8" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig8_HTML.png" alt="figure 8" loading="lazy" width="685" height="527"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-8-desc"><p>System processing interface for speeding vehicles</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/8" data-track-dest="link:Figure8 Full size image" aria-label="Full size image figure 8" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div></div></div></section><section data-title="Performance of Intelligent Vehicle Violation Detection System"><div class="c-article-section" id="Sec12-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec12"><span class="c-article-section__title-number">5 </span>Performance of Intelligent Vehicle Violation Detection System</h2><div class="c-article-section__content" id="Sec12-content"><p>For intelligent vehicle violation detection systems, configuring advanced computers can ensure the success of training and the accuracy of results, and training reports require both the memory and computing power of the graphics card. Therefore, this training should be run on a laboratory server. The hardware configuration, system used, and related software environment of the laboratory server are shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00427-6#Tab3">3</a>.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-3"><figure><figcaption class="c-article-table__figcaption"><b id="Tab3" data-test="table-caption">Table 3 Software and hardware configuration of laboratory computers</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00427-6/tables/3" aria-label="Full size table 3"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>Parameter updates or learning rules are a key part in machine learning that describes how a model adjusts its internal parameters to improve performance against new data. In supervised learning, this often involves minimizing a loss function that measures the difference between the model predictions and the actual labels. For the vehicle violation detection system, these labels indicate whether there is any violation in the image and what type of violation it is. Invite professional annotators or teams to mark the collected vehicle images one by one. The labeling process needs to follow the unified labeling criteria to ensure the accuracy of each sample label. To ensure the accuracy of annotation, annotated data can be spot-sampled and verified to detect and correct possible errors.</p><p>This article uses computer vision technology to process vehicle violation images, and then combines human–computer interaction to construct a system. The constructed system is more excellent than other traditional detection technologies. In order to capture illegal vehicles, the system can install cameras on the side or above the road for image collection, and then use a converter in the computer platform to convert it into an image. It is stored in computer memory. Then, this article uses computer vision technology to process the collected video images to see if the vehicle has engaged in illegal behavior, and at the same time, it is necessary to promptly handle the fault once it is discovered. In order to further demonstrate the performance superiority of the system studied in this article, it is compared with an intelligent vehicle violation detection system based on Video Image Processing (VIP) technology, Magnetic Induction Coil (MIC) technology, and Digital Image Processing (DIP) technology. The specific performance comparison is shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00427-6#Tab4">4</a>.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-4"><figure><figcaption class="c-article-table__figcaption"><b id="Tab4" data-test="table-caption">Table 4 Comparison of performance between video detection technology and traditional detection technology</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00427-6/tables/4" aria-label="Full size table 4"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>As shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00427-6#Tab4">4</a>, compared with other vehicle violation detection systems, the overall system performance of the vehicle violation detection system constructed in this article using human–computer interaction and computer vision technology is more superior, with significant advantages. The system studied in this article has a simple operating interface and complete functions such as recognition and tracking of various vehicle violation detection behaviors. It does not require users to spend a lot of time familiarizing themselves and is easy to get started with. At the same time, the detection accuracy is higher than other systems, and the stability is also better. The accuracy of warning for illegal vehicles is also higher. It can be all-weather tested and has excellent scalability.</p><p>The training set is a dataset used to train and optimize the vehicle violation detection system. It should contain various types of violations to ensure that the system can identify and classify various situations. The data in the training set should be annotated, that is, each sample should have a corresponding label, indicating whether it is a violation. The test set is used to evaluate the performance of the vehicle violation detection system, i.e., through the test set to measure the performance of the system on unseen data. The validation set was used to adjust the hyperparameters and to select the best model. By evaluating the performance of the model on the validation set, the optimal hyperparameter combination and the best model can be found. Updated regularly to reflect changes in the data. This study focuses on training accuracy.</p><p>There are many types of vehicle violations, such as retrograde, pressing yellow line, illegal parking, block license plates, running red lights, speeding, cut into a lane, and not polite to pedestrians. It is very important to detect these violations. This article utilizes Computer Vision (CV) technology to construct an intelligent vehicle violation detection system, which has high accuracy in detecting these types of violations. The research data are sourced from the database images mentioned above, and these different captured photos are tested. The experimental data of the above 8 violations are shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00427-6#Tab5">5</a>.</p><div class="c-article-table" data-test="inline-table" data-container-section="table" id="table-5"><figure><figcaption class="c-article-table__figcaption"><b id="Tab5" data-test="table-caption">Table 5 Experimental data on 8 types of violations</b></figcaption><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="table-link" data-track="click" data-track-action="view table" data-track-label="button" rel="nofollow" href="/article/10.1007/s44196-024-00427-6/tables/5" aria-label="Full size table 5"><span>Full size table</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>As shown in Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s44196-024-00427-6#Tab5">5</a>, a total of 8 violations are selected. This article has numbered these 8 violations from 1 to 8, and each violation was detected by extracting a different amount of image data, totaling 49,521 images. Among them, there are 9687 images of people who are not polite to others. The image data for illegal parking is the least, with only 2681 images.</p><p>This article uses the research to detect 8 types of violations in the violation detection system, and conducts 100 experiments. The final result is the average accuracy of 100 experimental detections. It compares the results with violation detection systems based on VIP, MIC, and DIP, and the specific comparison results are shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig9">9</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-9" data-title="Fig. 9"><figure><figcaption><b id="Fig9" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 9</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/9" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig9_HTML.png?as=webp"><img aria-describedby="Fig9" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig9_HTML.png" alt="figure 9" loading="lazy" width="685" height="499"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-9-desc"><p>Comparison of detection accuracy of different vehicle violation detection systems</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/9" data-track-dest="link:Figure9 Full size image" aria-label="Full size image figure 9" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>In Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig9">9</a>, the x-axis represents the number of violations, while the y-axis represents the accuracy of the detection. The vehicle violation detection system in this study detects different types of violations with much higher accuracy than other violation detection systems. Among them, the CV based violation detection system has an accuracy of over 96.86% for detecting the 8 types of violations extracted. The accuracy of violation detection systems based on VIP, MIC, and DIP for violation behavior detection is below 91.92%, 87.27%, and 92.35%, respectively. The accuracy of the system studied in this article for detecting illegal behavior of running red lights is 99.69%, approaching 100%. It is 8.81%, 14.51%, and 10.4% higher than the violation detection systems based on VIP, MIC, and DIP, respectively. The average accuracy of the violation detection system based on CV for 8 types of violations is 98%. It is 7.55%, 12.49%, and 7.87% higher than the violation detection systems based on VIP, MIC, and DIP, respectively.</p><p>With the continuous increase in the number of cars, more and more image data are generated. It is crucial to quickly find evidence of vehicle violations from massive image data, in order to search for and punish driver violations, and to reduce accident rates. The vehicle violation detection system studied in this article has a faster speed and shorter detection time for image detection. In order to further highlight the superiority of the system studied in this article in terms of image detection speed, this article compares it with the violation detection systems of VIP, MIC, and DIP. The specific comparison results are shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig10">10</a>.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-10" data-title="Fig. 10"><figure><figcaption><b id="Fig10" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 10</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/10" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig10_HTML.png?as=webp"><img aria-describedby="Fig10" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs44196-024-00427-6/MediaObjects/44196_2024_427_Fig10_HTML.png" alt="figure 10" loading="lazy" width="685" height="562"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-10-desc"><p>Comparison of image detection time for different vehicle violation detection systems</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s44196-024-00427-6/figures/10" data-track-dest="link:Figure10 Full size image" aria-label="Full size image figure 10" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div><p>In Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig10">10</a>, the x-axis represents the number of images to be detected, while the y-axis represents the time required for detection. As shown in Fig. <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s44196-024-00427-6#Fig10">10</a>, the system studied in this article takes much less time to detect extracted images than other violation detection systems, and the detection speed is faster. Among them, the detection system built based on MIC takes much longer than the other three detection systems. When the number of detected images is 150 or less, the time required for a VIP based detection system is higher than that for a CV based system. However, it is lower than systems built on MIC and DIP. When the number of detected images is 160 or more, the time required for a VIP based detection system is higher than that of a CV and DIP based system, but lower than that of a MIC based system. Meanwhile, when the number of detected images is 10, the time required for the system detection studied in this article is 5.54 ms. It is 1.58 ms, 4.24 ms, and 3.05 ms lower than the violation detection systems based on VIP, MIC, and DIP, respectively. When the number of detected images is 200, the time required for the system detection studied in this article is 68.39 ms. It is 48.03 ms, 55.17 ms, and 28.73 ms lower than the violation detection systems based on VIP, MIC, and DIP, respectively.</p></div></div></section><section data-title="Conclusions"><div class="c-article-section" id="Sec13-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Sec13"><span class="c-article-section__title-number">6 </span>Conclusions</h2><div class="c-article-section__content" id="Sec13-content"><p>The number of deaths caused by traffic accidents in China each year exceeds hundreds of thousands, ranking among the top in the world, most of which are caused by driver violations. Therefore, if computer vision technology can be used to detect violations of certain vehicles and record and punish them, it is of great significance for reducing law enforcement, managing the traffic environment, preventing accidents, and ensuring pedestrian safety. The intelligent vehicle violation detection system based on human–computer interaction and computer vision is a system that detects and records vehicle violations through computer vision and human–computer interaction technology. The system studied in this article utilized computer vision technology to preprocess the extracted images, and then used Kalman filtering to track vehicles. Then, it utilized the intelligent interaction interface of human–computer interaction technology system to detect vehicle violations, record and count vehicle violations, and presented the final recorded results to the administrator. Therefore, the research system in this article can recognize traffic signs, detect vehicle violations, and then provide useful information and warnings to drivers through human–computer interaction. This can effectively improve traffic safety, reduce manual patrol and monitoring costs, and improve the accuracy and efficiency of violation detection. Intelligent vehicle violation detection systems based on human–computer interaction and computer vision may encounter many difficulties in the research. Here are the possible challenges and the corresponding solutions: for computer vision systems, there is a lot of annotated data for model training. If the actual annotation data is limited available, the dataset can be augmented by using techniques such as data augmentation. Also, consider using semi-supervised or unsupervised learning to effectively utilize unannotated data. To make the system easy to use and understand, an efficient and intuitive human–machine interface needs to be designed. This can find the optimal design scheme through user research and design thinking.</p></div></div></section> </div> <section data-title="Data Availability"><div class="c-article-section" id="data-availability-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="data-availability">Data Availability</h2><div class="c-article-section__content" id="data-availability-content"> <p>Data are available upon reasonable request.</p> </div></div></section><div id="MagazineFulltextArticleBodySuffix"><section aria-labelledby="Bib1" data-title="References"><div class="c-article-section" id="Bib1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Bib1">References</h2><div class="c-article-section__content" id="Bib1-content"><div data-container-section="references"><ol class="c-article-references" data-track-component="outbound reference" data-track-context="references section"><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="1."><p class="c-article-references__text" id="ref-CR1">Lv, Z., Qiao, L., You, I.: 6G-Enabled network in box for internet of connected vehicles. IEEE Trans. Intellig. Transport. Syst. (2020). <a href="https://doi.org/10.1109/TITS.2020.3034817" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/TITS.2020.3034817">https://doi.org/10.1109/TITS.2020.3034817</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TITS.2020.3034817" data-track-item_id="10.1109/TITS.2020.3034817" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTITS.2020.3034817" aria-label="Article reference 1" data-doi="10.1109/TITS.2020.3034817">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 1" href="http://scholar.google.com/scholar_lookup?&title=6G-Enabled%20network%20in%20box%20for%20internet%20of%20connected%20vehicles&journal=IEEE%20Trans.%20Intellig.%20Transport.%20Syst.&doi=10.1109%2FTITS.2020.3034817&publication_year=2020&author=Lv%2CZ&author=Qiao%2CL&author=You%2CI"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="2."><p class="c-article-references__text" id="ref-CR2">Asadianfam, S., Shamsi, M., Rasouli Kenari, A.: Big data platform of traffic violation detection system: identifying the risky behaviors of vehicle drivers. Multimedia Tools Appl. <b>79</b>(33–34), 24645–24684 (2020). <a href="https://doi.org/10.1007/s11042-020-09099-8" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1007/s11042-020-09099-8">https://doi.org/10.1007/s11042-020-09099-8</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s11042-020-09099-8" data-track-item_id="10.1007/s11042-020-09099-8" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s11042-020-09099-8" aria-label="Article reference 2" data-doi="10.1007/s11042-020-09099-8">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 2" href="http://scholar.google.com/scholar_lookup?&title=Big%20data%20platform%20of%20traffic%20violation%20detection%20system%3A%20identifying%20the%20risky%20behaviors%20of%20vehicle%20drivers&journal=Multimedia%20Tools%20Appl.&doi=10.1007%2Fs11042-020-09099-8&volume=79&issue=33%E2%80%9334&pages=24645-24684&publication_year=2020&author=Asadianfam%2CS&author=Shamsi%2CM&author=Rasouli%20Kenari%2CA"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="3."><p class="c-article-references__text" id="ref-CR3">Sahraoui, Y., Kerrache, C. A., Korichi, A., Nour, B., Adnane, A., Hussain, R.: “DeepDist: a deep-learning-based IoV framework for real-time objects and distance violation detection.” IEEE Internet Things Magaz. <b>33</b>, 30–34 (2020) <a href="https://doi.org/10.1109/IOTM.0001.2000116" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/IOTM.0001.2000116">https://doi.org/10.1109/IOTM.0001.2000116</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="4."><p class="c-article-references__text" id="ref-CR4">Maha Vishnu, V.C., Rajalakshmi, M., Nedunchezhian, R.: Intelligent traffic video surveillance and accident detection system with dynamic traffic signal control. Cluster Comput. <b>215</b>, 135–147 (2018). <a href="https://doi.org/10.1007/s10586-017-0974-5" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1007/s10586-017-0974-5">https://doi.org/10.1007/s10586-017-0974-5</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s10586-017-0974-5" data-track-item_id="10.1007/s10586-017-0974-5" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s10586-017-0974-5" aria-label="Article reference 4" data-doi="10.1007/s10586-017-0974-5">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 4" href="http://scholar.google.com/scholar_lookup?&title=Intelligent%20traffic%20video%20surveillance%20and%20accident%20detection%20system%20with%20dynamic%20traffic%20signal%20control&journal=Cluster%20Comput.&doi=10.1007%2Fs10586-017-0974-5&volume=215&pages=135-147&publication_year=2018&author=Maha%20Vishnu%2CVC&author=Rajalakshmi%2CM&author=Nedunchezhian%2CR"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="5."><p class="c-article-references__text" id="ref-CR5">Zhang, R., Ishikawa, A., Wang, W., Striner, B., Tonguz, O.K.: Using reinforcement learning with partial vehicle detection for intelligent traffic signal control. IEEE Trans. Intellig. Transport. Syst. <b>22</b>(1), 404–415 (2020). <a href="https://doi.org/10.1109/TITS.2019.2958859" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/TITS.2019.2958859">https://doi.org/10.1109/TITS.2019.2958859</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TITS.2019.2958859" data-track-item_id="10.1109/TITS.2019.2958859" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTITS.2019.2958859" aria-label="Article reference 5" data-doi="10.1109/TITS.2019.2958859">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 5" href="http://scholar.google.com/scholar_lookup?&title=Using%20reinforcement%20learning%20with%20partial%20vehicle%20detection%20for%20intelligent%20traffic%20signal%20control&journal=IEEE%20Trans.%20Intellig.%20Transport.%20Syst.&doi=10.1109%2FTITS.2019.2958859&volume=22&issue=1&pages=404-415&publication_year=2020&author=Zhang%2CR&author=Ishikawa%2CA&author=Wang%2CW&author=Striner%2CB&author=Tonguz%2COK"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="6."><p class="c-article-references__text" id="ref-CR6">Liu Shuo., Gu Yuhai., Rao Wenjun., Wang Juyuan.: “A method for detecting illegal vehicles based on the optimized YOLOv3 algorithm”. J. Chongqing Univer. Technol. (Nat. Sci.) <b>35.4</b>, 135–141 (2021). <a href="https://doi.org/10.3969/j.issn.1674-8425(z).2021.04.018" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.3969/j.issn.1674-8425(z).2021.04.018">https://doi.org/10.3969/j.issn.1674-8425(z).2021.04.018</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="7."><p class="c-article-references__text" id="ref-CR7">Alagarsamy, S., Ramkumar, S., Kamatchi, K., Shankar, H., Kumar, A., Karthick, S., Kumar, P.: “Designing a advanced technique for detection and violation of traffic control system.” J. Crit. Rev. <b>7.8</b>, 2874–2879 (2020). <a href="https://doi.org/10.31838/jcr.07.08.473" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.31838/jcr.07.08.473">https://doi.org/10.31838/jcr.07.08.473</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="8."><p class="c-article-references__text" id="ref-CR8">Bhat, A.T., Rao, M.S., Pai, D.G.: Traffic violation detection in India using genetic algorithm. Glob. Trans. Proc. <b>2</b>(2), 309–314 (2021). <a href="https://doi.org/10.1016/j.gltp.2021.08.056" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1016/j.gltp.2021.08.056">https://doi.org/10.1016/j.gltp.2021.08.056</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1016/j.gltp.2021.08.056" data-track-item_id="10.1016/j.gltp.2021.08.056" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1016%2Fj.gltp.2021.08.056" aria-label="Article reference 8" data-doi="10.1016/j.gltp.2021.08.056">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 8" href="http://scholar.google.com/scholar_lookup?&title=Traffic%20violation%20detection%20in%20India%20using%20genetic%20algorithm&journal=Glob.%20Trans.%20Proc.&doi=10.1016%2Fj.gltp.2021.08.056&volume=2&issue=2&pages=309-314&publication_year=2021&author=Bhat%2CAT&author=Rao%2CMS&author=Pai%2CDG"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="9."><p class="c-article-references__text" id="ref-CR9">Charran, R. S., Dubey. R. K.: “Two-Wheeler Vehicle Traffic Violations Detection and Automated Ticketing for Indian Road Scenario.” IEEE Trans. Intellig. Transport. Syst. <b>23.11</b>, 22002–22007 (2022). <a href="https://doi.org/10.1109/TITS.2022.3186679" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/TITS.2022.3186679">https://doi.org/10.1109/TITS.2022.3186679</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="10."><p class="c-article-references__text" id="ref-CR10">Ozkul, M., Çapuni, I.: Police-less multi-party traffic violation detection and reporting system with privacy preservation. IET Intellig. Trans. Syst. <b>12</b>(5), 351–358 (2018). <a href="https://doi.org/10.1049/iet-its.2017.0122" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1049/iet-its.2017.0122">https://doi.org/10.1049/iet-its.2017.0122</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1049/iet-its.2017.0122" data-track-item_id="10.1049/iet-its.2017.0122" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1049%2Fiet-its.2017.0122" aria-label="Article reference 10" data-doi="10.1049/iet-its.2017.0122">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 10" href="http://scholar.google.com/scholar_lookup?&title=Police-less%20multi-party%20traffic%20violation%20detection%20and%20reporting%20system%20with%20privacy%20preservation&journal=IET%20Intellig.%20Trans.%20Syst.&doi=10.1049%2Fiet-its.2017.0122&volume=12&issue=5&pages=351-358&publication_year=2018&author=Ozkul%2CM&author=%C3%87apuni%2CI"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="11."><p class="c-article-references__text" id="ref-CR11">Abbas, A. F., Sheikh, U. U., Al-Dhief, F. T., Mohd, M. N. H.: “A comprehensive review of vehicle detection using computer vision.” TELKOMNIKA (Telecommunication Computing Electronics and Control) <b>19.3</b>, 838–850 (2021). <a href="https://doi.org/10.12928/telkomnika.v19i3.12880" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.12928/telkomnika.v19i3.12880">https://doi.org/10.12928/telkomnika.v19i3.12880</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="12."><p class="c-article-references__text" id="ref-CR12">Guo, M.-H., Xu, T.X., Liu, J.J., Liu, Z.N., Jiang, P.T., Mu, T.J., et al.: Attention mechanisms in computer vision: a survey. Comput. Visual Media <b>8</b>(3), 331–368 (2022). <a href="https://doi.org/10.1007/s41095-022-0271-y" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1007/s41095-022-0271-y">https://doi.org/10.1007/s41095-022-0271-y</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s41095-022-0271-y" data-track-item_id="10.1007/s41095-022-0271-y" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s41095-022-0271-y" aria-label="Article reference 12" data-doi="10.1007/s41095-022-0271-y">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 12" href="http://scholar.google.com/scholar_lookup?&title=Attention%20mechanisms%20in%20computer%20vision%3A%20a%20survey&journal=Comput.%20Visual%20Media&doi=10.1007%2Fs41095-022-0271-y&volume=8&issue=3&pages=331-368&publication_year=2022&author=Guo%2CM-H&author=Xu%2CTX&author=Liu%2CJJ&author=Liu%2CZN&author=Jiang%2CPT&author=Mu%2CTJ"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="13."><p class="c-article-references__text" id="ref-CR13">Yang, Z., Pun-Cheng, L.S.C.: Vehicle detection in intelligent transportation systems and its applications under varying environments: a review. Image Vis. Comput. <b>6</b>(9), 143–154 (2018). <a href="https://doi.org/10.1016/j.imavis.2017.09.008" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1016/j.imavis.2017.09.008">https://doi.org/10.1016/j.imavis.2017.09.008</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1016/j.imavis.2017.09.008" data-track-item_id="10.1016/j.imavis.2017.09.008" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1016%2Fj.imavis.2017.09.008" aria-label="Article reference 13" data-doi="10.1016/j.imavis.2017.09.008">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 13" href="http://scholar.google.com/scholar_lookup?&title=Vehicle%20detection%20in%20intelligent%20transportation%20systems%20and%20its%20applications%20under%20varying%20environments%3A%20a%20review&journal=Image%20Vis.%20Comput.&doi=10.1016%2Fj.imavis.2017.09.008&volume=6&issue=9&pages=143-154&publication_year=2018&author=Yang%2CZ&author=Pun-Cheng%2CLSC"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="14."><p class="c-article-references__text" id="ref-CR14">Kumar, A., Kundu, S., Kumar, S., Tiwari, U. K., Kalra, J.: “S-tvds: Smart traffic violation detection system for Indian traffic scenario.” Int. J. Innovat. Technol. Explor. Eng. (IJITEE) <b>8.4S3</b>, 6–10 (2019). <a href="https://doi.org/10.35940/ijitee.D1002.0384S319" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.35940/ijitee.D1002.0384S319">https://doi.org/10.35940/ijitee.D1002.0384S319</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="15."><p class="c-article-references__text" id="ref-CR15">Tutsoy, O., Tanrikulu. M. Y.: “Priority and age specific vaccination algorithm for the pandemic diseases: a comprehensive parametric prediction model.” BMC Med. Inform. Decis. Mak. <b>22.1</b>, 4 (2022). <a href="https://doi.org/10.13140/RG.2.2.25044.32646" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.13140/RG.2.2.25044.32646">https://doi.org/10.13140/RG.2.2.25044.32646</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="16."><p class="c-article-references__text" id="ref-CR16">Tutsoy, O.: Graph theory based large-scale machine learning with multi-dimensional constrained optimization approaches for exact epidemiological modelling of pandemic diseases. IEEE Trans. Pattern Analysis Mach. Intelligence (2023). <a href="https://doi.org/10.1109/TPAMI.2023.3256421" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/TPAMI.2023.3256421">https://doi.org/10.1109/TPAMI.2023.3256421</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2023.3256421" data-track-item_id="10.1109/TPAMI.2023.3256421" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2023.3256421" aria-label="Article reference 16" data-doi="10.1109/TPAMI.2023.3256421">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 16" href="http://scholar.google.com/scholar_lookup?&title=Graph%20theory%20based%20large-scale%20machine%20learning%20with%20multi-dimensional%20constrained%20optimization%20approaches%20for%20exact%20epidemiological%20modelling%20of%20pandemic%20diseases&journal=IEEE%20Trans.%20Pattern%20Analysis%20Mach.%20Intelligence&doi=10.1109%2FTPAMI.2023.3256421&publication_year=2023&author=Tutsoy%2CO"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="17."><p class="c-article-references__text" id="ref-CR17">Arabi, S., Haghighat, A., Sharma, A.: A deep-learning-based computer vision solution for construction vehicle detection. Comput. Aided Civil Infrastr. Eng. <b>35</b>(7), 753–767 (2020). <a href="https://doi.org/10.1111/mice.12530" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1111/mice.12530">https://doi.org/10.1111/mice.12530</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1111/mice.12530" data-track-item_id="10.1111/mice.12530" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1111%2Fmice.12530" aria-label="Article reference 17" data-doi="10.1111/mice.12530">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 17" href="http://scholar.google.com/scholar_lookup?&title=A%20deep-learning-based%20computer%20vision%20solution%20for%20construction%20vehicle%20detection&journal=Comput.%20Aided%20Civil%20Infrastr.%20Eng.&doi=10.1111%2Fmice.12530&volume=35&issue=7&pages=753-767&publication_year=2020&author=Arabi%2CS&author=Haghighat%2CA&author=Sharma%2CA"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="18."><p class="c-article-references__text" id="ref-CR18">Wiley, V., Lucas, T.: “Computer vision and image processing: a paper review.” Int. J. Artif. Intellig. Res. <b>2.1</b>, 29–36 (2018).<a href="https://doi.org/10.29099/ijair.v2i1.42" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.29099/ijair.v2i1.42">https://doi.org/10.29099/ijair.v2i1.42</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="19."><p class="c-article-references__text" id="ref-CR19">Tian, H., Wang, T., Liu, Y., Qiao, X., Li, Y.: Computer vision technology in agricultural automation—a review. Inform. Process. Agric. <b>7</b>(1), 1–19 (2020). <a href="https://doi.org/10.1016/j.inpa.2019.09.006" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1016/j.inpa.2019.09.006">https://doi.org/10.1016/j.inpa.2019.09.006</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1016/j.inpa.2019.09.006" data-track-item_id="10.1016/j.inpa.2019.09.006" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1016%2Fj.inpa.2019.09.006" aria-label="Article reference 19" data-doi="10.1016/j.inpa.2019.09.006">Article</a> <a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="cas reference" data-track-action="cas reference" href="/articles/cas-redirect/1:CAS:528:DC%2BB3cXht1Kkur7K" aria-label="CAS reference 19">CAS</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 19" href="http://scholar.google.com/scholar_lookup?&title=Computer%20vision%20technology%20in%20agricultural%20automation%E2%80%94a%20review&journal=Inform.%20Process.%20Agric.&doi=10.1016%2Fj.inpa.2019.09.006&volume=7&issue=1&pages=1-19&publication_year=2020&author=Tian%2CH&author=Wang%2CT&author=Liu%2CY&author=Qiao%2CX&author=Li%2CY"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="20."><p class="c-article-references__text" id="ref-CR20">Qianlong, D., Wei, S.: Model recognition based on improved sparse stack coding. Comput. Eng. Appl. <b>56</b>(1), 136–141 (2020). <a href="https://doi.org/10.1109/ACCESS.2020.2997286" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/ACCESS.2020.2997286">https://doi.org/10.1109/ACCESS.2020.2997286</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/ACCESS.2020.2997286" data-track-item_id="10.1109/ACCESS.2020.2997286" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FACCESS.2020.2997286" aria-label="Article reference 20" data-doi="10.1109/ACCESS.2020.2997286">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 20" href="http://scholar.google.com/scholar_lookup?&title=Model%20recognition%20based%20on%20improved%20sparse%20stack%20coding&journal=Comput.%20Eng.%20Appl.&doi=10.1109%2FACCESS.2020.2997286&volume=56&issue=1&pages=136-141&publication_year=2020&author=Qianlong%2CD&author=Wei%2CS"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="21."><p class="c-article-references__text" id="ref-CR21">Huang, X., Wang, P., Cheng, X., Zhou, D., Geng, Q., Yang, R.: The apolloscape open dataset for autonomous driving and its application. IEEE Trans. Pattern Anal. Mach. Intell. <b>42</b>(10), 2702–2719 (2019). <a href="https://doi.org/10.1109/TPAMI.2019.2926463" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/TPAMI.2019.2926463">https://doi.org/10.1109/TPAMI.2019.2926463</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2019.2926463" data-track-item_id="10.1109/TPAMI.2019.2926463" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2019.2926463" aria-label="Article reference 21" data-doi="10.1109/TPAMI.2019.2926463">Article</a> <a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="pubmed reference" data-track-action="pubmed reference" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&dopt=Abstract&list_uids=31283496" aria-label="PubMed reference 21">PubMed</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 21" href="http://scholar.google.com/scholar_lookup?&title=The%20apolloscape%20open%20dataset%20for%20autonomous%20driving%20and%20its%20application&journal=IEEE%20Trans.%20Pattern%20Anal.%20Mach.%20Intell.&doi=10.1109%2FTPAMI.2019.2926463&volume=42&issue=10&pages=2702-2719&publication_year=2019&author=Huang%2CX&author=Wang%2CP&author=Cheng%2CX&author=Zhou%2CD&author=Geng%2CQ&author=Yang%2CR"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="22."><p class="c-article-references__text" id="ref-CR22">Rakshit, M., Das, S.: An efficient ECG denoising methodology using empirical mode decomposition and adaptive switching mean filter. Biomed. Signal Process. Control <b>40</b>(3), 140–148 (2018). <a href="https://doi.org/10.1016/j.bspc.2017.09.020" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1016/j.bspc.2017.09.020">https://doi.org/10.1016/j.bspc.2017.09.020</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1016/j.bspc.2017.09.020" data-track-item_id="10.1016/j.bspc.2017.09.020" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1016%2Fj.bspc.2017.09.020" aria-label="Article reference 22" data-doi="10.1016/j.bspc.2017.09.020">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 22" href="http://scholar.google.com/scholar_lookup?&title=An%20efficient%20ECG%20denoising%20methodology%20using%20empirical%20mode%20decomposition%20and%20adaptive%20switching%20mean%20filter&journal=Biomed.%20Signal%20Process.%20Control&doi=10.1016%2Fj.bspc.2017.09.020&volume=40&issue=3&pages=140-148&publication_year=2018&author=Rakshit%2CM&author=Das%2CS"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="23."><p class="c-article-references__text" id="ref-CR23">Shengchun, W., Jin, Li., Shanshan, Hu.: A calculation method for floating datum of complex surface area based on non-local mean filtering. Prog. Geophys. <b>33</b>(5), 1985–1988 (2018). <a href="https://doi.org/10.6038/pg2018CC0132" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.6038/pg2018CC0132">https://doi.org/10.6038/pg2018CC0132</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.6038/pg2018CC0132" data-track-item_id="10.6038/pg2018CC0132" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.6038%2Fpg2018CC0132" aria-label="Article reference 23" data-doi="10.6038/pg2018CC0132">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 23" href="http://scholar.google.com/scholar_lookup?&title=A%20calculation%20method%20for%20floating%20datum%20of%20complex%20surface%20area%20based%20on%20non-local%20mean%20filtering&journal=Prog.%20Geophys.&doi=10.6038%2Fpg2018CC0132&volume=33&issue=5&pages=1985-1988&publication_year=2018&author=Shengchun%2CW&author=Jin%2CLi&author=Shanshan%2CHu"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="24."><p class="c-article-references__text" id="ref-CR24">Dyke, R.M., Hormann, K.: Histogram equalization using a selective filter. Vis. Comput. <b>39</b>(12), 6221–6235 (2023). <a href="https://doi.org/10.1007/s00371-022-02723-8" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1007/s00371-022-02723-8">https://doi.org/10.1007/s00371-022-02723-8</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s00371-022-02723-8" data-track-item_id="10.1007/s00371-022-02723-8" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s00371-022-02723-8" aria-label="Article reference 24" data-doi="10.1007/s00371-022-02723-8">Article</a> <a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="pubmed reference" data-track-action="pubmed reference" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&dopt=Abstract&list_uids=37969935" aria-label="PubMed reference 24">PubMed</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 24" href="http://scholar.google.com/scholar_lookup?&title=Histogram%20equalization%20using%20a%20selective%20filter&journal=Vis.%20Comput.&doi=10.1007%2Fs00371-022-02723-8&volume=39&issue=12&pages=6221-6235&publication_year=2023&author=Dyke%2CRM&author=Hormann%2CK"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="25."><p class="c-article-references__text" id="ref-CR25">Agrawal, S., Panda, R., Mishro, P. K., Abraham, A.: “A novel joint histogram equalization based image contrast enhancement.” J. King Saud University Comput. Inform. Sci. <b>34.4</b>, 1172–1182 (2022). <a href="https://doi.org/10.1016/j.jksuci.2019.05.010" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1016/j.jksuci.2019.05.010">https://doi.org/10.1016/j.jksuci.2019.05.010</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="26."><p class="c-article-references__text" id="ref-CR26">Vijayalakshmi, D., Nath, M.K.: A novel contrast enhancement technique using gradient-based joint histogram equalization. Circuits Syst. Signal Process. <b>40</b>(8), 3929–3967 (2021). <a href="https://doi.org/10.1007/s00034-021-01655-3" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1007/s00034-021-01655-3">https://doi.org/10.1007/s00034-021-01655-3</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s00034-021-01655-3" data-track-item_id="10.1007/s00034-021-01655-3" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s00034-021-01655-3" aria-label="Article reference 26" data-doi="10.1007/s00034-021-01655-3">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 26" href="http://scholar.google.com/scholar_lookup?&title=A%20novel%20contrast%20enhancement%20technique%20using%20gradient-based%20joint%20histogram%20equalization&journal=Circuits%20Syst.%20Signal%20Process.&doi=10.1007%2Fs00034-021-01655-3&volume=40&issue=8&pages=3929-3967&publication_year=2021&author=Vijayalakshmi%2CD&author=Nath%2CMK"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="27."><p class="c-article-references__text" id="ref-CR27">Pei, Y., Biswas, S., Fussell, D.S., Pingali, K.: An elementary introduction to Kalman filtering. Commun. ACM <b>62</b>(11), 122–133 (2019). <a href="https://doi.org/10.1145/3363294" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1145/3363294">https://doi.org/10.1145/3363294</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1145/3363294" data-track-item_id="10.1145/3363294" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1145%2F3363294" aria-label="Article reference 27" data-doi="10.1145/3363294">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 27" href="http://scholar.google.com/scholar_lookup?&title=An%20elementary%20introduction%20to%20Kalman%20filtering&journal=Commun.%20ACM&doi=10.1145%2F3363294&volume=62&issue=11&pages=122-133&publication_year=2019&author=Pei%2CY&author=Biswas%2CS&author=Fussell%2CDS&author=Pingali%2CK"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="28."><p class="c-article-references__text" id="ref-CR28">Fang, H., Tian, N., Wang, Y., Zhou, M., Haile, M.A.: Nonlinear Bayesian estimation: From Kalman filtering to a broader horizon. IEEE/CAA J. Automat. Sinica <b>5</b>(2), 401–417 (2018). <a href="https://doi.org/10.1109/JAS.2017.7510808" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/JAS.2017.7510808">https://doi.org/10.1109/JAS.2017.7510808</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/JAS.2017.7510808" data-track-item_id="10.1109/JAS.2017.7510808" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FJAS.2017.7510808" aria-label="Article reference 28" data-doi="10.1109/JAS.2017.7510808">Article</a> <a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="mathscinet reference" data-track-action="mathscinet reference" href="http://www.ams.org/mathscinet-getitem?mr=3769058" aria-label="MathSciNet reference 28">MathSciNet</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 28" href="http://scholar.google.com/scholar_lookup?&title=Nonlinear%20Bayesian%20estimation%3A%20From%20Kalman%20filtering%20to%20a%20broader%20horizon&journal=IEEE%2FCAA%20J.%20Automat.%20Sinica&doi=10.1109%2FJAS.2017.7510808&volume=5&issue=2&pages=401-417&publication_year=2018&author=Fang%2CH&author=Tian%2CN&author=Wang%2CY&author=Zhou%2CM&author=Haile%2CMA"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="29."><p class="c-article-references__text" id="ref-CR29">Liu, S., Wang, Z., Chen, Y., Wei, G.: Protocol-based unscented Kalman filtering in the presence of stochastic uncertainties. IEEE Trans. Autom. Control <b>65</b>(3), 1303–1309 (2019). <a href="https://doi.org/10.1109/TAC.2019.2929817" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/TAC.2019.2929817">https://doi.org/10.1109/TAC.2019.2929817</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TAC.2019.2929817" data-track-item_id="10.1109/TAC.2019.2929817" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTAC.2019.2929817" aria-label="Article reference 29" data-doi="10.1109/TAC.2019.2929817">Article</a> <a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="mathscinet reference" data-track-action="mathscinet reference" href="http://www.ams.org/mathscinet-getitem?mr=4084074" aria-label="MathSciNet reference 29">MathSciNet</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 29" href="http://scholar.google.com/scholar_lookup?&title=Protocol-based%20unscented%20Kalman%20filtering%20in%20the%20presence%20of%20stochastic%20uncertainties&journal=IEEE%20Trans.%20Autom.%20Control&doi=10.1109%2FTAC.2019.2929817&volume=65&issue=3&pages=1303-1309&publication_year=2019&author=Liu%2CS&author=Wang%2CZ&author=Chen%2CY&author=Wei%2CG"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="30."><p class="c-article-references__text" id="ref-CR30">Huang, Y., Zhang, Y., Zhao, Y., Shi, P., Chambers, J.A.: A novel outlier-robust Kalman filtering framework based on statistical similarity measure. IEEE Trans. Autom. Control <b>66</b>(6), 2677–2692 (2020). <a href="https://doi.org/10.1109/TAC.2020.3011443" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1109/TAC.2020.3011443">https://doi.org/10.1109/TAC.2020.3011443</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TAC.2020.3011443" data-track-item_id="10.1109/TAC.2020.3011443" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTAC.2020.3011443" aria-label="Article reference 30" data-doi="10.1109/TAC.2020.3011443">Article</a> <a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="mathscinet reference" data-track-action="mathscinet reference" href="http://www.ams.org/mathscinet-getitem?mr=4265103" aria-label="MathSciNet reference 30">MathSciNet</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 30" href="http://scholar.google.com/scholar_lookup?&title=A%20novel%20outlier-robust%20Kalman%20filtering%20framework%20based%20on%20statistical%20similarity%20measure&journal=IEEE%20Trans.%20Autom.%20Control&doi=10.1109%2FTAC.2020.3011443&volume=66&issue=6&pages=2677-2692&publication_year=2020&author=Huang%2CY&author=Zhang%2CY&author=Zhao%2CY&author=Shi%2CP&author=Chambers%2CJA"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="31."><p class="c-article-references__text" id="ref-CR31">Xiangyu, K., Xiaopeng, Z., Xuanyong, Z., et al.: Adaptive dynamic state estimation of distribution network based on interacting multiple model [J]. IEEE Trans. Sustain. Energy APR <b>13</b>(2), 643–652 (2022)</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TSTE.2021.3118030" data-track-item_id="10.1109/TSTE.2021.3118030" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTSTE.2021.3118030" aria-label="Article reference 31" data-doi="10.1109/TSTE.2021.3118030">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 31" href="http://scholar.google.com/scholar_lookup?&title=Adaptive%20dynamic%20state%20estimation%20of%20distribution%20network%20based%20on%20interacting%20multiple%20model%20%5BJ%5D&journal=IEEE%20Trans.%20Sustain.%20Energy%20APR&doi=10.1109%2FTSTE.2021.3118030&volume=13&issue=2&pages=643-652&publication_year=2022&author=Xiangyu%2CK&author=Xiaopeng%2CZ&author=Xuanyong%2CZ"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="32."><p class="c-article-references__text" id="ref-CR32">Go, M.J., Park, M., Yeo, J.: “Detecting vehicles that are illegally driving on road shoulders using faster R-CNN.” J. Korea Instit. Intellig. Trans. Syst. <b>21.1</b>, 105–122 (2022). <a href="https://doi.org/10.12815/kits.2022.21.1.105" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.12815/kits.2022.21.1.105">https://doi.org/10.12815/kits.2022.21.1.105</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="33."><p class="c-article-references__text" id="ref-CR33">Saritha, M., Rajalakshmi, S., Angel Deborah, S., Milton, R.S., Thirumla Devi, S., Vrithika, M., et al.: RFID-based traffic violation detection and traffic flow analysis system. Int. J. Pure Appl. Math. <b>118</b>(20), 319–328 (2018). <a href="https://doi.org/10.1007/s11042-020-09714-8" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1007/s11042-020-09714-8">https://doi.org/10.1007/s11042-020-09714-8</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s11042-020-09714-8" data-track-item_id="10.1007/s11042-020-09714-8" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s11042-020-09714-8" aria-label="Article reference 33" data-doi="10.1007/s11042-020-09714-8">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 33" href="http://scholar.google.com/scholar_lookup?&title=RFID-based%20traffic%20violation%20detection%20and%20traffic%20flow%20analysis%20system&journal=Int.%20J.%20Pure%20Appl.%20Math.&doi=10.1007%2Fs11042-020-09714-8&volume=118&issue=20&pages=319-328&publication_year=2018&author=Saritha%2CM&author=Rajalakshmi%2CS&author=Angel%20Deborah%2CS&author=Milton%2CRS&author=Thirumla%20Devi%2CS&author=Vrithika%2CM"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="34."><p class="c-article-references__text" id="ref-CR34">Agarwal, P., Chopra, K., Kashif, M., Kumari, V.: Implementing ALPR for detection of traffic violations: a step towards sustainability. Procedia Comput. Sci. <b>13</b>(2), 738–743 (2018). <a href="https://doi.org/10.1016/j.procs.2018.05.085" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1016/j.procs.2018.05.085">https://doi.org/10.1016/j.procs.2018.05.085</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1016/j.procs.2018.05.085" data-track-item_id="10.1016/j.procs.2018.05.085" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1016%2Fj.procs.2018.05.085" aria-label="Article reference 34" data-doi="10.1016/j.procs.2018.05.085">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 34" href="http://scholar.google.com/scholar_lookup?&title=Implementing%20ALPR%20for%20detection%20of%20traffic%20violations%3A%20a%20step%20towards%20sustainability&journal=Procedia%20Comput.%20Sci.&doi=10.1016%2Fj.procs.2018.05.085&volume=13&issue=2&pages=738-743&publication_year=2018&author=Agarwal%2CP&author=Chopra%2CK&author=Kashif%2CM&author=Kumari%2CV"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="35."><p class="c-article-references__text" id="ref-CR35">Santhosh, K.K., Dogra, D.P., Roy, P.P.: Anomaly detection in road traffic using visual surveillance: A survey. ACM Comput. Surveys (CSUR) <b>53</b>(6), 1–26 (2020). <a href="https://doi.org/10.1145/3417989" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1145/3417989">https://doi.org/10.1145/3417989</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1145/3417989" data-track-item_id="10.1145/3417989" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1145%2F3417989" aria-label="Article reference 35" data-doi="10.1145/3417989">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 35" href="http://scholar.google.com/scholar_lookup?&title=Anomaly%20detection%20in%20road%20traffic%20using%20visual%20surveillance%3A%20A%20survey&journal=ACM%20Comput.%20Surveys%20%28CSUR%29&doi=10.1145%2F3417989&volume=53&issue=6&pages=1-26&publication_year=2020&author=Santhosh%2CKK&author=Dogra%2CDP&author=Roy%2CPP"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="36."><p class="c-article-references__text" id="ref-CR36">Kousar, S., Aslam, F., Kausar, N., Pamucar, D., Addis, G.M.: Fault diagnosis in regenerative braking system of hybrid electric vehicles by using semigroup of finite-state deterministic fully intuitionistic fuzzy automata. Comput. Intellig. Neurosci. (2022). <a href="https://doi.org/10.1155/2022/3684727" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1155/2022/3684727">https://doi.org/10.1155/2022/3684727</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1155/2022/3684727" data-track-item_id="10.1155/2022/3684727" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1155%2F2022%2F3684727" aria-label="Article reference 36" data-doi="10.1155/2022/3684727">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 36" href="http://scholar.google.com/scholar_lookup?&title=Fault%20diagnosis%20in%20regenerative%20braking%20system%20of%20hybrid%20electric%20vehicles%20by%20using%20semigroup%20of%20finite-state%20deterministic%20fully%20intuitionistic%20fuzzy%20automata&journal=Comput.%20Intellig.%20Neurosci.&doi=10.1155%2F2022%2F3684727&publication_year=2022&author=Kousar%2CS&author=Aslam%2CF&author=Kausar%2CN&author=Pamucar%2CD&author=Addis%2CGM"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="37."><p class="c-article-references__text" id="ref-CR37">Liu, Y., Zhong, S., Kausar, N., Zhang, C., Mohammadzadeh, A., Pamucar, D.: “A stable fuzzy-based computational model and control for inductions motors.” Cmes-Comput. Model. Eng. Sci. <b>138</b>, 793–812 (2024). <a href="https://doi.org/10.32604/cmes.2023.028175" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.32604/cmes.2023.028175">https://doi.org/10.32604/cmes.2023.028175</a></p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="38."><p class="c-article-references__text" id="ref-CR38">Rafiq, N., Yaqoob, N., Kausar, N., Shams, M., Mir, N.A., Gaba, Y.U., Khan, N.: Computer-based fuzzy numerical method for solving engineering and real-world applications. Math. Prob. Eng. (2021). <a href="https://doi.org/10.1155/2021/6916282" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.1155/2021/6916282">https://doi.org/10.1155/2021/6916282</a></p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1155/2021/6916282" data-track-item_id="10.1155/2021/6916282" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1155%2F2021%2F6916282" aria-label="Article reference 38" data-doi="10.1155/2021/6916282">Article</a> <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 38" href="http://scholar.google.com/scholar_lookup?&title=Computer-based%20fuzzy%20numerical%20method%20for%20solving%20engineering%20and%20real-world%20applications&journal=Math.%20Prob.%20Eng.&doi=10.1155%2F2021%2F6916282&publication_year=2021&author=Rafiq%2CN&author=Yaqoob%2CN&author=Kausar%2CN&author=Shams%2CM&author=Mir%2CNA&author=Gaba%2CYU&author=Khan%2CN"> Google Scholar</a> </p></li><li class="c-article-references__item js-c-reading-companion-references-item" data-counter="39."><p class="c-article-references__text" id="ref-CR39">Shams, M., Rafiq, N., Kausar, N., Mir, N. A., Alalyani, A.: “Computer oriented numerical scheme for solving engineering problems.” Comput. Syst. Sci. Eng. <b>42.2</b>, 689–701 (2022). <a href="https://doi.org/10.32604/csse.2022.022269" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="10.32604/csse.2022.022269">https://doi.org/10.32604/csse.2022.022269</a>.</p></li></ol><p class="c-article-references__download u-hide-print"><a data-track="click" data-track-action="download citation references" data-track-label="link" rel="nofollow" href="https://citation-needed.springer.com/v2/references/10.1007/s44196-024-00427-6?format=refman&flavour=references">Download references<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-download-medium"></use></svg></a></p></div></div></div></section></div><section data-title="Funding"><div class="c-article-section" id="Fun-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Fun">Funding</h2><div class="c-article-section__content" id="Fun-content"><p>This work was supported by City University of Seattle.</p></div></div></section><section aria-labelledby="author-information" data-title="Author information"><div class="c-article-section" id="author-information-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="author-information">Author information</h2><div class="c-article-section__content" id="author-information-content"><h3 class="c-article__sub-heading" id="affiliations">Authors and Affiliations</h3><ol class="c-article-author-affiliation__list"><li id="Aff1"><p class="c-article-author-affiliation__address">MSCS, City University of Seattle, Seattle, Washington, 98102, USA</p><p class="c-article-author-affiliation__authors-list">Yang Ren</p></li></ol><div class="u-js-hide u-hide-print" data-test="author-info"><span class="c-article__sub-heading">Authors</span><ol class="c-article-authors-search u-list-reset"><li id="auth-Yang-Ren-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Yang Ren</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Yang%20Ren" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&term=Yang%20Ren" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&num=10&btnG=Search+Scholar&as_epq=&as_oq=&as_eq=&as_occt=any&as_sauthors=%22Yang%20Ren%22&as_publication=&as_ylo=&as_yhi=&as_allsubj=all&hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li></ol></div><h3 class="c-article__sub-heading" id="corresponding-author">Corresponding author</h3><p id="corresponding-author-list">Correspondence to <a id="corresp-c1" href="mailto:renyangoffical@163.com">Yang Ren</a>.</p></div></div></section><section data-title="Ethics declarations"><div class="c-article-section" id="ethics-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="ethics">Ethics declarations</h2><div class="c-article-section__content" id="ethics-content"> <h3 class="c-article__sub-heading" id="FPar1">Conflict of Interest</h3> <p>There are no potential competing interests in my paper.</p> </div></div></section><section data-title="Additional information"><div class="c-article-section" id="additional-information-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="additional-information">Additional information</h2><div class="c-article-section__content" id="additional-information-content"><h3 class="c-article__sub-heading">Publisher's Note</h3><p>Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.</p></div></div></section><section data-title="Rights and permissions"><div class="c-article-section" id="rightslink-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="rightslink">Rights and permissions</h2><div class="c-article-section__content" id="rightslink-content"> <p><b>Open Access</b> This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit <a href="http://creativecommons.org/licenses/by/4.0/" rel="license">http://creativecommons.org/licenses/by/4.0/</a>.</p> <p class="c-article-rights"><a data-track="click" data-track-action="view rights and permissions" data-track-label="link" href="https://s100.copyright.com/AppDispatchServlet?title=Intelligent%20Vehicle%20Violation%20Detection%20System%20Under%20Human%E2%80%93Computer%20Interaction%20and%20Computer%20Vision&author=Yang%20Ren&contentID=10.1007%2Fs44196-024-00427-6&copyright=The%20Author%28s%29&publication=1875-6883&publicationDate=2024-02-26&publisherName=SpringerNature&orderBeanReset=true&oa=CC%20BY">Reprints and permissions</a></p></div></div></section><section aria-labelledby="article-info" data-title="About this article"><div class="c-article-section" id="article-info-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="article-info">About this article</h2><div class="c-article-section__content" id="article-info-content"><div class="c-bibliographic-information"><div class="u-hide-print c-bibliographic-information__column c-bibliographic-information__column--border"><a data-crossmark="10.1007/s44196-024-00427-6" target="_blank" rel="noopener" href="https://crossmark.crossref.org/dialog/?doi=10.1007/s44196-024-00427-6" data-track="click" data-track-action="Click Crossmark" data-track-label="link" data-test="crossmark"><img loading="lazy" width="57" height="81" alt="Check for updates. Verify currency and authenticity via CrossMark" src="data:image/svg+xml;base64,<svg height="81" width="57" xmlns="http://www.w3.org/2000/svg"><g fill="none" fill-rule="evenodd"><path d="m17.35 35.45 21.3-14.2v-17.03h-21.3" fill="#989898"/><path d="m38.65 35.45-21.3-14.2v-17.03h21.3" fill="#747474"/><path d="m28 .5c-12.98 0-23.5 10.52-23.5 23.5s10.52 23.5 23.5 23.5 23.5-10.52 23.5-23.5c0-6.23-2.48-12.21-6.88-16.62-4.41-4.4-10.39-6.88-16.62-6.88zm0 41.25c-9.8 0-17.75-7.95-17.75-17.75s7.95-17.75 17.75-17.75 17.75 7.95 17.75 17.75c0 4.71-1.87 9.22-5.2 12.55s-7.84 5.2-12.55 5.2z" fill="#535353"/><path d="m41 36c-5.81 6.23-15.23 7.45-22.43 2.9-7.21-4.55-10.16-13.57-7.03-21.5l-4.92-3.11c-4.95 10.7-1.19 23.42 8.78 29.71 9.97 6.3 23.07 4.22 30.6-4.86z" fill="#9c9c9c"/><path d="m.2 58.45c0-.75.11-1.42.33-2.01s.52-1.09.91-1.5c.38-.41.83-.73 1.34-.94.51-.22 1.06-.32 1.65-.32.56 0 1.06.11 1.51.35.44.23.81.5 1.1.81l-.91 1.01c-.24-.24-.49-.42-.75-.56-.27-.13-.58-.2-.93-.2-.39 0-.73.08-1.05.23-.31.16-.58.37-.81.66-.23.28-.41.63-.53 1.04-.13.41-.19.88-.19 1.39 0 1.04.23 1.86.68 2.46.45.59 1.06.88 1.84.88.41 0 .77-.07 1.07-.23s.59-.39.85-.68l.91 1c-.38.43-.8.76-1.28.99-.47.22-1 .34-1.58.34-.59 0-1.13-.1-1.64-.31-.5-.2-.94-.51-1.31-.91-.38-.4-.67-.9-.88-1.48-.22-.59-.33-1.26-.33-2.02zm8.4-5.33h1.61v2.54l-.05 1.33c.29-.27.61-.51.96-.72s.76-.31 1.24-.31c.73 0 1.27.23 1.61.71.33.47.5 1.14.5 2.02v4.31h-1.61v-4.1c0-.57-.08-.97-.25-1.21-.17-.23-.45-.35-.83-.35-.3 0-.56.08-.79.22-.23.15-.49.36-.78.64v4.8h-1.61zm7.37 6.45c0-.56.09-1.06.26-1.51.18-.45.42-.83.71-1.14.29-.3.63-.54 1.01-.71.39-.17.78-.25 1.18-.25.47 0 .88.08 1.23.24.36.16.65.38.89.67s.42.63.54 1.03c.12.41.18.84.18 1.32 0 .32-.02.57-.07.76h-4.36c.07.62.29 1.1.65 1.44.36.33.82.5 1.38.5.29 0 .57-.04.83-.13s.51-.21.76-.37l.55 1.01c-.33.21-.69.39-1.09.53-.41.14-.83.21-1.26.21-.48 0-.92-.08-1.34-.25-.41-.16-.76-.4-1.07-.7-.31-.31-.55-.69-.72-1.13-.18-.44-.26-.95-.26-1.52zm4.6-.62c0-.55-.11-.98-.34-1.28-.23-.31-.58-.47-1.06-.47-.41 0-.77.15-1.07.45-.31.29-.5.73-.58 1.3zm2.5.62c0-.57.09-1.08.28-1.53.18-.44.43-.82.75-1.13s.69-.54 1.1-.71c.42-.16.85-.24 1.31-.24.45 0 .84.08 1.17.23s.61.34.85.57l-.77 1.02c-.19-.16-.38-.28-.56-.37-.19-.09-.39-.14-.61-.14-.56 0-1.01.21-1.35.63-.35.41-.52.97-.52 1.67 0 .69.17 1.24.51 1.66.34.41.78.62 1.32.62.28 0 .54-.06.78-.17.24-.12.45-.26.64-.42l.67 1.03c-.33.29-.69.51-1.08.65-.39.15-.78.23-1.18.23-.46 0-.9-.08-1.31-.24-.4-.16-.75-.39-1.05-.7s-.53-.69-.7-1.13c-.17-.45-.25-.96-.25-1.53zm6.91-6.45h1.58v6.17h.05l2.54-3.16h1.77l-2.35 2.8 2.59 4.07h-1.75l-1.77-2.98-1.08 1.23v1.75h-1.58zm13.69 1.27c-.25-.11-.5-.17-.75-.17-.58 0-.87.39-.87 1.16v.75h1.34v1.27h-1.34v5.6h-1.61v-5.6h-.92v-1.2l.92-.07v-.72c0-.35.04-.68.13-.98.08-.31.21-.57.4-.79s.42-.39.71-.51c.28-.12.63-.18 1.04-.18.24 0 .48.02.69.07.22.05.41.1.57.17zm.48 5.18c0-.57.09-1.08.27-1.53.17-.44.41-.82.72-1.13.3-.31.65-.54 1.04-.71.39-.16.8-.24 1.23-.24s.84.08 1.24.24c.4.17.74.4 1.04.71s.54.69.72 1.13c.19.45.28.96.28 1.53s-.09 1.08-.28 1.53c-.18.44-.42.82-.72 1.13s-.64.54-1.04.7-.81.24-1.24.24-.84-.08-1.23-.24-.74-.39-1.04-.7c-.31-.31-.55-.69-.72-1.13-.18-.45-.27-.96-.27-1.53zm1.65 0c0 .69.14 1.24.43 1.66.28.41.68.62 1.18.62.51 0 .9-.21 1.19-.62.29-.42.44-.97.44-1.66 0-.7-.15-1.26-.44-1.67-.29-.42-.68-.63-1.19-.63-.5 0-.9.21-1.18.63-.29.41-.43.97-.43 1.67zm6.48-3.44h1.33l.12 1.21h.05c.24-.44.54-.79.88-1.02.35-.24.7-.36 1.07-.36.32 0 .59.05.78.14l-.28 1.4-.33-.09c-.11-.01-.23-.02-.38-.02-.27 0-.56.1-.86.31s-.55.58-.77 1.1v4.2h-1.61zm-47.87 15h1.61v4.1c0 .57.08.97.25 1.2.17.24.44.35.81.35.3 0 .57-.07.8-.22.22-.15.47-.39.73-.73v-4.7h1.61v6.87h-1.32l-.12-1.01h-.04c-.3.36-.63.64-.98.86-.35.21-.76.32-1.24.32-.73 0-1.27-.24-1.61-.71-.33-.47-.5-1.14-.5-2.02zm9.46 7.43v2.16h-1.61v-9.59h1.33l.12.72h.05c.29-.24.61-.45.97-.63.35-.17.72-.26 1.1-.26.43 0 .81.08 1.15.24.33.17.61.4.84.71.24.31.41.68.53 1.11.13.42.19.91.19 1.44 0 .59-.09 1.11-.25 1.57-.16.47-.38.85-.65 1.16-.27.32-.58.56-.94.73-.35.16-.72.25-1.1.25-.3 0-.6-.07-.9-.2s-.59-.31-.87-.56zm0-2.3c.26.22.5.37.73.45.24.09.46.13.66.13.46 0 .84-.2 1.15-.6.31-.39.46-.98.46-1.77 0-.69-.12-1.22-.35-1.61-.23-.38-.61-.57-1.13-.57-.49 0-.99.26-1.52.77zm5.87-1.69c0-.56.08-1.06.25-1.51.16-.45.37-.83.65-1.14.27-.3.58-.54.93-.71s.71-.25 1.08-.25c.39 0 .73.07 1 .2.27.14.54.32.81.55l-.06-1.1v-2.49h1.61v9.88h-1.33l-.11-.74h-.06c-.25.25-.54.46-.88.64-.33.18-.69.27-1.06.27-.87 0-1.56-.32-2.07-.95s-.76-1.51-.76-2.65zm1.67-.01c0 .74.13 1.31.4 1.7.26.38.65.58 1.15.58.51 0 .99-.26 1.44-.77v-3.21c-.24-.21-.48-.36-.7-.45-.23-.08-.46-.12-.7-.12-.45 0-.82.19-1.13.59-.31.39-.46.95-.46 1.68zm6.35 1.59c0-.73.32-1.3.97-1.71.64-.4 1.67-.68 3.08-.84 0-.17-.02-.34-.07-.51-.05-.16-.12-.3-.22-.43s-.22-.22-.38-.3c-.15-.06-.34-.1-.58-.1-.34 0-.68.07-1 .2s-.63.29-.93.47l-.59-1.08c.39-.24.81-.45 1.28-.63.47-.17.99-.26 1.54-.26.86 0 1.51.25 1.93.76s.63 1.25.63 2.21v4.07h-1.32l-.12-.76h-.05c-.3.27-.63.48-.98.66s-.73.27-1.14.27c-.61 0-1.1-.19-1.48-.56-.38-.36-.57-.85-.57-1.46zm1.57-.12c0 .3.09.53.27.67.19.14.42.21.71.21.28 0 .54-.07.77-.2s.48-.31.73-.56v-1.54c-.47.06-.86.13-1.18.23-.31.09-.57.19-.76.31s-.33.25-.41.4c-.09.15-.13.31-.13.48zm6.29-3.63h-.98v-1.2l1.06-.07.2-1.88h1.34v1.88h1.75v1.27h-1.75v3.28c0 .8.32 1.2.97 1.2.12 0 .24-.01.37-.04.12-.03.24-.07.34-.11l.28 1.19c-.19.06-.4.12-.64.17-.23.05-.49.08-.76.08-.4 0-.74-.06-1.02-.18-.27-.13-.49-.3-.67-.52-.17-.21-.3-.48-.37-.78-.08-.3-.12-.64-.12-1.01zm4.36 2.17c0-.56.09-1.06.27-1.51s.41-.83.71-1.14c.29-.3.63-.54 1.01-.71.39-.17.78-.25 1.18-.25.47 0 .88.08 1.23.24.36.16.65.38.89.67s.42.63.54 1.03c.12.41.18.84.18 1.32 0 .32-.02.57-.07.76h-4.37c.08.62.29 1.1.65 1.44.36.33.82.5 1.38.5.3 0 .58-.04.84-.13.25-.09.51-.21.76-.37l.54 1.01c-.32.21-.69.39-1.09.53s-.82.21-1.26.21c-.47 0-.92-.08-1.33-.25-.41-.16-.77-.4-1.08-.7-.3-.31-.54-.69-.72-1.13-.17-.44-.26-.95-.26-1.52zm4.61-.62c0-.55-.11-.98-.34-1.28-.23-.31-.58-.47-1.06-.47-.41 0-.77.15-1.08.45-.31.29-.5.73-.57 1.3zm3.01 2.23c.31.24.61.43.92.57.3.13.63.2.98.2.38 0 .65-.08.83-.23s.27-.35.27-.6c0-.14-.05-.26-.13-.37-.08-.1-.2-.2-.34-.28-.14-.09-.29-.16-.47-.23l-.53-.22c-.23-.09-.46-.18-.69-.3-.23-.11-.44-.24-.62-.4s-.33-.35-.45-.55c-.12-.21-.18-.46-.18-.75 0-.61.23-1.1.68-1.49.44-.38 1.06-.57 1.83-.57.48 0 .91.08 1.29.25s.71.36.99.57l-.74.98c-.24-.17-.49-.32-.73-.42-.25-.11-.51-.16-.78-.16-.35 0-.6.07-.76.21-.17.15-.25.33-.25.54 0 .14.04.26.12.36s.18.18.31.26c.14.07.29.14.46.21l.54.19c.23.09.47.18.7.29s.44.24.64.4c.19.16.34.35.46.58.11.23.17.5.17.82 0 .3-.06.58-.17.83-.12.26-.29.48-.51.68-.23.19-.51.34-.84.45-.34.11-.72.17-1.15.17-.48 0-.95-.09-1.41-.27-.46-.19-.86-.41-1.2-.68z" fill="#535353"/></g></svg>"></a></div><div class="c-bibliographic-information__column"><h3 class="c-article__sub-heading" id="citeas">Cite this article</h3><p class="c-bibliographic-information__citation">Ren, Y. Intelligent Vehicle Violation Detection System Under Human–Computer Interaction and Computer Vision. <i>Int J Comput Intell Syst</i> <b>17</b>, 40 (2024). https://doi.org/10.1007/s44196-024-00427-6</p><p class="c-bibliographic-information__download-citation u-hide-print"><a data-test="citation-link" data-track="click" data-track-action="download article citation" data-track-label="link" data-track-external="" rel="nofollow" href="https://citation-needed.springer.com/v2/references/10.1007/s44196-024-00427-6?format=refman&flavour=citation">Download citation<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-download-medium"></use></svg></a></p><ul class="c-bibliographic-information__list" data-test="publication-history"><li class="c-bibliographic-information__list-item"><p>Received<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2023-11-27">27 November 2023</time></span></p></li><li class="c-bibliographic-information__list-item"><p>Accepted<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2024-01-28">28 January 2024</time></span></p></li><li class="c-bibliographic-information__list-item"><p>Published<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2024-02-26">26 February 2024</time></span></p></li><li class="c-bibliographic-information__list-item c-bibliographic-information__list-item--full-width"><p><abbr title="Digital Object Identifier">DOI</abbr><span class="u-hide">: </span><span class="c-bibliographic-information__value">https://doi.org/10.1007/s44196-024-00427-6</span></p></li></ul><div data-component="share-box"><div class="c-article-share-box u-display-none" hidden=""><h3 class="c-article__sub-heading">Share this article</h3><p class="c-article-share-box__description">Anyone you share the following link with will be able to read this content:</p><button class="js-get-share-url c-article-share-box__button" type="button" id="get-share-url" data-track="click" data-track-label="button" data-track-external="" data-track-action="get shareable link">Get shareable link</button><div class="js-no-share-url-container u-display-none" hidden=""><p class="js-c-article-share-box__no-sharelink-info c-article-share-box__no-sharelink-info">Sorry, a shareable link is not currently available for this article.</p></div><div class="js-share-url-container u-display-none" hidden=""><p class="js-share-url c-article-share-box__only-read-input" id="share-url" data-track="click" data-track-label="button" data-track-action="select share url"></p><button class="js-copy-share-url c-article-share-box__button--link-like" type="button" id="copy-share-url" data-track="click" data-track-label="button" data-track-action="copy share url" data-track-external="">Copy to clipboard</button></div><p class="js-c-article-share-box__additional-info c-article-share-box__additional-info"> Provided by the Springer Nature SharedIt content-sharing initiative </p></div></div><h3 class="c-article__sub-heading">Keywords</h3><ul class="c-article-subject-list"><li class="c-article-subject-list__subject"><span><a href="/search?query=Vehicle%20violation%20detection%20system&facet-discipline="Engineering"" data-track="click" data-track-action="view keyword" data-track-label="link">Vehicle violation detection system</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Computer%20vision&facet-discipline="Engineering"" data-track="click" data-track-action="view keyword" data-track-label="link">Computer vision</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Human%E2%80%93computer%20interaction&facet-discipline="Engineering"" data-track="click" data-track-action="view keyword" data-track-label="link">Human–computer interaction</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Kalman%20filtering&facet-discipline="Engineering"" data-track="click" data-track-action="view keyword" data-track-label="link">Kalman filtering</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Mean%20filtering&facet-discipline="Engineering"" data-track="click" data-track-action="view keyword" data-track-label="link">Mean filtering</a></span></li></ul><div data-component="article-info-list"></div></div></div></div></div></section> </div> </main> <div class="c-article-sidebar u-text-sm u-hide-print l-with-sidebar__sidebar" id="sidebar" data-container-type="reading-companion" data-track-component="reading companion"> <aside> <div class="app-card-service" data-test="article-checklist-banner"> <div> <a class="app-card-service__link" data-track="click_presubmission_checklist" data-track-context="article page top of reading companion" data-track-category="pre-submission-checklist" data-track-action="clicked article page checklist banner test 2 old version" data-track-label="link" href="https://beta.springernature.com/pre-submission?journalId=44196" data-test="article-checklist-banner-link"> <span class="app-card-service__link-text">Use our pre-submission checklist</span> <svg class="app-card-service__link-icon" aria-hidden="true" focusable="false"><use xlink:href="#icon-eds-i-arrow-right-small"></use></svg> </a> <p class="app-card-service__description">Avoid common mistakes on your manuscript.</p> </div> <div class="app-card-service__icon-container"> <svg class="app-card-service__icon" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-clipboard-check-medium"></use> </svg> </div> </div> <div data-test="collections"> </div> <div data-test="editorial-summary"> </div> <div class="c-reading-companion"> <div class="c-reading-companion__sticky" data-component="reading-companion-sticky" data-test="reading-companion-sticky"> <div class="c-reading-companion__panel c-reading-companion__sections c-reading-companion__panel--active" id="tabpanel-sections"> <div class="u-lazy-ad-wrapper u-mt-16 u-hide" data-component-mpu><div class="c-ad c-ad--300x250"> <div class="c-ad__inner"> <p class="c-ad__label">Advertisement</p> <div id="div-gpt-ad-MPU1" class="div-gpt-ad grade-c-hide" data-pa11y-ignore data-gpt data-gpt-unitpath="/270604982/springerlink/44196/article" data-gpt-sizes="300x250" data-test="MPU1-ad" data-gpt-targeting="pos=MPU1;articleid=s44196-024-00427-6;"> </div> </div> </div> </div> </div> <div class="c-reading-companion__panel c-reading-companion__figures c-reading-companion__panel--full-width" id="tabpanel-figures"></div> <div class="c-reading-companion__panel c-reading-companion__references c-reading-companion__panel--full-width" id="tabpanel-references"></div> </div> </div> </aside> </div> </div> </article> <div class="app-elements"> <div class="eds-c-header__expander eds-c-header__expander--search" id="eds-c-header-popup-search"> <h2 class="eds-c-header__heading">Search</h2> <div class="u-container"> <search class="eds-c-header__search" role="search" aria-label="Search from the header"> <form method="GET" action="//link.springer.com/search" data-test="header-search" data-track="search" data-track-context="search from header" data-track-action="submit search form" data-track-category="unified header" data-track-label="form" > <label for="eds-c-header-search" class="eds-c-header__search-label">Search by keyword or author</label> <div class="eds-c-header__search-container"> <input id="eds-c-header-search" class="eds-c-header__search-input" autocomplete="off" name="query" type="search" value="" required> <button class="eds-c-header__search-button" type="submit"> <svg class="eds-c-header__icon" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-search-medium"></use> </svg> <span class="u-visually-hidden">Search</span> </button> </div> </form> </search> </div> </div> <div class="eds-c-header__expander eds-c-header__expander--menu" id="eds-c-header-nav"> <h2 class="eds-c-header__heading">Navigation</h2> <ul class="eds-c-header__list"> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://link.springer.com/journals/" data-track="nav_find_a_journal" data-track-context="unified header" data-track-action="click find a journal" data-track-category="unified header" data-track-label="link" > Find a journal </a> </li> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://www.springernature.com/gp/authors" data-track="nav_how_to_publish" data-track-context="unified header" data-track-action="click publish with us link" data-track-category="unified header" data-track-label="link" > Publish with us </a> </li> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://link.springernature.com/home/" data-track="nav_track_your_research" data-track-context="unified header" data-track-action="click track your research" data-track-category="unified header" data-track-label="link" > Track your research </a> </li> </ul> </div> <footer > <div class="eds-c-footer" > <div class="eds-c-footer__container"> <div class="eds-c-footer__grid eds-c-footer__group--separator"> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Discover content</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/journals/a/1" data-track="nav_journals_a_z" data-track-action="journals a-z" data-track-context="unified footer" data-track-label="link">Journals A-Z</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/books/a/1" data-track="nav_books_a_z" data-track-action="books a-z" data-track-context="unified footer" data-track-label="link">Books A-Z</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Publish with us</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/journals" data-track="nav_journal_finder" data-track-action="journal finder" data-track-context="unified footer" data-track-label="link">Journal finder</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/authors" data-track="nav_publish_your_research" data-track-action="publish your research" data-track-context="unified footer" data-track-label="link">Publish your research</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/open-research/about/the-fundamentals-of-open-access-and-open-research" data-track="nav_open_access_publishing" data-track-action="open access publishing" data-track-context="unified footer" data-track-label="link">Open access publishing</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Products and services</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/products" data-track="nav_our_products" data-track-action="our products" data-track-context="unified footer" data-track-label="link">Our products</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/librarians" data-track="nav_librarians" data-track-action="librarians" data-track-context="unified footer" data-track-label="link">Librarians</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/societies" data-track="nav_societies" data-track-action="societies" data-track-context="unified footer" data-track-label="link">Societies</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/partners" data-track="nav_partners_and_advertisers" data-track-action="partners and advertisers" data-track-context="unified footer" data-track-label="link">Partners and advertisers</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Our imprints</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springer.com/" data-track="nav_imprint_Springer" data-track-action="Springer" data-track-context="unified footer" data-track-label="link">Springer</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.nature.com/" data-track="nav_imprint_Nature_Portfolio" data-track-action="Nature Portfolio" data-track-context="unified footer" data-track-label="link">Nature Portfolio</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.biomedcentral.com/" data-track="nav_imprint_BMC" data-track-action="BMC" data-track-context="unified footer" data-track-label="link">BMC</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.palgrave.com/" data-track="nav_imprint_Palgrave_Macmillan" data-track-action="Palgrave Macmillan" data-track-context="unified footer" data-track-label="link">Palgrave Macmillan</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.apress.com/" data-track="nav_imprint_Apress" data-track-action="Apress" data-track-context="unified footer" data-track-label="link">Apress</a></li> </ul> </div> </div> </div> <div class="eds-c-footer__container"> <nav aria-label="footer navigation"> <ul class="eds-c-footer__links"> <li class="eds-c-footer__item"> <button class="eds-c-footer__link" data-cc-action="preferences" data-track="dialog_manage_cookies" data-track-action="Manage cookies" data-track-context="unified footer" data-track-label="link"><span class="eds-c-footer__button-text">Your privacy choices/Manage cookies</span></button> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://www.springernature.com/gp/legal/ccpa" data-track="nav_california_privacy_statement" data-track-action="california privacy statement" data-track-context="unified footer" data-track-label="link">Your US state privacy rights</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://www.springernature.com/gp/info/accessibility" data-track="nav_accessibility_statement" data-track-action="accessibility statement" data-track-context="unified footer" data-track-label="link">Accessibility statement</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://link.springer.com/termsandconditions" data-track="nav_terms_and_conditions" data-track-action="terms and conditions" data-track-context="unified footer" data-track-label="link">Terms and conditions</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://link.springer.com/privacystatement" data-track="nav_privacy_policy" data-track-action="privacy policy" data-track-context="unified footer" data-track-label="link">Privacy policy</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://support.springernature.com/en/support/home" data-track="nav_help_and_support" data-track-action="help and support" data-track-context="unified footer" data-track-label="link">Help and support</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://support.springernature.com/en/support/solutions/articles/6000255911-subscription-cancellations" data-track-action="cancel contracts here">Cancel contracts here</a> </li> </ul> </nav> <div class="eds-c-footer__user"> <p class="eds-c-footer__user-info"> <span data-test="footer-user-ip">8.222.208.146</span> </p> <p class="eds-c-footer__user-info" data-test="footer-business-partners">Not affiliated</p> </div> <a href="https://www.springernature.com/" class="eds-c-footer__link"> <img src="/oscar-static/images/logo-springernature-white-19dd4ba190.svg" alt="Springer Nature" loading="lazy" width="200" height="20"/> </a> <p class="eds-c-footer__legal" data-test="copyright">© 2024 Springer Nature</p> </div> </div> </footer> </div> </body> </html>