CINXE.COM

Search results for: ADAS

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: ADAS</title> <meta name="description" content="Search results for: ADAS"> <meta name="keywords" content="ADAS"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="ADAS" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="ADAS"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 25</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: ADAS</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">25</span> Self-Calibration of Fish-Eye Camera for Advanced Driver Assistance Systems</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Atef%20Alaaeddine%20Sarraj">Atef Alaaeddine Sarraj</a>, <a href="https://publications.waset.org/abstracts/search?q=Brendan%20Jackman"> Brendan Jackman</a>, <a href="https://publications.waset.org/abstracts/search?q=Frank%20Walsh"> Frank Walsh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Tomorrow’s car will be more automated and increasingly connected. Innovative and intuitive interfaces are essential to accompany this functional enrichment. For that, today the automotive companies are competing to offer an advanced driver assistance system (ADAS) which will be able to provide enhanced navigation, collision avoidance, intersection support and lane keeping. These vision-based functions require an accurately calibrated camera. To achieve such differentiation in ADAS requires sophisticated sensors and efficient algorithms. This paper explores the different calibration methods applicable to vehicle-mounted fish-eye cameras with arbitrary fields of view and defines the first steps towards a self-calibration method that adequately addresses ADAS requirements. In particular, we present a self-calibration method after comparing different camera calibration algorithms in the context of ADAS requirements. Our method gathers data from unknown scenes while the car is moving, estimates the camera intrinsic and extrinsic parameters and corrects the wide-angle distortion. Our solution enables continuous and real-time detection of objects, pedestrians, road markings and other cars. In contrast, other camera calibration algorithms for ADAS need pre-calibration, while the presented method calibrates the camera without prior knowledge of the scene and in real-time. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=advanced%20driver%20assistance%20system%20%28ADAS%29" title="advanced driver assistance system (ADAS)">advanced driver assistance system (ADAS)</a>, <a href="https://publications.waset.org/abstracts/search?q=fish-eye" title=" fish-eye"> fish-eye</a>, <a href="https://publications.waset.org/abstracts/search?q=real-time" title=" real-time"> real-time</a>, <a href="https://publications.waset.org/abstracts/search?q=self-calibration" title=" self-calibration"> self-calibration</a> </p> <a href="https://publications.waset.org/abstracts/70853/self-calibration-of-fish-eye-camera-for-advanced-driver-assistance-systems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/70853.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">252</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">24</span> Vehicle Risk Evaluation in Low Speed Accidents: Consequences for Relevant Test Scenarios</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Philip%20Feig">Philip Feig</a>, <a href="https://publications.waset.org/abstracts/search?q=Klaus%20Gschwendtner"> Klaus Gschwendtner</a>, <a href="https://publications.waset.org/abstracts/search?q=Julian%20Schatz"> Julian Schatz</a>, <a href="https://publications.waset.org/abstracts/search?q=Frank%20Diermeyer"> Frank Diermeyer</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Projects of accident research analysis are mostly focused on accidents involving personal damage. Property damage only has a high frequency of occurrence combined with high economic impact. This paper describes main influencing parameters for the extent of damage and presents a repair cost model. For a prospective evaluation method of the monetary effect of advanced driver assistance systems (ADAS), it is necessary to be aware of and quantify all influencing parameters. Furthermore, this method allows the evaluation of vehicle concepts in combination with an ADAS at an early point in time of the product development process. In combination with a property damage database and the introduced repair cost model relevant test scenarios for specific vehicle configurations and their individual property damage risk may be determined. Currently, equipment rates of ADAS are low and a purchase incentive for customers would be beneficial. The next ADAS generation will prevent property damage to a large extent or at least reduce damage severity. Both effects may be a purchasing incentive for the customer and furthermore contribute to increased traffic safety. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=accident%20research" title="accident research">accident research</a>, <a href="https://publications.waset.org/abstracts/search?q=accident%20scenarios" title=" accident scenarios"> accident scenarios</a>, <a href="https://publications.waset.org/abstracts/search?q=ADAS" title=" ADAS"> ADAS</a>, <a href="https://publications.waset.org/abstracts/search?q=effectiveness" title=" effectiveness"> effectiveness</a>, <a href="https://publications.waset.org/abstracts/search?q=property%20damage%20analysis" title=" property damage analysis"> property damage analysis</a> </p> <a href="https://publications.waset.org/abstracts/47288/vehicle-risk-evaluation-in-low-speed-accidents-consequences-for-relevant-test-scenarios" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/47288.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">340</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">23</span> Real-Time Lane Marking Detection Using Weighted Filter</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ayhan%20Kucukmanisa">Ayhan Kucukmanisa</a>, <a href="https://publications.waset.org/abstracts/search?q=Orhan%20Akbulut"> Orhan Akbulut</a>, <a href="https://publications.waset.org/abstracts/search?q=Oguzhan%20Urhan"> Oguzhan Urhan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Nowadays, advanced driver assistance systems (ADAS) have become popular, since they enable safe driving. Lane detection is a vital step for ADAS. The performance of the lane detection process is critical to obtain a high accuracy lane departure warning system (LDWS). Challenging factors such as road cracks, erosion of lane markings, weather conditions might affect the performance of a lane detection system. In this paper, 1-D weighted filter based on row filtering to detect lane marking is proposed. 2-D input image is filtered by 1-D weighted filter considering four-pixel values located symmetrically around the center of candidate pixel. Performance evaluation is carried out by two metrics which are true positive rate (TPR) and false positive rate (FPR). Experimental results demonstrate that the proposed approach provides better lane marking detection accuracy compared to the previous methods while providing real-time processing performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=lane%20marking%20filter" title="lane marking filter">lane marking filter</a>, <a href="https://publications.waset.org/abstracts/search?q=lane%20detection" title=" lane detection"> lane detection</a>, <a href="https://publications.waset.org/abstracts/search?q=ADAS" title=" ADAS"> ADAS</a>, <a href="https://publications.waset.org/abstracts/search?q=LDWS" title=" LDWS"> LDWS</a> </p> <a href="https://publications.waset.org/abstracts/90804/real-time-lane-marking-detection-using-weighted-filter" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/90804.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">194</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">22</span> Methodology to Affirm Driver Engagement in Dynamic Driving Task (DDT) for a Level 2 Adas Feature</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Praneeth%20Puvvula">Praneeth Puvvula</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Autonomy in has become increasingly common in modern automotive cars. There are 5 levels of autonomy as defined by SAE. This paper focuses on a SAE level 2 feature which, by definition, is able to control the vehicle longitudinally and laterally at the same time. The system keeps the vehicle centred with in the lane by detecting the lane boundaries while maintaining the vehicle speed. As with the features from SAE level 1 to level 3, the primary responsibility of dynamic driving task lies with the driver. This will need monitoring techniques to ensure the driver is always engaged even while the feature is active. This paper focuses on the these techniques, which would help the safe usage of the feature and provide appropriate warnings to the driver. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=autonomous%20driving" title="autonomous driving">autonomous driving</a>, <a href="https://publications.waset.org/abstracts/search?q=safety" title=" safety"> safety</a>, <a href="https://publications.waset.org/abstracts/search?q=adas" title=" adas"> adas</a>, <a href="https://publications.waset.org/abstracts/search?q=automotive%20technology" title=" automotive technology"> automotive technology</a> </p> <a href="https://publications.waset.org/abstracts/166616/methodology-to-affirm-driver-engagement-in-dynamic-driving-task-ddt-for-a-level-2-adas-feature" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/166616.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">89</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">21</span> Low-Cost Parking Lot Mapping and Localization for Home Zone Parking Pilot</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hongbo%20Zhang">Hongbo Zhang</a>, <a href="https://publications.waset.org/abstracts/search?q=Xinlu%20Tang"> Xinlu Tang</a>, <a href="https://publications.waset.org/abstracts/search?q=Jiangwei%20Li"> Jiangwei Li</a>, <a href="https://publications.waset.org/abstracts/search?q=Chi%20Yan"> Chi Yan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Home zone parking pilot (HPP) is a fast-growing segment in low-speed autonomous driving applications. It requires the car automatically cruise around a parking lot and park itself in a range of up to 100 meters inside a recurrent home/office parking lot, which requires precise parking lot mapping and localization solution. Although Lidar is ideal for SLAM, the car OEMs favor a low-cost fish-eye camera based visual SLAM approach. Recent approaches have employed segmentation models to extract semantic features and improve mapping accuracy, but these AI models are memory unfriendly and computationally expensive, making deploying on embedded ADAS systems difficult. To address this issue, we proposed a new method that utilizes object detection models to extract robust and accurate parking lot features. The proposed method could reduce computational costs while maintaining high accuracy. Once combined with vehicles’ wheel-pulse information, the system could construct maps and locate the vehicle in real-time. This article will discuss in detail (1) the fish-eye based Around View Monitoring (AVM) with transparent chassis images as the inputs, (2) an Object Detection (OD) based feature point extraction algorithm to generate point cloud, (3) a low computational parking lot mapping algorithm and (4) the real-time localization algorithm. At last, we will demonstrate the experiment results with an embedded ADAS system installed on a real car in the underground parking lot. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ADAS" title="ADAS">ADAS</a>, <a href="https://publications.waset.org/abstracts/search?q=home%20zone%20parking%20pilot" title=" home zone parking pilot"> home zone parking pilot</a>, <a href="https://publications.waset.org/abstracts/search?q=object%20detection" title=" object detection"> object detection</a>, <a href="https://publications.waset.org/abstracts/search?q=visual%20SLAM" title=" visual SLAM"> visual SLAM</a> </p> <a href="https://publications.waset.org/abstracts/162272/low-cost-parking-lot-mapping-and-localization-for-home-zone-parking-pilot" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162272.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">67</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">20</span> Adjustable Aperture with Liquid Crystal for Real-Time Range Sensor</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yumee%20Kim">Yumee Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Seung-Guk%20Hyeon"> Seung-Guk Hyeon</a>, <a href="https://publications.waset.org/abstracts/search?q=Kukjin%20Chun"> Kukjin Chun</a> </p> <p class="card-text"><strong>Abstract:</strong></p> An adjustable aperture using a liquid crystal is proposed for real-time range detection and obtaining images simultaneously. The adjustable aperture operates as two types of aperture stops which can create two different Depth of Field images. By analyzing these two images, the distance can be extracted from camera to object. Initially, the aperture stop has large size with zero voltage. When the input voltage is applied, the aperture stop transfer to smaller size by orientational transition of liquid crystal molecules in the device. The diameter of aperture stop is 1.94mm and 1.06mm. The proposed device has low driving voltage of 7.0V and fast response time of 6.22m. Compact size aperture of 6×6×1.1 mm3 is assembled in conventional camera which contain 1/3” HD image sensor and focal length of 3.3mm that can be used in autonomous. The measured range was up to 5m. The adjustable aperture has high stability due to no mechanically moving parts. This range sensor can be applied to the various field of 3D depth map application which is the Advanced Driving Assistance System (ADAS), drones and manufacturing machine. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=adjustable%20aperture" title="adjustable aperture">adjustable aperture</a>, <a href="https://publications.waset.org/abstracts/search?q=dual%20aperture" title=" dual aperture"> dual aperture</a>, <a href="https://publications.waset.org/abstracts/search?q=liquid%20crystal" title=" liquid crystal"> liquid crystal</a>, <a href="https://publications.waset.org/abstracts/search?q=ranging%20and%20imaging" title=" ranging and imaging"> ranging and imaging</a>, <a href="https://publications.waset.org/abstracts/search?q=ADAS" title=" ADAS"> ADAS</a>, <a href="https://publications.waset.org/abstracts/search?q=range%20sensor" title=" range sensor"> range sensor</a> </p> <a href="https://publications.waset.org/abstracts/68963/adjustable-aperture-with-liquid-crystal-for-real-time-range-sensor" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/68963.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">381</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">19</span> Moving Object Detection Using Histogram of Uniformly Oriented Gradient</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wei-Jong%20Yang">Wei-Jong Yang</a>, <a href="https://publications.waset.org/abstracts/search?q=Yu-Siang%20Su"> Yu-Siang Su</a>, <a href="https://publications.waset.org/abstracts/search?q=Pau-Choo%20Chung"> Pau-Choo Chung</a>, <a href="https://publications.waset.org/abstracts/search?q=Jar-Ferr%20Yang"> Jar-Ferr Yang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Moving object detection (MOD) is an important issue in advanced driver assistance systems (ADAS). There are two important moving objects, pedestrians and scooters in ADAS. In real-world systems, there exist two important challenges for MOD, including the computational complexity and the detection accuracy. The histogram of oriented gradient (HOG) features can easily detect the edge of object without invariance to changes in illumination and shadowing. However, to reduce the execution time for real-time systems, the image size should be down sampled which would lead the outlier influence to increase. For this reason, we propose the histogram of uniformly-oriented gradient (HUG) features to get better accurate description of the contour of human body. In the testing phase, the support vector machine (SVM) with linear kernel function is involved. Experimental results show the correctness and effectiveness of the proposed method. With SVM classifiers, the real testing results show the proposed HUG features achieve better than classification performance than the HOG ones. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=moving%20object%20detection" title="moving object detection">moving object detection</a>, <a href="https://publications.waset.org/abstracts/search?q=histogram%20of%20oriented%20gradient" title=" histogram of oriented gradient"> histogram of oriented gradient</a>, <a href="https://publications.waset.org/abstracts/search?q=histogram%20of%20uniformly-oriented%20gradient" title=" histogram of uniformly-oriented gradient"> histogram of uniformly-oriented gradient</a>, <a href="https://publications.waset.org/abstracts/search?q=linear%20support%20vector%20machine" title=" linear support vector machine"> linear support vector machine</a> </p> <a href="https://publications.waset.org/abstracts/62854/moving-object-detection-using-histogram-of-uniformly-oriented-gradient" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/62854.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">594</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">18</span> Embedded Hardware and Software Design of Omnidirectional Autonomous Robotic Platform Suitable for Advanced Driver Assistance Systems Testing with Focus on Modularity and Safety</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ondrej%20Lufinka">Ondrej Lufinka</a>, <a href="https://publications.waset.org/abstracts/search?q=Jan%20Kaderabek"> Jan Kaderabek</a>, <a href="https://publications.waset.org/abstracts/search?q=Juraj%20Prstek"> Juraj Prstek</a>, <a href="https://publications.waset.org/abstracts/search?q=Jiri%20Skala"> Jiri Skala</a>, <a href="https://publications.waset.org/abstracts/search?q=Kamil%20Kosturik"> Kamil Kosturik</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper deals with the problem of using Autonomous Robotic Platforms (ARP) for the ADAS (Advanced Driver Assistance Systems) testing in automotive. There are different possibilities of the testing already in development, and lately, the autonomous robotic platforms are beginning to be used more and more widely. Autonomous Robotic Platform discussed in this paper explores the hardware and software design possibilities related to the field of embedded systems. The paper focuses on its chapters on the introduction of the problem in general; then, it describes the proposed prototype concept and its principles from the embedded HW and SW point of view. It talks about the key features that can be used for the innovation of these platforms (e.g., modularity, omnidirectional movement, common and non-traditional sensors used for localization, synchronization of more platforms and cars together, or safety mechanisms). In the end, the future possible development of the project is discussed as well. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=advanced%20driver%20assistance%20systems" title="advanced driver assistance systems">advanced driver assistance systems</a>, <a href="https://publications.waset.org/abstracts/search?q=ADAS" title=" ADAS"> ADAS</a>, <a href="https://publications.waset.org/abstracts/search?q=autonomous%20robotic%20platform" title=" autonomous robotic platform"> autonomous robotic platform</a>, <a href="https://publications.waset.org/abstracts/search?q=embedded%20systems" title=" embedded systems"> embedded systems</a>, <a href="https://publications.waset.org/abstracts/search?q=hardware" title=" hardware"> hardware</a>, <a href="https://publications.waset.org/abstracts/search?q=localization" title=" localization"> localization</a>, <a href="https://publications.waset.org/abstracts/search?q=modularity" title=" modularity"> modularity</a>, <a href="https://publications.waset.org/abstracts/search?q=multiple%20robots%20synchronization" title=" multiple robots synchronization"> multiple robots synchronization</a>, <a href="https://publications.waset.org/abstracts/search?q=omnidirectional%20movement" title=" omnidirectional movement"> omnidirectional movement</a>, <a href="https://publications.waset.org/abstracts/search?q=safety%20mechanisms" title=" safety mechanisms"> safety mechanisms</a>, <a href="https://publications.waset.org/abstracts/search?q=software" title=" software"> software</a> </p> <a href="https://publications.waset.org/abstracts/130591/embedded-hardware-and-software-design-of-omnidirectional-autonomous-robotic-platform-suitable-for-advanced-driver-assistance-systems-testing-with-focus-on-modularity-and-safety" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/130591.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">143</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">17</span> Efficacy and Safety of Computerized Cognitive Training Combined with SSRIs for Treating Cognitive Impairment Among Patients with Late-Life Depression: A 12-Week, Randomized Controlled Study</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Xiao%20Wang">Xiao Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Qinge%20Zhang"> Qinge Zhang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: This randomized, open-label study examined the therapeutic effects of computerized cognitive training (CCT) combined with selective serotonin reuptake inhibitors (SSRIs) on cognitive impairment among patients with late-life depression (LLD). Method: Study data were collected from May 5, 2021, to April 21, 2023. Outpatients who met diagnostic criteria for major depressive disorder according to the fifth revision of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) criteria (i.e., a total score on the 17-item Hamilton Depression Rating Scale (HAMD-17) ≥ 18 and a total score on the Montreal Cognitive Assessment scale (MOCA) <26) were randomly assigned to receive up to 12 weeks of CCT and SSRIs treatment (n=57) or SSRIs and Control treatment (n=61). The primary outcome was the change in Alzheimer's Disease Assessment Scale-Cognitive Subscale (ADAS-Cog) scores from baseline to week 12 between the two groups. The secondary outcomes included changes in the HAMD-17 score, Hamilton Anxiety Scale (HAMA) score and Neuropsychiatric Inventory (NPI) score. Mixed model repeated measures (MMRM) analysis was performed on modified intention-to-treat (mITT) and completer populations. Results: The full analysis set (FAS) included 118 patients (CCT and SSRIs group, n=57; SSRIs and Control group, n =61). Over the 12-week study period, the reduction in the ADAS-cog total score was significant (P < 0.001) in both groups, while MMRM analysis revealed a significantly greater reduction in cognitive function (ADAS-cog total scores) from baseline to posttreatment in the CCT and SSRIs group than in the SSRI and Control group [(F (1,115) =13.65, least-squares mean difference [95% CI]: −2.77 [−3.73, −1.81], p<0.001)]. There were significantly greater improvements in depression symptoms (measured by the HAMD-17) in the CCT and SSRIs group than in the control group [MMRM, estimated mean difference (SE) between groups −3.59 [−5.02, −2.15], p < 0.001]. The least-squares mean changes in the HAMA scores and NPI scores between baseline and week 8 were greater in the CCT and SSRIs group than in the control group (all P < 0.05). There was no significant difference between groups on response rates and remission rates by using the last-observation-carried-forward (LOCF) method (all P > 0.05). The most frequent adverse events (AEs) in both groups were dry mouth, somnolence, and constipation. There was no significant difference in the incidence of adverse events between the two groups. Conclusions: CCT combined with SSRIs was efficacious and well tolerated in LLD patients with cognitive impairment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=late-life%20depression" title="late-life depression">late-life depression</a>, <a href="https://publications.waset.org/abstracts/search?q=cognitive%20function" title=" cognitive function"> cognitive function</a>, <a href="https://publications.waset.org/abstracts/search?q=computerized%20cognitive%20training" title=" computerized cognitive training"> computerized cognitive training</a>, <a href="https://publications.waset.org/abstracts/search?q=SSRIs" title=" SSRIs"> SSRIs</a> </p> <a href="https://publications.waset.org/abstracts/187728/efficacy-and-safety-of-computerized-cognitive-training-combined-with-ssris-for-treating-cognitive-impairment-among-patients-with-late-life-depression-a-12-week-randomized-controlled-study" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/187728.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">53</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">16</span> Communication Infrastructure Required for a Driver Behaviour Monitoring System, ‘SiaMOTO’ IT Platform</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Dogaru-Ulieru%20Valentin">Dogaru-Ulieru Valentin</a>, <a href="https://publications.waset.org/abstracts/search?q=S%C4%83li%C8%99teanu%20Ioan%20Corneliu"> Sălișteanu Ioan Corneliu</a>, <a href="https://publications.waset.org/abstracts/search?q=Ardeleanu%20Mih%C4%83i%C8%9B%C4%83%20Nicolae"> Ardeleanu Mihăiță Nicolae</a>, <a href="https://publications.waset.org/abstracts/search?q=Brosc%C4%83reanu%20%C8%98tefan"> Broscăreanu Ștefan</a>, <a href="https://publications.waset.org/abstracts/search?q=S%C4%83li%C8%99teanu%20Bogdan"> Sălișteanu Bogdan</a>, <a href="https://publications.waset.org/abstracts/search?q=Mihai%20Mihail"> Mihai Mihail</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The SiaMOTO system is a communications and data processing platform for vehicle traffic. The human factor is the most important factor in the generation of this data, as the driver is the one who dictates the trajectory of the vehicle. Like any trajectory, specific parameters refer to position, speed and acceleration. Constant knowledge of these parameters allows complex analyses. Roadways allow many vehicles to travel through their confined space, and the overlapping trajectories of several vehicles increase the likelihood of collision events, known as road accidents. Any such event has causes that lead to its occurrence, so the conditions for its occurrence are known. The human factor is predominant in deciding the trajectory parameters of the vehicle on the road, so monitoring it by knowing the events reported by the DiaMOTO device over time, will generate a guide to target any potentially high-risk driving behavior and reward those who control the driving phenomenon well. In this paper, we have focused on detailing the communication infrastructure of the DiaMOTO device with the traffic data collection server, the infrastructure through which the database that will be used for complex AI/DLM analysis is built. The central element of this description is the data string in CODEC-8 format sent by the DiaMOTO device to the SiaMOTO collection server database. The data presented are specific to a functional infrastructure implemented in an experimental model stage, by installing on a number of 50 vehicles DiaMOTO unique code devices, integrating ADAS and GPS functions, through which vehicle trajectories can be monitored 24 hours a day. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=DiaMOTO" title="DiaMOTO">DiaMOTO</a>, <a href="https://publications.waset.org/abstracts/search?q=Codec-8" title=" Codec-8"> Codec-8</a>, <a href="https://publications.waset.org/abstracts/search?q=ADAS" title=" ADAS"> ADAS</a>, <a href="https://publications.waset.org/abstracts/search?q=GPS" title=" GPS"> GPS</a>, <a href="https://publications.waset.org/abstracts/search?q=driver%20monitoring" title=" driver monitoring"> driver monitoring</a> </p> <a href="https://publications.waset.org/abstracts/176531/communication-infrastructure-required-for-a-driver-behaviour-monitoring-system-siamoto-it-platform" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/176531.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">78</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">15</span> Improved Distance Estimation in Dynamic Environments through Multi-Sensor Fusion with Extended Kalman Filter</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Iffat%20Ara%20Ebu">Iffat Ara Ebu</a>, <a href="https://publications.waset.org/abstracts/search?q=Fahmida%20Islam"> Fahmida Islam</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Abdus%20Shahid%20Rafi"> Mohammad Abdus Shahid Rafi</a>, <a href="https://publications.waset.org/abstracts/search?q=Mahfuzur%20Rahman"> Mahfuzur Rahman</a>, <a href="https://publications.waset.org/abstracts/search?q=Umar%20Iqbal"> Umar Iqbal</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20Ball"> John Ball</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The application of multi-sensor fusion for enhanced distance estimation accuracy in dynamic environments is crucial for advanced driver assistance systems (ADAS) and autonomous vehicles. Limitations of single sensors such as cameras or radar in adverse conditions motivate the use of combined camera and radar data to improve reliability, adaptability, and object recognition. A multi-sensor fusion approach using an extended Kalman filter (EKF) is proposed to combine sensor measurements with a dynamic system model, achieving robust and accurate distance estimation. The research utilizes the Mississippi State University Autonomous Vehicular Simulator (MAVS) to create a controlled environment for data collection. Data analysis is performed using MATLAB. Qualitative (visualization of fused data vs ground truth) and quantitative metrics (RMSE, MAE) are employed for performance assessment. Initial results with simulated data demonstrate accurate distance estimation compared to individual sensors. The optimal sensor measurement noise variance and plant noise variance parameters within the EKF are identified, and the algorithm is validated with real-world data from a Chevrolet Blazer. In summary, this research demonstrates that multi-sensor fusion with an EKF significantly improves distance estimation accuracy in dynamic environments. This is supported by comprehensive evaluation metrics, with validation transitioning from simulated to real-world data, paving the way for safer and more reliable autonomous vehicle control. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=sensor%20fusion" title="sensor fusion">sensor fusion</a>, <a href="https://publications.waset.org/abstracts/search?q=EKF" title=" EKF"> EKF</a>, <a href="https://publications.waset.org/abstracts/search?q=MATLAB" title=" MATLAB"> MATLAB</a>, <a href="https://publications.waset.org/abstracts/search?q=MAVS" title=" MAVS"> MAVS</a>, <a href="https://publications.waset.org/abstracts/search?q=autonomous%20vehicle" title=" autonomous vehicle"> autonomous vehicle</a>, <a href="https://publications.waset.org/abstracts/search?q=ADAS" title=" ADAS"> ADAS</a> </p> <a href="https://publications.waset.org/abstracts/187474/improved-distance-estimation-in-dynamic-environments-through-multi-sensor-fusion-with-extended-kalman-filter" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/187474.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">43</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">14</span> Improving Lane Detection for Autonomous Vehicles Using Deep Transfer Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Richard%20O%E2%80%99Riordan">Richard O’Riordan</a>, <a href="https://publications.waset.org/abstracts/search?q=Saritha%20Unnikrishnan"> Saritha Unnikrishnan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Autonomous Vehicles (AVs) are incorporating an increasing number of ADAS features, including automated lane-keeping systems. In recent years, many research papers into lane detection algorithms have been published, varying from computer vision techniques to deep learning methods. The transition from lower levels of autonomy defined in the SAE framework and the progression to higher autonomy levels requires increasingly complex models and algorithms that must be highly reliable in their operation and functionality capacities. Furthermore, these algorithms have no room for error when operating at high levels of autonomy. Although the current research details existing computer vision and deep learning algorithms and their methodologies and individual results, the research also details challenges faced by the algorithms and the resources needed to operate, along with shortcomings experienced during their detection of lanes in certain weather and lighting conditions. This paper will explore these shortcomings and attempt to implement a lane detection algorithm that could be used to achieve improvements in AV lane detection systems. This paper uses a pre-trained LaneNet model to detect lane or non-lane pixels using binary segmentation as the base detection method using an existing dataset BDD100k followed by a custom dataset generated locally. The selected roads will be modern well-laid roads with up-to-date infrastructure and lane markings, while the second road network will be an older road with infrastructure and lane markings reflecting the road network's age. The performance of the proposed method will be evaluated on the custom dataset to compare its performance to the BDD100k dataset. In summary, this paper will use Transfer Learning to provide a fast and robust lane detection algorithm that can handle various road conditions and provide accurate lane detection. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ADAS" title="ADAS">ADAS</a>, <a href="https://publications.waset.org/abstracts/search?q=autonomous%20vehicles" title=" autonomous vehicles"> autonomous vehicles</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=LaneNet" title=" LaneNet"> LaneNet</a>, <a href="https://publications.waset.org/abstracts/search?q=lane%20detection" title=" lane detection"> lane detection</a> </p> <a href="https://publications.waset.org/abstracts/162354/improving-lane-detection-for-autonomous-vehicles-using-deep-transfer-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162354.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">104</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13</span> A Clinician’s Perspective on Electroencephalography Annotation and Analysis for Driver Drowsiness Estimation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ruxandra%20Aursulesei">Ruxandra Aursulesei</a>, <a href="https://publications.waset.org/abstracts/search?q=David%20O%E2%80%99Callaghan"> David O’Callaghan</a>, <a href="https://publications.waset.org/abstracts/search?q=Cian%20Ryan"> Cian Ryan</a>, <a href="https://publications.waset.org/abstracts/search?q=Diarmaid%20O%E2%80%99Cualain"> Diarmaid O’Cualain</a>, <a href="https://publications.waset.org/abstracts/search?q=Viktor%20Varkarakis"> Viktor Varkarakis</a>, <a href="https://publications.waset.org/abstracts/search?q=Alina%20Sultana"> Alina Sultana</a>, <a href="https://publications.waset.org/abstracts/search?q=Joseph%20Lemley"> Joseph Lemley</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human errors caused by drowsiness are among the leading causes of road accidents. Neurobiological research gives information about the electrical signals emitted by neurons firing within the brain. Electrical signal frequencies can be determined by attaching bio-sensors to the head surface. By observing the electrical impulses and the rhythmic interaction of neurons with each other, we can predict the mental state of a person. In this paper, we aim to better understand intersubject and intrasubject variability in terms of electrophysiological patterns that occur at the onset of drowsiness and their evolution with the decreasing of vigilance. The purpose is to lay the foundations for an algorithm that detects the onset of drowsiness before the physical signs become apparent. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=electroencephalography" title="electroencephalography">electroencephalography</a>, <a href="https://publications.waset.org/abstracts/search?q=drowsiness" title=" drowsiness"> drowsiness</a>, <a href="https://publications.waset.org/abstracts/search?q=ADAS" title=" ADAS"> ADAS</a>, <a href="https://publications.waset.org/abstracts/search?q=annotations" title=" annotations"> annotations</a>, <a href="https://publications.waset.org/abstracts/search?q=clinician" title=" clinician"> clinician</a> </p> <a href="https://publications.waset.org/abstracts/156014/a-clinicians-perspective-on-electroencephalography-annotation-and-analysis-for-driver-drowsiness-estimation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/156014.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">115</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">12</span> Real-world Characterization of Treatment Intensified (Add-on to Metformin) Adults with Type 2 Diabetes in Pakistan: A Multi-center Retrospective Study (Converge)</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Qamar%20Masood">Muhammad Qamar Masood</a>, <a href="https://publications.waset.org/abstracts/search?q=Syed%20Abbas%20Raza"> Syed Abbas Raza</a>, <a href="https://publications.waset.org/abstracts/search?q=Umar%20Yousaf%20Raja"> Umar Yousaf Raja</a>, <a href="https://publications.waset.org/abstracts/search?q=Imran%20Hassan"> Imran Hassan</a>, <a href="https://publications.waset.org/abstracts/search?q=Bilal%20Afzal"> Bilal Afzal</a>, <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Aleem%20Zahir"> Muhammad Aleem Zahir</a>, <a href="https://publications.waset.org/abstracts/search?q=Atika%20Shaheer"> Atika Shaheer</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Cardiovascular disease (CVD) is a major burden among people with type 2 diabetes (T2D) with 1 in 3 reported to have CVD. Therefore, understanding real-world clinical characteristics and prescribing patterns could help in better care. Objective: The CONVERGE (Cardiovascular Outcomes and Value in the Real world with GLP-1RAs) study characterized demographics and medication usage patterns in T2D intensified (add-on to metformin) overall population. The data were further divided into subgroups {dipeptidyl peptidase-4 inhibitors (DPP-4is), sulfonylureas (SUs), insulins, glucagon-like peptide-1 receptor agonists (GLP-1 RAs) and sodium-glucose cotransporter-2 inhibitors (SGLT-2is)}, according to the latest prescribed antidiabetic agent (ADA) in India/Pakistan/Thailand. Here, we report findings from Pakistan. Methods: A multi-center retrospective study utilized data from medical records between 13-Sep-2008 (post-market approval of GLP-1RAs) and 31-Dec-2017 in adults (≥18-year-old). The data for this study were collected from 05 centers / institutes located in major cities of Pakistan, including Karachi, Lahore, Islamabad, and Multan. These centers included National Hospital, Aga Khan University Hospital, Diabetes Endocrine Clinic Lahore, Shifa International Hospital, Mukhtar A Sheikh Hospital Multan. Data were collected at start of medical record and at 6 or 12-months prior to baseline based on variable type; analyzed descriptively. Results: Overall, 1,010 patients were eligible. At baseline, overall mean age (SD) was 51.6 (11.3) years, T2D duration was 2.4 (2.6) years, HbA1c was 8.3% (1.9) and 35% received ≥1CVD medications in the past 1-year (before baseline). Most frequently prescribed ADAs post-metformin were DPP-4is and SUs (~63%). Only 6.5% received GLP-1RAs and SGLT-2is were not available in Pakistan during the study period. Overall, it took a mean of 4.4 years and 5 years to initiate GLP-1RAs and SGLT-2is, respectively. In comparison to other subgroups, more patients from GLP-1RAs received ≥3 types of ADA (58%), ≥1 CVD medication (64%) and had higher body mass index (37kg/m2). Conclusions: Utilization of GLP-1RAs and SGLT-2is was low, took longer time to initiate and not before trying multiple ADAs. This may be due to lack of evidence for CV benefits for these agents during the study period. The planned phase 2 of the CONVERGE study can provide more insights into utilization and barriers to prescribe GLP-1RAs and SGLT-2is post 2018 in Pakistan. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=type%202%20diabetes" title="type 2 diabetes">type 2 diabetes</a>, <a href="https://publications.waset.org/abstracts/search?q=GLP-1RA" title=" GLP-1RA"> GLP-1RA</a>, <a href="https://publications.waset.org/abstracts/search?q=treatment%20intensification" title=" treatment intensification"> treatment intensification</a>, <a href="https://publications.waset.org/abstracts/search?q=cardiovascular%20disease" title=" cardiovascular disease"> cardiovascular disease</a> </p> <a href="https://publications.waset.org/abstracts/183406/real-world-characterization-of-treatment-intensified-add-on-to-metformin-adults-with-type-2-diabetes-in-pakistan-a-multi-center-retrospective-study-converge" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/183406.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">60</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11</span> A Review of In-Vehicle Network for Cloud Connected Vehicle</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hanbhin%20Ryu">Hanbhin Ryu</a>, <a href="https://publications.waset.org/abstracts/search?q=Ilkwon%20Yun"> Ilkwon Yun</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Automotive industry targets to provide an improvement in safety and convenience through realizing fully autonomous vehicle. For partially realizing fully automated driving, Current vehicles already feature varieties of advanced driver assistance system (ADAS) for safety and infotainment systems for the driver’s convenience. This paper presents Cloud Connected Vehicle (CCV) which connected vehicles with cloud data center via the access network to control the vehicle for achieving next autonomous driving form and describes its features. This paper also describes the shortcoming of the existing In-Vehicle Network (IVN) to be a next generation IVN of CCV and organize the 802.3 Ethernet, the next generation of IVN, related research issue to verify the feasibility of using Ethernet. At last, this paper refers to additional considerations to adopting Ethernet-based IVN for CCV. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=autonomous%20vehicle" title="autonomous vehicle">autonomous vehicle</a>, <a href="https://publications.waset.org/abstracts/search?q=cloud%20connected%20vehicle" title=" cloud connected vehicle"> cloud connected vehicle</a>, <a href="https://publications.waset.org/abstracts/search?q=ethernet" title=" ethernet"> ethernet</a>, <a href="https://publications.waset.org/abstracts/search?q=in-vehicle%20network" title=" in-vehicle network "> in-vehicle network </a> </p> <a href="https://publications.waset.org/abstracts/21239/a-review-of-in-vehicle-network-for-cloud-connected-vehicle" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/21239.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">479</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">10</span> Lane Detection Using Labeling Based RANSAC Algorithm</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yeongyu%20Choi">Yeongyu Choi</a>, <a href="https://publications.waset.org/abstracts/search?q=Ju%20H.%20Park"> Ju H. Park</a>, <a href="https://publications.waset.org/abstracts/search?q=Ho-Youl%20Jung"> Ho-Youl Jung</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we propose labeling based RANSAC algorithm for lane detection. Advanced driver assistance systems (ADAS) have been widely researched to avoid unexpected accidents. Lane detection is a necessary system to assist keeping lane and lane departure prevention. The proposed vision based lane detection method applies Canny edge detection, inverse perspective mapping (IPM), K-means algorithm, mathematical morphology operations and 8 connected-component labeling. Next, random samples are selected from each labeling region for RANSAC. The sampling method selects the points of lane with a high probability. Finally, lane parameters of straight line or curve equations are estimated. Through the simulations tested on video recorded at daytime and nighttime, we show that the proposed method has better performance than the existing RANSAC algorithm in various environments. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Canny%20edge%20detection" title="Canny edge detection">Canny edge detection</a>, <a href="https://publications.waset.org/abstracts/search?q=k-means%20algorithm" title=" k-means algorithm"> k-means algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=RANSAC" title=" RANSAC"> RANSAC</a>, <a href="https://publications.waset.org/abstracts/search?q=inverse%20perspective%20mapping" title=" inverse perspective mapping"> inverse perspective mapping</a> </p> <a href="https://publications.waset.org/abstracts/92894/lane-detection-using-labeling-based-ransac-algorithm" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/92894.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">244</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">9</span> Automated Driving Deep Neural Networks Model Accuracy and Performance Assessment in a Simulated Environment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=David%20Tena-Gago">David Tena-Gago</a>, <a href="https://publications.waset.org/abstracts/search?q=Jose%20M.%20Alcaraz%20Calero"> Jose M. Alcaraz Calero</a>, <a href="https://publications.waset.org/abstracts/search?q=Qi%20Wang"> Qi Wang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The evolution and integration of automated vehicles have become more and more tangible in recent years. State-of-the-art technological advances in the field of camera-based Artificial Intelligence (AI) and computer vision greatly favor the performance and reliability of the Advanced Driver Assistance System (ADAS), leading to a greater knowledge of vehicular operation and resembling human behavior. However, the exclusive use of this technology still seems insufficient to control vehicular operation at 100%. To reveal the degree of accuracy of the current camera-based automated driving AI modules, this paper studies the structure and behavior of one of the main solutions in a controlled testing environment. The results obtained clearly outline the lack of reliability when using exclusively the AI model in the perception stage, thereby entailing using additional complementary sensors to improve its safety and performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=accuracy%20assessment" title="accuracy assessment">accuracy assessment</a>, <a href="https://publications.waset.org/abstracts/search?q=AI-driven%20mobility" title=" AI-driven mobility"> AI-driven mobility</a>, <a href="https://publications.waset.org/abstracts/search?q=artificial%20intelligence" title=" artificial intelligence"> artificial intelligence</a>, <a href="https://publications.waset.org/abstracts/search?q=automated%20vehicles" title=" automated vehicles"> automated vehicles</a> </p> <a href="https://publications.waset.org/abstracts/149282/automated-driving-deep-neural-networks-model-accuracy-and-performance-assessment-in-a-simulated-environment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/149282.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">113</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">8</span> Enhancement Dynamic Cars Detection Based on Optimized HOG Descriptor</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mansouri%20Nabila">Mansouri Nabila</a>, <a href="https://publications.waset.org/abstracts/search?q=Ben%20Jemaa%20Yousra"> Ben Jemaa Yousra</a>, <a href="https://publications.waset.org/abstracts/search?q=Motamed%20Cina"> Motamed Cina</a>, <a href="https://publications.waset.org/abstracts/search?q=Watelain%20Eric"> Watelain Eric</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Research and development efforts in intelligent Advanced Driver Assistance Systems (ADAS) seek to save lives and reduce the number of on-road fatalities. For traffic and emergency monitoring, the essential but challenging task is vehicle detection and tracking in reasonably short time. This purpose needs first of all a powerful dynamic car detector model. In fact, this paper presents an optimized HOG process based on shape and motion parameters fusion. Our proposed approach mains to compute HOG by bloc feature from foreground blobs using configurable research window and pathway in order to overcome the shortcoming in term of computing time of HOG descriptor and improve their dynamic application performance. Indeed we prove in this paper that HOG by bloc descriptor combined with motion parameters is a very suitable car detector which reaches in record time a satisfactory recognition rate in dynamic outside area and bypasses several popular works without using sophisticated and expensive architectures such as GPU and FPGA. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=car-detector" title="car-detector">car-detector</a>, <a href="https://publications.waset.org/abstracts/search?q=HOG" title=" HOG"> HOG</a>, <a href="https://publications.waset.org/abstracts/search?q=motion" title=" motion"> motion</a>, <a href="https://publications.waset.org/abstracts/search?q=computing%20time" title=" computing time"> computing time</a> </p> <a href="https://publications.waset.org/abstracts/40704/enhancement-dynamic-cars-detection-based-on-optimized-hog-descriptor" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/40704.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">323</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7</span> The Need for a One Health and Welfare Approach to Industrial Animal Farming</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Clinton%20Adas">Clinton Adas</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Industrial animal farming contributes to numerous problems that humans face, and among these, antimicrobial resistance (AMR) has been identified by the World Health Organisation as a real possibility for the 21st Century. While numerous factors contribute to AMR, one of them is industrial animal farming and its effect on the food chain and environment. In 2017, livestock were given around 73% of all antibiotics worldwide to make them grow faster for profit purposes, to prevent illness caused by unhealthy living conditions, and to treat disease when it breaks out. Many of the antibiotics used provide little benefit to animals, and most are the same as those used by humans - including many deemed critical to human health that should be used sparingly. AMR contributes to millions of illnesses, and in 2019 was responsible for around 4.95 million deaths worldwide. It costs Europe around nine billion euros per year, while it costs the United States (US) around 20 billion dollars per year. While not a simple or quick solution, one way to begin to address the challenge of AMR and other harms from this type of farming is to focus on animal welfare as part of a One Health and Welfare approach, as better welfare requires less antibiotics usage, which may begin to break the cycle. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=animal%20and%20human%20welfare" title="animal and human welfare">animal and human welfare</a>, <a href="https://publications.waset.org/abstracts/search?q=industrial%20animal%20farming" title=" industrial animal farming"> industrial animal farming</a>, <a href="https://publications.waset.org/abstracts/search?q=antimicrobial%20resistance" title=" antimicrobial resistance"> antimicrobial resistance</a>, <a href="https://publications.waset.org/abstracts/search?q=one%20health%20and%20welfare" title=" one health and welfare"> one health and welfare</a>, <a href="https://publications.waset.org/abstracts/search?q=sustainable%20development%20goals" title=" sustainable development goals"> sustainable development goals</a> </p> <a href="https://publications.waset.org/abstracts/150842/the-need-for-a-one-health-and-welfare-approach-to-industrial-animal-farming" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/150842.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">101</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6</span> Atypical Familial Amyotrophic Lateral Sclerosis Secondary to Superoxide Dismutase 1 Gene Mutation With Coexistent Axonal Polyneuropathy: A Challenging Diagnosis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Seraj%20Makkawi">Seraj Makkawi</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdulaziz%20A.%20Alqarni"> Abdulaziz A. Alqarni</a>, <a href="https://publications.waset.org/abstracts/search?q=Himyan%20Alghaythee"> Himyan Alghaythee</a>, <a href="https://publications.waset.org/abstracts/search?q=Suzan%20Y.%20Alharbi"> Suzan Y. Alharbi</a>, <a href="https://publications.waset.org/abstracts/search?q=Anmar%20Fatani"> Anmar Fatani</a>, <a href="https://publications.waset.org/abstracts/search?q=Reem%20Adas"> Reem Adas</a>, <a href="https://publications.waset.org/abstracts/search?q=Ahmad%20R.%20Abuzinadah"> Ahmad R. Abuzinadah</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease, is a neurodegenerative disease that involves both the upper and lower motor neurons. Familial ALS, including superoxide dismutase 1 (SOD1) mutation, accounts for 5-10% of all cases of ALS. Typically, the symptoms of ALS are purely motor, though coexistent sensory symptoms have been reported in rare cases. In this report, we describe the case of a 47- year-old man who presented with progressive bilateral lower limb weakness and numbness for the last four years. A nerve conduction study (NCS) showed evidence of coexistent axonal sensorimotor polyneuropathy in addition to the typical findings of ALS in needle electromyography. Genetic testing confirmed the diagnosis of familial ALS secondary to the SOD1 genetic mutation. This report highlights that the presence of sensory symptoms should not exclude the possibility of ALS in an appropriate clinical setting. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Saudi%20Arabia" title="Saudi Arabia">Saudi Arabia</a>, <a href="https://publications.waset.org/abstracts/search?q=polyneuropathy" title=" polyneuropathy"> polyneuropathy</a>, <a href="https://publications.waset.org/abstracts/search?q=SOD1%20gene%20mutation" title=" SOD1 gene mutation"> SOD1 gene mutation</a>, <a href="https://publications.waset.org/abstracts/search?q=familial%20amyotrophic%20lateral%20sclerosis" title=" familial amyotrophic lateral sclerosis"> familial amyotrophic lateral sclerosis</a>, <a href="https://publications.waset.org/abstracts/search?q=amyotrophic%20lateral%20sclerosis" title=" amyotrophic lateral sclerosis"> amyotrophic lateral sclerosis</a> </p> <a href="https://publications.waset.org/abstracts/148506/atypical-familial-amyotrophic-lateral-sclerosis-secondary-to-superoxide-dismutase-1-gene-mutation-with-coexistent-axonal-polyneuropathy-a-challenging-diagnosis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/148506.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">148</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">5</span> Advanced Driver Assistance System: Veibra</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=C.%20Fernanda%20da%20S.%20Sampaio">C. Fernanda da S. Sampaio</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Gabriela%20Sadith%20Perez%20Paredes"> M. Gabriela Sadith Perez Paredes</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20Antonio%20de%20O.%20Martins"> V. Antonio de O. Martins</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Today the transport sector is undergoing a revolution, with the rise of Advanced Driver Assistance Systems (ADAS), industry and society itself will undergo a major transformation. However, the technological development of these applications is a challenge that requires new techniques and great machine learning and artificial intelligence. The study proposes to develop a vehicular perception system called Veibra, which consists of two front cameras for day/night viewing and an embedded device capable of working with Yolov2 image processing algorithms with low computational cost. The strategic version for the market is to assist the driver on the road with the detection of day/night objects, such as road signs, pedestrians, and animals that will be viewed through the screen of the phone or tablet through an application. The system has the ability to perform real-time driver detection and recognition to identify muscle movements and pupils to determine if the driver is tired or inattentive, analyzing the student's characteristic change and following the subtle movements of the whole face and issuing alerts through beta waves to ensure the concentration and attention of the driver. The system will also be able to perform tracking and monitoring through GSM (Global System for Mobile Communications) technology and the cameras installed in the vehicle. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=advanced%20driver%20assistance%20systems" title="advanced driver assistance systems">advanced driver assistance systems</a>, <a href="https://publications.waset.org/abstracts/search?q=tracking" title=" tracking"> tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=traffic%20signal%20detection" title=" traffic signal detection"> traffic signal detection</a>, <a href="https://publications.waset.org/abstracts/search?q=vehicle%20perception%20system" title=" vehicle perception system"> vehicle perception system</a> </p> <a href="https://publications.waset.org/abstracts/99299/advanced-driver-assistance-system-veibra" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/99299.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">155</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4</span> ADA Tool for Satellite InSAR-Based Ground Displacement Analysis: The Granada Region</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20Cuevas-Gonz%C3%A1lez">M. Cuevas-González</a>, <a href="https://publications.waset.org/abstracts/search?q=O.%20Monserrat"> O. Monserrat</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Barra"> A. Barra</a>, <a href="https://publications.waset.org/abstracts/search?q=C.%20Reyes-Carmona"> C. Reyes-Carmona</a>, <a href="https://publications.waset.org/abstracts/search?q=R.M.%20Mateos"> R.M. Mateos</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20P.%20Galve"> J. P. Galve</a>, <a href="https://publications.waset.org/abstracts/search?q=R.%20Sarro"> R. Sarro</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Cantalejo"> M. Cantalejo</a>, <a href="https://publications.waset.org/abstracts/search?q=E.%20Pe%C3%B1a"> E. Peña</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Mart%C3%ADnez-Corbella"> M. Martínez-Corbella</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20A.%20Luque"> J. A. Luque</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20M.%20Aza%C3%B1%C3%B3n"> J. M. Azañón</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Millares"> A. Millares</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20B%C3%A9jar"> M. Béjar</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20A.%20Navarro"> J. A. Navarro</a>, <a href="https://publications.waset.org/abstracts/search?q=L.%20Solari"> L. Solari</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Geohazard prone areas require continuous monitoring to detect risks, understand the phenomena occurring in those regions and prevent disasters. Satellite interferometry (InSAR) has come to be a trustworthy technique for ground movement detection and monitoring in the last few years. InSAR based techniques allow to process large areas providing high number of displacement measurements at low cost. However, the results provided by such techniques are usually not easy to interpret by non-experienced users hampering its use for decision makers. This work presents a set of tools developed in the framework of different projects (Momit, Safety, U-Geohaz, Riskcoast) and an example of their use in the Granada Coastal area (Spain) is shown. The ADA (Active Displacement Areas) tool have been developed with the aim of easing the management, use and interpretation of InSAR based results. It provides a semi-automatic extraction of the most significant ADAs through the application ADAFinder tool. This tool aims to support the exploitation of the European Ground Motion Service (EU-GMS), which will provide consistent, regular and reliable information regarding natural and anthropogenic ground motion phenomena all over Europe. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ground%20displacements" title="ground displacements">ground displacements</a>, <a href="https://publications.waset.org/abstracts/search?q=InSAR" title=" InSAR"> InSAR</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20hazards" title=" natural hazards"> natural hazards</a>, <a href="https://publications.waset.org/abstracts/search?q=satellite%20imagery" title=" satellite imagery"> satellite imagery</a> </p> <a href="https://publications.waset.org/abstracts/141505/ada-tool-for-satellite-insar-based-ground-displacement-analysis-the-granada-region" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/141505.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">219</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3</span> Visual Inspection of Road Conditions Using Deep Convolutional Neural Networks</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Christos%20Theoharatos">Christos Theoharatos</a>, <a href="https://publications.waset.org/abstracts/search?q=Dimitris%20Tsourounis"> Dimitris Tsourounis</a>, <a href="https://publications.waset.org/abstracts/search?q=Spiros%20Oikonomou"> Spiros Oikonomou</a>, <a href="https://publications.waset.org/abstracts/search?q=Andreas%20Makedonas"> Andreas Makedonas</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper focuses on the problem of visually inspecting and recognizing the road conditions in front of moving vehicles, targeting automotive scenarios. The goal of road inspection is to identify whether the road is slippery or not, as well as to detect possible anomalies on the road surface like potholes or body bumps/humps. Our work is based on an artificial intelligence methodology for real-time monitoring of road conditions in autonomous driving scenarios, using state-of-the-art deep convolutional neural network (CNN) techniques. Initially, the road and ego lane are segmented within the field of view of the camera that is integrated into the front part of the vehicle. A novel classification CNN is utilized to identify among plain and slippery road textures (e.g., wet, snow, etc.). Simultaneously, a robust detection CNN identifies severe surface anomalies within the ego lane, such as potholes and speed bumps/humps, within a distance of 5 to 25 meters. The overall methodology is illustrated under the scope of an integrated application (or system), which can be integrated into complete Advanced Driver-Assistance Systems (ADAS) systems that provide a full range of functionalities. The outcome of the proposed techniques present state-of-the-art detection and classification results and real-time performance running on AI accelerator devices like Intel’s Myriad 2/X Vision Processing Unit (VPU). <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title="deep learning">deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20networks" title=" convolutional neural networks"> convolutional neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=road%20condition%20classification" title=" road condition classification"> road condition classification</a>, <a href="https://publications.waset.org/abstracts/search?q=embedded%20systems" title=" embedded systems"> embedded systems</a> </p> <a href="https://publications.waset.org/abstracts/116793/visual-inspection-of-road-conditions-using-deep-convolutional-neural-networks" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/116793.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">134</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2</span> The Need for a One Health and Welfare Approach to Animal Welfare in Industrial Animal Farming</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Clinton%20Adas">Clinton Adas</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Antibiotic resistance has been identified by the World Health Organisation as a real possibility for the 21st Century. While many factors contribute to this, one of the more significant is industrial animal farming and its effect on the food chain and environment. Livestock consumes a significant portion of antibiotics sold globally, and these are used to make animals grow faster for profit purposes, to prevent illness caused by inhumane living conditions, and to treat disease when it breaks out. Many of these antibiotics provide little benefit to animals, and most are the same as those used by humans - including those deemed critical to human health that should therefore be used sparingly. Antibiotic resistance contributes to growing numbers of illnesses and death in humans, and the excess usage of these medications results in waste that enters the environment and is harmful to many ecological processes. This combination of antimicrobial resistance and environmental degradation furthermore harms the economic well-being and prospects of many. Using an interdisciplinary approach including medical, environmental, economic, and legal studies, the paper evaluates the dynamic between animal welfare and commerce and argues that while animal welfare is not of great concern to many, this approach is ultimately harming human welfare too. It is, however, proposed that both could be addressed under a One Health and Welfare approach, as we cannot continue to ignore the linkages between animals, the environment, and people. The evaluation of industrial animal farming is therefore considered through three aspects – the environmental impact, which is measured by pollution that causes environmental degradation; the human impact, which is measured by the rise of illnesses from pollution and antibiotics resistance; and the economic impact, which is measured through costs to the health care system and the financial implications of industrial farming on the economic well-being of many. These three aspects are considered in light of the Sustainable Development Goals that provide additional tangible metrics to evidence the negative impacts. While the research addresses the welfare of farmed animals, there is potential for these principles to be extrapolated into other contexts, including wildlife and habitat protection. It must be noted that while the question of animal rights in industrial animal farming is acknowledged and of importance, this is a separate matter that is not addressed here. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=animal%20and%20human%20welfare" title="animal and human welfare">animal and human welfare</a>, <a href="https://publications.waset.org/abstracts/search?q=industrial%20animal%20farming" title=" industrial animal farming"> industrial animal farming</a>, <a href="https://publications.waset.org/abstracts/search?q=one%20health%20and%20welfare" title=" one health and welfare"> one health and welfare</a>, <a href="https://publications.waset.org/abstracts/search?q=sustainable%20development%20goals" title=" sustainable development goals"> sustainable development goals</a> </p> <a href="https://publications.waset.org/abstracts/144212/the-need-for-a-one-health-and-welfare-approach-to-animal-welfare-in-industrial-animal-farming" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/144212.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">84</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1</span> Safety Tolerance Zone for Driver-Vehicle-Environment Interactions under Challenging Conditions</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Matja%C5%BE%20%C5%A0raml">Matjaž Šraml</a>, <a href="https://publications.waset.org/abstracts/search?q=Marko%20Ren%C4%8Delj"> Marko Renčelj</a>, <a href="https://publications.waset.org/abstracts/search?q=Toma%C5%BE%20Tollazzi"> Tomaž Tollazzi</a>, <a href="https://publications.waset.org/abstracts/search?q=Chiara%20Gruden"> Chiara Gruden</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Road safety is a worldwide issue with numerous and heterogeneous factors influencing it. On the side, driver state – comprising distraction/inattention, fatigue, drowsiness, extreme emotions, and socio-cultural factors highly affect road safety. On the other side, the vehicle state has an important role in mitigating (or not) the road risk. Finally, the road environment is still one of the main determinants of road safety, defining driving task complexity. At the same time, thanks to technological development, a lot of detailed data is easily available, creating opportunities for the detection of driver state, vehicle characteristics and road conditions and, consequently, for the design of ad hoc interventions aimed at improving driver performance, increase awareness and mitigate road risks. This is the challenge faced by the i-DREAMS project. i-DREAMS, which stands for a smart Driver and Road Environment Assessment and Monitoring System, is a 3-year project funded by the European Union’s Horizon 2020 research and innovation program. It aims to set up a platform to define, develop, test and validate a ‘Safety Tolerance Zone’ to prevent drivers from getting too close to the boundaries of unsafe operation by mitigating risks in real-time and after the trip. After the definition and development of the Safety Tolerance Zone concept and the concretization of the same in an Advanced driver-assistance system (ADAS) platform, the system was tested firstly for 2 months in a driving simulator environment in 5 different countries. After that, naturalistic driving studies started for a 10-month period (comprising a 1-month pilot study, 3-month baseline study and 6 months study implementing interventions). Currently, the project team has approved a common evaluation approach, and it is developing the assessment of the usage and outcomes of the i-DREAMS system, which is turning positive insights. The i-DREAMS consortium consists of 13 partners, 7 engineering universities and research groups, 4 industry partners and 2 partners (European Transport Safety Council - ETSC - and POLIS cities and regions for transport innovation) closely linked to transport safety stakeholders, covering 8 different countries altogether. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=advanced%20driver%20assistant%20systems" title="advanced driver assistant systems">advanced driver assistant systems</a>, <a href="https://publications.waset.org/abstracts/search?q=driving%20simulator" title=" driving simulator"> driving simulator</a>, <a href="https://publications.waset.org/abstracts/search?q=safety%20tolerance%20zone" title=" safety tolerance zone"> safety tolerance zone</a>, <a href="https://publications.waset.org/abstracts/search?q=traffic%20safety" title=" traffic safety"> traffic safety</a> </p> <a href="https://publications.waset.org/abstracts/162481/safety-tolerance-zone-for-driver-vehicle-environment-interactions-under-challenging-conditions" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162481.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">67</span> </span> </div> </div> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10