CINXE.COM

Search results for: odometry

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: odometry</title> <meta name="description" content="Search results for: odometry"> <meta name="keywords" content="odometry"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="odometry" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="odometry"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 15</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: odometry</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">15</span> Two Wheels Differential Type Odometry for Robot</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Abhishek%20Jha">Abhishek Jha</a>, <a href="https://publications.waset.org/abstracts/search?q=Manoj%20Kumar"> Manoj Kumar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper proposes a new type of two wheels differential type odometry to estimate the next position and orientation of mobile robots. The proposed odometry is composed for two independent wheels with respective encoders. The two wheels rotate independently, and the change is determined by the difference in the velocity of the two wheels. Angular velocities of the two wheels are measured by rotary encoders. A mathematical model is proposed for the mobile robots to precisely move towards the goal. Using measured values of the two encoders, the current displacement vector of a mobile robot is calculated by kinematics of the mathematical model. Using the displacement vector, the next position and orientation of the mobile robot are estimated by proposed odometry. Result of simulator experiment by the developed odometry is shown. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=mobile%20robot" title="mobile robot">mobile robot</a>, <a href="https://publications.waset.org/abstracts/search?q=odometry" title=" odometry"> odometry</a>, <a href="https://publications.waset.org/abstracts/search?q=unicycle" title=" unicycle"> unicycle</a>, <a href="https://publications.waset.org/abstracts/search?q=differential%20type" title=" differential type"> differential type</a>, <a href="https://publications.waset.org/abstracts/search?q=encoders" title=" encoders"> encoders</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20range%20sensors" title=" infrared range sensors"> infrared range sensors</a>, <a href="https://publications.waset.org/abstracts/search?q=kinematic%20model" title=" kinematic model"> kinematic model</a> </p> <a href="https://publications.waset.org/abstracts/12157/two-wheels-differential-type-odometry-for-robot" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/12157.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">451</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">14</span> Open Source, Open Hardware Ground Truth for Visual Odometry and Simultaneous Localization and Mapping Applications</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Janusz%20Bedkowski">Janusz Bedkowski</a>, <a href="https://publications.waset.org/abstracts/search?q=Grzegorz%20Kisala"> Grzegorz Kisala</a>, <a href="https://publications.waset.org/abstracts/search?q=Michal%20Wlasiuk"> Michal Wlasiuk</a>, <a href="https://publications.waset.org/abstracts/search?q=Piotr%20Pokorski"> Piotr Pokorski</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Ground-truth data is essential for VO (Visual Odometry) and SLAM (Simultaneous Localization and Mapping) quantitative evaluation using e.g. ATE (Absolute Trajectory Error) and RPE (Relative Pose Error). Many open-access data sets provide raw and ground-truth data for benchmark purposes. The issue appears when one would like to validate Visual Odometry and/or SLAM approaches on data captured using the device for which the algorithm is targeted for example mobile phone and disseminate data for other researchers. For this reason, we propose an open source, open hardware groundtruth system that provides an accurate and precise trajectory with a 3D point cloud. It is based on LiDAR Livox Mid-360 with a non-repetitive scanning pattern, on-board Raspberry Pi 4B computer, battery and software for off-line calculations (camera to LiDAR calibration, LiDAR odometry, SLAM, georeferencing). We show how this system can be used for the evaluation of various the state of the art algorithms (Stella SLAM, ORB SLAM3, DSO) in typical indoor monocular VO/SLAM. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=SLAM" title="SLAM">SLAM</a>, <a href="https://publications.waset.org/abstracts/search?q=ground%20truth" title=" ground truth"> ground truth</a>, <a href="https://publications.waset.org/abstracts/search?q=navigation" title=" navigation"> navigation</a>, <a href="https://publications.waset.org/abstracts/search?q=LiDAR" title=" LiDAR"> LiDAR</a>, <a href="https://publications.waset.org/abstracts/search?q=visual%20odometry" title=" visual odometry"> visual odometry</a>, <a href="https://publications.waset.org/abstracts/search?q=mapping" title=" mapping"> mapping</a> </p> <a href="https://publications.waset.org/abstracts/187389/open-source-open-hardware-ground-truth-for-visual-odometry-and-simultaneous-localization-and-mapping-applications" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/187389.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">69</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13</span> Monocular Visual Odometry for Three Different View Angles by Intel Realsense T265 with the Measurement of Remote</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Heru%20Syah%20Putra">Heru Syah Putra</a>, <a href="https://publications.waset.org/abstracts/search?q=Aji%20Tri%20Pamungkas%20Nurcahyo"> Aji Tri Pamungkas Nurcahyo</a>, <a href="https://publications.waset.org/abstracts/search?q=Chuang-Jan%20Chang"> Chuang-Jan Chang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> MOIL-SDK method refers to the spatial angle that forms a view with a different perspective from the Fisheye image. Visual Odometry forms a trusted application for extending projects by tracking using image sequences. A real-time, precise, and persistent approach that is able to contribute to the work when taking datasets and generate ground truth as a reference for the estimates of each image using the FAST Algorithm method in finding Keypoints that are evaluated during the tracking process with the 5-point Algorithm with RANSAC, as well as produce accurate estimates the camera trajectory for each rotational, translational movement on the X, Y, and Z axes. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=MOIL-SDK" title="MOIL-SDK">MOIL-SDK</a>, <a href="https://publications.waset.org/abstracts/search?q=intel%20realsense%20T265" title=" intel realsense T265"> intel realsense T265</a>, <a href="https://publications.waset.org/abstracts/search?q=Fisheye%20image" title=" Fisheye image"> Fisheye image</a>, <a href="https://publications.waset.org/abstracts/search?q=monocular%20visual%20odometry" title=" monocular visual odometry"> monocular visual odometry</a> </p> <a href="https://publications.waset.org/abstracts/147340/monocular-visual-odometry-for-three-different-view-angles-by-intel-realsense-t265-with-the-measurement-of-remote" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/147340.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">134</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">12</span> Studies on Affecting Factors of Wheel Slip and Odometry Error on Real-Time of Wheeled Mobile Robots: A Review</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=D.%20Vidhyaprakash">D. Vidhyaprakash</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Elango"> A. Elango</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In real-time applications, wheeled mobile robots are increasingly used and operated in extreme and diverse conditions traversing challenging surfaces such as a pitted, uneven terrain, natural flat, smooth terrain, as well as wet and dry surfaces. In order to accomplish such tasks, it is critical that the motion control functions without wheel slip and odometry error during the navigation of the two-wheeled mobile robot (WMR). Wheel slip and odometry error are disrupting factors on overall WMR performance in the form of deviation from desired trajectory, navigation, travel time and budgeted energy consumption. The wheeled mobile robot’s ability to operate at peak performance on various work surfaces without wheel slippage and odometry error is directly connected to four main parameters, which are the range of payload distribution, speed, wheel diameter, and wheel width. This paper analyses the effects of those parameters on overall performance and is concerned with determining the ideal range of parameters for optimum performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=wheeled%20mobile%20robot" title="wheeled mobile robot">wheeled mobile robot</a>, <a href="https://publications.waset.org/abstracts/search?q=terrain" title=" terrain"> terrain</a>, <a href="https://publications.waset.org/abstracts/search?q=wheel%20slippage" title=" wheel slippage"> wheel slippage</a>, <a href="https://publications.waset.org/abstracts/search?q=odometryerror" title=" odometryerror"> odometryerror</a>, <a href="https://publications.waset.org/abstracts/search?q=trajectory" title=" trajectory"> trajectory</a> </p> <a href="https://publications.waset.org/abstracts/38028/studies-on-affecting-factors-of-wheel-slip-and-odometry-error-on-real-time-of-wheeled-mobile-robots-a-review" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/38028.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">284</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11</span> Visual Odometry and Trajectory Reconstruction for UAVs</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sandro%20Bartolini">Sandro Bartolini</a>, <a href="https://publications.waset.org/abstracts/search?q=Alessandro%20Mecocci"> Alessandro Mecocci</a>, <a href="https://publications.waset.org/abstracts/search?q=Alessio%20Medaglini"> Alessio Medaglini</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The growing popularity of systems based on unmanned aerial vehicles (UAVs) is highlighting their vulnerability, particularly in relation to the positioning system used. Typically, UAV architectures use the civilian GPS, which is exposed to a number of different attacks, such as jamming or spoofing. This is why it is important to develop alternative methodologies to accurately estimate the actual UAV position without relying on GPS measurements only. In this paper, we propose a position estimate method for UAVs based on monocular visual odometry. We have developed a flight control system capable of keeping track of the entire trajectory travelled, with a reduced dependency on the availability of GPS signals. Moreover, the simplicity of the developed solution makes it applicable to a wide range of commercial drones. The final goal is to allow for safer flights in all conditions, even under cyber-attacks trying to deceive the drone. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=visual%20odometry" title="visual odometry">visual odometry</a>, <a href="https://publications.waset.org/abstracts/search?q=autonomous%20uav" title=" autonomous uav"> autonomous uav</a>, <a href="https://publications.waset.org/abstracts/search?q=position%20measurement" title=" position measurement"> position measurement</a>, <a href="https://publications.waset.org/abstracts/search?q=autonomous%20outdoor%20flight" title=" autonomous outdoor flight"> autonomous outdoor flight</a> </p> <a href="https://publications.waset.org/abstracts/139336/visual-odometry-and-trajectory-reconstruction-for-uavs" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139336.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">217</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">10</span> Application of Adaptive Particle Filter for Localizing a Mobile Robot Using 3D Camera Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Maysam%20Shahsavari">Maysam Shahsavari</a>, <a href="https://publications.waset.org/abstracts/search?q=Seyed%20Jamalaldin%20Haddadi"> Seyed Jamalaldin Haddadi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> There are several methods to localize a mobile robot such as relative, absolute and probabilistic. In this paper, particle filter due to its simple implementation and the fact that it does not need to know to the starting position will be used. This method estimates the position of the mobile robot using a probabilistic distribution, relying on a known map of the environment instead of predicting it. Afterwards, it updates this estimation by reading input sensors and control commands. To receive information from the surrounding world, distance to obstacles, for example, a Kinect is used which is much cheaper than a laser range finder. Finally, after explaining the Adaptive Particle Filter method and its implementation in detail, we will compare this method with the dead reckoning method and show that this method is much more suitable for situations in which we have a map of the environment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=particle%20filter" title="particle filter">particle filter</a>, <a href="https://publications.waset.org/abstracts/search?q=localization" title=" localization"> localization</a>, <a href="https://publications.waset.org/abstracts/search?q=methods" title=" methods"> methods</a>, <a href="https://publications.waset.org/abstracts/search?q=odometry" title=" odometry"> odometry</a>, <a href="https://publications.waset.org/abstracts/search?q=kinect" title=" kinect "> kinect </a> </p> <a href="https://publications.waset.org/abstracts/53041/application-of-adaptive-particle-filter-for-localizing-a-mobile-robot-using-3d-camera-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/53041.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">269</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">9</span> Map Matching Performance under Various Similarity Metrics for Heterogeneous Robot Teams</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20C.%20Akay">M. C. Akay</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Aybakan"> A. Aybakan</a>, <a href="https://publications.waset.org/abstracts/search?q=H.%20Temeltas"> H. Temeltas </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Aerial and ground robots have various advantages of usage in different missions. Aerial robots can move quickly and get a different sight of view of the area, but those vehicles cannot carry heavy payloads. On the other hand, unmanned ground vehicles (UGVs) are slow moving vehicles, since those can carry heavier payloads than unmanned aerial vehicles (UAVs). In this context, we investigate the performances of various Similarity Metrics to provide a common map for Heterogeneous Robot Team (HRT) in complex environments. Within the usage of Lidar Odometry and Octree Mapping technique, the local 3D maps of the environment are gathered. &nbsp;In order to obtain a common map for HRT, informative theoretic similarity metrics are exploited. All types of these similarity metrics gave adequate as allowable simulation time and accurate results that can be used in different types of applications. For the heterogeneous multi robot team, those methods can be used to match different types of maps. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=common%20maps" title="common maps">common maps</a>, <a href="https://publications.waset.org/abstracts/search?q=heterogeneous%20robot%20team" title=" heterogeneous robot team"> heterogeneous robot team</a>, <a href="https://publications.waset.org/abstracts/search?q=map%20matching" title=" map matching"> map matching</a>, <a href="https://publications.waset.org/abstracts/search?q=informative%20theoretic%20similarity%20metrics" title=" informative theoretic similarity metrics"> informative theoretic similarity metrics</a> </p> <a href="https://publications.waset.org/abstracts/99098/map-matching-performance-under-various-similarity-metrics-for-heterogeneous-robot-teams" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/99098.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">167</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">8</span> Augmentation of Automatic Selective Door Operation systems with UWB positioning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=John%20Chan">John Chan</a>, <a href="https://publications.waset.org/abstracts/search?q=Jake%20Linnenbank"> Jake Linnenbank</a>, <a href="https://publications.waset.org/abstracts/search?q=Gavin%20Caird"> Gavin Caird</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Automatic Selective Door Operation (ASDO) systems are increasingly used in railways to provide Correct Side Door Enable (CSDE) protection as well as to protect passenger doors opening off the platform where the train is longer than the platform, or in overshoot or undershoot scenarios. Such ASDO systems typically utilise trackside-installed RFID beacons, such as Eurobalises for odometry positioning purposes. Installing such trackside infrastructure may not be desirable or possible due to various factors such as conflict with existing infrastructure, potential damage from track tamping and jurisdiction constraints. Ultra-wideband (UWB) positioning technology could enable ASDO positioning requirements to be met without requiring installation of equipment directly on track since UWB technology can be installed on adjacent infrastructure such as on platforms. This paper will explore the feasibility of upgrading existing ASDO systems with UWB positioning technology, the feasibility of retrofitting UWB-enabled ASDO systems onto unfitted trains, and any other considerations relating to the use of UWB positioning for ASDO applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=UWB" title="UWB">UWB</a>, <a href="https://publications.waset.org/abstracts/search?q=ASDO" title=" ASDO"> ASDO</a>, <a href="https://publications.waset.org/abstracts/search?q=automatic%20selective%20door%20operations" title=" automatic selective door operations"> automatic selective door operations</a>, <a href="https://publications.waset.org/abstracts/search?q=CSDE" title=" CSDE"> CSDE</a>, <a href="https://publications.waset.org/abstracts/search?q=correct%20side%20door%20enable" title=" correct side door enable"> correct side door enable</a> </p> <a href="https://publications.waset.org/abstracts/166459/augmentation-of-automatic-selective-door-operation-systems-with-uwb-positioning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/166459.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">77</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7</span> Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Carlos%20Gordon">Carlos Gordon</a>, <a href="https://publications.waset.org/abstracts/search?q=Patricio%20Encalada"> Patricio Encalada</a>, <a href="https://publications.waset.org/abstracts/search?q=Henry%20Lema"> Henry Lema</a>, <a href="https://publications.waset.org/abstracts/search?q=Diego%20Leon"> Diego Leon</a>, <a href="https://publications.waset.org/abstracts/search?q=Dennis%20Chicaiza"> Dennis Chicaiza</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=autonomous%20navigation" title="autonomous navigation">autonomous navigation</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=path%20planning" title=" path planning"> path planning</a>, <a href="https://publications.waset.org/abstracts/search?q=robotic%20operative%20system" title=" robotic operative system"> robotic operative system</a>, <a href="https://publications.waset.org/abstracts/search?q=open%20source%20computer%20vision%20library" title=" open source computer vision library"> open source computer vision library</a> </p> <a href="https://publications.waset.org/abstracts/101726/autonomous-kuka-youbot-navigation-based-on-machine-learning-and-path-planning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/101726.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">177</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6</span> A Diagnostic Comparative Analysis of on Simultaneous Localization and Mapping (SLAM) Models for Indoor and Outdoor Route Planning and Obstacle Avoidance </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Seyed%20Esmail%20Seyedi%20Bariran">Seyed Esmail Seyedi Bariran</a>, <a href="https://publications.waset.org/abstracts/search?q=Khairul%20Salleh%20Mohamed%20Sahari"> Khairul Salleh Mohamed Sahari</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In robotics literature, the simultaneous localization and mapping (SLAM) is commonly associated with a priori-posteriori problem. The autonomous vehicle needs a neutral map to spontaneously track its local position, i.e., “localization” while at the same time a precise path estimation of the environment state is required for effective route planning and obstacle avoidance. On the other hand, the environmental noise factors can significantly intensify the inherent uncertainties in using odometry information and measurements obtained from the robot’s exteroceptive sensor which in return directly affect the overall performance of the corresponding SLAM. Therefore, the current work is primarily dedicated to provide a diagnostic analysis of six SLAM algorithms including FastSLAM, L-SLAM, GraphSLAM, Grid SLAM and DP-SLAM. A SLAM simulated environment consisting of two sets of landmark locations and robot waypoints was set based on modified EKF and UKF in MATLAB using two separate maps for indoor and outdoor route planning subject to natural and artificial obstacles. The simulation results are expected to provide an unbiased platform to compare the estimation performances of the five SLAM models as well as on the reliability of each SLAM model for indoor and outdoor applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=route%20planning" title="route planning">route planning</a>, <a href="https://publications.waset.org/abstracts/search?q=obstacle" title=" obstacle"> obstacle</a>, <a href="https://publications.waset.org/abstracts/search?q=estimation%20performance" title=" estimation performance"> estimation performance</a>, <a href="https://publications.waset.org/abstracts/search?q=FastSLAM" title=" FastSLAM"> FastSLAM</a>, <a href="https://publications.waset.org/abstracts/search?q=L-SLAM" title=" L-SLAM"> L-SLAM</a>, <a href="https://publications.waset.org/abstracts/search?q=GraphSLAM" title=" GraphSLAM"> GraphSLAM</a>, <a href="https://publications.waset.org/abstracts/search?q=Grid%20SLAM" title=" Grid SLAM"> Grid SLAM</a>, <a href="https://publications.waset.org/abstracts/search?q=DP-SLAM" title=" DP-SLAM"> DP-SLAM</a> </p> <a href="https://publications.waset.org/abstracts/13160/a-diagnostic-comparative-analysis-of-on-simultaneous-localization-and-mapping-slam-models-for-indoor-and-outdoor-route-planning-and-obstacle-avoidance" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13160.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">444</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">5</span> Robot Operating System-Based SLAM for a Gazebo-Simulated Turtlebot2 in 2d Indoor Environment with Cartographer Algorithm</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wilayat%20Ali">Wilayat Ali</a>, <a href="https://publications.waset.org/abstracts/search?q=Li%20Sheng"> Li Sheng</a>, <a href="https://publications.waset.org/abstracts/search?q=Waleed%20Ahmed"> Waleed Ahmed</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The ability of the robot to make simultaneously map of the environment and localize itself with respect to that environment is the most important element of mobile robots. To solve SLAM many algorithms could be utilized to build up the SLAM process and SLAM is a developing area in Robotics research. Robot Operating System (ROS) is one of the frameworks which provide multiple algorithm nodes to work with and provide a transmission layer to robots. Manyof these algorithms extensively in use are Hector SLAM, Gmapping and Cartographer SLAM. This paper describes a ROS-based Simultaneous localization and mapping (SLAM) library Google Cartographer mapping, which is open-source algorithm. The algorithm was applied to create a map using laser and pose data from 2d Lidar that was placed on a mobile robot. The model robot uses the gazebo package and simulated in Rviz. Our research work&#39;s primary goal is to obtain mapping through Cartographer SLAM algorithm in a static indoor environment. From our research, it is shown that for indoor environments cartographer is an applicable algorithm to generate 2d maps with LIDAR placed on mobile robot because it uses both odometry and poses estimation. The algorithm has been evaluated and maps are constructed against the SLAM algorithms presented by Turtlebot2 in the static indoor environment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=SLAM" title="SLAM">SLAM</a>, <a href="https://publications.waset.org/abstracts/search?q=ROS" title=" ROS"> ROS</a>, <a href="https://publications.waset.org/abstracts/search?q=navigation" title=" navigation"> navigation</a>, <a href="https://publications.waset.org/abstracts/search?q=localization%20and%20mapping" title=" localization and mapping"> localization and mapping</a>, <a href="https://publications.waset.org/abstracts/search?q=gazebo" title=" gazebo"> gazebo</a>, <a href="https://publications.waset.org/abstracts/search?q=Rviz" title=" Rviz"> Rviz</a>, <a href="https://publications.waset.org/abstracts/search?q=Turtlebot2" title=" Turtlebot2"> Turtlebot2</a>, <a href="https://publications.waset.org/abstracts/search?q=slam%20algorithms" title=" slam algorithms"> slam algorithms</a>, <a href="https://publications.waset.org/abstracts/search?q=2d%20indoor%20environment" title=" 2d indoor environment"> 2d indoor environment</a>, <a href="https://publications.waset.org/abstracts/search?q=cartographer" title=" cartographer"> cartographer</a> </p> <a href="https://publications.waset.org/abstracts/133435/robot-operating-system-based-slam-for-a-gazebo-simulated-turtlebot2-in-2d-indoor-environment-with-cartographer-algorithm" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/133435.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">145</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4</span> Autonomous Exploration, Navigation and Mapping Payload Integrated on a Quadruped Robot</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Julian%20Y.%20Raheema">Julian Y. Raheema</a>, <a href="https://publications.waset.org/abstracts/search?q=Michael%20R.%20Hess"> Michael R. Hess</a>, <a href="https://publications.waset.org/abstracts/search?q=Raymond%20C.%20Provost"> Raymond C. Provost</a>, <a href="https://publications.waset.org/abstracts/search?q=Mark%20Bilinski"> Mark Bilinski</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The world is rapidly moving towards advancing and utilizing artificial intelligence and autonomous robotics. The ground-breaking Boston Dynamics quadruped robot, SPOT, was designed for industrial and commercial tasks requiring limited autonomous navigation. Out of the box, SPOT has route memorization and playback – it can repeat a path that it has been manually piloted through, but it cannot autonomously navigate an area that has not been previously explored. The presented SPOT payload package is built on ROS framework to support autonomous navigation and mapping of an unexplored environment. The package is fully integrated with SPOT to take advantage of motor controls and collision avoidance that comes natively with the robot. The payload runs all computations onboard, takes advantage of visual odometry SLAM and uses an Intel RealSense depth camera and Velodyne LiDAR sensor to generate 2D and 3D maps while in autonomous navigation mode. These maps are fused into the navigation stack to generate a costmap to enable the robot to safely navigate the environment without causing damage to the surroundings or the robot. The operator defines the operational zone and start location and then sends the explore command to have SPOT explore, generate 2D and 3D maps of the environment and return to the start location to await the operator's next command. The benefit of the presented package is that it is much lighter weight and less expensive than previous approaches and, importantly, operates in GPS-denied scenarios, which is ideal for indoor mapping. There are numerous applications that are hazardous to humans for SPOT enhanced with the autonomy payload, including disaster response, nuclear inspection, mine inspection, and so on. Other less extreme uses cases include autonomous 3D and 2D scanning of facilities for inspection, engineering and construction purposes. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=autonomous" title="autonomous">autonomous</a>, <a href="https://publications.waset.org/abstracts/search?q=SLAM" title=" SLAM"> SLAM</a>, <a href="https://publications.waset.org/abstracts/search?q=quadruped" title=" quadruped"> quadruped</a>, <a href="https://publications.waset.org/abstracts/search?q=mapping" title=" mapping"> mapping</a>, <a href="https://publications.waset.org/abstracts/search?q=exploring" title=" exploring"> exploring</a>, <a href="https://publications.waset.org/abstracts/search?q=ROS" title=" ROS"> ROS</a>, <a href="https://publications.waset.org/abstracts/search?q=robotics" title=" robotics"> robotics</a>, <a href="https://publications.waset.org/abstracts/search?q=navigation" title=" navigation"> navigation</a> </p> <a href="https://publications.waset.org/abstracts/173559/autonomous-exploration-navigation-and-mapping-payload-integrated-on-a-quadruped-robot" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/173559.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">90</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3</span> Inelastic and Elastic Taping in Plantar Pressure of Runners Pronators: Clinical Trial</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Liana%20Gomide">Liana Gomide</a>, <a href="https://publications.waset.org/abstracts/search?q=Juliana%20Rodrigues"> Juliana Rodrigues</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The morphology of the foot defines its mode of operation and a biomechanical reform indispensable for a symmetrical distribution of plantar pressures in order not to overload some of its components in isolation. High plantar pressures at specific points in the foot may be a causal factor in several orthopedic disorders that affect the feet such as pain and stress fracture. With digital baro-podometry equipment one can observe an intensity of pressures along the entire foot and quantify some of the movements, such as a subtalar pronation present in the midfoot region. Although, they are involved in microtraumas. In clinical practice, excessive movement has been limited with the use of different taping techniques applied on the plantar arch. Thus, the objective of the present study was to analyze and compare the influence of the inelastic and elastic taping on the distribution of plantar pressure of runners pronators. This is a randomized clinical trial and blind-crossover. Twenty (20) male subjects, mean age 33 ± 7 years old, mean body mass of 71 ± 7 kg, mean height of 174 ± 6 cm, were included in the study. A data collection was carried out by a single research through barop-odometry equipment - Tekscan, model F-scan mobile. The tests were performed at three different times. In the first, an initial barop-odometric evaluation was performed, without a bandage application, with edges at a speed of 9.0 km/h. In the second and third moments, the inelastic or elastic taping was applied consecutively, according to the definition defined in the randomization. As results, it was observed that both as inelastic and elastic taping, provided significant reductions in contact pressure and peak pressure values when compared to the moment without a taping. However, an elastic taping was more effective in decreasing contact pressure (no bandage = 714 ± 201, elastic taping = 690 ± 210 and inelastic taping = 716 ± 180) and no peak pressure in the midfoot region (no bandage = 1490 ± 42, elastic taping = 1273 ± 323 and inelastic taping = 1487 ± 437). It is possible to conclude that it is an elastic taping provided by pressure in the middle region, thereby reducing the subtalar pronunciation event during the run. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=elastic%20taping" title="elastic taping">elastic taping</a>, <a href="https://publications.waset.org/abstracts/search?q=inelastic%20taping" title=" inelastic taping"> inelastic taping</a>, <a href="https://publications.waset.org/abstracts/search?q=running" title=" running"> running</a>, <a href="https://publications.waset.org/abstracts/search?q=subtalar%20pronation" title=" subtalar pronation"> subtalar pronation</a> </p> <a href="https://publications.waset.org/abstracts/78087/inelastic-and-elastic-taping-in-plantar-pressure-of-runners-pronators-clinical-trial" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/78087.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">156</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2</span> High Speed Motion Tracking with Magnetometer in Nonuniform Magnetic Field</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jeronimo%20Cox">Jeronimo Cox</a>, <a href="https://publications.waset.org/abstracts/search?q=Tomonari%20Furukawa"> Tomonari Furukawa</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Magnetometers have become more popular in inertial measurement units (IMU) for their ability to correct estimations using the earth's magnetic field. Accelerometer and gyroscope-based packages fail with dead-reckoning errors accumulated over time. Localization in robotic applications with magnetometer-inclusive IMUs has become popular as a way to track the odometry of slower-speed robots. With high-speed motions, the accumulated error increases over smaller periods of time, making them difficult to track with IMU. Tracking a high-speed motion is especially difficult with limited observability. Visual obstruction of motion leaves motion-tracking cameras unusable. When motions are too dynamic for estimation techniques reliant on the observability of the gravity vector, the use of magnetometers is further justified. As available magnetometer calibration methods are limited with the assumption that background magnetic fields are uniform, estimation in nonuniform magnetic fields is problematic. Hard iron distortion is a distortion of the magnetic field by other objects that produce magnetic fields. This kind of distortion is often observed as the offset from the origin of the center of data points when a magnetometer is rotated. The magnitude of hard iron distortion is dependent on proximity to distortion sources. Soft iron distortion is more related to the scaling of the axes of magnetometer sensors. Hard iron distortion is more of a contributor to the error of attitude estimation with magnetometers. Indoor environments or spaces inside ferrite-based structures, such as building reinforcements or a vehicle, often cause distortions with proximity. As positions correlate to areas of distortion, methods of magnetometer localization include the production of spatial mapping of magnetic field and collection of distortion signatures to better aid location tracking. The goal of this paper is to compare magnetometer methods that don't need pre-productions of magnetic field maps. Mapping the magnetic field in some spaces can be costly and inefficient. Dynamic measurement fusion is used to track the motion of a multi-link system with us. Conventional calibration by data collection of rotation at a static point, real-time estimation of calibration parameters each time step, and using two magnetometers for determining local hard iron distortion are compared to confirm the robustness and accuracy of each technique. With opposite-facing magnetometers, hard iron distortion can be accounted for regardless of position, Rather than assuming that hard iron distortion is constant regardless of positional change. The motion measured is a repeatable planar motion of a two-link system connected by revolute joints. The links are translated on a moving base to impulse rotation of the links. Equipping the joints with absolute encoders and recording the motion with cameras to enable ground truth comparison to each of the magnetometer methods. While the two-magnetometer method accounts for local hard iron distortion, the method fails where the magnetic field direction in space is inconsistent. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=motion%20tracking" title="motion tracking">motion tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=sensor%20fusion" title=" sensor fusion"> sensor fusion</a>, <a href="https://publications.waset.org/abstracts/search?q=magnetometer" title=" magnetometer"> magnetometer</a>, <a href="https://publications.waset.org/abstracts/search?q=state%20estimation" title=" state estimation"> state estimation</a> </p> <a href="https://publications.waset.org/abstracts/161291/high-speed-motion-tracking-with-magnetometer-in-nonuniform-magnetic-field" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/161291.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">84</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1</span> Calpoly Autonomous Transportation Experience: Software for Driverless Vehicle Operating on Campus</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=F.%20Tang">F. Tang</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Boskovich"> S. Boskovich</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Raheja"> A. Raheja</a>, <a href="https://publications.waset.org/abstracts/search?q=Z.%20Aliyazicioglu"> Z. Aliyazicioglu</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Bhandari"> S. Bhandari</a>, <a href="https://publications.waset.org/abstracts/search?q=N.%20Tsuchiya"> N. Tsuchiya</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Calpoly Autonomous Transportation Experience (CATE) is a driverless vehicle that we are developing to provide safe, accessible, and efficient transportation of passengers throughout the Cal Poly Pomona campus for events such as orientation tours. Unlike the other self-driving vehicles that are usually developed to operate with other vehicles and reside only on the road networks, CATE will operate exclusively on walk-paths of the campus (potentially narrow passages) with pedestrians traveling from multiple locations. Safety becomes paramount as CATE operates within the same environment as pedestrians. As driverless vehicles assume greater roles in today’s transportation, this project will contribute to autonomous driving with pedestrian traffic in a highly dynamic environment. The CATE project requires significant interdisciplinary work. Researchers from mechanical engineering, electrical engineering and computer science are working together to attack the problem from different perspectives (hardware, software and system). In this abstract, we describe the software aspects of the project, with a focus on the requirements and the major components. CATE shall provide a GUI interface for the average user to interact with the car and access its available functionalities, such as selecting a destination from any origin on campus. We have developed an interface that provides an aerial view of the campus map, the current car location, routes, and the goal location. Users can interact with CATE through audio or manual inputs. CATE shall plan routes from the origin to the selected destination for the vehicle to travel. We will use an existing aerial map for the campus and convert it to a spatial graph configuration where the vertices represent the landmarks and edges represent paths that the car should follow with some designated behaviors (such as stay on the right side of the lane or follow an edge). Graph search algorithms such as A* will be implemented as the default path planning algorithm. D* Lite will be explored to efficiently recompute the path when there are any changes to the map. CATE shall avoid any static obstacles and walking pedestrians within some safe distance. Unlike traveling along traditional roadways, CATE’s route directly coexists with pedestrians. To ensure the safety of the pedestrians, we will use sensor fusion techniques that combine data from both lidar and stereo vision for obstacle avoidance while also allowing CATE to operate along its intended route. We will also build prediction models for pedestrian traffic patterns. CATE shall improve its location and work under a GPS-denied situation. CATE relies on its GPS to give its current location, which has a precision of a few meters. We have implemented an Unscented Kalman Filter (UKF) that allows the fusion of data from multiple sensors (such as GPS, IMU, odometry) in order to increase the confidence of localization. We also noticed that GPS signals can easily get degraded or blocked on campus due to high-rise buildings or trees. UKF can also help here to generate a better state estimate. In summary, CATE will provide on-campus transportation experience that coexists with dynamic pedestrian traffic. In future work, we will extend it to multi-vehicle scenarios. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=driverless%20vehicle" title="driverless vehicle">driverless vehicle</a>, <a href="https://publications.waset.org/abstracts/search?q=path%20planning" title=" path planning"> path planning</a>, <a href="https://publications.waset.org/abstracts/search?q=sensor%20fusion" title=" sensor fusion"> sensor fusion</a>, <a href="https://publications.waset.org/abstracts/search?q=state%20estimate" title=" state estimate"> state estimate</a> </p> <a href="https://publications.waset.org/abstracts/94814/calpoly-autonomous-transportation-experience-software-for-driverless-vehicle-operating-on-campus" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94814.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">144</span> </span> </div> </div> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10