CINXE.COM

Search results for: drone technology

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: drone technology</title> <meta name="description" content="Search results for: drone technology"> <meta name="keywords" content="drone technology"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="drone technology" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="drone technology"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 7848</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: drone technology</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7848</span> Stakeholder Analysis of Agricultural Drone Policy: A Case Study of the Agricultural Drone Ecosystem of Thailand</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Thanomsin%20Chakreeves">Thanomsin Chakreeves</a>, <a href="https://publications.waset.org/abstracts/search?q=Atichat%20Preittigun"> Atichat Preittigun</a>, <a href="https://publications.waset.org/abstracts/search?q=Ajchara%20Phu-ang"> Ajchara Phu-ang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents a stakeholder analysis of agricultural drone policies that meet the government&#39;s goal of building an agricultural drone ecosystem in Thailand. Firstly, case studies from other countries are reviewed. The stakeholder analysis method and qualitative data from the interviews are then presented including data from the Institute of Innovation and Management, the Office of National Higher Education Science Research and Innovation Policy Council, agricultural entrepreneurs and farmers. Study and interview data are then employed to describe the current ecosystem and to guide the implementation of agricultural drone policies that are suitable for the ecosystem of Thailand. Finally, policy recommendations are then made that the Thai government should adopt in the future. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20public%20policy" title="drone public policy">drone public policy</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20ecosystem" title=" drone ecosystem"> drone ecosystem</a>, <a href="https://publications.waset.org/abstracts/search?q=policy%20development" title=" policy development"> policy development</a>, <a href="https://publications.waset.org/abstracts/search?q=agricultural%20drone" title=" agricultural drone"> agricultural drone</a> </p> <a href="https://publications.waset.org/abstracts/132133/stakeholder-analysis-of-agricultural-drone-policy-a-case-study-of-the-agricultural-drone-ecosystem-of-thailand" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/132133.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">149</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7847</span> 3D Stereoscopic Measurements from AR Drone Squadron</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=R.%20Schurig">R. Schurig</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20D%C3%A9sesquelles"> T. Désesquelles</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Dumont"> A. Dumont</a>, <a href="https://publications.waset.org/abstracts/search?q=E.%20Lefranc"> E. Lefranc</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Lux"> A. Lux</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A cost-efficient alternative is proposed to the use of a single drone carrying multiple cameras in order to take stereoscopic images and videos during its flight. Such drone has to be particularly large enough to take off with its equipment, and stable enough in order to make valid measurements. Corresponding performance for a single aircraft usually comes with a large cost. Proposed solution consists in using multiple smaller and cheaper aircrafts carrying one camera each instead of a single expensive one. To give a proof of concept, AR drones, quad-rotor UAVs from Parrot Inc., are experimentally used. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20squadron" title="drone squadron">drone squadron</a>, <a href="https://publications.waset.org/abstracts/search?q=flight%20control" title=" flight control"> flight control</a>, <a href="https://publications.waset.org/abstracts/search?q=rotorcraft" title=" rotorcraft"> rotorcraft</a>, <a href="https://publications.waset.org/abstracts/search?q=Unmanned%20Aerial%20Vehicle%20%28UAV%29" title=" Unmanned Aerial Vehicle (UAV)"> Unmanned Aerial Vehicle (UAV)</a>, <a href="https://publications.waset.org/abstracts/search?q=AR%20drone" title=" AR drone"> AR drone</a>, <a href="https://publications.waset.org/abstracts/search?q=stereoscopic%20vision" title=" stereoscopic vision"> stereoscopic vision</a> </p> <a href="https://publications.waset.org/abstracts/17205/3d-stereoscopic-measurements-from-ar-drone-squadron" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/17205.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">473</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7846</span> Droning the Pedagogy: Future Prospect of Teaching and Learning </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Farha%20Sattar">Farha Sattar</a>, <a href="https://publications.waset.org/abstracts/search?q=Laurence%20Tamatea"> Laurence Tamatea</a>, <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Nawaz"> Muhammad Nawaz</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drones, the Unmanned Aerial Vehicles are playing an important role in real-world problem-solving. With the new advancements in technology, drones are becoming available, affordable and user- friendly. Use of drones in education is opening new trends in teaching and learning practices in an innovative and engaging way. Drones vary in types and sizes and possess various characteristics and capabilities which enhance their potential to be used in education from basic to advanced and challenging learning activities which are suitable for primary, middle and high school level. This research aims to provide an insight to explore different types of drones and their compatibility to be used in teaching different subjects at various levels. Research focuses on integrating the drone technology along with Australian curriculum content knowledge to reinforce the understanding of the fundamental concepts and helps to develop the critical thinking and reasoning in the learning process. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=critical%20thinking" title="critical thinking">critical thinking</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20technology" title=" drone technology"> drone technology</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20types" title=" drone types"> drone types</a>, <a href="https://publications.waset.org/abstracts/search?q=innovative%20learning" title=" innovative learning"> innovative learning</a> </p> <a href="https://publications.waset.org/abstracts/69802/droning-the-pedagogy-future-prospect-of-teaching-and-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/69802.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">309</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7845</span> A Research on the Benefits of Drone Usage in Industry by Determining Companies Using Drone in the World</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ahmet%20Akdemir">Ahmet Akdemir</a>, <a href="https://publications.waset.org/abstracts/search?q=G%C3%BCzide%20Karaku%C5%9F"> Güzide Karakuş</a>, <a href="https://publications.waset.org/abstracts/search?q=Leyla%20Polat"> Leyla Polat</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Aviation that has been arisen in accordance with flying request that is existing inside of people, has not only made life easier by making a great contribution to humanity; it has also accelerated globalization by reducing distances between countries. It is seen that the growth rate of aviation industry has reached the undreamed level when it is looked back on. Today, the last point in aviation is unmanned aerial vehicles that are self-ventilating and move in desired coordinates without any onboard pilot. For those vehicles, there are two different control systems are developed. In the first type of control, an unmanned aerial vehicle (UAV) moves according to instructions of a remote control. UAV that moves with a remote control is named as drone; it can be used personally. In the second one, there is a flight plan that is programmed and placed inside of UAV before flight. Recently, drones have started to be used in unimagined areas and utilize specific, important benefits for any industry. Within this framework, this study answers the question that is drone usage would be beneficial for businesses or not. To answer this question, applied basic methodologies are determining businesses using drone in the world, their purposes to use drone, and then, comparing their economy as before drone and after drone. In the end of this study, it is seen that many companies in different business areas use drone in logistics support, and it makes their work easier than before. This paper has contributed to academic literature about this subject, and it has introduced the benefits of drone usage for businesses. In addition, it has encouraged businesses that they keep pace with this technological age by following the developments about drones. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aviation" title="aviation">aviation</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20in%20business" title=" drone in business"> drone in business</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a> </p> <a href="https://publications.waset.org/abstracts/77049/a-research-on-the-benefits-of-drone-usage-in-industry-by-determining-companies-using-drone-in-the-world" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/77049.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">257</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7844</span> Safe Zone: A Framework for Detecting and Preventing Drones Misuse </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=AlHanoof%20A.%20Alharbi">AlHanoof A. Alharbi</a>, <a href="https://publications.waset.org/abstracts/search?q=Fatima%20M.%20Alamoudi"> Fatima M. Alamoudi</a>, <a href="https://publications.waset.org/abstracts/search?q=Razan%20A.%20Albrahim"> Razan A. Albrahim</a>, <a href="https://publications.waset.org/abstracts/search?q=Sarah%20F.%20Alharbi"> Sarah F. Alharbi</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdullah%20M%20Almuhaideb"> Abdullah M Almuhaideb</a>, <a href="https://publications.waset.org/abstracts/search?q=Norah%20A.%20Almubairik"> Norah A. Almubairik</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdulrahman%20Alharby"> Abdulrahman Alharby</a>, <a href="https://publications.waset.org/abstracts/search?q=Naya%20M.%20Nagy"> Naya M. Nagy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recently, drones received a rapid interest in different industries worldwide due to its powerful impact. However, limitations still exist in this emerging technology, especially privacy violation. These aircrafts consistently threaten the security of entities by entering restricted areas accidentally or deliberately. Therefore, this research project aims to develop drone detection and prevention mechanism to protect the restricted area. Until now, none of the solutions have met the optimal requirements of detection which are cost-effectiveness, high accuracy, long range, convenience, unaffected by noise and generalization. In terms of prevention, the existing methods are focusing on impractical solutions such as catching a drone by a larger drone, training an eagle or a gun. In addition, the practical solutions have limitations, such as the No-Fly Zone and PITBULL jammers. According to our study and analysis of previous related works, none of the solutions includes detection and prevention at the same time. The proposed solution is a combination of detection and prevention methods. To implement the detection system, a passive radar will be used to properly identify the drone against any possible flying objects. As for the prevention, jamming signals and forceful safe landing of the drone integrated together to stop the drone’s operation. We believe that applying this mechanism will limit the drone’s invasion of privacy incidents against highly restricted properties. Consequently, it effectively accelerates drones‘ usages at personal and governmental levels. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=detection" title="detection">detection</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=jamming" title=" jamming"> jamming</a>, <a href="https://publications.waset.org/abstracts/search?q=prevention" title=" prevention"> prevention</a>, <a href="https://publications.waset.org/abstracts/search?q=privacy" title=" privacy"> privacy</a>, <a href="https://publications.waset.org/abstracts/search?q=RF" title=" RF"> RF</a>, <a href="https://publications.waset.org/abstracts/search?q=radar" title=" radar"> radar</a>, <a href="https://publications.waset.org/abstracts/search?q=UAV" title=" UAV"> UAV</a> </p> <a href="https://publications.waset.org/abstracts/106189/safe-zone-a-framework-for-detecting-and-preventing-drones-misuse" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/106189.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">213</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7843</span> Comparison of Direction of Arrival Estimation Method for Drone Based on Phased Microphone Array</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jiwon%20Lee">Jiwon Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Yeong-Ju%20Go"> Yeong-Ju Go</a>, <a href="https://publications.waset.org/abstracts/search?q=Jong-Soo%20Choi"> Jong-Soo Choi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drones were first developed for military use and were used in World War 1. But recently drones have been used in a variety of fields. Several companies actively utilize drone technology to strengthen their services, and in agriculture, drones are used for crop monitoring and sowing. Other people use drones for hobby activities such as photography. However, as the range of use of drones expands rapidly, problems caused by drones such as improperly flying, privacy and terrorism are also increasing. As the need for monitoring and tracking of drones increases, researches are progressing accordingly. The drone detection system estimates the position of the drone using the physical phenomena that occur when the drones fly. The drone detection system measures being developed utilize many approaches, such as radar, infrared camera, and acoustic detection systems. Among the various drone detection system, the acoustic detection system is advantageous in that the microphone array system is small, inexpensive, and easy to operate than other systems. In this paper, the acoustic signal is acquired by using minimum microphone when drone is flying, and direction of drone is estimated. When estimating the Direction of Arrival(DOA), there is a method of calculating the DOA based on the Time Difference of Arrival(TDOA) and a method of calculating the DOA based on the beamforming. The TDOA technique requires less number of microphones than the beamforming technique, but is weak in noisy environments and can only estimate the DOA of a single source. The beamforming technique requires more microphones than the TDOA technique. However, it is strong against the noisy environment and it is possible to simultaneously estimate the DOA of several drones. When estimating the DOA using acoustic signals emitted from the drone, it is impossible to measure the position of the drone, and only the direction can be estimated. To overcome this problem, in this work we show how to estimate the position of drones by arranging multiple microphone arrays. The microphone array used in the experiments was four tetrahedral microphones. We simulated the performance of each DOA algorithm and demonstrated the simulation results through experiments. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=acoustic%20sensing" title="acoustic sensing">acoustic sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=direction%20of%20arrival" title=" direction of arrival"> direction of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20detection" title=" drone detection"> drone detection</a>, <a href="https://publications.waset.org/abstracts/search?q=microphone%20array" title=" microphone array"> microphone array</a> </p> <a href="https://publications.waset.org/abstracts/94230/comparison-of-direction-of-arrival-estimation-method-for-drone-based-on-phased-microphone-array" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94230.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">160</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7842</span> Drone Classification Using Classification Methods Using Conventional Model With Embedded Audio-Visual Features</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hrishi%20Rakshit">Hrishi Rakshit</a>, <a href="https://publications.waset.org/abstracts/search?q=Pooneh%20Bagheri%20Zadeh"> Pooneh Bagheri Zadeh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper investigates the performance of drone classification methods using conventional DCNN with different hyperparameters, when additional drone audio data is embedded in the dataset for training and further classification. In this paper, first a custom dataset is created using different images of drones from University of South California (USC) datasets and Leeds Beckett university datasets with embedded drone audio signal. The three well-known DCNN architectures namely, Resnet50, Darknet53 and Shufflenet are employed over the created dataset tuning their hyperparameters such as, learning rates, maximum epochs, Mini Batch size with different optimizers. Precision-Recall curves and F1 Scores-Threshold curves are used to evaluate the performance of the named classification algorithms. Experimental results show that Resnet50 has the highest efficiency compared to other DCNN methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20classifications" title="drone classifications">drone classifications</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20convolutional%20neural%20network" title=" deep convolutional neural network"> deep convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=hyperparameters" title=" hyperparameters"> hyperparameters</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal" title=" drone audio signal"> drone audio signal</a> </p> <a href="https://publications.waset.org/abstracts/172929/drone-classification-using-classification-methods-using-conventional-model-with-embedded-audio-visual-features" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/172929.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">104</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7841</span> Design of a Surveillance Drone with Computer Aided Durability</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Maram%20Shahad%20Dana%20Anfal">Maram Shahad Dana Anfal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research paper presents the design of a surveillance drone with computer-aided durability and model analyses that provides a cost-effective and efficient solution for various applications. The quadcopter's design is based on a lightweight and strong structure made of materials such as aluminum and titanium, which provide a durable structure for the quadcopter. The structure of this product and the computer-aided durability system are both designed to ensure frequent repairs or replacements, which will save time and money in the long run. Moreover, the study discusses the drone's ability to track, investigate, and deliver objects more quickly than traditional methods, makes it a highly efficient and cost-effective technology. In this paper, a comprehensive analysis of the quadcopter's operation dynamics and limitations is presented. In both simulation and experimental data, the computer-aided durability system and the drone's design demonstrate their effectiveness, highlighting the potential for a variety of applications, such as search and rescue missions, infrastructure monitoring, and agricultural operations. Also, the findings provide insights into possible areas for improvement in the design and operation of the drone. Ultimately, this paper presents a reliable and cost-effective solution for surveillance applications by designing a drone with computer-aided durability and modeling. With its potential to save time and money, increase reliability, and enhance safety, it is a promising technology for the future of surveillance drones. operation dynamic equations have been evaluated successfully for different flight conditions of a quadcopter. Also, CAE modeling techniques have been applied for the modal risk assessment at operating conditions.Stress analysis have been performed under the loadings of the worst-case combined motion flight conditions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone" title="drone">drone</a>, <a href="https://publications.waset.org/abstracts/search?q=material" title=" material"> material</a>, <a href="https://publications.waset.org/abstracts/search?q=solidwork" title=" solidwork"> solidwork</a>, <a href="https://publications.waset.org/abstracts/search?q=hypermesh" title=" hypermesh"> hypermesh</a> </p> <a href="https://publications.waset.org/abstracts/167463/design-of-a-surveillance-drone-with-computer-aided-durability" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/167463.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">146</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7840</span> Designing Agricultural Irrigation Systems Using Drone Technology and Geospatial Analysis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yongqin%20Zhang">Yongqin Zhang</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20Lett"> John Lett</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Geospatial technologies have been increasingly used in agriculture for various applications and purposes in recent years. Unmanned aerial vehicles (drones) fit the needs of farmers in farming operations, from field spraying to grow cycles and crop health. In this research, we conducted a practical research project that used drone technology to design and map optimal locations and layouts of irrigation systems for agriculture farms. We flew a DJI Mavic 2 Pro drone to acquire aerial remote sensing images over two agriculture fields in Forest, Mississippi, in 2022. Flight plans were first designed to capture multiple high-resolution images via a 20-megapixel RGB camera mounted on the drone over the agriculture fields. The Drone Deploy web application was then utilized to develop flight plans and subsequent image processing and measurements. The images were orthorectified and processed to estimate the area of the area and measure the locations of the water line and sprinkle heads. Field measurements were conducted to measure the ground targets and validate the aerial measurements. Geospatial analysis and photogrammetric measurements were performed for the study area to determine optimal layout and quantitative estimates for irrigation systems. We created maps and tabular estimates to demonstrate the locations, spacing, amount, and layout of sprinkler heads and water lines to cover the agricultural fields. This research project provides scientific guidance to Mississippi farmers for a precision agricultural irrigation practice. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20images" title="drone images">drone images</a>, <a href="https://publications.waset.org/abstracts/search?q=agriculture" title=" agriculture"> agriculture</a>, <a href="https://publications.waset.org/abstracts/search?q=irrigation" title=" irrigation"> irrigation</a>, <a href="https://publications.waset.org/abstracts/search?q=geospatial%20analysis" title=" geospatial analysis"> geospatial analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=photogrammetric%20measurements" title=" photogrammetric measurements"> photogrammetric measurements</a> </p> <a href="https://publications.waset.org/abstracts/162153/designing-agricultural-irrigation-systems-using-drone-technology-and-geospatial-analysis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162153.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">77</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7839</span> Cognitive Theory and the Design of Integrate Curriculum</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bijan%20Gillani">Bijan Gillani</a>, <a href="https://publications.waset.org/abstracts/search?q=Roya%20Gillani"> Roya Gillani</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The purpose of this paper is to propose a pedagogical model where engineering provides the interconnection to integrate the other topics of science, technology, engineering, and mathematics. The author(s) will first present a brief discussion of cognitive theory and then derive an integrated pedagogy to use engineering and technology, such as drones, sensors, camera, iPhone, radio waves as the nexus to an integrated curriculum development for the other topics of STEM. Based on this pedagogy, one example developed by the author(s) called “Drones and Environmental Science,” will be presented that uses a drone and related technology as an appropriate instructional delivery medium to apply Piaget’s cognitive theory to create environments that promote the integration of different STEM subjects that relate to environmental science. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=cogntive%20theories" title="cogntive theories">cogntive theories</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=environmental%20science" title=" environmental science"> environmental science</a>, <a href="https://publications.waset.org/abstracts/search?q=pedagogy" title=" pedagogy"> pedagogy</a> </p> <a href="https://publications.waset.org/abstracts/30940/cognitive-theory-and-the-design-of-integrate-curriculum" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/30940.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">576</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7838</span> Agile Real-Time Field Programmable Gate Array-Based Image Processing System for Drone Imagery in Digital Agriculture</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sabiha%20Shahid%20Antora">Sabiha Shahid Antora</a>, <a href="https://publications.waset.org/abstracts/search?q=Young%20Ki%20Chang"> Young Ki Chang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Along with various farm management technologies, imagery is an important tool that facilitates crop assessment, monitoring, and management. As a consequence, drone imaging technology is playing a vital role to capture the state of the entire field for yield mapping, crop scouting, weed detection, and so on. Although it is essential to inspect the cultivable lands in real-time for making rapid decisions regarding field variable inputs to combat stresses and diseases, drone imagery is still evolving in this area of interest. Cost margin and post-processing complexions of the image stream are the main challenges of imaging technology. Therefore, this proposed project involves the cost-effective field programmable gate array (FPGA) based image processing device that would process the image stream in real-time as well as providing the processed output to support on-the-spot decisions in the crop field. As a result, the real-time FPGA-based image processing system would reduce operating costs while minimizing a few intermediate steps to deliver scalable field decisions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=real-time" title="real-time">real-time</a>, <a href="https://publications.waset.org/abstracts/search?q=FPGA" title=" FPGA"> FPGA</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20imagery" title=" drone imagery"> drone imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a>, <a href="https://publications.waset.org/abstracts/search?q=crop%20monitoring" title=" crop monitoring"> crop monitoring</a> </p> <a href="https://publications.waset.org/abstracts/132611/agile-real-time-field-programmable-gate-array-based-image-processing-system-for-drone-imagery-in-digital-agriculture" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/132611.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">114</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7837</span> An Approach to Autonomous Drones Using Deep Reinforcement Learning and Object Detection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=K.%20R.%20Roopesh%20Bharatwaj">K. R. Roopesh Bharatwaj</a>, <a href="https://publications.waset.org/abstracts/search?q=Avinash%20Maharana"> Avinash Maharana</a>, <a href="https://publications.waset.org/abstracts/search?q=Favour%20Tobi%20Aborisade"> Favour Tobi Aborisade</a>, <a href="https://publications.waset.org/abstracts/search?q=Roger%20Young"> Roger Young</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Presently, there are few cases of complete automation of drones and its allied intelligence capabilities. In essence, the potential of the drone has not yet been fully utilized. This paper presents feasible methods to build an intelligent drone with smart capabilities such as self-driving, and obstacle avoidance. It does this through advanced Reinforcement Learning Techniques and performs object detection using latest advanced algorithms, which are capable of processing light weight models with fast training in real time instances. For the scope of this paper, after researching on the various algorithms and comparing them, we finally implemented the Deep-Q-Networks (DQN) algorithm in the AirSim Simulator. In future works, we plan to implement further advanced self-driving and object detection algorithms, we also plan to implement voice-based speech recognition for the entire drone operation which would provide an option of speech communication between users (People) and the drone in the time of unavoidable circumstances. Thus, making drones an interactive intelligent Robotic Voice Enabled Service Assistant. This proposed drone has a wide scope of usability and is applicable in scenarios such as Disaster management, Air Transport of essentials, Agriculture, Manufacturing, Monitoring people movements in public area, and Defense. Also discussed, is the entire drone communication based on the satellite broadband Internet technology for faster computation and seamless communication service for uninterrupted network during disasters and remote location operations. This paper will explain the feasible algorithms required to go about achieving this goal and is more of a reference paper for future researchers going down this path. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=convolution%20neural%20network" title="convolution neural network">convolution neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20language%20processing" title=" natural language processing"> natural language processing</a>, <a href="https://publications.waset.org/abstracts/search?q=obstacle%20avoidance" title=" obstacle avoidance"> obstacle avoidance</a>, <a href="https://publications.waset.org/abstracts/search?q=satellite%20broadband%20technology" title=" satellite broadband technology"> satellite broadband technology</a>, <a href="https://publications.waset.org/abstracts/search?q=self-driving" title=" self-driving"> self-driving</a> </p> <a href="https://publications.waset.org/abstracts/137145/an-approach-to-autonomous-drones-using-deep-reinforcement-learning-and-object-detection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/137145.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">252</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7836</span> A Power Management System for Indoor Micro-Drones in GPS-Denied Environments</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yendo%20Hu">Yendo Hu</a>, <a href="https://publications.waset.org/abstracts/search?q=Xu-Yu%20Wu"> Xu-Yu Wu</a>, <a href="https://publications.waset.org/abstracts/search?q=Dylan%20Oh"> Dylan Oh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> GPS-Denied drones open the possibility of indoor applications, including dynamic arial surveillance, inspection, safety enforcement, and discovery. Indoor swarming further enhances these applications in accuracy, robustness, operational time, and coverage. For micro-drones, power management becomes a critical issue, given the battery payload restriction. This paper proposes an application enabling battery replacement solution that extends the micro-drone active phase without human intervention. First, a framework to quantify the effectiveness of a power management solution for a drone fleet is proposed. The operation-to-non-operation ratio, ONR, gives one a quantitative benchmark to measure the effectiveness of a power management solution. Second, a survey was carried out to evaluate the ONR performance for the various solutions. Third, through analysis, this paper proposes a solution tailored to the indoor micro-drone, suitable for swarming applications. The proposed automated battery replacement solution, along with a modified micro-drone architecture, was implemented along with the associated micro-drone. Fourth, the system was tested and compared with the various solutions within the industry. Results show that the proposed solution achieves an ONR value of 31, which is a 1-fold improvement of the best alternative option. The cost analysis shows a manufacturing cost of $25, which makes this approach viable for cost-sensitive markets (e.g., consumer). Further challenges remain in the area of drone design for automated battery replacement, landing pad/drone production, high-precision landing control, and ONR improvements. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=micro-drone" title="micro-drone">micro-drone</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20swap" title=" battery swap"> battery swap</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20replacement" title=" battery replacement"> battery replacement</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20recharge" title=" battery recharge"> battery recharge</a>, <a href="https://publications.waset.org/abstracts/search?q=landing%20pad" title=" landing pad"> landing pad</a>, <a href="https://publications.waset.org/abstracts/search?q=power%20management" title=" power management"> power management</a> </p> <a href="https://publications.waset.org/abstracts/171391/a-power-management-system-for-indoor-micro-drones-in-gps-denied-environments" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/171391.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">122</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7835</span> The Effects of Cross-Border Use of Drones in Nigerian National Security</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=H.%20P.%20Kerry">H. P. Kerry</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drone technology has become a significant discourse in a nation&rsquo;s national security, while this technology could constitute a danger to national security on the one hand, on the other hand, it is used in developed and developing countries for border security, and in some cases, for protection of security agents and migrants. In the case of Nigeria, drones are used by the military to monitor and tighten security around the borders. However, terrorist groups have devised a means to utilize the technology to their advantage. Therefore, the potential danger in the widespread proliferation of this technology has become a myriad of risks. The research on the effects of cross-border use of drones in Nigerian national security looks at the negative and positive consequences of using drone technology. The study employs the use of interviews and relevant documents to obtain data while the study applied the Just War theory to justify the reason why countries use force; it further buttresses the points with what the realist theory thinks about the use of force. In conclusion, the paper recommends that the Nigerian government through the National Assembly should pass a bill for the establishment of a law that will guide the use of armed and unarmed drones in Nigeria enforced by the Nigeria Civil Aviation Authority and the office of the National Security Adviser. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=armed%20drones" title="armed drones">armed drones</a>, <a href="https://publications.waset.org/abstracts/search?q=drones" title=" drones"> drones</a>, <a href="https://publications.waset.org/abstracts/search?q=cross-border" title=" cross-border"> cross-border</a>, <a href="https://publications.waset.org/abstracts/search?q=national%20security" title=" national security"> national security</a> </p> <a href="https://publications.waset.org/abstracts/127425/the-effects-of-cross-border-use-of-drones-in-nigerian-national-security" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/127425.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">158</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7834</span> Should the U.S. Rely on Drone Strikes to Combat the Islamic State? Why Deploying a Drone Campaign against ISIS Will Do Nothing to Address the Causes of the Insurgency or Prevent Its Resurgence?</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Danielle%20Jablanski">Danielle Jablanski</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This article addresses the use of drone strikes under international law and the intersection between Islamic law and current terrorist trends worldwide. It breaks down the legality of drone strikes under international law and dissects certain aspects of their usage in modern warfare; i.e. concepts of directly participating in hostilities and the role of CIA operators. The article then looks at international paradigms of law enforcement versus the use of military force in relation to terrorism. Lastly, it describes traditional aspects of Islamic law and several interpretations of the law today as applied to widespread campaigns of terrorism, namely that of the recent group ISIS or ISIL operating between the battlegrounds of Iraq and Syria. The piece concludes with appraisals for moving forward on the basis of honing in on reasons for terrorism and negative opinions of solely military campaigns to dismantle or disrupt terror organizations and breeding grounds. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=international%20law" title="international law">international law</a>, <a href="https://publications.waset.org/abstracts/search?q=terrorism" title=" terrorism"> terrorism</a>, <a href="https://publications.waset.org/abstracts/search?q=ISIS" title=" ISIS"> ISIS</a>, <a href="https://publications.waset.org/abstracts/search?q=islamic%20law" title=" islamic law"> islamic law</a> </p> <a href="https://publications.waset.org/abstracts/24847/should-the-us-rely-on-drone-strikes-to-combat-the-islamic-state-why-deploying-a-drone-campaign-against-isis-will-do-nothing-to-address-the-causes-of-the-insurgency-or-prevent-its-resurgence" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/24847.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">476</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7833</span> Proposal of Non-Destructive Inspection Function Based on Internet of Things Technology Using Drone</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Byoungjoon%20Yu">Byoungjoon Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Jihwan%20Park"> Jihwan Park</a>, <a href="https://publications.waset.org/abstracts/search?q=Sujung%20Sin"> Sujung Sin</a>, <a href="https://publications.waset.org/abstracts/search?q=Junghyun%20Im"> Junghyun Im</a>, <a href="https://publications.waset.org/abstracts/search?q=Minsoo%20Park"> Minsoo Park</a>, <a href="https://publications.waset.org/abstracts/search?q=Sehwan%20Park"> Sehwan Park</a>, <a href="https://publications.waset.org/abstracts/search?q=Seunghee%20Park"> Seunghee Park</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we propose a technology to monitor the soundness of an Internet-based bridge using a non-conductive inspection function. There has been a collapse accident due to the aging of the bridge structure, and it is necessary to prepare for the deterioration of the bridge. The NDT/SHM system for maintenance of existing bridge structures requires a large number of inspection personnel and expensive inspection costs, and access of expensive and large equipment to measurement points is required. Because current drone inspection equipment can only be inspected through camera, it is difficult to inspect inside damage accurately, and the results of an internal damage evaluation are subjective, and it is difficult for non-specialists to recognize the evaluation results. Therefore, it is necessary to develop NDT/SHM techniques for maintenance of new-concept bridge structures that allow for free movement and real-time evaluation of measurement results. This work is financially supported by Korea Ministry of Land, Infrastructure, and Transport (MOLIT) as 'Smart City Master and Doctor Course Grant Program' and a grant (14SCIP-B088624-01) from Construction Technology Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Structural%20Health%20Monitoring" title="Structural Health Monitoring">Structural Health Monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=SHM" title=" SHM"> SHM</a>, <a href="https://publications.waset.org/abstracts/search?q=non-contact%20sensing" title=" non-contact sensing"> non-contact sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=nondestructive%20testing" title=" nondestructive testing"> nondestructive testing</a>, <a href="https://publications.waset.org/abstracts/search?q=NDT" title=" NDT"> NDT</a>, <a href="https://publications.waset.org/abstracts/search?q=Internet%20of%20Things" title=" Internet of Things"> Internet of Things</a>, <a href="https://publications.waset.org/abstracts/search?q=autonomous%20self-driving%20drone" title=" autonomous self-driving drone"> autonomous self-driving drone</a> </p> <a href="https://publications.waset.org/abstracts/92770/proposal-of-non-destructive-inspection-function-based-on-internet-of-things-technology-using-drone" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/92770.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">268</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7832</span> Drone Swarm Routing and Scheduling for Off-shore Wind Turbine Blades Inspection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohanad%20Al-Behadili">Mohanad Al-Behadili</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiang%20Song"> Xiang Song</a>, <a href="https://publications.waset.org/abstracts/search?q=Djamila%20Ouelhadj"> Djamila Ouelhadj</a>, <a href="https://publications.waset.org/abstracts/search?q=Alex%20Fraess-Ehrfeld"> Alex Fraess-Ehrfeld</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In off-shore wind farms, turbine blade inspection accessibility under various sea states is very challenging and greatly affects the downtime of wind turbines. Maintenance of any offshore system is not an easy task due to the restricted logistics and accessibility. The multirotor unmanned helicopter is of increasing interest in inspection applications due to its manoeuvrability and payload capacity. These advantages increase when many of them are deployed simultaneously in a swarm. Hence this paper proposes a drone swarm framework for inspecting offshore wind turbine blades and nacelles so as to reduce downtime. One of the big challenges of this task is that when operating a drone swarm, an individual drone may not have enough power to fly and communicate during missions and it has no capability of refueling due to its small size. Once the drone power is drained, there are no signals transmitted and the links become intermittent. Vessels equipped with 5G masts and small power units are utilised as platforms for drones to recharge/swap batteries. The research work aims at designing a smart energy management system, which provides automated vessel and drone routing and recharging plans. To achieve this goal, a novel mathematical optimisation model is developed with the main objective of minimising the number of drones and vessels, which carry the charging stations, and the downtime of the wind turbines. There are a number of constraints to be considered, such as each wind turbine must be inspected once and only once by one drone; each drone can inspect at most one wind turbine after recharging, then fly back to the charging station; collision should be avoided during the drone flying; all wind turbines in the wind farm should be inspected within the given time window. We have developed a real-time Ant Colony Optimisation (ACO) algorithm to generate real-time and near-optimal solutions to the drone swarm routing problem. The schedule will generate efficient and real-time solutions to indicate the inspection tasks, time windows, and the optimal routes of the drones to access the turbines. Experiments are conducted to evaluate the quality of the solutions generated by ACO. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20swarm" title="drone swarm">drone swarm</a>, <a href="https://publications.waset.org/abstracts/search?q=routing" title=" routing"> routing</a>, <a href="https://publications.waset.org/abstracts/search?q=scheduling" title=" scheduling"> scheduling</a>, <a href="https://publications.waset.org/abstracts/search?q=optimisation%20model" title=" optimisation model"> optimisation model</a>, <a href="https://publications.waset.org/abstracts/search?q=ant%20colony%20optimisation" title=" ant colony optimisation"> ant colony optimisation</a> </p> <a href="https://publications.waset.org/abstracts/141935/drone-swarm-routing-and-scheduling-for-off-shore-wind-turbine-blades-inspection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/141935.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">269</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7831</span> Development of a Fire Analysis Drone for Smoke Toxicity Measurement for Fire Prediction and Management</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Gabrielle%20Peck">Gabrielle Peck</a>, <a href="https://publications.waset.org/abstracts/search?q=Ryan%20Hayes"> Ryan Hayes</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research presents the design and creation of a drone gas analyser, aimed at addressing the need for independent data collection and analysis of gas emissions during large-scale fires, particularly wasteland fires. The analyser drone, comprising a lightweight gas analysis system attached to a remote-controlled drone, enables the real-time assessment of smoke toxicity and the monitoring of gases released into the atmosphere during such incidents. The key components of the analyser unit included two gas line inlets connected to glass wool filters, a pump with regulated flow controlled by a mass flow controller, and electrochemical cells for detecting nitrogen oxides, hydrogen cyanide, and oxygen levels. Additionally, a non-dispersive infrared (NDIR) analyser is employed to monitor carbon monoxide (CO), carbon dioxide (CO₂), and hydrocarbon concentrations. Thermocouples can be attached to the analyser to monitor temperature, as well as McCaffrey probes combined with pressure transducers to monitor air velocity and wind direction. These additions allow for monitoring of the large fire and can be used for predictions of fire spread. The innovative system not only provides crucial data for assessing smoke toxicity but also contributes to fire prediction and management. The remote-controlled drone's mobility allows for safe and efficient data collection in proximity to the fire source, reducing the need for human exposure to hazardous conditions. The data obtained from the gas analyser unit facilitates informed decision-making by emergency responders, aiding in the protection of both human health and the environment. This abstract highlights the successful development of a drone gas analyser, illustrating its potential for enhancing smoke toxicity analysis and fire prediction capabilities. The integration of this technology into fire management strategies offers a promising solution for addressing the challenges associated with wildfires and other large-scale fire incidents. The project's methodology and results contribute to the growing body of knowledge in the field of environmental monitoring and safety, emphasizing the practical utility of drones for critical applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=fire%20prediction" title="fire prediction">fire prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=smoke%20toxicity" title=" smoke toxicity"> smoke toxicity</a>, <a href="https://publications.waset.org/abstracts/search?q=analyser" title=" analyser"> analyser</a>, <a href="https://publications.waset.org/abstracts/search?q=fire%20management" title=" fire management"> fire management</a> </p> <a href="https://publications.waset.org/abstracts/174836/development-of-a-fire-analysis-drone-for-smoke-toxicity-measurement-for-fire-prediction-and-management" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/174836.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">90</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7830</span> Study on Acoustic Source Detection Performance Improvement of Microphone Array Installed on Drones Using Blind Source Separation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Youngsun%20Moon">Youngsun Moon</a>, <a href="https://publications.waset.org/abstracts/search?q=Yeong-Ju%20Go"> Yeong-Ju Go</a>, <a href="https://publications.waset.org/abstracts/search?q=Jong-Soo%20Choi"> Jong-Soo Choi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Most drones that currently have surveillance/reconnaissance missions are basically equipped with optical equipment, but we also need to use a microphone array to estimate the location of the acoustic source. This can provide additional information in the absence of optical equipment. The purpose of this study is to estimate Direction of Arrival (DOA) based on Time Difference of Arrival (TDOA) estimation of the acoustic source in the drone. The problem is that it is impossible to measure the clear target acoustic source because of the drone noise. To overcome this problem is to separate the drone noise and the target acoustic source using Blind Source Separation(BSS) based on Independent Component Analysis(ICA). ICA can be performed assuming that the drone noise and target acoustic source are independent and each signal has non-gaussianity. For maximized non-gaussianity each signal, we use Negentropy and Kurtosis based on probability theory. As a result, we can improve TDOA estimation and DOA estimation of the target source in the noisy environment. We simulated the performance of the DOA algorithm applying BSS algorithm, and demonstrated the simulation through experiment at the anechoic wind tunnel. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aeroacoustics" title="aeroacoustics">aeroacoustics</a>, <a href="https://publications.waset.org/abstracts/search?q=acoustic%20source%20detection" title=" acoustic source detection"> acoustic source detection</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20difference%20of%20arrival" title=" time difference of arrival"> time difference of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=direction%20of%20arrival" title=" direction of arrival"> direction of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=blind%20source%20separation" title=" blind source separation"> blind source separation</a>, <a href="https://publications.waset.org/abstracts/search?q=independent%20component%20analysis" title=" independent component analysis"> independent component analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a> </p> <a href="https://publications.waset.org/abstracts/94236/study-on-acoustic-source-detection-performance-improvement-of-microphone-array-installed-on-drones-using-blind-source-separation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94236.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">164</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7829</span> Embedded Electrochemistry with Miniaturized, Drone-Based, Potentiostat System for Remote Detection Chemical Warfare Agents</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amer%20Dawoud">Amer Dawoud</a>, <a href="https://publications.waset.org/abstracts/search?q=Jesy%20Motchaalangaram"> Jesy Motchaalangaram</a>, <a href="https://publications.waset.org/abstracts/search?q=Arati%20Biswakarma"> Arati Biswakarma</a>, <a href="https://publications.waset.org/abstracts/search?q=Wujan%20Mio"> Wujan Mio</a>, <a href="https://publications.waset.org/abstracts/search?q=Karl%20Wallace"> Karl Wallace</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The development of an embedded miniaturized drone-based system for remote detection of Chemical Warfare Agents (CWA) is proposed. The paper focuses on the software/hardware system design of the electrochemical Cyclic Voltammetry (CV) and Differential Pulse Voltammetry (DPV) signal processing for future deployment on drones. The paper summarizes the progress made towards hardware and electrochemical signal processing for signature detection of CWA. Also, the miniature potentiostat signal is validated by comparing it with the high-end lab potentiostat signal. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone-based" title="drone-based">drone-based</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20detection%20chemical%20warfare%20agents" title=" remote detection chemical warfare agents"> remote detection chemical warfare agents</a>, <a href="https://publications.waset.org/abstracts/search?q=miniaturized" title=" miniaturized"> miniaturized</a>, <a href="https://publications.waset.org/abstracts/search?q=potentiostat" title=" potentiostat"> potentiostat</a> </p> <a href="https://publications.waset.org/abstracts/145007/embedded-electrochemistry-with-miniaturized-drone-based-potentiostat-system-for-remote-detection-chemical-warfare-agents" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/145007.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">136</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7828</span> Feasibility of Using Bike Lanes in Conjunctions with Sidewalks for Ground Drone Applications in Last Mile Delivery for Dense Urban Areas</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=N.%20Bazyar%20Shourabi">N. Bazyar Shourabi</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20Nyarko"> K. Nyarko</a>, <a href="https://publications.waset.org/abstracts/search?q=C.%20Scott"> C. Scott</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Jeihnai"> M. Jeihnai</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Ground drones have the potential to reduce the cost and time of making last-mile deliveries. They also have the potential to make a huge impact on human life. Despite this potential, little work has gone into developing a suitable feasibility model for ground drone delivery in dense urban areas. Today, most of the experimental ground delivery drones utilize sidewalks only, with just a few of them starting to use bike lanes, which a significant portion of some urban areas have. This study works on the feasibility of using bike lanes in conjunction with sidewalks for ground drone applications in last-mile delivery for dense urban areas. This work begins with surveying bike lanes and sidewalks within the city of Boston using Geographic Information System (GIS) software to determine the percentage of coverage currently available within the city. Then six scenarios are examined. Based on this research, a mathematical model is developed. The daily cost of delivering packages using each scenario is calculated by the mathematical model. Comparing the drone delivery scenarios with the traditional method of package delivery using trucks will provide essential information concerning the feasibility of implementing routing protocols that combine the use of sidewalks and bike lanes. The preliminary results of the model show that ground drones that can travel via sidewalks or bike lanes have the potential to significantly reduce delivery cost. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ground%20drone" title="ground drone">ground drone</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligent%20transportation%20system" title=" intelligent transportation system"> intelligent transportation system</a>, <a href="https://publications.waset.org/abstracts/search?q=last-mile%20delivery" title=" last-mile delivery"> last-mile delivery</a>, <a href="https://publications.waset.org/abstracts/search?q=sidewalk%20robot" title=" sidewalk robot"> sidewalk robot</a> </p> <a href="https://publications.waset.org/abstracts/116913/feasibility-of-using-bike-lanes-in-conjunctions-with-sidewalks-for-ground-drone-applications-in-last-mile-delivery-for-dense-urban-areas" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/116913.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">147</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7827</span> A System Architecture for Hand Gesture Control of Robotic Technology: A Case Study Using a Myo™ Arm Band, DJI Spark™ Drone, and a Staubli™ Robotic Manipulator</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sebastian%20van%20Delden">Sebastian van Delden</a>, <a href="https://publications.waset.org/abstracts/search?q=Matthew%20Anuszkiewicz"> Matthew Anuszkiewicz</a>, <a href="https://publications.waset.org/abstracts/search?q=Jayse%20White"> Jayse White</a>, <a href="https://publications.waset.org/abstracts/search?q=Scott%20Stolarski"> Scott Stolarski</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Industrial robotic manipulators have been commonplace in the manufacturing world since the early 1960s, and unmanned aerial vehicles (drones) have only begun to realize their full potential in the service industry and the military. The omnipresence of these technologies in their respective fields will only become more potent in coming years. While these technologies have greatly evolved over the years, the typical approach to human interaction with these robots has not. In the industrial robotics realm, a manipulator is typically jogged around using a teach pendant and programmed using a networked computer or the teach pendant itself via a proprietary software development platform. Drones are typically controlled using a two-handed controller equipped with throttles, buttons, and sticks, an app that can be downloaded to one’s mobile device, or a combination of both. This application-oriented work offers a novel approach to human interaction with both unmanned aerial vehicles and industrial robotic manipulators via hand gestures and movements. Two systems have been implemented, both of which use a Myo™ armband to control either a drone (DJI Spark™) or a robotic arm (Stäubli™ TX40). The methodologies developed by this work present a mapping of armband gestures (fist, finger spread, swing hand in, swing hand out, swing arm left/up/down/right, etc.) to either drone or robot arm movements. The findings of this study present the efficacy and limitations (precision and ergonomic) of hand gesture control of two distinct types of robotic technology. All source code associated with this project will be open sourced and placed on GitHub. In conclusion, this study offers a framework that maps hand and arm gestures to drone and robot arm control. The system has been implemented using current ubiquitous technologies, and these software artifacts will be open sourced for future researchers or practitioners to use in their work. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human%20robot%20interaction" title="human robot interaction">human robot interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=drones" title=" drones"> drones</a>, <a href="https://publications.waset.org/abstracts/search?q=gestures" title=" gestures"> gestures</a>, <a href="https://publications.waset.org/abstracts/search?q=robotics" title=" robotics"> robotics</a> </p> <a href="https://publications.waset.org/abstracts/93270/a-system-architecture-for-hand-gesture-control-of-robotic-technology-a-case-study-using-a-myo-arm-band-dji-spark-drone-and-a-staubli-robotic-manipulator" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/93270.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">161</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7826</span> Non-intrusive Hand Control of Drone Using an Inexpensive and Streamlined Convolutional Neural Network Approach</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Evan%20Lowhorn">Evan Lowhorn</a>, <a href="https://publications.waset.org/abstracts/search?q=Rocio%20Alba-Flores"> Rocio Alba-Flores</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The purpose of this work is to develop a method for classifying hand signals and using the output in a drone control algorithm. To achieve this, methods based on Convolutional Neural Networks (CNN) were applied. CNN's are a subset of deep learning, which allows grid-like inputs to be processed and passed through a neural network to be trained for classification. This type of neural network allows for classification via imaging, which is less intrusive than previous methods using biosensors, such as EMG sensors. Classification CNN's operate purely from the pixel values in an image; therefore they can be used without additional exteroceptive sensors. A development bench was constructed using a desktop computer connected to a high-definition webcam mounted on a scissor arm. This allowed the camera to be pointed downwards at the desk to provide a constant solid background for the dataset and a clear detection area for the user. A MATLAB script was created to automate dataset image capture at the development bench and save the images to the desktop. This allowed the user to create their own dataset of 12,000 images within three hours. These images were evenly distributed among seven classes. The defined classes include forward, backward, left, right, idle, and land. The drone has a popular flip function which was also included as an additional class. To simplify control, the corresponding hand signals chosen were the numerical hand signs for one through five for movements, a fist for land, and the universal “ok” sign for the flip command. Transfer learning with PyTorch (Python) was performed using a pre-trained 18-layer residual learning network (ResNet-18) to retrain the network for custom classification. An algorithm was created to interpret the classification and send encoded messages to a Ryze Tello drone over its 2.4 GHz Wi-Fi connection. The drone’s movements were performed in half-meter distance increments at a constant speed. When combined with the drone control algorithm, the classification performed as desired with negligible latency when compared to the delay in the drone’s movement commands. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=classification" title="classification">classification</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20vision" title=" computer vision"> computer vision</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20networks" title=" convolutional neural networks"> convolutional neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20control" title=" drone control"> drone control</a> </p> <a href="https://publications.waset.org/abstracts/139743/non-intrusive-hand-control-of-drone-using-an-inexpensive-and-streamlined-convolutional-neural-network-approach" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139743.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">212</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7825</span> Multiperson Drone Control with Seamless Pilot Switching Using Onboard Camera and Openpose Real-Time Keypoint Detection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Evan%20Lowhorn">Evan Lowhorn</a>, <a href="https://publications.waset.org/abstracts/search?q=Rocio%20Alba-Flores"> Rocio Alba-Flores</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Traditional classification Convolutional Neural Networks (CNN) attempt to classify an image in its entirety. This becomes problematic when trying to perform classification with a drone’s camera in real-time due to unpredictable backgrounds. Object detectors with bounding boxes can be used to isolate individuals and other items, but the original backgrounds remain within these boxes. These basic detectors have been regularly used to determine what type of object an item is, such as “person” or “dog.” Recent advancement in computer vision, particularly with human imaging, is keypoint detection. Human keypoint detection goes beyond bounding boxes to fully isolate humans and plot points, or Regions of Interest (ROI), on their bodies within an image. ROIs can include shoulders, elbows, knees, heads, etc. These points can then be related to each other and used in deep learning methods such as pose estimation. For drone control based on human motions, poses, or signals using the onboard camera, it is important to have a simple method for pilot identification among multiple individuals while also giving the pilot fine control options for the drone. To achieve this, the OpenPose keypoint detection network was used with body and hand keypoint detection enabled. OpenPose supports the ability to combine multiple keypoint detection methods in real-time with a single network. Body keypoint detection allows simple poses to act as the pilot identifier. The hand keypoint detection with ROIs for each finger can then offer a greater variety of signal options for the pilot once identified. For this work, the individual must raise their non-control arm to be identified as the operator and send commands with the hand on their other arm. The drone ignores all other individuals in the onboard camera feed until the current operator lowers their non-control arm. When another individual wish to operate the drone, they simply raise their arm once the current operator relinquishes control, and then they can begin controlling the drone with their other hand. This is all performed mid-flight with no landing or script editing required. When using a desktop with a discrete NVIDIA GPU, the drone’s 2.4 GHz Wi-Fi connection combined with OpenPose restrictions to only body and hand allows this control method to perform as intended while maintaining the responsiveness required for practical use. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=computer%20vision" title="computer vision">computer vision</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20control" title=" drone control"> drone control</a>, <a href="https://publications.waset.org/abstracts/search?q=keypoint%20detection" title=" keypoint detection"> keypoint detection</a>, <a href="https://publications.waset.org/abstracts/search?q=openpose" title=" openpose"> openpose</a> </p> <a href="https://publications.waset.org/abstracts/139752/multiperson-drone-control-with-seamless-pilot-switching-using-onboard-camera-and-openpose-real-time-keypoint-detection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139752.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">185</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7824</span> Seawater Changes&#039; Estimation at Tidal Flat in Korean Peninsula Using Drone Stereo Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hyoseong%20Lee">Hyoseong Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Duk-jin%20Kim"> Duk-jin Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Jaehong%20Oh"> Jaehong Oh</a>, <a href="https://publications.waset.org/abstracts/search?q=Jungil%20Shin"> Jungil Shin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Tidal flat in Korean peninsula is one of the largest biodiversity tidal flats in the world. Therefore, digital elevation models (DEM) is continuously demanded to monitor of the tidal flat. In this study, DEM of tidal flat, according to different times, was produced by means of the Drone and commercial software in order to measure seawater change during high tide at water-channel in tidal flat. To correct the produced DEMs of the tidal flat where is inaccessible to collect control points, the DEM matching method was applied by using the reference DEM instead of the survey. After the ortho-image was made from the corrected DEM, the land cover classified image was produced. The changes of seawater amount according to the times were analyzed by using the classified images and DEMs. As a result, it was confirmed that the amount of water rapidly increased as the time passed during high tide. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=tidal%20flat" title="tidal flat">tidal flat</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=DEM" title=" DEM"> DEM</a>, <a href="https://publications.waset.org/abstracts/search?q=seawater%20change" title=" seawater change"> seawater change</a> </p> <a href="https://publications.waset.org/abstracts/83545/seawater-changes-estimation-at-tidal-flat-in-korean-peninsula-using-drone-stereo-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/83545.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">204</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7823</span> Delivery of Contraceptive and Maternal Health Commodities with Drones in the Most Remote Areas of Madagascar</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Josiane%20Yaguibou">Josiane Yaguibou</a>, <a href="https://publications.waset.org/abstracts/search?q=Ngoy%20Kishimba"> Ngoy Kishimba</a>, <a href="https://publications.waset.org/abstracts/search?q=Issiaka%20V.%20Coulibaly"> Issiaka V. Coulibaly</a>, <a href="https://publications.waset.org/abstracts/search?q=Sabrina%20Pestilli"> Sabrina Pestilli</a>, <a href="https://publications.waset.org/abstracts/search?q=Falinirina%20Razanalison"> Falinirina Razanalison</a>, <a href="https://publications.waset.org/abstracts/search?q=Hantanirina%20Andremanisa"> Hantanirina Andremanisa</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Madagascar has one of the least developed road networks in the world with a majority of its national and local roads being earth roads and in poor condition. In addition, the country is affected by frequent natural disasters that further affect the road conditions limiting the accessibility to some parts of the country. In 2021 and 2022, 2.21 million people were affected by drought in the Grand Sud region, and by cyclones and floods in the coastal regions, with disruptions of the health system including last mile distribution of lifesaving maternal health commodities and reproductive health commodities in the health facilities. Program intervention: The intervention uses drone technology to deliver maternal health and family planning commodities in hard-to-reach health facilities in the Grand Sud and Sud-Est of Madagascar, the regions more affected by natural disasters. Methodology The intervention was developed in two phases. A first phase, conducted in the Grand Sud, used drones leased from a private company to deliver commodities in isolated health facilities. Based on the lesson learnt and encouraging results of the first phase, in the second phase (2023) the intervention has been extended to the Sud Est regions with the purchase of drones and the recruitment of pilots to reduce costs and ensure sustainability. Key findings: The drones ensure deliveries of lifesaving commodities in the Grand Sud of Madagascar. In 2023, 297 deliveries in commodities in forty hard-to-reach health facilities have been carried out. Drone technology reduced delivery times from the usual 3 - 7 days necessary by road or boat to only a few hours. Program Implications: The use of innovative drone technology demonstrated to be successful in the Madagascar context to reduce dramatically the distribution time of commodities in hard-to-reach health facilities and avoid stockouts of life-saving medicines. When the intervention reaches full scale with the completion of the second phase and the extension in the Sud-Est, 150 hard-to-reach facilities will receive drone deliveries, avoiding stockouts and improving the quality of maternal health and family planning services offered to 1,4 million people in targeted areas. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=commodities" title="commodities">commodities</a>, <a href="https://publications.waset.org/abstracts/search?q=drones" title=" drones"> drones</a>, <a href="https://publications.waset.org/abstracts/search?q=last-mile%20distribution" title=" last-mile distribution"> last-mile distribution</a>, <a href="https://publications.waset.org/abstracts/search?q=lifesaving%20supplies" title=" lifesaving supplies"> lifesaving supplies</a> </p> <a href="https://publications.waset.org/abstracts/174846/delivery-of-contraceptive-and-maternal-health-commodities-with-drones-in-the-most-remote-areas-of-madagascar" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/174846.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">66</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7822</span> Controlling Drone Flight Missions through Natural Language Processors Using Artificial Intelligence</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sylvester%20Akpah">Sylvester Akpah</a>, <a href="https://publications.waset.org/abstracts/search?q=Selasi%20Vondee"> Selasi Vondee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Unmanned Aerial Vehicles (UAV) as they are also known, drones have attracted increasing attention in recent years due to their ubiquitous nature and boundless applications in the areas of communication, surveying, aerial photography, weather forecasting, medical delivery, surveillance amongst others. Operated remotely in real-time or pre-programmed, drones can fly autonomously or on pre-defined routes. The application of these aerial vehicles has successfully penetrated the world due to technological evolution, thus a lot more businesses are utilizing their capabilities. Unfortunately, while drones are replete with the benefits stated supra, they are riddled with some problems, mainly attributed to the complexities in learning how to master drone flights, collision avoidance and enterprise security. Additional challenges, such as the analysis of flight data recorded by sensors attached to the drone may take time and require expert help to analyse and understand. This paper presents an autonomous drone control system using a chatbot. The system allows for easy control of drones using conversations with the aid of Natural Language Processing, thus to reduce the workload needed to set up, deploy, control, and monitor drone flight missions. The results obtained at the end of the study revealed that the drone connected to the chatbot was able to initiate flight missions with just text and voice commands, enable conversation and give real-time feedback from data and requests made to the chatbot. The results further revealed that the system was able to process natural language and produced human-like conversational abilities using Artificial Intelligence (Natural Language Understanding). It is recommended that radio signal adapters be used instead of wireless connections thus to increase the range of communication with the aerial vehicle. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=artificial%20ntelligence" title="artificial ntelligence">artificial ntelligence</a>, <a href="https://publications.waset.org/abstracts/search?q=chatbot" title=" chatbot"> chatbot</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20language%20processing" title=" natural language processing"> natural language processing</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a> </p> <a href="https://publications.waset.org/abstracts/116870/controlling-drone-flight-missions-through-natural-language-processors-using-artificial-intelligence" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/116870.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">143</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7821</span> Development of an Indoor Drone Designed for the Needs of the Creative Industries</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=V.%20Santamarina%20Campos">V. Santamarina Campos</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20de%20Miguel%20Molina"> M. de Miguel Molina</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Kr%C3%B6ner"> S. Kröner</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20de%20Miguel%20Molina"> B. de Miguel Molina</a> </p> <p class="card-text"><strong>Abstract:</strong></p> With this contribution, we want to show how the AiRT system could change the future way of working of a part of the creative industry and what new economic opportunities could arise for them. Remotely Piloted Aircraft Systems (RPAS), also more commonly known as drones, are now essential tools used by many different companies for their creative outdoor work. However, using this very flexible applicable tool indoor is almost impossible, since safe navigation cannot be guaranteed by the operator due to the lack of a reliable and affordable indoor positioning system which ensures a stable flight, among other issues. Here we present our first results of a European project, which consists of developing an indoor drone for professional footage especially designed for the creative industries. One of the main achievements of this project is the successful implication of the end-users in the overall design process from the very beginning. To ensure safe flight in confined spaces, our drone incorporates a positioning system based on ultra-wide band technology, an RGB-D (depth) camera for 3D environment reconstruction and the possibility to fully pre-program automatic flights. Since we also want to offer this tool for inexperienced pilots, we have always focused on user-friendly handling of the whole system throughout the entire process. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=virtual%20reality" title="virtual reality">virtual reality</a>, <a href="https://publications.waset.org/abstracts/search?q=3D%20reconstruction" title=" 3D reconstruction"> 3D reconstruction</a>, <a href="https://publications.waset.org/abstracts/search?q=indoor%20positioning%20system" title=" indoor positioning system"> indoor positioning system</a>, <a href="https://publications.waset.org/abstracts/search?q=RPAS" title=" RPAS"> RPAS</a>, <a href="https://publications.waset.org/abstracts/search?q=remotely%20piloted%20aircraft%20systems" title=" remotely piloted aircraft systems"> remotely piloted aircraft systems</a>, <a href="https://publications.waset.org/abstracts/search?q=aerial%20film" title=" aerial film"> aerial film</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligent%20navigation" title=" intelligent navigation"> intelligent navigation</a>, <a href="https://publications.waset.org/abstracts/search?q=advanced%20safety%20measures" title=" advanced safety measures"> advanced safety measures</a>, <a href="https://publications.waset.org/abstracts/search?q=creative%20industries" title=" creative industries"> creative industries</a> </p> <a href="https://publications.waset.org/abstracts/90557/development-of-an-indoor-drone-designed-for-the-needs-of-the-creative-industries" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/90557.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">198</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7820</span> DQN for Navigation in Gazebo Simulator</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Xabier%20Olaz%20Moratinos">Xabier Olaz Moratinos</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drone navigation is critical, particularly during the initial phases, such as the initial ascension, where pilots may fail due to strong external interferences that could potentially lead to a crash. In this ongoing work, a drone has been successfully trained to perform an ascent of up to 6 meters at speeds with external disturbances pushing it up to 24 mph, with the DQN algorithm managing external forces affecting the system. It has been demonstrated that the system can control its height, position, and stability in all three axes (roll, pitch, and yaw) throughout the process. The learning process is carried out in the Gazebo simulator, which emulates interferences, while ROS is used to communicate with the agent. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title="machine learning">machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=DQN" title=" DQN"> DQN</a>, <a href="https://publications.waset.org/abstracts/search?q=gazebo" title=" gazebo"> gazebo</a>, <a href="https://publications.waset.org/abstracts/search?q=navigation" title=" navigation"> navigation</a> </p> <a href="https://publications.waset.org/abstracts/165698/dqn-for-navigation-in-gazebo-simulator" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/165698.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">114</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7819</span> Deep Q-Network for Navigation in Gazebo Simulator</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Xabier%20Olaz%20Moratinos">Xabier Olaz Moratinos</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drone navigation is critical, particularly during the initial phases, such as the initial ascension, where pilots may fail due to strong external interferences that could potentially lead to a crash. In this ongoing work, a drone has been successfully trained to perform an ascent of up to 6 meters at speeds with external disturbances pushing it up to 24 mph, with the DQN algorithm managing external forces affecting the system. It has been demonstrated that the system can control its height, position, and stability in all three axes (roll, pitch, and yaw) throughout the process. The learning process is carried out in the Gazebo simulator, which emulates interferences, while ROS is used to communicate with the agent. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title="machine learning">machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=DQN" title=" DQN"> DQN</a>, <a href="https://publications.waset.org/abstracts/search?q=Gazebo" title=" Gazebo"> Gazebo</a>, <a href="https://publications.waset.org/abstracts/search?q=navigation" title=" navigation"> navigation</a> </p> <a href="https://publications.waset.org/abstracts/165568/deep-q-network-for-navigation-in-gazebo-simulator" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/165568.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">80</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=261">261</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=262">262</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20technology&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10