CINXE.COM

Search results for: drone in business

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: drone in business</title> <meta name="description" content="Search results for: drone in business"> <meta name="keywords" content="drone in business"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="drone in business" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="drone in business"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 3184</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: drone in business</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3184</span> A Research on the Benefits of Drone Usage in Industry by Determining Companies Using Drone in the World</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ahmet%20Akdemir">Ahmet Akdemir</a>, <a href="https://publications.waset.org/abstracts/search?q=G%C3%BCzide%20Karaku%C5%9F"> Güzide Karakuş</a>, <a href="https://publications.waset.org/abstracts/search?q=Leyla%20Polat"> Leyla Polat</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Aviation that has been arisen in accordance with flying request that is existing inside of people, has not only made life easier by making a great contribution to humanity; it has also accelerated globalization by reducing distances between countries. It is seen that the growth rate of aviation industry has reached the undreamed level when it is looked back on. Today, the last point in aviation is unmanned aerial vehicles that are self-ventilating and move in desired coordinates without any onboard pilot. For those vehicles, there are two different control systems are developed. In the first type of control, an unmanned aerial vehicle (UAV) moves according to instructions of a remote control. UAV that moves with a remote control is named as drone; it can be used personally. In the second one, there is a flight plan that is programmed and placed inside of UAV before flight. Recently, drones have started to be used in unimagined areas and utilize specific, important benefits for any industry. Within this framework, this study answers the question that is drone usage would be beneficial for businesses or not. To answer this question, applied basic methodologies are determining businesses using drone in the world, their purposes to use drone, and then, comparing their economy as before drone and after drone. In the end of this study, it is seen that many companies in different business areas use drone in logistics support, and it makes their work easier than before. This paper has contributed to academic literature about this subject, and it has introduced the benefits of drone usage for businesses. In addition, it has encouraged businesses that they keep pace with this technological age by following the developments about drones. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aviation" title="aviation">aviation</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20in%20business" title=" drone in business"> drone in business</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a> </p> <a href="https://publications.waset.org/abstracts/77049/a-research-on-the-benefits-of-drone-usage-in-industry-by-determining-companies-using-drone-in-the-world" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/77049.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">257</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3183</span> Stakeholder Analysis of Agricultural Drone Policy: A Case Study of the Agricultural Drone Ecosystem of Thailand</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Thanomsin%20Chakreeves">Thanomsin Chakreeves</a>, <a href="https://publications.waset.org/abstracts/search?q=Atichat%20Preittigun"> Atichat Preittigun</a>, <a href="https://publications.waset.org/abstracts/search?q=Ajchara%20Phu-ang"> Ajchara Phu-ang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents a stakeholder analysis of agricultural drone policies that meet the government&#39;s goal of building an agricultural drone ecosystem in Thailand. Firstly, case studies from other countries are reviewed. The stakeholder analysis method and qualitative data from the interviews are then presented including data from the Institute of Innovation and Management, the Office of National Higher Education Science Research and Innovation Policy Council, agricultural entrepreneurs and farmers. Study and interview data are then employed to describe the current ecosystem and to guide the implementation of agricultural drone policies that are suitable for the ecosystem of Thailand. Finally, policy recommendations are then made that the Thai government should adopt in the future. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20public%20policy" title="drone public policy">drone public policy</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20ecosystem" title=" drone ecosystem"> drone ecosystem</a>, <a href="https://publications.waset.org/abstracts/search?q=policy%20development" title=" policy development"> policy development</a>, <a href="https://publications.waset.org/abstracts/search?q=agricultural%20drone" title=" agricultural drone"> agricultural drone</a> </p> <a href="https://publications.waset.org/abstracts/132133/stakeholder-analysis-of-agricultural-drone-policy-a-case-study-of-the-agricultural-drone-ecosystem-of-thailand" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/132133.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">149</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3182</span> 3D Stereoscopic Measurements from AR Drone Squadron</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=R.%20Schurig">R. Schurig</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20D%C3%A9sesquelles"> T. Désesquelles</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Dumont"> A. Dumont</a>, <a href="https://publications.waset.org/abstracts/search?q=E.%20Lefranc"> E. Lefranc</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Lux"> A. Lux</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A cost-efficient alternative is proposed to the use of a single drone carrying multiple cameras in order to take stereoscopic images and videos during its flight. Such drone has to be particularly large enough to take off with its equipment, and stable enough in order to make valid measurements. Corresponding performance for a single aircraft usually comes with a large cost. Proposed solution consists in using multiple smaller and cheaper aircrafts carrying one camera each instead of a single expensive one. To give a proof of concept, AR drones, quad-rotor UAVs from Parrot Inc., are experimentally used. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20squadron" title="drone squadron">drone squadron</a>, <a href="https://publications.waset.org/abstracts/search?q=flight%20control" title=" flight control"> flight control</a>, <a href="https://publications.waset.org/abstracts/search?q=rotorcraft" title=" rotorcraft"> rotorcraft</a>, <a href="https://publications.waset.org/abstracts/search?q=Unmanned%20Aerial%20Vehicle%20%28UAV%29" title=" Unmanned Aerial Vehicle (UAV)"> Unmanned Aerial Vehicle (UAV)</a>, <a href="https://publications.waset.org/abstracts/search?q=AR%20drone" title=" AR drone"> AR drone</a>, <a href="https://publications.waset.org/abstracts/search?q=stereoscopic%20vision" title=" stereoscopic vision"> stereoscopic vision</a> </p> <a href="https://publications.waset.org/abstracts/17205/3d-stereoscopic-measurements-from-ar-drone-squadron" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/17205.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">473</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3181</span> Drone Classification Using Classification Methods Using Conventional Model With Embedded Audio-Visual Features</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hrishi%20Rakshit">Hrishi Rakshit</a>, <a href="https://publications.waset.org/abstracts/search?q=Pooneh%20Bagheri%20Zadeh"> Pooneh Bagheri Zadeh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper investigates the performance of drone classification methods using conventional DCNN with different hyperparameters, when additional drone audio data is embedded in the dataset for training and further classification. In this paper, first a custom dataset is created using different images of drones from University of South California (USC) datasets and Leeds Beckett university datasets with embedded drone audio signal. The three well-known DCNN architectures namely, Resnet50, Darknet53 and Shufflenet are employed over the created dataset tuning their hyperparameters such as, learning rates, maximum epochs, Mini Batch size with different optimizers. Precision-Recall curves and F1 Scores-Threshold curves are used to evaluate the performance of the named classification algorithms. Experimental results show that Resnet50 has the highest efficiency compared to other DCNN methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20classifications" title="drone classifications">drone classifications</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20convolutional%20neural%20network" title=" deep convolutional neural network"> deep convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=hyperparameters" title=" hyperparameters"> hyperparameters</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal" title=" drone audio signal"> drone audio signal</a> </p> <a href="https://publications.waset.org/abstracts/172929/drone-classification-using-classification-methods-using-conventional-model-with-embedded-audio-visual-features" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/172929.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">104</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3180</span> Comparison of Direction of Arrival Estimation Method for Drone Based on Phased Microphone Array</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jiwon%20Lee">Jiwon Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Yeong-Ju%20Go"> Yeong-Ju Go</a>, <a href="https://publications.waset.org/abstracts/search?q=Jong-Soo%20Choi"> Jong-Soo Choi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drones were first developed for military use and were used in World War 1. But recently drones have been used in a variety of fields. Several companies actively utilize drone technology to strengthen their services, and in agriculture, drones are used for crop monitoring and sowing. Other people use drones for hobby activities such as photography. However, as the range of use of drones expands rapidly, problems caused by drones such as improperly flying, privacy and terrorism are also increasing. As the need for monitoring and tracking of drones increases, researches are progressing accordingly. The drone detection system estimates the position of the drone using the physical phenomena that occur when the drones fly. The drone detection system measures being developed utilize many approaches, such as radar, infrared camera, and acoustic detection systems. Among the various drone detection system, the acoustic detection system is advantageous in that the microphone array system is small, inexpensive, and easy to operate than other systems. In this paper, the acoustic signal is acquired by using minimum microphone when drone is flying, and direction of drone is estimated. When estimating the Direction of Arrival(DOA), there is a method of calculating the DOA based on the Time Difference of Arrival(TDOA) and a method of calculating the DOA based on the beamforming. The TDOA technique requires less number of microphones than the beamforming technique, but is weak in noisy environments and can only estimate the DOA of a single source. The beamforming technique requires more microphones than the TDOA technique. However, it is strong against the noisy environment and it is possible to simultaneously estimate the DOA of several drones. When estimating the DOA using acoustic signals emitted from the drone, it is impossible to measure the position of the drone, and only the direction can be estimated. To overcome this problem, in this work we show how to estimate the position of drones by arranging multiple microphone arrays. The microphone array used in the experiments was four tetrahedral microphones. We simulated the performance of each DOA algorithm and demonstrated the simulation results through experiments. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=acoustic%20sensing" title="acoustic sensing">acoustic sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=direction%20of%20arrival" title=" direction of arrival"> direction of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20detection" title=" drone detection"> drone detection</a>, <a href="https://publications.waset.org/abstracts/search?q=microphone%20array" title=" microphone array"> microphone array</a> </p> <a href="https://publications.waset.org/abstracts/94230/comparison-of-direction-of-arrival-estimation-method-for-drone-based-on-phased-microphone-array" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94230.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">160</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3179</span> Safe Zone: A Framework for Detecting and Preventing Drones Misuse </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=AlHanoof%20A.%20Alharbi">AlHanoof A. Alharbi</a>, <a href="https://publications.waset.org/abstracts/search?q=Fatima%20M.%20Alamoudi"> Fatima M. Alamoudi</a>, <a href="https://publications.waset.org/abstracts/search?q=Razan%20A.%20Albrahim"> Razan A. Albrahim</a>, <a href="https://publications.waset.org/abstracts/search?q=Sarah%20F.%20Alharbi"> Sarah F. Alharbi</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdullah%20M%20Almuhaideb"> Abdullah M Almuhaideb</a>, <a href="https://publications.waset.org/abstracts/search?q=Norah%20A.%20Almubairik"> Norah A. Almubairik</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdulrahman%20Alharby"> Abdulrahman Alharby</a>, <a href="https://publications.waset.org/abstracts/search?q=Naya%20M.%20Nagy"> Naya M. Nagy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recently, drones received a rapid interest in different industries worldwide due to its powerful impact. However, limitations still exist in this emerging technology, especially privacy violation. These aircrafts consistently threaten the security of entities by entering restricted areas accidentally or deliberately. Therefore, this research project aims to develop drone detection and prevention mechanism to protect the restricted area. Until now, none of the solutions have met the optimal requirements of detection which are cost-effectiveness, high accuracy, long range, convenience, unaffected by noise and generalization. In terms of prevention, the existing methods are focusing on impractical solutions such as catching a drone by a larger drone, training an eagle or a gun. In addition, the practical solutions have limitations, such as the No-Fly Zone and PITBULL jammers. According to our study and analysis of previous related works, none of the solutions includes detection and prevention at the same time. The proposed solution is a combination of detection and prevention methods. To implement the detection system, a passive radar will be used to properly identify the drone against any possible flying objects. As for the prevention, jamming signals and forceful safe landing of the drone integrated together to stop the drone’s operation. We believe that applying this mechanism will limit the drone’s invasion of privacy incidents against highly restricted properties. Consequently, it effectively accelerates drones‘ usages at personal and governmental levels. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=detection" title="detection">detection</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=jamming" title=" jamming"> jamming</a>, <a href="https://publications.waset.org/abstracts/search?q=prevention" title=" prevention"> prevention</a>, <a href="https://publications.waset.org/abstracts/search?q=privacy" title=" privacy"> privacy</a>, <a href="https://publications.waset.org/abstracts/search?q=RF" title=" RF"> RF</a>, <a href="https://publications.waset.org/abstracts/search?q=radar" title=" radar"> radar</a>, <a href="https://publications.waset.org/abstracts/search?q=UAV" title=" UAV"> UAV</a> </p> <a href="https://publications.waset.org/abstracts/106189/safe-zone-a-framework-for-detecting-and-preventing-drones-misuse" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/106189.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">213</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3178</span> A Power Management System for Indoor Micro-Drones in GPS-Denied Environments</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yendo%20Hu">Yendo Hu</a>, <a href="https://publications.waset.org/abstracts/search?q=Xu-Yu%20Wu"> Xu-Yu Wu</a>, <a href="https://publications.waset.org/abstracts/search?q=Dylan%20Oh"> Dylan Oh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> GPS-Denied drones open the possibility of indoor applications, including dynamic arial surveillance, inspection, safety enforcement, and discovery. Indoor swarming further enhances these applications in accuracy, robustness, operational time, and coverage. For micro-drones, power management becomes a critical issue, given the battery payload restriction. This paper proposes an application enabling battery replacement solution that extends the micro-drone active phase without human intervention. First, a framework to quantify the effectiveness of a power management solution for a drone fleet is proposed. The operation-to-non-operation ratio, ONR, gives one a quantitative benchmark to measure the effectiveness of a power management solution. Second, a survey was carried out to evaluate the ONR performance for the various solutions. Third, through analysis, this paper proposes a solution tailored to the indoor micro-drone, suitable for swarming applications. The proposed automated battery replacement solution, along with a modified micro-drone architecture, was implemented along with the associated micro-drone. Fourth, the system was tested and compared with the various solutions within the industry. Results show that the proposed solution achieves an ONR value of 31, which is a 1-fold improvement of the best alternative option. The cost analysis shows a manufacturing cost of $25, which makes this approach viable for cost-sensitive markets (e.g., consumer). Further challenges remain in the area of drone design for automated battery replacement, landing pad/drone production, high-precision landing control, and ONR improvements. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=micro-drone" title="micro-drone">micro-drone</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20swap" title=" battery swap"> battery swap</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20replacement" title=" battery replacement"> battery replacement</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20recharge" title=" battery recharge"> battery recharge</a>, <a href="https://publications.waset.org/abstracts/search?q=landing%20pad" title=" landing pad"> landing pad</a>, <a href="https://publications.waset.org/abstracts/search?q=power%20management" title=" power management"> power management</a> </p> <a href="https://publications.waset.org/abstracts/171391/a-power-management-system-for-indoor-micro-drones-in-gps-denied-environments" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/171391.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">122</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3177</span> Should the U.S. Rely on Drone Strikes to Combat the Islamic State? Why Deploying a Drone Campaign against ISIS Will Do Nothing to Address the Causes of the Insurgency or Prevent Its Resurgence?</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Danielle%20Jablanski">Danielle Jablanski</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This article addresses the use of drone strikes under international law and the intersection between Islamic law and current terrorist trends worldwide. It breaks down the legality of drone strikes under international law and dissects certain aspects of their usage in modern warfare; i.e. concepts of directly participating in hostilities and the role of CIA operators. The article then looks at international paradigms of law enforcement versus the use of military force in relation to terrorism. Lastly, it describes traditional aspects of Islamic law and several interpretations of the law today as applied to widespread campaigns of terrorism, namely that of the recent group ISIS or ISIL operating between the battlegrounds of Iraq and Syria. The piece concludes with appraisals for moving forward on the basis of honing in on reasons for terrorism and negative opinions of solely military campaigns to dismantle or disrupt terror organizations and breeding grounds. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=international%20law" title="international law">international law</a>, <a href="https://publications.waset.org/abstracts/search?q=terrorism" title=" terrorism"> terrorism</a>, <a href="https://publications.waset.org/abstracts/search?q=ISIS" title=" ISIS"> ISIS</a>, <a href="https://publications.waset.org/abstracts/search?q=islamic%20law" title=" islamic law"> islamic law</a> </p> <a href="https://publications.waset.org/abstracts/24847/should-the-us-rely-on-drone-strikes-to-combat-the-islamic-state-why-deploying-a-drone-campaign-against-isis-will-do-nothing-to-address-the-causes-of-the-insurgency-or-prevent-its-resurgence" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/24847.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">476</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3176</span> Drone Swarm Routing and Scheduling for Off-shore Wind Turbine Blades Inspection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohanad%20Al-Behadili">Mohanad Al-Behadili</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiang%20Song"> Xiang Song</a>, <a href="https://publications.waset.org/abstracts/search?q=Djamila%20Ouelhadj"> Djamila Ouelhadj</a>, <a href="https://publications.waset.org/abstracts/search?q=Alex%20Fraess-Ehrfeld"> Alex Fraess-Ehrfeld</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In off-shore wind farms, turbine blade inspection accessibility under various sea states is very challenging and greatly affects the downtime of wind turbines. Maintenance of any offshore system is not an easy task due to the restricted logistics and accessibility. The multirotor unmanned helicopter is of increasing interest in inspection applications due to its manoeuvrability and payload capacity. These advantages increase when many of them are deployed simultaneously in a swarm. Hence this paper proposes a drone swarm framework for inspecting offshore wind turbine blades and nacelles so as to reduce downtime. One of the big challenges of this task is that when operating a drone swarm, an individual drone may not have enough power to fly and communicate during missions and it has no capability of refueling due to its small size. Once the drone power is drained, there are no signals transmitted and the links become intermittent. Vessels equipped with 5G masts and small power units are utilised as platforms for drones to recharge/swap batteries. The research work aims at designing a smart energy management system, which provides automated vessel and drone routing and recharging plans. To achieve this goal, a novel mathematical optimisation model is developed with the main objective of minimising the number of drones and vessels, which carry the charging stations, and the downtime of the wind turbines. There are a number of constraints to be considered, such as each wind turbine must be inspected once and only once by one drone; each drone can inspect at most one wind turbine after recharging, then fly back to the charging station; collision should be avoided during the drone flying; all wind turbines in the wind farm should be inspected within the given time window. We have developed a real-time Ant Colony Optimisation (ACO) algorithm to generate real-time and near-optimal solutions to the drone swarm routing problem. The schedule will generate efficient and real-time solutions to indicate the inspection tasks, time windows, and the optimal routes of the drones to access the turbines. Experiments are conducted to evaluate the quality of the solutions generated by ACO. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20swarm" title="drone swarm">drone swarm</a>, <a href="https://publications.waset.org/abstracts/search?q=routing" title=" routing"> routing</a>, <a href="https://publications.waset.org/abstracts/search?q=scheduling" title=" scheduling"> scheduling</a>, <a href="https://publications.waset.org/abstracts/search?q=optimisation%20model" title=" optimisation model"> optimisation model</a>, <a href="https://publications.waset.org/abstracts/search?q=ant%20colony%20optimisation" title=" ant colony optimisation"> ant colony optimisation</a> </p> <a href="https://publications.waset.org/abstracts/141935/drone-swarm-routing-and-scheduling-for-off-shore-wind-turbine-blades-inspection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/141935.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">269</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3175</span> Design of a Surveillance Drone with Computer Aided Durability</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Maram%20Shahad%20Dana%20Anfal">Maram Shahad Dana Anfal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research paper presents the design of a surveillance drone with computer-aided durability and model analyses that provides a cost-effective and efficient solution for various applications. The quadcopter's design is based on a lightweight and strong structure made of materials such as aluminum and titanium, which provide a durable structure for the quadcopter. The structure of this product and the computer-aided durability system are both designed to ensure frequent repairs or replacements, which will save time and money in the long run. Moreover, the study discusses the drone's ability to track, investigate, and deliver objects more quickly than traditional methods, makes it a highly efficient and cost-effective technology. In this paper, a comprehensive analysis of the quadcopter's operation dynamics and limitations is presented. In both simulation and experimental data, the computer-aided durability system and the drone's design demonstrate their effectiveness, highlighting the potential for a variety of applications, such as search and rescue missions, infrastructure monitoring, and agricultural operations. Also, the findings provide insights into possible areas for improvement in the design and operation of the drone. Ultimately, this paper presents a reliable and cost-effective solution for surveillance applications by designing a drone with computer-aided durability and modeling. With its potential to save time and money, increase reliability, and enhance safety, it is a promising technology for the future of surveillance drones. operation dynamic equations have been evaluated successfully for different flight conditions of a quadcopter. Also, CAE modeling techniques have been applied for the modal risk assessment at operating conditions.Stress analysis have been performed under the loadings of the worst-case combined motion flight conditions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone" title="drone">drone</a>, <a href="https://publications.waset.org/abstracts/search?q=material" title=" material"> material</a>, <a href="https://publications.waset.org/abstracts/search?q=solidwork" title=" solidwork"> solidwork</a>, <a href="https://publications.waset.org/abstracts/search?q=hypermesh" title=" hypermesh"> hypermesh</a> </p> <a href="https://publications.waset.org/abstracts/167463/design-of-a-surveillance-drone-with-computer-aided-durability" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/167463.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">146</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3174</span> Study on Acoustic Source Detection Performance Improvement of Microphone Array Installed on Drones Using Blind Source Separation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Youngsun%20Moon">Youngsun Moon</a>, <a href="https://publications.waset.org/abstracts/search?q=Yeong-Ju%20Go"> Yeong-Ju Go</a>, <a href="https://publications.waset.org/abstracts/search?q=Jong-Soo%20Choi"> Jong-Soo Choi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Most drones that currently have surveillance/reconnaissance missions are basically equipped with optical equipment, but we also need to use a microphone array to estimate the location of the acoustic source. This can provide additional information in the absence of optical equipment. The purpose of this study is to estimate Direction of Arrival (DOA) based on Time Difference of Arrival (TDOA) estimation of the acoustic source in the drone. The problem is that it is impossible to measure the clear target acoustic source because of the drone noise. To overcome this problem is to separate the drone noise and the target acoustic source using Blind Source Separation(BSS) based on Independent Component Analysis(ICA). ICA can be performed assuming that the drone noise and target acoustic source are independent and each signal has non-gaussianity. For maximized non-gaussianity each signal, we use Negentropy and Kurtosis based on probability theory. As a result, we can improve TDOA estimation and DOA estimation of the target source in the noisy environment. We simulated the performance of the DOA algorithm applying BSS algorithm, and demonstrated the simulation through experiment at the anechoic wind tunnel. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aeroacoustics" title="aeroacoustics">aeroacoustics</a>, <a href="https://publications.waset.org/abstracts/search?q=acoustic%20source%20detection" title=" acoustic source detection"> acoustic source detection</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20difference%20of%20arrival" title=" time difference of arrival"> time difference of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=direction%20of%20arrival" title=" direction of arrival"> direction of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=blind%20source%20separation" title=" blind source separation"> blind source separation</a>, <a href="https://publications.waset.org/abstracts/search?q=independent%20component%20analysis" title=" independent component analysis"> independent component analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a> </p> <a href="https://publications.waset.org/abstracts/94236/study-on-acoustic-source-detection-performance-improvement-of-microphone-array-installed-on-drones-using-blind-source-separation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94236.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">164</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3173</span> Designing Agricultural Irrigation Systems Using Drone Technology and Geospatial Analysis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yongqin%20Zhang">Yongqin Zhang</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20Lett"> John Lett</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Geospatial technologies have been increasingly used in agriculture for various applications and purposes in recent years. Unmanned aerial vehicles (drones) fit the needs of farmers in farming operations, from field spraying to grow cycles and crop health. In this research, we conducted a practical research project that used drone technology to design and map optimal locations and layouts of irrigation systems for agriculture farms. We flew a DJI Mavic 2 Pro drone to acquire aerial remote sensing images over two agriculture fields in Forest, Mississippi, in 2022. Flight plans were first designed to capture multiple high-resolution images via a 20-megapixel RGB camera mounted on the drone over the agriculture fields. The Drone Deploy web application was then utilized to develop flight plans and subsequent image processing and measurements. The images were orthorectified and processed to estimate the area of the area and measure the locations of the water line and sprinkle heads. Field measurements were conducted to measure the ground targets and validate the aerial measurements. Geospatial analysis and photogrammetric measurements were performed for the study area to determine optimal layout and quantitative estimates for irrigation systems. We created maps and tabular estimates to demonstrate the locations, spacing, amount, and layout of sprinkler heads and water lines to cover the agricultural fields. This research project provides scientific guidance to Mississippi farmers for a precision agricultural irrigation practice. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20images" title="drone images">drone images</a>, <a href="https://publications.waset.org/abstracts/search?q=agriculture" title=" agriculture"> agriculture</a>, <a href="https://publications.waset.org/abstracts/search?q=irrigation" title=" irrigation"> irrigation</a>, <a href="https://publications.waset.org/abstracts/search?q=geospatial%20analysis" title=" geospatial analysis"> geospatial analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=photogrammetric%20measurements" title=" photogrammetric measurements"> photogrammetric measurements</a> </p> <a href="https://publications.waset.org/abstracts/162153/designing-agricultural-irrigation-systems-using-drone-technology-and-geospatial-analysis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162153.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">77</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3172</span> Embedded Electrochemistry with Miniaturized, Drone-Based, Potentiostat System for Remote Detection Chemical Warfare Agents</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amer%20Dawoud">Amer Dawoud</a>, <a href="https://publications.waset.org/abstracts/search?q=Jesy%20Motchaalangaram"> Jesy Motchaalangaram</a>, <a href="https://publications.waset.org/abstracts/search?q=Arati%20Biswakarma"> Arati Biswakarma</a>, <a href="https://publications.waset.org/abstracts/search?q=Wujan%20Mio"> Wujan Mio</a>, <a href="https://publications.waset.org/abstracts/search?q=Karl%20Wallace"> Karl Wallace</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The development of an embedded miniaturized drone-based system for remote detection of Chemical Warfare Agents (CWA) is proposed. The paper focuses on the software/hardware system design of the electrochemical Cyclic Voltammetry (CV) and Differential Pulse Voltammetry (DPV) signal processing for future deployment on drones. The paper summarizes the progress made towards hardware and electrochemical signal processing for signature detection of CWA. Also, the miniature potentiostat signal is validated by comparing it with the high-end lab potentiostat signal. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone-based" title="drone-based">drone-based</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20detection%20chemical%20warfare%20agents" title=" remote detection chemical warfare agents"> remote detection chemical warfare agents</a>, <a href="https://publications.waset.org/abstracts/search?q=miniaturized" title=" miniaturized"> miniaturized</a>, <a href="https://publications.waset.org/abstracts/search?q=potentiostat" title=" potentiostat"> potentiostat</a> </p> <a href="https://publications.waset.org/abstracts/145007/embedded-electrochemistry-with-miniaturized-drone-based-potentiostat-system-for-remote-detection-chemical-warfare-agents" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/145007.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">136</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3171</span> Feasibility of Using Bike Lanes in Conjunctions with Sidewalks for Ground Drone Applications in Last Mile Delivery for Dense Urban Areas</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=N.%20Bazyar%20Shourabi">N. Bazyar Shourabi</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20Nyarko"> K. Nyarko</a>, <a href="https://publications.waset.org/abstracts/search?q=C.%20Scott"> C. Scott</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Jeihnai"> M. Jeihnai</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Ground drones have the potential to reduce the cost and time of making last-mile deliveries. They also have the potential to make a huge impact on human life. Despite this potential, little work has gone into developing a suitable feasibility model for ground drone delivery in dense urban areas. Today, most of the experimental ground delivery drones utilize sidewalks only, with just a few of them starting to use bike lanes, which a significant portion of some urban areas have. This study works on the feasibility of using bike lanes in conjunction with sidewalks for ground drone applications in last-mile delivery for dense urban areas. This work begins with surveying bike lanes and sidewalks within the city of Boston using Geographic Information System (GIS) software to determine the percentage of coverage currently available within the city. Then six scenarios are examined. Based on this research, a mathematical model is developed. The daily cost of delivering packages using each scenario is calculated by the mathematical model. Comparing the drone delivery scenarios with the traditional method of package delivery using trucks will provide essential information concerning the feasibility of implementing routing protocols that combine the use of sidewalks and bike lanes. The preliminary results of the model show that ground drones that can travel via sidewalks or bike lanes have the potential to significantly reduce delivery cost. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ground%20drone" title="ground drone">ground drone</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligent%20transportation%20system" title=" intelligent transportation system"> intelligent transportation system</a>, <a href="https://publications.waset.org/abstracts/search?q=last-mile%20delivery" title=" last-mile delivery"> last-mile delivery</a>, <a href="https://publications.waset.org/abstracts/search?q=sidewalk%20robot" title=" sidewalk robot"> sidewalk robot</a> </p> <a href="https://publications.waset.org/abstracts/116913/feasibility-of-using-bike-lanes-in-conjunctions-with-sidewalks-for-ground-drone-applications-in-last-mile-delivery-for-dense-urban-areas" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/116913.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">147</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3170</span> Non-intrusive Hand Control of Drone Using an Inexpensive and Streamlined Convolutional Neural Network Approach</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Evan%20Lowhorn">Evan Lowhorn</a>, <a href="https://publications.waset.org/abstracts/search?q=Rocio%20Alba-Flores"> Rocio Alba-Flores</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The purpose of this work is to develop a method for classifying hand signals and using the output in a drone control algorithm. To achieve this, methods based on Convolutional Neural Networks (CNN) were applied. CNN's are a subset of deep learning, which allows grid-like inputs to be processed and passed through a neural network to be trained for classification. This type of neural network allows for classification via imaging, which is less intrusive than previous methods using biosensors, such as EMG sensors. Classification CNN's operate purely from the pixel values in an image; therefore they can be used without additional exteroceptive sensors. A development bench was constructed using a desktop computer connected to a high-definition webcam mounted on a scissor arm. This allowed the camera to be pointed downwards at the desk to provide a constant solid background for the dataset and a clear detection area for the user. A MATLAB script was created to automate dataset image capture at the development bench and save the images to the desktop. This allowed the user to create their own dataset of 12,000 images within three hours. These images were evenly distributed among seven classes. The defined classes include forward, backward, left, right, idle, and land. The drone has a popular flip function which was also included as an additional class. To simplify control, the corresponding hand signals chosen were the numerical hand signs for one through five for movements, a fist for land, and the universal “ok” sign for the flip command. Transfer learning with PyTorch (Python) was performed using a pre-trained 18-layer residual learning network (ResNet-18) to retrain the network for custom classification. An algorithm was created to interpret the classification and send encoded messages to a Ryze Tello drone over its 2.4 GHz Wi-Fi connection. The drone’s movements were performed in half-meter distance increments at a constant speed. When combined with the drone control algorithm, the classification performed as desired with negligible latency when compared to the delay in the drone’s movement commands. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=classification" title="classification">classification</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20vision" title=" computer vision"> computer vision</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20networks" title=" convolutional neural networks"> convolutional neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20control" title=" drone control"> drone control</a> </p> <a href="https://publications.waset.org/abstracts/139743/non-intrusive-hand-control-of-drone-using-an-inexpensive-and-streamlined-convolutional-neural-network-approach" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139743.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">212</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3169</span> Multiperson Drone Control with Seamless Pilot Switching Using Onboard Camera and Openpose Real-Time Keypoint Detection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Evan%20Lowhorn">Evan Lowhorn</a>, <a href="https://publications.waset.org/abstracts/search?q=Rocio%20Alba-Flores"> Rocio Alba-Flores</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Traditional classification Convolutional Neural Networks (CNN) attempt to classify an image in its entirety. This becomes problematic when trying to perform classification with a drone’s camera in real-time due to unpredictable backgrounds. Object detectors with bounding boxes can be used to isolate individuals and other items, but the original backgrounds remain within these boxes. These basic detectors have been regularly used to determine what type of object an item is, such as “person” or “dog.” Recent advancement in computer vision, particularly with human imaging, is keypoint detection. Human keypoint detection goes beyond bounding boxes to fully isolate humans and plot points, or Regions of Interest (ROI), on their bodies within an image. ROIs can include shoulders, elbows, knees, heads, etc. These points can then be related to each other and used in deep learning methods such as pose estimation. For drone control based on human motions, poses, or signals using the onboard camera, it is important to have a simple method for pilot identification among multiple individuals while also giving the pilot fine control options for the drone. To achieve this, the OpenPose keypoint detection network was used with body and hand keypoint detection enabled. OpenPose supports the ability to combine multiple keypoint detection methods in real-time with a single network. Body keypoint detection allows simple poses to act as the pilot identifier. The hand keypoint detection with ROIs for each finger can then offer a greater variety of signal options for the pilot once identified. For this work, the individual must raise their non-control arm to be identified as the operator and send commands with the hand on their other arm. The drone ignores all other individuals in the onboard camera feed until the current operator lowers their non-control arm. When another individual wish to operate the drone, they simply raise their arm once the current operator relinquishes control, and then they can begin controlling the drone with their other hand. This is all performed mid-flight with no landing or script editing required. When using a desktop with a discrete NVIDIA GPU, the drone’s 2.4 GHz Wi-Fi connection combined with OpenPose restrictions to only body and hand allows this control method to perform as intended while maintaining the responsiveness required for practical use. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=computer%20vision" title="computer vision">computer vision</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20control" title=" drone control"> drone control</a>, <a href="https://publications.waset.org/abstracts/search?q=keypoint%20detection" title=" keypoint detection"> keypoint detection</a>, <a href="https://publications.waset.org/abstracts/search?q=openpose" title=" openpose"> openpose</a> </p> <a href="https://publications.waset.org/abstracts/139752/multiperson-drone-control-with-seamless-pilot-switching-using-onboard-camera-and-openpose-real-time-keypoint-detection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139752.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">185</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3168</span> Seawater Changes&#039; Estimation at Tidal Flat in Korean Peninsula Using Drone Stereo Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hyoseong%20Lee">Hyoseong Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Duk-jin%20Kim"> Duk-jin Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Jaehong%20Oh"> Jaehong Oh</a>, <a href="https://publications.waset.org/abstracts/search?q=Jungil%20Shin"> Jungil Shin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Tidal flat in Korean peninsula is one of the largest biodiversity tidal flats in the world. Therefore, digital elevation models (DEM) is continuously demanded to monitor of the tidal flat. In this study, DEM of tidal flat, according to different times, was produced by means of the Drone and commercial software in order to measure seawater change during high tide at water-channel in tidal flat. To correct the produced DEMs of the tidal flat where is inaccessible to collect control points, the DEM matching method was applied by using the reference DEM instead of the survey. After the ortho-image was made from the corrected DEM, the land cover classified image was produced. The changes of seawater amount according to the times were analyzed by using the classified images and DEMs. As a result, it was confirmed that the amount of water rapidly increased as the time passed during high tide. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=tidal%20flat" title="tidal flat">tidal flat</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=DEM" title=" DEM"> DEM</a>, <a href="https://publications.waset.org/abstracts/search?q=seawater%20change" title=" seawater change"> seawater change</a> </p> <a href="https://publications.waset.org/abstracts/83545/seawater-changes-estimation-at-tidal-flat-in-korean-peninsula-using-drone-stereo-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/83545.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">204</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3167</span> Droning the Pedagogy: Future Prospect of Teaching and Learning </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Farha%20Sattar">Farha Sattar</a>, <a href="https://publications.waset.org/abstracts/search?q=Laurence%20Tamatea"> Laurence Tamatea</a>, <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Nawaz"> Muhammad Nawaz</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drones, the Unmanned Aerial Vehicles are playing an important role in real-world problem-solving. With the new advancements in technology, drones are becoming available, affordable and user- friendly. Use of drones in education is opening new trends in teaching and learning practices in an innovative and engaging way. Drones vary in types and sizes and possess various characteristics and capabilities which enhance their potential to be used in education from basic to advanced and challenging learning activities which are suitable for primary, middle and high school level. This research aims to provide an insight to explore different types of drones and their compatibility to be used in teaching different subjects at various levels. Research focuses on integrating the drone technology along with Australian curriculum content knowledge to reinforce the understanding of the fundamental concepts and helps to develop the critical thinking and reasoning in the learning process. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=critical%20thinking" title="critical thinking">critical thinking</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20technology" title=" drone technology"> drone technology</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20types" title=" drone types"> drone types</a>, <a href="https://publications.waset.org/abstracts/search?q=innovative%20learning" title=" innovative learning"> innovative learning</a> </p> <a href="https://publications.waset.org/abstracts/69802/droning-the-pedagogy-future-prospect-of-teaching-and-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/69802.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">309</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3166</span> Controlling Drone Flight Missions through Natural Language Processors Using Artificial Intelligence</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sylvester%20Akpah">Sylvester Akpah</a>, <a href="https://publications.waset.org/abstracts/search?q=Selasi%20Vondee"> Selasi Vondee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Unmanned Aerial Vehicles (UAV) as they are also known, drones have attracted increasing attention in recent years due to their ubiquitous nature and boundless applications in the areas of communication, surveying, aerial photography, weather forecasting, medical delivery, surveillance amongst others. Operated remotely in real-time or pre-programmed, drones can fly autonomously or on pre-defined routes. The application of these aerial vehicles has successfully penetrated the world due to technological evolution, thus a lot more businesses are utilizing their capabilities. Unfortunately, while drones are replete with the benefits stated supra, they are riddled with some problems, mainly attributed to the complexities in learning how to master drone flights, collision avoidance and enterprise security. Additional challenges, such as the analysis of flight data recorded by sensors attached to the drone may take time and require expert help to analyse and understand. This paper presents an autonomous drone control system using a chatbot. The system allows for easy control of drones using conversations with the aid of Natural Language Processing, thus to reduce the workload needed to set up, deploy, control, and monitor drone flight missions. The results obtained at the end of the study revealed that the drone connected to the chatbot was able to initiate flight missions with just text and voice commands, enable conversation and give real-time feedback from data and requests made to the chatbot. The results further revealed that the system was able to process natural language and produced human-like conversational abilities using Artificial Intelligence (Natural Language Understanding). It is recommended that radio signal adapters be used instead of wireless connections thus to increase the range of communication with the aerial vehicle. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=artificial%20ntelligence" title="artificial ntelligence">artificial ntelligence</a>, <a href="https://publications.waset.org/abstracts/search?q=chatbot" title=" chatbot"> chatbot</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20language%20processing" title=" natural language processing"> natural language processing</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a> </p> <a href="https://publications.waset.org/abstracts/116870/controlling-drone-flight-missions-through-natural-language-processors-using-artificial-intelligence" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/116870.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">143</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3165</span> Agile Real-Time Field Programmable Gate Array-Based Image Processing System for Drone Imagery in Digital Agriculture</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sabiha%20Shahid%20Antora">Sabiha Shahid Antora</a>, <a href="https://publications.waset.org/abstracts/search?q=Young%20Ki%20Chang"> Young Ki Chang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Along with various farm management technologies, imagery is an important tool that facilitates crop assessment, monitoring, and management. As a consequence, drone imaging technology is playing a vital role to capture the state of the entire field for yield mapping, crop scouting, weed detection, and so on. Although it is essential to inspect the cultivable lands in real-time for making rapid decisions regarding field variable inputs to combat stresses and diseases, drone imagery is still evolving in this area of interest. Cost margin and post-processing complexions of the image stream are the main challenges of imaging technology. Therefore, this proposed project involves the cost-effective field programmable gate array (FPGA) based image processing device that would process the image stream in real-time as well as providing the processed output to support on-the-spot decisions in the crop field. As a result, the real-time FPGA-based image processing system would reduce operating costs while minimizing a few intermediate steps to deliver scalable field decisions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=real-time" title="real-time">real-time</a>, <a href="https://publications.waset.org/abstracts/search?q=FPGA" title=" FPGA"> FPGA</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20imagery" title=" drone imagery"> drone imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a>, <a href="https://publications.waset.org/abstracts/search?q=crop%20monitoring" title=" crop monitoring"> crop monitoring</a> </p> <a href="https://publications.waset.org/abstracts/132611/agile-real-time-field-programmable-gate-array-based-image-processing-system-for-drone-imagery-in-digital-agriculture" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/132611.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">114</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3164</span> An Approach to Autonomous Drones Using Deep Reinforcement Learning and Object Detection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=K.%20R.%20Roopesh%20Bharatwaj">K. R. Roopesh Bharatwaj</a>, <a href="https://publications.waset.org/abstracts/search?q=Avinash%20Maharana"> Avinash Maharana</a>, <a href="https://publications.waset.org/abstracts/search?q=Favour%20Tobi%20Aborisade"> Favour Tobi Aborisade</a>, <a href="https://publications.waset.org/abstracts/search?q=Roger%20Young"> Roger Young</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Presently, there are few cases of complete automation of drones and its allied intelligence capabilities. In essence, the potential of the drone has not yet been fully utilized. This paper presents feasible methods to build an intelligent drone with smart capabilities such as self-driving, and obstacle avoidance. It does this through advanced Reinforcement Learning Techniques and performs object detection using latest advanced algorithms, which are capable of processing light weight models with fast training in real time instances. For the scope of this paper, after researching on the various algorithms and comparing them, we finally implemented the Deep-Q-Networks (DQN) algorithm in the AirSim Simulator. In future works, we plan to implement further advanced self-driving and object detection algorithms, we also plan to implement voice-based speech recognition for the entire drone operation which would provide an option of speech communication between users (People) and the drone in the time of unavoidable circumstances. Thus, making drones an interactive intelligent Robotic Voice Enabled Service Assistant. This proposed drone has a wide scope of usability and is applicable in scenarios such as Disaster management, Air Transport of essentials, Agriculture, Manufacturing, Monitoring people movements in public area, and Defense. Also discussed, is the entire drone communication based on the satellite broadband Internet technology for faster computation and seamless communication service for uninterrupted network during disasters and remote location operations. This paper will explain the feasible algorithms required to go about achieving this goal and is more of a reference paper for future researchers going down this path. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=convolution%20neural%20network" title="convolution neural network">convolution neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20language%20processing" title=" natural language processing"> natural language processing</a>, <a href="https://publications.waset.org/abstracts/search?q=obstacle%20avoidance" title=" obstacle avoidance"> obstacle avoidance</a>, <a href="https://publications.waset.org/abstracts/search?q=satellite%20broadband%20technology" title=" satellite broadband technology"> satellite broadband technology</a>, <a href="https://publications.waset.org/abstracts/search?q=self-driving" title=" self-driving"> self-driving</a> </p> <a href="https://publications.waset.org/abstracts/137145/an-approach-to-autonomous-drones-using-deep-reinforcement-learning-and-object-detection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/137145.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">252</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3163</span> DQN for Navigation in Gazebo Simulator</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Xabier%20Olaz%20Moratinos">Xabier Olaz Moratinos</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drone navigation is critical, particularly during the initial phases, such as the initial ascension, where pilots may fail due to strong external interferences that could potentially lead to a crash. In this ongoing work, a drone has been successfully trained to perform an ascent of up to 6 meters at speeds with external disturbances pushing it up to 24 mph, with the DQN algorithm managing external forces affecting the system. It has been demonstrated that the system can control its height, position, and stability in all three axes (roll, pitch, and yaw) throughout the process. The learning process is carried out in the Gazebo simulator, which emulates interferences, while ROS is used to communicate with the agent. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title="machine learning">machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=DQN" title=" DQN"> DQN</a>, <a href="https://publications.waset.org/abstracts/search?q=gazebo" title=" gazebo"> gazebo</a>, <a href="https://publications.waset.org/abstracts/search?q=navigation" title=" navigation"> navigation</a> </p> <a href="https://publications.waset.org/abstracts/165698/dqn-for-navigation-in-gazebo-simulator" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/165698.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">114</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3162</span> Deep Q-Network for Navigation in Gazebo Simulator</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Xabier%20Olaz%20Moratinos">Xabier Olaz Moratinos</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drone navigation is critical, particularly during the initial phases, such as the initial ascension, where pilots may fail due to strong external interferences that could potentially lead to a crash. In this ongoing work, a drone has been successfully trained to perform an ascent of up to 6 meters at speeds with external disturbances pushing it up to 24 mph, with the DQN algorithm managing external forces affecting the system. It has been demonstrated that the system can control its height, position, and stability in all three axes (roll, pitch, and yaw) throughout the process. The learning process is carried out in the Gazebo simulator, which emulates interferences, while ROS is used to communicate with the agent. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title="machine learning">machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=DQN" title=" DQN"> DQN</a>, <a href="https://publications.waset.org/abstracts/search?q=Gazebo" title=" Gazebo"> Gazebo</a>, <a href="https://publications.waset.org/abstracts/search?q=navigation" title=" navigation"> navigation</a> </p> <a href="https://publications.waset.org/abstracts/165568/deep-q-network-for-navigation-in-gazebo-simulator" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/165568.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">80</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3161</span> Development of a Fire Analysis Drone for Smoke Toxicity Measurement for Fire Prediction and Management</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Gabrielle%20Peck">Gabrielle Peck</a>, <a href="https://publications.waset.org/abstracts/search?q=Ryan%20Hayes"> Ryan Hayes</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research presents the design and creation of a drone gas analyser, aimed at addressing the need for independent data collection and analysis of gas emissions during large-scale fires, particularly wasteland fires. The analyser drone, comprising a lightweight gas analysis system attached to a remote-controlled drone, enables the real-time assessment of smoke toxicity and the monitoring of gases released into the atmosphere during such incidents. The key components of the analyser unit included two gas line inlets connected to glass wool filters, a pump with regulated flow controlled by a mass flow controller, and electrochemical cells for detecting nitrogen oxides, hydrogen cyanide, and oxygen levels. Additionally, a non-dispersive infrared (NDIR) analyser is employed to monitor carbon monoxide (CO), carbon dioxide (CO₂), and hydrocarbon concentrations. Thermocouples can be attached to the analyser to monitor temperature, as well as McCaffrey probes combined with pressure transducers to monitor air velocity and wind direction. These additions allow for monitoring of the large fire and can be used for predictions of fire spread. The innovative system not only provides crucial data for assessing smoke toxicity but also contributes to fire prediction and management. The remote-controlled drone's mobility allows for safe and efficient data collection in proximity to the fire source, reducing the need for human exposure to hazardous conditions. The data obtained from the gas analyser unit facilitates informed decision-making by emergency responders, aiding in the protection of both human health and the environment. This abstract highlights the successful development of a drone gas analyser, illustrating its potential for enhancing smoke toxicity analysis and fire prediction capabilities. The integration of this technology into fire management strategies offers a promising solution for addressing the challenges associated with wildfires and other large-scale fire incidents. The project's methodology and results contribute to the growing body of knowledge in the field of environmental monitoring and safety, emphasizing the practical utility of drones for critical applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=fire%20prediction" title="fire prediction">fire prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=smoke%20toxicity" title=" smoke toxicity"> smoke toxicity</a>, <a href="https://publications.waset.org/abstracts/search?q=analyser" title=" analyser"> analyser</a>, <a href="https://publications.waset.org/abstracts/search?q=fire%20management" title=" fire management"> fire management</a> </p> <a href="https://publications.waset.org/abstracts/174836/development-of-a-fire-analysis-drone-for-smoke-toxicity-measurement-for-fire-prediction-and-management" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/174836.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">90</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3160</span> Making Creative Ethnography through Droned Mode of Engagements</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Elin%20Linder">Elin Linder</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Ethnographic endeavors feature a long history of creative modes of engagements, and anthropology an equally long critique of its disciplinary attention to worded representations of beyond worded experiences. Curious and critical as our research comes about, takes place, unfolds, and develops, processes of documenting, exploring, experiencing, and producing knowledge commonly evolve as intrinsic parts of our situated wishes to make sense of the worlds we study. We may imagine to do one thing and to use a specific mode of fieldnoting, only to end up doing something else, such as to capture dynamics and dimensions otherwise not attentively engaged or even lost. This paper builds on such an experience, and it acts window to open the conversation for doing and representing ethnographic work as creatively as it was undertaken. Expressively and actively undertaken by means of sensuous scholarship, fieldworking in the world of olivicoltura in Apulia intriguingly advanced into resourcefully embodied research using a drone. While the drone first and foremost allowed perspectives that one as a human is largely and physically incapable of exploring, it rapidly emerged into a mode of engagement that probed critical question how one comes to learn how to see that which one watches, listen to that which one hears, smell that which one scents, feel that which one touch, and gather that which one experience. This paper develops how the drone incorporated a transition of a particularly situated ethnographic sense of attention, all while visualizing how imaginative conceptualizations enable unexpected modes of multimodal knowing in much multisensorial worlds of being. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone" title="drone">drone</a>, <a href="https://publications.waset.org/abstracts/search?q=multimodality" title=" multimodality"> multimodality</a>, <a href="https://publications.waset.org/abstracts/search?q=sensuous%20scholarship" title=" sensuous scholarship"> sensuous scholarship</a>, <a href="https://publications.waset.org/abstracts/search?q=critical%20creativity" title=" critical creativity"> critical creativity</a>, <a href="https://publications.waset.org/abstracts/search?q=ethnographic%20practice" title=" ethnographic practice"> ethnographic practice</a> </p> <a href="https://publications.waset.org/abstracts/164969/making-creative-ethnography-through-droned-mode-of-engagements" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/164969.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">74</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3159</span> Risk Assessment for Aerial Package Delivery</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Haluk%20Eren">Haluk Eren</a>, <a href="https://publications.waset.org/abstracts/search?q=%C3%9Cmit%20%C3%87elik"> Ümit Çelik</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recent developments in unmanned aerial vehicles (UAVs) have begun to attract intense interest. UAVs started to use for many different applications from military to civilian use. Some online retailer and logistics companies are testing the UAV delivery. UAVs have great potentials to reduce cost and time of deliveries and responding to emergencies in a short time. Despite these great positive sides, just a few works have been done for routing of UAVs for package deliveries. As known, transportation of goods from one place to another may have many hazards on delivery route due to falling hazards that can be exemplified as ground objects or air obstacles. This situation refers to wide-range insurance concept. For this reason, deliveries that are made with drones get into the scope of shipping insurance. On the other hand, air traffic was taken into account in the absence of unmanned aerial vehicle. But now, it has been a reality for aerial fields. In this study, the main goal is to conduct risk analysis of package delivery services using drone, based on delivery routes. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aerial%20package%20delivery" title="aerial package delivery">aerial package delivery</a>, <a href="https://publications.waset.org/abstracts/search?q=insurance%20estimation" title=" insurance estimation"> insurance estimation</a>, <a href="https://publications.waset.org/abstracts/search?q=territory%20risk%20map" title=" territory risk map"> territory risk map</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a>, <a href="https://publications.waset.org/abstracts/search?q=route%20risk%20estimation" title=" route risk estimation"> route risk estimation</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20risk%20assessment" title=" drone risk assessment"> drone risk assessment</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20package%20delivery" title=" drone package delivery"> drone package delivery</a> </p> <a href="https://publications.waset.org/abstracts/75960/risk-assessment-for-aerial-package-delivery" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/75960.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">344</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3158</span> Swarm Optimization of Unmanned Vehicles and Object Localization</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Venkataramana%20Sovenahalli%20Badigar">Venkataramana Sovenahalli Badigar</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20M.%20Suryakanth"> B. M. Suryakanth</a>, <a href="https://publications.waset.org/abstracts/search?q=Akshar%20Prasanna"> Akshar Prasanna</a>, <a href="https://publications.waset.org/abstracts/search?q=Karthik%20Veeramalai"> Karthik Veeramalai</a>, <a href="https://publications.waset.org/abstracts/search?q=Vishwak%20Ram%20Vishwak%20Ram"> Vishwak Ram Vishwak Ram</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Technological advances have led to widespread autonomy in vehicles. Empowering these autonomous with the intelligence to cooperate amongst themselves leads to a more efficient use of the resources available to them. This paper proposes a demonstration of a swarm algorithm implemented on a group of autonomous vehicles. The demonstration involves two ground bots and an aerial drone which cooperate amongst them to locate an object of interest. The object of interest is modelled using a high-intensity light source which acts as a beacon. The ground bots are light sensitive and move towards the beacon. The ground bots and the drone traverse in random paths and jointly locate the beacon. This finds application in various scenarios in where human interference is difficult such as search and rescue during natural disasters, delivering crucial packages in perilous situations, etc. Experimental results show that the modified swarm algorithm implemented in this system has better performance compared to fully random based moving algorithm for object localization and tracking. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=swarm%20algorithm" title="swarm algorithm">swarm algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=object%20localization" title=" object localization"> object localization</a>, <a href="https://publications.waset.org/abstracts/search?q=ground%20bots" title=" ground bots"> ground bots</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=beacon" title=" beacon"> beacon</a> </p> <a href="https://publications.waset.org/abstracts/52839/swarm-optimization-of-unmanned-vehicles-and-object-localization" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/52839.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">257</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3157</span> Structural and Modal Analyses of an s1223 High-Lift Airfoil Wing for Drone Design</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Johnson%20Okoduwa%20Imumbhon">Johnson Okoduwa Imumbhon</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Didarul%20Alam"> Mohammad Didarul Alam</a>, <a href="https://publications.waset.org/abstracts/search?q=Yiding%20Cao"> Yiding Cao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Structural analyses are commonly employed to test the integrity of aircraft component systems in the design stage to demonstrate the capability of the structural components to withstand what it was designed for, as well as to predict potential failure of the components. The analyses are also essential for weight minimization and selecting the most resilient materials that will provide optimal outcomes. This research focuses on testing the structural nature of a high-lift low Reynolds number airfoil profile design, the Selig S1223, under certain loading conditions for a drone model application. The wing (ribs, spars, and skin) of the drone model was made of carbon fiber-reinforced polymer and designed in SolidWorks, while the finite element analysis was carried out in ANSYS mechanical in conjunction with the lift and drag forces that were derived from the aerodynamic airfoil analysis. Additionally, modal analysis was performed to calculate the natural frequencies and the mode shapes of the wing structure. The structural strain and stress determined the minimal deformations under the wing loading conditions, and the modal analysis showed the prominent modes that were excited by the given forces. The research findings from the structural analysis of the S1223 high-lift airfoil indicated that it is applicable for use in an unmanned aerial vehicle as well as a novel reciprocating-airfoil-driven vertical take-off and landing (VTOL) drone model. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=CFRP" title="CFRP">CFRP</a>, <a href="https://publications.waset.org/abstracts/search?q=finite%20element%20analysis" title=" finite element analysis"> finite element analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=high-lift" title=" high-lift"> high-lift</a>, <a href="https://publications.waset.org/abstracts/search?q=S1223" title=" S1223"> S1223</a>, <a href="https://publications.waset.org/abstracts/search?q=strain" title=" strain"> strain</a>, <a href="https://publications.waset.org/abstracts/search?q=stress" title=" stress"> stress</a>, <a href="https://publications.waset.org/abstracts/search?q=VTOL" title=" VTOL"> VTOL</a> </p> <a href="https://publications.waset.org/abstracts/134309/structural-and-modal-analyses-of-an-s1223-high-lift-airfoil-wing-for-drone-design" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/134309.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">230</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3156</span> Unmanned Aerial Vehicle Use for Emergency Purpose</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shah%20S.%20M.%20A.">Shah S. M. A.</a>, <a href="https://publications.waset.org/abstracts/search?q=Aftab%20U."> Aftab U.</a> </p> <p class="card-text"><strong>Abstract:</strong></p> It is imperative in today’s world to get a real time information about different emergency situation occurred in the environment. Helicopters are mostly used to access places which are hard to access in emergencies like earthquake, floods, bridge failure or in any other disasters conditions. Use of helicopters are considered more costly to properly collect the data. Therefore a new technique has been introduced in this research to promptly collect data using drones. The drone designed in this research is based on trial and error experimental work with objective to construct an economical drone. Locally available material have been used for this purpose. And a mobile camera were also attached to prepare video during the flight. It was found that within very limited resources the result were quite successful. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=UAV" title="UAV">UAV</a>, <a href="https://publications.waset.org/abstracts/search?q=real%20time" title=" real time"> real time</a>, <a href="https://publications.waset.org/abstracts/search?q=camera" title=" camera"> camera</a>, <a href="https://publications.waset.org/abstracts/search?q=disasters" title=" disasters"> disasters</a> </p> <a href="https://publications.waset.org/abstracts/79652/unmanned-aerial-vehicle-use-for-emergency-purpose" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/79652.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">239</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3155</span> Occupational Safety and Health in the Wake of Drones</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hoda%20Rahmani">Hoda Rahmani</a>, <a href="https://publications.waset.org/abstracts/search?q=Gary%20Weckman"> Gary Weckman</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The body of research examining the integration of drones into various industries is expanding rapidly. Despite progress made in addressing the cybersecurity concerns for commercial drones, knowledge deficits remain in determining potential occupational hazards and risks of drone use to employees’ well-being and health in the workplace. This creates difficulty in identifying key approaches to risk mitigation strategies and thus reflects the need for raising awareness among employers, safety professionals, and policymakers about workplace drone-related accidents. The purpose of this study is to investigate the prevalence of and possible risk factors for drone-related mishaps by comparing the application of drones in construction with manufacturing industries. The chief reason for considering these specific sectors is to ascertain whether there exists any significant difference between indoor and outdoor flights since most construction sites use drones outside and vice versa. Therefore, the current research seeks to examine the causes and patterns of workplace drone-related mishaps and suggest possible ergonomic interventions through data collection. Potential ergonomic practices to mitigate hazards associated with flying drones could include providing operators with professional pieces of training, conducting a risk analysis, and promoting the use of personal protective equipment. For the purpose of data analysis, two data mining techniques, the random forest and association rule mining algorithms, will be performed to find meaningful associations and trends in data as well as influential features that have an impact on the occurrence of drone-related accidents in construction and manufacturing sectors. In addition, Spearman’s correlation and chi-square tests will be used to measure the possible correlation between different variables. Indeed, by recognizing risks and hazards, occupational safety stakeholders will be able to pursue data-driven and evidence-based policy change with the aim of reducing drone mishaps, increasing productivity, creating a safer work environment, and extending human performance in safe and fulfilling ways. This research study was supported by the National Institute for Occupational Safety and Health through the Pilot Research Project Training Program of the University of Cincinnati Education and Research Center Grant #T42OH008432. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=commercial%20drones" title="commercial drones">commercial drones</a>, <a href="https://publications.waset.org/abstracts/search?q=ergonomic%20interventions" title=" ergonomic interventions"> ergonomic interventions</a>, <a href="https://publications.waset.org/abstracts/search?q=occupational%20safety" title=" occupational safety"> occupational safety</a>, <a href="https://publications.waset.org/abstracts/search?q=pattern%20recognition" title=" pattern recognition"> pattern recognition</a> </p> <a href="https://publications.waset.org/abstracts/142998/occupational-safety-and-health-in-the-wake-of-drones" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/142998.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">212</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=106">106</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=107">107</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20in%20business&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10