CINXE.COM

Search results for: drone audio signal

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: drone audio signal</title> <meta name="description" content="Search results for: drone audio signal"> <meta name="keywords" content="drone audio signal"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="drone audio signal" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="drone audio signal"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 2123</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: drone audio signal</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2123</span> Drone Classification Using Classification Methods Using Conventional Model With Embedded Audio-Visual Features</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hrishi%20Rakshit">Hrishi Rakshit</a>, <a href="https://publications.waset.org/abstracts/search?q=Pooneh%20Bagheri%20Zadeh"> Pooneh Bagheri Zadeh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper investigates the performance of drone classification methods using conventional DCNN with different hyperparameters, when additional drone audio data is embedded in the dataset for training and further classification. In this paper, first a custom dataset is created using different images of drones from University of South California (USC) datasets and Leeds Beckett university datasets with embedded drone audio signal. The three well-known DCNN architectures namely, Resnet50, Darknet53 and Shufflenet are employed over the created dataset tuning their hyperparameters such as, learning rates, maximum epochs, Mini Batch size with different optimizers. Precision-Recall curves and F1 Scores-Threshold curves are used to evaluate the performance of the named classification algorithms. Experimental results show that Resnet50 has the highest efficiency compared to other DCNN methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20classifications" title="drone classifications">drone classifications</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20convolutional%20neural%20network" title=" deep convolutional neural network"> deep convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=hyperparameters" title=" hyperparameters"> hyperparameters</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal" title=" drone audio signal"> drone audio signal</a> </p> <a href="https://publications.waset.org/abstracts/172929/drone-classification-using-classification-methods-using-conventional-model-with-embedded-audio-visual-features" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/172929.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">104</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2122</span> Musical Tesla Coil Controlled by an Audio Signal Processed in Matlab</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sandra%20Cuenca">Sandra Cuenca</a>, <a href="https://publications.waset.org/abstracts/search?q=Danilo%20Santana"> Danilo Santana</a>, <a href="https://publications.waset.org/abstracts/search?q=Anderson%20Reyes"> Anderson Reyes</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The following project is based on the manipulation of audio signals through the Matlab software, which has an audio signal that is modified, and its resultant obtained through the auxiliary port of the computer is passed through a signal amplifier whose amplified signal is connected to a tesla coil which has a behavior like a vumeter, the flashes at the output of the tesla coil increase and decrease its intensity depending on the audio signal in the computer and also the voltage source from which it is sent. The amplified signal then passes to the tesla coil being shown in the plasma sphere with the respective flashes; this activation is given through the specified parameters that we want to give in the MATLAB algorithm that contains the digital filters for the manipulation of our audio signal sent to the tesla coil to be displayed in a plasma sphere with flashes of the combination of colors commonly pink and purple that varies according to the tone of the song. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=auxiliary%20port" title="auxiliary port">auxiliary port</a>, <a href="https://publications.waset.org/abstracts/search?q=tesla%20coil" title=" tesla coil"> tesla coil</a>, <a href="https://publications.waset.org/abstracts/search?q=vumeter" title=" vumeter"> vumeter</a>, <a href="https://publications.waset.org/abstracts/search?q=plasma%20sphere" title=" plasma sphere"> plasma sphere</a> </p> <a href="https://publications.waset.org/abstracts/170874/musical-tesla-coil-controlled-by-an-audio-signal-processed-in-matlab" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/170874.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">90</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2121</span> Carrier Communication through Power Lines</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Pavuluri%20Gopikrishna">Pavuluri Gopikrishna</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20Neelima"> B. Neelima</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Power line carrier communication means audio power transmission via power line and reception of the amplified audio power at the receiver as in the form of speaker output signal using power line as the channel medium. The main objective of this suggested work is to transmit our message signal after frequency modulation by the help of FM modulator IC LM565 which gives output proportional to the input voltage of the input message signal. And this audio power is received from the power line by the help of isolation circuit and demodulated from IC LM565 which uses the concept of the PLL and produces FM demodulated signal to the listener. Message signal will be transmitted over the carrier signal that will be generated from the FM modulator IC LM565. Using this message signal will not damage because of no direct contact of message signal from the power line, but noise can disturb our information. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=amplification" title="amplification">amplification</a>, <a href="https://publications.waset.org/abstracts/search?q=fm%20demodulator%20ic%20565" title=" fm demodulator ic 565"> fm demodulator ic 565</a>, <a href="https://publications.waset.org/abstracts/search?q=fm%20modulator%20ic%20565" title=" fm modulator ic 565"> fm modulator ic 565</a>, <a href="https://publications.waset.org/abstracts/search?q=phase%20locked%20loop" title=" phase locked loop"> phase locked loop</a>, <a href="https://publications.waset.org/abstracts/search?q=power%20isolation" title=" power isolation"> power isolation</a> </p> <a href="https://publications.waset.org/abstracts/31017/carrier-communication-through-power-lines" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/31017.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">552</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2120</span> Embedded Electrochemistry with Miniaturized, Drone-Based, Potentiostat System for Remote Detection Chemical Warfare Agents</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amer%20Dawoud">Amer Dawoud</a>, <a href="https://publications.waset.org/abstracts/search?q=Jesy%20Motchaalangaram"> Jesy Motchaalangaram</a>, <a href="https://publications.waset.org/abstracts/search?q=Arati%20Biswakarma"> Arati Biswakarma</a>, <a href="https://publications.waset.org/abstracts/search?q=Wujan%20Mio"> Wujan Mio</a>, <a href="https://publications.waset.org/abstracts/search?q=Karl%20Wallace"> Karl Wallace</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The development of an embedded miniaturized drone-based system for remote detection of Chemical Warfare Agents (CWA) is proposed. The paper focuses on the software/hardware system design of the electrochemical Cyclic Voltammetry (CV) and Differential Pulse Voltammetry (DPV) signal processing for future deployment on drones. The paper summarizes the progress made towards hardware and electrochemical signal processing for signature detection of CWA. Also, the miniature potentiostat signal is validated by comparing it with the high-end lab potentiostat signal. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone-based" title="drone-based">drone-based</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20detection%20chemical%20warfare%20agents" title=" remote detection chemical warfare agents"> remote detection chemical warfare agents</a>, <a href="https://publications.waset.org/abstracts/search?q=miniaturized" title=" miniaturized"> miniaturized</a>, <a href="https://publications.waset.org/abstracts/search?q=potentiostat" title=" potentiostat"> potentiostat</a> </p> <a href="https://publications.waset.org/abstracts/145007/embedded-electrochemistry-with-miniaturized-drone-based-potentiostat-system-for-remote-detection-chemical-warfare-agents" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/145007.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">136</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2119</span> Musical Tesla Coil with Faraday Box Controlled by a GNU Radio</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jairo%20Vega">Jairo Vega</a>, <a href="https://publications.waset.org/abstracts/search?q=Fabian%20Chamba"> Fabian Chamba</a>, <a href="https://publications.waset.org/abstracts/search?q=Jordy%20Urgiles"> Jordy Urgiles</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this work, the implementation of a Matlabcontrolled Musical Tesla Coil and external audio signals was presented. First, the audio signal was obtained from a mobile device and processed in Matlab to modify it, adding noise or other desired effects. Then, the processed signal was passed through a preamplifier to increase its amplitude to a level suitable for further amplification through a power amplifier, which was part of the current driver circuit of the Tesla coil. To get the Tesla coil to generate music, a circuit capable of modulating and generating the audio signal by manipulating electrical discharges was used. To visualize and listen to these discharges, a small Faraday cage was built to attenuate the external electric fields. Finally, the implementation of the musical Tesla coil was concluded. However, it was observed that the audio signal volume was very low, and the components used heated up quickly. Due to these limitations, it was determined that the project could not be connected to power for long periods of time. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Tesla%20coil" title="Tesla coil">Tesla coil</a>, <a href="https://publications.waset.org/abstracts/search?q=plasma" title=" plasma"> plasma</a>, <a href="https://publications.waset.org/abstracts/search?q=electrical%20signals" title=" electrical signals"> electrical signals</a>, <a href="https://publications.waset.org/abstracts/search?q=GNU%20Radio" title=" GNU Radio"> GNU Radio</a> </p> <a href="https://publications.waset.org/abstracts/170861/musical-tesla-coil-with-faraday-box-controlled-by-a-gnu-radio" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/170861.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">97</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2118</span> Stakeholder Analysis of Agricultural Drone Policy: A Case Study of the Agricultural Drone Ecosystem of Thailand</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Thanomsin%20Chakreeves">Thanomsin Chakreeves</a>, <a href="https://publications.waset.org/abstracts/search?q=Atichat%20Preittigun"> Atichat Preittigun</a>, <a href="https://publications.waset.org/abstracts/search?q=Ajchara%20Phu-ang"> Ajchara Phu-ang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents a stakeholder analysis of agricultural drone policies that meet the government&#39;s goal of building an agricultural drone ecosystem in Thailand. Firstly, case studies from other countries are reviewed. The stakeholder analysis method and qualitative data from the interviews are then presented including data from the Institute of Innovation and Management, the Office of National Higher Education Science Research and Innovation Policy Council, agricultural entrepreneurs and farmers. Study and interview data are then employed to describe the current ecosystem and to guide the implementation of agricultural drone policies that are suitable for the ecosystem of Thailand. Finally, policy recommendations are then made that the Thai government should adopt in the future. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20public%20policy" title="drone public policy">drone public policy</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20ecosystem" title=" drone ecosystem"> drone ecosystem</a>, <a href="https://publications.waset.org/abstracts/search?q=policy%20development" title=" policy development"> policy development</a>, <a href="https://publications.waset.org/abstracts/search?q=agricultural%20drone" title=" agricultural drone"> agricultural drone</a> </p> <a href="https://publications.waset.org/abstracts/132133/stakeholder-analysis-of-agricultural-drone-policy-a-case-study-of-the-agricultural-drone-ecosystem-of-thailand" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/132133.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">147</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2117</span> Study on Acoustic Source Detection Performance Improvement of Microphone Array Installed on Drones Using Blind Source Separation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Youngsun%20Moon">Youngsun Moon</a>, <a href="https://publications.waset.org/abstracts/search?q=Yeong-Ju%20Go"> Yeong-Ju Go</a>, <a href="https://publications.waset.org/abstracts/search?q=Jong-Soo%20Choi"> Jong-Soo Choi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Most drones that currently have surveillance/reconnaissance missions are basically equipped with optical equipment, but we also need to use a microphone array to estimate the location of the acoustic source. This can provide additional information in the absence of optical equipment. The purpose of this study is to estimate Direction of Arrival (DOA) based on Time Difference of Arrival (TDOA) estimation of the acoustic source in the drone. The problem is that it is impossible to measure the clear target acoustic source because of the drone noise. To overcome this problem is to separate the drone noise and the target acoustic source using Blind Source Separation(BSS) based on Independent Component Analysis(ICA). ICA can be performed assuming that the drone noise and target acoustic source are independent and each signal has non-gaussianity. For maximized non-gaussianity each signal, we use Negentropy and Kurtosis based on probability theory. As a result, we can improve TDOA estimation and DOA estimation of the target source in the noisy environment. We simulated the performance of the DOA algorithm applying BSS algorithm, and demonstrated the simulation through experiment at the anechoic wind tunnel. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aeroacoustics" title="aeroacoustics">aeroacoustics</a>, <a href="https://publications.waset.org/abstracts/search?q=acoustic%20source%20detection" title=" acoustic source detection"> acoustic source detection</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20difference%20of%20arrival" title=" time difference of arrival"> time difference of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=direction%20of%20arrival" title=" direction of arrival"> direction of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=blind%20source%20separation" title=" blind source separation"> blind source separation</a>, <a href="https://publications.waset.org/abstracts/search?q=independent%20component%20analysis" title=" independent component analysis"> independent component analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a> </p> <a href="https://publications.waset.org/abstracts/94236/study-on-acoustic-source-detection-performance-improvement-of-microphone-array-installed-on-drones-using-blind-source-separation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94236.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">162</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2116</span> 3D Stereoscopic Measurements from AR Drone Squadron</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=R.%20Schurig">R. Schurig</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20D%C3%A9sesquelles"> T. Désesquelles</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Dumont"> A. Dumont</a>, <a href="https://publications.waset.org/abstracts/search?q=E.%20Lefranc"> E. Lefranc</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Lux"> A. Lux</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A cost-efficient alternative is proposed to the use of a single drone carrying multiple cameras in order to take stereoscopic images and videos during its flight. Such drone has to be particularly large enough to take off with its equipment, and stable enough in order to make valid measurements. Corresponding performance for a single aircraft usually comes with a large cost. Proposed solution consists in using multiple smaller and cheaper aircrafts carrying one camera each instead of a single expensive one. To give a proof of concept, AR drones, quad-rotor UAVs from Parrot Inc., are experimentally used. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20squadron" title="drone squadron">drone squadron</a>, <a href="https://publications.waset.org/abstracts/search?q=flight%20control" title=" flight control"> flight control</a>, <a href="https://publications.waset.org/abstracts/search?q=rotorcraft" title=" rotorcraft"> rotorcraft</a>, <a href="https://publications.waset.org/abstracts/search?q=Unmanned%20Aerial%20Vehicle%20%28UAV%29" title=" Unmanned Aerial Vehicle (UAV)"> Unmanned Aerial Vehicle (UAV)</a>, <a href="https://publications.waset.org/abstracts/search?q=AR%20drone" title=" AR drone"> AR drone</a>, <a href="https://publications.waset.org/abstracts/search?q=stereoscopic%20vision" title=" stereoscopic vision"> stereoscopic vision</a> </p> <a href="https://publications.waset.org/abstracts/17205/3d-stereoscopic-measurements-from-ar-drone-squadron" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/17205.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">473</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2115</span> A Study on the Improvement of Mobile Device Call Buzz Noise Caused by Audio Frequency Ground Bounce</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jangje%20Park">Jangje Park</a>, <a href="https://publications.waset.org/abstracts/search?q=So%20Young%20Kim"> So Young Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The market demand for audio quality in mobile devices continues to increase, and audible buzz noise generated in time division communication is a chronic problem that goes against the market demand. In the case of time division type communication, the RF Power Amplifier (RF PA) is driven at the audio frequency cycle, and it makes various influences on the audio signal. In this paper, we measured the ground bounce noise generated by the peak current flowing through the ground network in the RF PA with the audio frequency; it was confirmed that the noise is the cause of the audible buzz noise during a call. In addition, a grounding method of the microphone device that can improve the buzzing noise was proposed. Considering that the level of the audio signal generated by the microphone device is -38dBV based on 94dB Sound Pressure Level (SPL), even ground bounce noise of several hundred uV will fall within the range of audible noise if it is induced by the audio amplifier. Through the grounding method of the microphone device proposed in this paper, it was confirmed that the audible buzz noise power density at the RF PA driving frequency was improved by more than 5dB under the conditions of the Printed Circuit Board (PCB) used in the experiment. A fundamental improvement method was presented regarding the buzzing noise during a mobile phone call. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=audio%20frequency" title="audio frequency">audio frequency</a>, <a href="https://publications.waset.org/abstracts/search?q=buzz%20noise" title=" buzz noise"> buzz noise</a>, <a href="https://publications.waset.org/abstracts/search?q=ground%20bounce" title=" ground bounce"> ground bounce</a>, <a href="https://publications.waset.org/abstracts/search?q=microphone%20grounding" title=" microphone grounding"> microphone grounding</a> </p> <a href="https://publications.waset.org/abstracts/150713/a-study-on-the-improvement-of-mobile-device-call-buzz-noise-caused-by-audio-frequency-ground-bounce" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/150713.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">136</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2114</span> Comparison of Direction of Arrival Estimation Method for Drone Based on Phased Microphone Array</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jiwon%20Lee">Jiwon Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Yeong-Ju%20Go"> Yeong-Ju Go</a>, <a href="https://publications.waset.org/abstracts/search?q=Jong-Soo%20Choi"> Jong-Soo Choi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Drones were first developed for military use and were used in World War 1. But recently drones have been used in a variety of fields. Several companies actively utilize drone technology to strengthen their services, and in agriculture, drones are used for crop monitoring and sowing. Other people use drones for hobby activities such as photography. However, as the range of use of drones expands rapidly, problems caused by drones such as improperly flying, privacy and terrorism are also increasing. As the need for monitoring and tracking of drones increases, researches are progressing accordingly. The drone detection system estimates the position of the drone using the physical phenomena that occur when the drones fly. The drone detection system measures being developed utilize many approaches, such as radar, infrared camera, and acoustic detection systems. Among the various drone detection system, the acoustic detection system is advantageous in that the microphone array system is small, inexpensive, and easy to operate than other systems. In this paper, the acoustic signal is acquired by using minimum microphone when drone is flying, and direction of drone is estimated. When estimating the Direction of Arrival(DOA), there is a method of calculating the DOA based on the Time Difference of Arrival(TDOA) and a method of calculating the DOA based on the beamforming. The TDOA technique requires less number of microphones than the beamforming technique, but is weak in noisy environments and can only estimate the DOA of a single source. The beamforming technique requires more microphones than the TDOA technique. However, it is strong against the noisy environment and it is possible to simultaneously estimate the DOA of several drones. When estimating the DOA using acoustic signals emitted from the drone, it is impossible to measure the position of the drone, and only the direction can be estimated. To overcome this problem, in this work we show how to estimate the position of drones by arranging multiple microphone arrays. The microphone array used in the experiments was four tetrahedral microphones. We simulated the performance of each DOA algorithm and demonstrated the simulation results through experiments. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=acoustic%20sensing" title="acoustic sensing">acoustic sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=direction%20of%20arrival" title=" direction of arrival"> direction of arrival</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20detection" title=" drone detection"> drone detection</a>, <a href="https://publications.waset.org/abstracts/search?q=microphone%20array" title=" microphone array"> microphone array</a> </p> <a href="https://publications.waset.org/abstracts/94230/comparison-of-direction-of-arrival-estimation-method-for-drone-based-on-phased-microphone-array" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94230.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">160</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2113</span> A Research on the Benefits of Drone Usage in Industry by Determining Companies Using Drone in the World</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ahmet%20Akdemir">Ahmet Akdemir</a>, <a href="https://publications.waset.org/abstracts/search?q=G%C3%BCzide%20Karaku%C5%9F"> Güzide Karakuş</a>, <a href="https://publications.waset.org/abstracts/search?q=Leyla%20Polat"> Leyla Polat</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Aviation that has been arisen in accordance with flying request that is existing inside of people, has not only made life easier by making a great contribution to humanity; it has also accelerated globalization by reducing distances between countries. It is seen that the growth rate of aviation industry has reached the undreamed level when it is looked back on. Today, the last point in aviation is unmanned aerial vehicles that are self-ventilating and move in desired coordinates without any onboard pilot. For those vehicles, there are two different control systems are developed. In the first type of control, an unmanned aerial vehicle (UAV) moves according to instructions of a remote control. UAV that moves with a remote control is named as drone; it can be used personally. In the second one, there is a flight plan that is programmed and placed inside of UAV before flight. Recently, drones have started to be used in unimagined areas and utilize specific, important benefits for any industry. Within this framework, this study answers the question that is drone usage would be beneficial for businesses or not. To answer this question, applied basic methodologies are determining businesses using drone in the world, their purposes to use drone, and then, comparing their economy as before drone and after drone. In the end of this study, it is seen that many companies in different business areas use drone in logistics support, and it makes their work easier than before. This paper has contributed to academic literature about this subject, and it has introduced the benefits of drone usage for businesses. In addition, it has encouraged businesses that they keep pace with this technological age by following the developments about drones. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aviation" title="aviation">aviation</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20in%20business" title=" drone in business"> drone in business</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a> </p> <a href="https://publications.waset.org/abstracts/77049/a-research-on-the-benefits-of-drone-usage-in-industry-by-determining-companies-using-drone-in-the-world" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/77049.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">255</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2112</span> A Combined Feature Extraction and Thresholding Technique for Silence Removal in Percussive Sounds </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=B.%20Kishore%20Kumar">B. Kishore Kumar</a>, <a href="https://publications.waset.org/abstracts/search?q=Pogula%20Rakesh"> Pogula Rakesh</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20Kishore%20Kumar"> T. Kishore Kumar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The music analysis is a part of the audio content analysis used to analyze the music by using the different features of audio signal. In music analysis, the first step is to divide the music signal to different sections based on the feature profiles of the music signal. In this paper, we present a music segmentation technique that will effectively segmentize the signal and thresholding technique to remove silence from the percussive sounds produced by percussive instruments, which uses two features of music, namely signal energy and spectral centroid. The proposed method impose thresholds on both the features which will vary depends on the music signal. Depends on the threshold, silence part is removed and the segmentation is done. The effectiveness of the proposed method is analyzed using MATLAB. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=percussive%20sounds" title="percussive sounds">percussive sounds</a>, <a href="https://publications.waset.org/abstracts/search?q=spectral%20centroid" title=" spectral centroid"> spectral centroid</a>, <a href="https://publications.waset.org/abstracts/search?q=spectral%20energy" title=" spectral energy"> spectral energy</a>, <a href="https://publications.waset.org/abstracts/search?q=silence%20removal" title=" silence removal"> silence removal</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20extraction" title=" feature extraction"> feature extraction</a> </p> <a href="https://publications.waset.org/abstracts/25510/a-combined-feature-extraction-and-thresholding-technique-for-silence-removal-in-percussive-sounds" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/25510.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">593</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2111</span> Voice Signal Processing and Coding in MATLAB Generating a Plasma Signal in a Tesla Coil for a Security System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Juan%20Jimenez">Juan Jimenez</a>, <a href="https://publications.waset.org/abstracts/search?q=Erika%20Yambay"> Erika Yambay</a>, <a href="https://publications.waset.org/abstracts/search?q=Dayana%20Pilco"> Dayana Pilco</a>, <a href="https://publications.waset.org/abstracts/search?q=Brayan%20Parra"> Brayan Parra</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents an investigation of voice signal processing and coding using MATLAB, with the objective of generating a plasma signal on a Tesla coil within a security system. The approach focuses on using advanced voice signal processing techniques to encode and modulate the audio signal, which is then amplified and applied to a Tesla coil. The result is the creation of a striking visual effect of voice-controlled plasma with specific applications in security systems. The article explores the technical aspects of voice signal processing, the generation of the plasma signal, and its relationship to security. The implications and creative potential of this technology are discussed, highlighting its relevance at the forefront of research in signal processing and visual effect generation in the field of security systems. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=voice%20signal%20processing" title="voice signal processing">voice signal processing</a>, <a href="https://publications.waset.org/abstracts/search?q=voice%20signal%20coding" title=" voice signal coding"> voice signal coding</a>, <a href="https://publications.waset.org/abstracts/search?q=MATLAB" title=" MATLAB"> MATLAB</a>, <a href="https://publications.waset.org/abstracts/search?q=plasma%20signal" title=" plasma signal"> plasma signal</a>, <a href="https://publications.waset.org/abstracts/search?q=Tesla%20coil" title=" Tesla coil"> Tesla coil</a>, <a href="https://publications.waset.org/abstracts/search?q=security%20system" title=" security system"> security system</a>, <a href="https://publications.waset.org/abstracts/search?q=visual%20effects" title=" visual effects"> visual effects</a>, <a href="https://publications.waset.org/abstracts/search?q=audiovisual%20interaction" title=" audiovisual interaction"> audiovisual interaction</a> </p> <a href="https://publications.waset.org/abstracts/170828/voice-signal-processing-and-coding-in-matlab-generating-a-plasma-signal-in-a-tesla-coil-for-a-security-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/170828.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">93</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2110</span> Robust and Transparent Spread Spectrum Audio Watermarking</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ali%20Akbar%20Attari">Ali Akbar Attari</a>, <a href="https://publications.waset.org/abstracts/search?q=Ali%20Asghar%20Beheshti%20Shirazi"> Ali Asghar Beheshti Shirazi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we propose a blind and robust audio watermarking scheme based on spread spectrum in Discrete Wavelet Transform (DWT) domain. Watermarks are embedded in the low-frequency coefficients, which is less audible. The key idea is dividing the audio signal into small frames, and magnitude of the 6<sup>th</sup> level of DWT approximation coefficients is modifying based upon the Direct Sequence Spread Spectrum (DSSS) technique. Also, the psychoacoustic model for enhancing in imperceptibility, as well as Savitsky-Golay filter for increasing accuracy in extraction, is used. The experimental results illustrate high robustness against most common attacks, i.e. Gaussian noise addition, Low pass filter, Resampling, Requantizing, MP3 compression, without significant perceptual distortion (ODG is higher than -1). The proposed scheme has about 83 bps data payload. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=audio%20watermarking" title="audio watermarking">audio watermarking</a>, <a href="https://publications.waset.org/abstracts/search?q=spread%20spectrum" title=" spread spectrum"> spread spectrum</a>, <a href="https://publications.waset.org/abstracts/search?q=discrete%20wavelet%20transform" title=" discrete wavelet transform"> discrete wavelet transform</a>, <a href="https://publications.waset.org/abstracts/search?q=psychoacoustic" title=" psychoacoustic"> psychoacoustic</a>, <a href="https://publications.waset.org/abstracts/search?q=Savitsky-Golay%20filter" title=" Savitsky-Golay filter"> Savitsky-Golay filter</a> </p> <a href="https://publications.waset.org/abstracts/86040/robust-and-transparent-spread-spectrum-audio-watermarking" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86040.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">200</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2109</span> Controlling Drone Flight Missions through Natural Language Processors Using Artificial Intelligence</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sylvester%20Akpah">Sylvester Akpah</a>, <a href="https://publications.waset.org/abstracts/search?q=Selasi%20Vondee"> Selasi Vondee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Unmanned Aerial Vehicles (UAV) as they are also known, drones have attracted increasing attention in recent years due to their ubiquitous nature and boundless applications in the areas of communication, surveying, aerial photography, weather forecasting, medical delivery, surveillance amongst others. Operated remotely in real-time or pre-programmed, drones can fly autonomously or on pre-defined routes. The application of these aerial vehicles has successfully penetrated the world due to technological evolution, thus a lot more businesses are utilizing their capabilities. Unfortunately, while drones are replete with the benefits stated supra, they are riddled with some problems, mainly attributed to the complexities in learning how to master drone flights, collision avoidance and enterprise security. Additional challenges, such as the analysis of flight data recorded by sensors attached to the drone may take time and require expert help to analyse and understand. This paper presents an autonomous drone control system using a chatbot. The system allows for easy control of drones using conversations with the aid of Natural Language Processing, thus to reduce the workload needed to set up, deploy, control, and monitor drone flight missions. The results obtained at the end of the study revealed that the drone connected to the chatbot was able to initiate flight missions with just text and voice commands, enable conversation and give real-time feedback from data and requests made to the chatbot. The results further revealed that the system was able to process natural language and produced human-like conversational abilities using Artificial Intelligence (Natural Language Understanding). It is recommended that radio signal adapters be used instead of wireless connections thus to increase the range of communication with the aerial vehicle. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=artificial%20ntelligence" title="artificial ntelligence">artificial ntelligence</a>, <a href="https://publications.waset.org/abstracts/search?q=chatbot" title=" chatbot"> chatbot</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20language%20processing" title=" natural language processing"> natural language processing</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a> </p> <a href="https://publications.waset.org/abstracts/116870/controlling-drone-flight-missions-through-natural-language-processors-using-artificial-intelligence" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/116870.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">142</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2108</span> Spatial Audio Player Using Musical Genre Classification</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jun-Yong%20Lee">Jun-Yong Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Hyoung-Gook%20Kim"> Hyoung-Gook Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we propose a smart music player that combines the musical genre classification and the spatial audio processing. The musical genre is classified based on content analysis of the musical segment detected from the audio stream. In parallel with the classification, the spatial audio quality is achieved by adding an artificial reverberation in a virtual acoustic space to the input mono sound. Thereafter, the spatial sound is boosted with the given frequency gains based on the musical genre when played back. Experiments measured the accuracy of detecting the musical segment from the audio stream and its musical genre classification. A listening test was performed based on the virtual acoustic space based spatial audio processing. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=automatic%20equalization" title="automatic equalization">automatic equalization</a>, <a href="https://publications.waset.org/abstracts/search?q=genre%20classification" title=" genre classification"> genre classification</a>, <a href="https://publications.waset.org/abstracts/search?q=music%20segment%20detection" title=" music segment detection"> music segment detection</a>, <a href="https://publications.waset.org/abstracts/search?q=spatial%20audio%20processing" title=" spatial audio processing"> spatial audio processing</a> </p> <a href="https://publications.waset.org/abstracts/7561/spatial-audio-player-using-musical-genre-classification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/7561.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">429</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2107</span> Multiperson Drone Control with Seamless Pilot Switching Using Onboard Camera and Openpose Real-Time Keypoint Detection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Evan%20Lowhorn">Evan Lowhorn</a>, <a href="https://publications.waset.org/abstracts/search?q=Rocio%20Alba-Flores"> Rocio Alba-Flores</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Traditional classification Convolutional Neural Networks (CNN) attempt to classify an image in its entirety. This becomes problematic when trying to perform classification with a drone’s camera in real-time due to unpredictable backgrounds. Object detectors with bounding boxes can be used to isolate individuals and other items, but the original backgrounds remain within these boxes. These basic detectors have been regularly used to determine what type of object an item is, such as “person” or “dog.” Recent advancement in computer vision, particularly with human imaging, is keypoint detection. Human keypoint detection goes beyond bounding boxes to fully isolate humans and plot points, or Regions of Interest (ROI), on their bodies within an image. ROIs can include shoulders, elbows, knees, heads, etc. These points can then be related to each other and used in deep learning methods such as pose estimation. For drone control based on human motions, poses, or signals using the onboard camera, it is important to have a simple method for pilot identification among multiple individuals while also giving the pilot fine control options for the drone. To achieve this, the OpenPose keypoint detection network was used with body and hand keypoint detection enabled. OpenPose supports the ability to combine multiple keypoint detection methods in real-time with a single network. Body keypoint detection allows simple poses to act as the pilot identifier. The hand keypoint detection with ROIs for each finger can then offer a greater variety of signal options for the pilot once identified. For this work, the individual must raise their non-control arm to be identified as the operator and send commands with the hand on their other arm. The drone ignores all other individuals in the onboard camera feed until the current operator lowers their non-control arm. When another individual wish to operate the drone, they simply raise their arm once the current operator relinquishes control, and then they can begin controlling the drone with their other hand. This is all performed mid-flight with no landing or script editing required. When using a desktop with a discrete NVIDIA GPU, the drone’s 2.4 GHz Wi-Fi connection combined with OpenPose restrictions to only body and hand allows this control method to perform as intended while maintaining the responsiveness required for practical use. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=computer%20vision" title="computer vision">computer vision</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20control" title=" drone control"> drone control</a>, <a href="https://publications.waset.org/abstracts/search?q=keypoint%20detection" title=" keypoint detection"> keypoint detection</a>, <a href="https://publications.waset.org/abstracts/search?q=openpose" title=" openpose"> openpose</a> </p> <a href="https://publications.waset.org/abstracts/139752/multiperson-drone-control-with-seamless-pilot-switching-using-onboard-camera-and-openpose-real-time-keypoint-detection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139752.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">184</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2106</span> Safe Zone: A Framework for Detecting and Preventing Drones Misuse </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=AlHanoof%20A.%20Alharbi">AlHanoof A. Alharbi</a>, <a href="https://publications.waset.org/abstracts/search?q=Fatima%20M.%20Alamoudi"> Fatima M. Alamoudi</a>, <a href="https://publications.waset.org/abstracts/search?q=Razan%20A.%20Albrahim"> Razan A. Albrahim</a>, <a href="https://publications.waset.org/abstracts/search?q=Sarah%20F.%20Alharbi"> Sarah F. Alharbi</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdullah%20M%20Almuhaideb"> Abdullah M Almuhaideb</a>, <a href="https://publications.waset.org/abstracts/search?q=Norah%20A.%20Almubairik"> Norah A. Almubairik</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdulrahman%20Alharby"> Abdulrahman Alharby</a>, <a href="https://publications.waset.org/abstracts/search?q=Naya%20M.%20Nagy"> Naya M. Nagy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recently, drones received a rapid interest in different industries worldwide due to its powerful impact. However, limitations still exist in this emerging technology, especially privacy violation. These aircrafts consistently threaten the security of entities by entering restricted areas accidentally or deliberately. Therefore, this research project aims to develop drone detection and prevention mechanism to protect the restricted area. Until now, none of the solutions have met the optimal requirements of detection which are cost-effectiveness, high accuracy, long range, convenience, unaffected by noise and generalization. In terms of prevention, the existing methods are focusing on impractical solutions such as catching a drone by a larger drone, training an eagle or a gun. In addition, the practical solutions have limitations, such as the No-Fly Zone and PITBULL jammers. According to our study and analysis of previous related works, none of the solutions includes detection and prevention at the same time. The proposed solution is a combination of detection and prevention methods. To implement the detection system, a passive radar will be used to properly identify the drone against any possible flying objects. As for the prevention, jamming signals and forceful safe landing of the drone integrated together to stop the drone’s operation. We believe that applying this mechanism will limit the drone’s invasion of privacy incidents against highly restricted properties. Consequently, it effectively accelerates drones‘ usages at personal and governmental levels. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=detection" title="detection">detection</a>, <a href="https://publications.waset.org/abstracts/search?q=drone" title=" drone"> drone</a>, <a href="https://publications.waset.org/abstracts/search?q=jamming" title=" jamming"> jamming</a>, <a href="https://publications.waset.org/abstracts/search?q=prevention" title=" prevention"> prevention</a>, <a href="https://publications.waset.org/abstracts/search?q=privacy" title=" privacy"> privacy</a>, <a href="https://publications.waset.org/abstracts/search?q=RF" title=" RF"> RF</a>, <a href="https://publications.waset.org/abstracts/search?q=radar" title=" radar"> radar</a>, <a href="https://publications.waset.org/abstracts/search?q=UAV" title=" UAV"> UAV</a> </p> <a href="https://publications.waset.org/abstracts/106189/safe-zone-a-framework-for-detecting-and-preventing-drones-misuse" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/106189.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">211</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2105</span> A Power Management System for Indoor Micro-Drones in GPS-Denied Environments</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yendo%20Hu">Yendo Hu</a>, <a href="https://publications.waset.org/abstracts/search?q=Xu-Yu%20Wu"> Xu-Yu Wu</a>, <a href="https://publications.waset.org/abstracts/search?q=Dylan%20Oh"> Dylan Oh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> GPS-Denied drones open the possibility of indoor applications, including dynamic arial surveillance, inspection, safety enforcement, and discovery. Indoor swarming further enhances these applications in accuracy, robustness, operational time, and coverage. For micro-drones, power management becomes a critical issue, given the battery payload restriction. This paper proposes an application enabling battery replacement solution that extends the micro-drone active phase without human intervention. First, a framework to quantify the effectiveness of a power management solution for a drone fleet is proposed. The operation-to-non-operation ratio, ONR, gives one a quantitative benchmark to measure the effectiveness of a power management solution. Second, a survey was carried out to evaluate the ONR performance for the various solutions. Third, through analysis, this paper proposes a solution tailored to the indoor micro-drone, suitable for swarming applications. The proposed automated battery replacement solution, along with a modified micro-drone architecture, was implemented along with the associated micro-drone. Fourth, the system was tested and compared with the various solutions within the industry. Results show that the proposed solution achieves an ONR value of 31, which is a 1-fold improvement of the best alternative option. The cost analysis shows a manufacturing cost of $25, which makes this approach viable for cost-sensitive markets (e.g., consumer). Further challenges remain in the area of drone design for automated battery replacement, landing pad/drone production, high-precision landing control, and ONR improvements. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=micro-drone" title="micro-drone">micro-drone</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20swap" title=" battery swap"> battery swap</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20replacement" title=" battery replacement"> battery replacement</a>, <a href="https://publications.waset.org/abstracts/search?q=battery%20recharge" title=" battery recharge"> battery recharge</a>, <a href="https://publications.waset.org/abstracts/search?q=landing%20pad" title=" landing pad"> landing pad</a>, <a href="https://publications.waset.org/abstracts/search?q=power%20management" title=" power management"> power management</a> </p> <a href="https://publications.waset.org/abstracts/171391/a-power-management-system-for-indoor-micro-drones-in-gps-denied-environments" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/171391.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">119</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2104</span> Atomic Decomposition Audio Data Compression and Denoising Using Sparse Dictionary Feature Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=T.%20Bryan">T. Bryan </a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20Kepuska"> V. Kepuska</a>, <a href="https://publications.waset.org/abstracts/search?q=I.%20Kostnaic"> I. Kostnaic</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A method of data compression and denoising is introduced that is based on atomic decomposition of audio data using “basis vectors” that are learned from the audio data itself. The basis vectors are shown to have higher data compression and better signal-to-noise enhancement than the Gabor and gammatone “seed atoms” that were used to generate them. The basis vectors are the input weights of a Sparse AutoEncoder (SAE) that is trained using “envelope samples” of windowed segments of the audio data. The envelope samples are extracted from the audio data by performing atomic decomposition with Gabor or gammatone seed atoms. This process identifies segments of audio data that are locally coherent with the seed atoms. Envelope samples are extracted by identifying locally coherent audio data segments with Gabor or gammatone seed atoms, found by matching pursuit. The envelope samples are formed by taking the kronecker products of the atomic envelopes with the locally coherent data segments. Oracle signal-to-noise ratio (SNR) verses data compression curves are generated for the seed atoms as well as the basis vectors learned from Gabor and gammatone seed atoms. SNR data compression curves are generated for speech signals as well as early American music recordings. The basis vectors are shown to have higher denoising capability for data compression rates ranging from 90% to 99.84% for speech as well as music. Envelope samples are displayed as images by folding the time series into column vectors. This display method is used to compare of the output of the SAE with the envelope samples that produced them. The basis vectors are also displayed as images. Sparsity is shown to play an important role in producing the highest denoising basis vectors. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=sparse%20dictionary%20learning" title="sparse dictionary learning">sparse dictionary learning</a>, <a href="https://publications.waset.org/abstracts/search?q=autoencoder" title=" autoencoder"> autoencoder</a>, <a href="https://publications.waset.org/abstracts/search?q=sparse%20autoencoder" title=" sparse autoencoder"> sparse autoencoder</a>, <a href="https://publications.waset.org/abstracts/search?q=basis%20vectors" title=" basis vectors"> basis vectors</a>, <a href="https://publications.waset.org/abstracts/search?q=atomic%20decomposition" title=" atomic decomposition"> atomic decomposition</a>, <a href="https://publications.waset.org/abstracts/search?q=envelope%20sampling" title=" envelope sampling"> envelope sampling</a>, <a href="https://publications.waset.org/abstracts/search?q=envelope%20samples" title=" envelope samples"> envelope samples</a>, <a href="https://publications.waset.org/abstracts/search?q=Gabor" title=" Gabor"> Gabor</a>, <a href="https://publications.waset.org/abstracts/search?q=gammatone" title=" gammatone"> gammatone</a>, <a href="https://publications.waset.org/abstracts/search?q=matching%20pursuit" title=" matching pursuit"> matching pursuit</a> </p> <a href="https://publications.waset.org/abstracts/42586/atomic-decomposition-audio-data-compression-and-denoising-using-sparse-dictionary-feature-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/42586.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">253</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2103</span> Should the U.S. Rely on Drone Strikes to Combat the Islamic State? Why Deploying a Drone Campaign against ISIS Will Do Nothing to Address the Causes of the Insurgency or Prevent Its Resurgence?</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Danielle%20Jablanski">Danielle Jablanski</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This article addresses the use of drone strikes under international law and the intersection between Islamic law and current terrorist trends worldwide. It breaks down the legality of drone strikes under international law and dissects certain aspects of their usage in modern warfare; i.e. concepts of directly participating in hostilities and the role of CIA operators. The article then looks at international paradigms of law enforcement versus the use of military force in relation to terrorism. Lastly, it describes traditional aspects of Islamic law and several interpretations of the law today as applied to widespread campaigns of terrorism, namely that of the recent group ISIS or ISIL operating between the battlegrounds of Iraq and Syria. The piece concludes with appraisals for moving forward on the basis of honing in on reasons for terrorism and negative opinions of solely military campaigns to dismantle or disrupt terror organizations and breeding grounds. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=international%20law" title="international law">international law</a>, <a href="https://publications.waset.org/abstracts/search?q=terrorism" title=" terrorism"> terrorism</a>, <a href="https://publications.waset.org/abstracts/search?q=ISIS" title=" ISIS"> ISIS</a>, <a href="https://publications.waset.org/abstracts/search?q=islamic%20law" title=" islamic law"> islamic law</a> </p> <a href="https://publications.waset.org/abstracts/24847/should-the-us-rely-on-drone-strikes-to-combat-the-islamic-state-why-deploying-a-drone-campaign-against-isis-will-do-nothing-to-address-the-causes-of-the-insurgency-or-prevent-its-resurgence" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/24847.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">475</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2102</span> Mathematical Model That Using Scrambling and Message Integrity Methods in Audio Steganography</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohammed%20Salem%20Atoum">Mohammed Salem Atoum</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The success of audio steganography is to ensure imperceptibility of the embedded message in stego file and withstand any form of intentional or un-intentional degradation of message (robustness). Audio steganographic that utilized LSB of audio stream to embed message gain a lot of popularity over the years in meeting the perceptual transparency, robustness and capacity. This research proposes an XLSB technique in order to circumvent the weakness observed in LSB technique. Scrambling technique is introduce in two steps; partitioning the message into blocks followed by permutation each blocks in order to confuse the contents of the message. The message is embedded in the MP3 audio sample. After extracting the message, the permutation codebook is used to re-order it into its original form. Md5sum and SHA-256 are used to verify whether the message is altered or not during transmission. Experimental result shows that the XLSB performs better than LSB. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=XLSB" title="XLSB">XLSB</a>, <a href="https://publications.waset.org/abstracts/search?q=scrambling" title=" scrambling"> scrambling</a>, <a href="https://publications.waset.org/abstracts/search?q=audio%20steganography" title=" audio steganography"> audio steganography</a>, <a href="https://publications.waset.org/abstracts/search?q=security" title=" security"> security</a> </p> <a href="https://publications.waset.org/abstracts/42449/mathematical-model-that-using-scrambling-and-message-integrity-methods-in-audio-steganography" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/42449.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">363</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2101</span> Freedom of Expression and Its Restriction in Audiovisual Media</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sevil%20Yildiz">Sevil Yildiz</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Audio visual communication is a type of collective expression. Collective expression activity informs the masses, gives direction to opinions and establishes public opinion. Due to these characteristics, audio visual communication must be subjected to special restrictions. This has been stipulated in both the Constitution and the European Human Rights Agreement. This paper aims to review freedom of expression and its restriction in audio visual media. For this purpose, the authorisation of the Radio and Television Supreme Council to impose sanctions as an independent administrative authority empowered to regulate the field of audio visual communication has been reviewed with regard to freedom of expression and its limits. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=audio%20visual%20media" title="audio visual media">audio visual media</a>, <a href="https://publications.waset.org/abstracts/search?q=freedom%20of%20expression" title=" freedom of expression"> freedom of expression</a>, <a href="https://publications.waset.org/abstracts/search?q=its%20limits" title=" its limits"> its limits</a>, <a href="https://publications.waset.org/abstracts/search?q=radio%20and%20television%20supreme%20council" title=" radio and television supreme council"> radio and television supreme council</a> </p> <a href="https://publications.waset.org/abstracts/39325/freedom-of-expression-and-its-restriction-in-audiovisual-media" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/39325.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">326</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2100</span> Subband Coding and Glottal Closure Instant (GCI) Using SEDREAMS Algorithm</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Harisudha%20Kuresan">Harisudha Kuresan</a>, <a href="https://publications.waset.org/abstracts/search?q=Dhanalakshmi%20Samiappan"> Dhanalakshmi Samiappan</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20Rama%20Rao"> T. Rama Rao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In modern telecommunication applications, Glottal Closure Instants location finding is important and is directly evaluated from the speech waveform. Here, we study the GCI using Speech Event Detection using Residual Excitation and the Mean Based Signal (SEDREAMS) algorithm. Speech coding uses parameter estimation using audio signal processing techniques to model the speech signal combined with generic data compression algorithms to represent the resulting modeled in a compact bit stream. This paper proposes a sub-band coder SBC, which is a type of transform coding and its performance for GCI detection using SEDREAMS are evaluated. In SBCs code in the speech signal is divided into two or more frequency bands and each of these sub-band signal is coded individually. The sub-bands after being processed are recombined to form the output signal, whose bandwidth covers the whole frequency spectrum. Then the signal is decomposed into low and high-frequency components and decimation and interpolation in frequency domain are performed. The proposed structure significantly reduces error, and precise locations of Glottal Closure Instants (GCIs) are found using SEDREAMS algorithm. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=SEDREAMS" title="SEDREAMS">SEDREAMS</a>, <a href="https://publications.waset.org/abstracts/search?q=GCI" title=" GCI"> GCI</a>, <a href="https://publications.waset.org/abstracts/search?q=SBC" title=" SBC"> SBC</a>, <a href="https://publications.waset.org/abstracts/search?q=GOI" title=" GOI"> GOI</a> </p> <a href="https://publications.waset.org/abstracts/56336/subband-coding-and-glottal-closure-instant-gci-using-sedreams-algorithm" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/56336.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">356</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2099</span> Drone Swarm Routing and Scheduling for Off-shore Wind Turbine Blades Inspection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohanad%20Al-Behadili">Mohanad Al-Behadili</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiang%20Song"> Xiang Song</a>, <a href="https://publications.waset.org/abstracts/search?q=Djamila%20Ouelhadj"> Djamila Ouelhadj</a>, <a href="https://publications.waset.org/abstracts/search?q=Alex%20Fraess-Ehrfeld"> Alex Fraess-Ehrfeld</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In off-shore wind farms, turbine blade inspection accessibility under various sea states is very challenging and greatly affects the downtime of wind turbines. Maintenance of any offshore system is not an easy task due to the restricted logistics and accessibility. The multirotor unmanned helicopter is of increasing interest in inspection applications due to its manoeuvrability and payload capacity. These advantages increase when many of them are deployed simultaneously in a swarm. Hence this paper proposes a drone swarm framework for inspecting offshore wind turbine blades and nacelles so as to reduce downtime. One of the big challenges of this task is that when operating a drone swarm, an individual drone may not have enough power to fly and communicate during missions and it has no capability of refueling due to its small size. Once the drone power is drained, there are no signals transmitted and the links become intermittent. Vessels equipped with 5G masts and small power units are utilised as platforms for drones to recharge/swap batteries. The research work aims at designing a smart energy management system, which provides automated vessel and drone routing and recharging plans. To achieve this goal, a novel mathematical optimisation model is developed with the main objective of minimising the number of drones and vessels, which carry the charging stations, and the downtime of the wind turbines. There are a number of constraints to be considered, such as each wind turbine must be inspected once and only once by one drone; each drone can inspect at most one wind turbine after recharging, then fly back to the charging station; collision should be avoided during the drone flying; all wind turbines in the wind farm should be inspected within the given time window. We have developed a real-time Ant Colony Optimisation (ACO) algorithm to generate real-time and near-optimal solutions to the drone swarm routing problem. The schedule will generate efficient and real-time solutions to indicate the inspection tasks, time windows, and the optimal routes of the drones to access the turbines. Experiments are conducted to evaluate the quality of the solutions generated by ACO. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone%20swarm" title="drone swarm">drone swarm</a>, <a href="https://publications.waset.org/abstracts/search?q=routing" title=" routing"> routing</a>, <a href="https://publications.waset.org/abstracts/search?q=scheduling" title=" scheduling"> scheduling</a>, <a href="https://publications.waset.org/abstracts/search?q=optimisation%20model" title=" optimisation model"> optimisation model</a>, <a href="https://publications.waset.org/abstracts/search?q=ant%20colony%20optimisation" title=" ant colony optimisation"> ant colony optimisation</a> </p> <a href="https://publications.waset.org/abstracts/141935/drone-swarm-routing-and-scheduling-for-off-shore-wind-turbine-blades-inspection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/141935.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">265</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2098</span> The Influence of Audio on Perceived Quality of Segmentation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Silvio%20Ricardo%20Rodrigues%20Sanches">Silvio Ricardo Rodrigues Sanches</a>, <a href="https://publications.waset.org/abstracts/search?q=Bianca%20Cogo%20Barbosa"> Bianca Cogo Barbosa</a>, <a href="https://publications.waset.org/abstracts/search?q=Beatriz%20Regina%20Brum"> Beatriz Regina Brum</a>, <a href="https://publications.waset.org/abstracts/search?q=Cl%C3%A9ber%20Gimenez%20Corr%C3%AAa"> Cléber Gimenez Corrêa</a> </p> <p class="card-text"><strong>Abstract:</strong></p> To evaluate the quality of a segmentation algorithm, the authors use subjective or objective metrics. Although subjective metrics are more accurate than objective ones, objective metrics do not require user feedback to test an algorithm. Objective metrics require subjective experiments only during their development. Subjective experiments typically display to users some videos (generated from frames with segmentation errors) that simulate the environment of an application domain. This user feedback is crucial information for metric definition. In the subjective experiments applied to develop some state-of-the-art metrics used to test segmentation algorithms, the videos displayed during the experiments did not contain audio. Audio is an essential component in applications such as videoconference and augmented reality. If the audio influences the user’s perception, using only videos without audio in subjective experiments can compromise the efficiency of an objective metric generated using data from these experiments. This work aims to identify if the audio influences the user’s perception of segmentation quality in background substitution applications with audio. The proposed approach used a subjective method based on formal video quality assessment methods. The results showed that audio influences the quality of segmentation perceived by a user. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=background%20substitution" title="background substitution">background substitution</a>, <a href="https://publications.waset.org/abstracts/search?q=influence%20of%20audio" title=" influence of audio"> influence of audio</a>, <a href="https://publications.waset.org/abstracts/search?q=segmentation%20evaluation" title=" segmentation evaluation"> segmentation evaluation</a>, <a href="https://publications.waset.org/abstracts/search?q=segmentation%20quality" title=" segmentation quality"> segmentation quality</a> </p> <a href="https://publications.waset.org/abstracts/148456/the-influence-of-audio-on-perceived-quality-of-segmentation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/148456.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">117</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2097</span> Audio Information Retrieval in Mobile Environment with Fast Audio Classifier</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bruno%20T.%20Gomes">Bruno T. Gomes</a>, <a href="https://publications.waset.org/abstracts/search?q=Jos%C3%A9%20A.%20Menezes"> José A. Menezes</a>, <a href="https://publications.waset.org/abstracts/search?q=Giordano%20Cabral"> Giordano Cabral</a> </p> <p class="card-text"><strong>Abstract:</strong></p> With the popularity of smartphones, mobile apps emerge to meet the diverse needs, however the resources at the disposal are limited, either by the hardware, due to the low computing power, or the software, that does not have the same robustness of desktop environment. For example, in automatic audio classification (AC) tasks, musical information retrieval (MIR) subarea, is required a fast processing and a good success rate. However the mobile platform has limited computing power and the best AC tools are only available for desktop. To solve these problems the fast classifier suits, to mobile environments, the most widespread MIR technologies, seeking a balance in terms of speed and robustness. At the end we found that it is possible to enjoy the best of MIR for mobile environments. This paper presents the results obtained and the difficulties encountered. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=audio%20classification" title="audio classification">audio classification</a>, <a href="https://publications.waset.org/abstracts/search?q=audio%20extraction" title=" audio extraction"> audio extraction</a>, <a href="https://publications.waset.org/abstracts/search?q=environment%20mobile" title=" environment mobile"> environment mobile</a>, <a href="https://publications.waset.org/abstracts/search?q=musical%20information%20retrieval" title=" musical information retrieval"> musical information retrieval</a> </p> <a href="https://publications.waset.org/abstracts/36642/audio-information-retrieval-in-mobile-environment-with-fast-audio-classifier" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/36642.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">545</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2096</span> Genetic Algorithms for Feature Generation in the Context of Audio Classification</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jos%C3%A9%20A.%20Menezes">José A. Menezes</a>, <a href="https://publications.waset.org/abstracts/search?q=Giordano%20Cabral"> Giordano Cabral</a>, <a href="https://publications.waset.org/abstracts/search?q=Bruno%20T.%20Gomes"> Bruno T. Gomes</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Choosing good features is an essential part of machine learning. Recent techniques aim to automate this process. For instance, feature learning intends to learn the transformation of raw data into a useful representation to machine learning tasks. In automatic audio classification tasks, this is interesting since the audio, usually complex information, needs to be transformed into a computationally convenient input to process. Another technique tries to generate features by searching a feature space. Genetic algorithms, for instance, have being used to generate audio features by combining or modifying them. We find this approach particularly interesting and, despite the undeniable advances of feature learning approaches, we wanted to take a step forward in the use of genetic algorithms to find audio features, combining them with more conventional methods, like PCA, and inserting search control mechanisms, such as constraints over a confusion matrix. This work presents the results obtained on particular audio classification problems. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=feature%20generation" title="feature generation">feature generation</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20learning" title=" feature learning"> feature learning</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title=" genetic algorithm"> genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=music%20information%20retrieval" title=" music information retrieval"> music information retrieval</a> </p> <a href="https://publications.waset.org/abstracts/36638/genetic-algorithms-for-feature-generation-in-the-context-of-audio-classification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/36638.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">435</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2095</span> Mood Recognition Using Indian Music</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vishwa%20Joshi">Vishwa Joshi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The study of mood recognition in the field of music has gained a lot of momentum in the recent years with machine learning and data mining techniques and many audio features contributing considerably to analyze and identify the relation of mood plus music. In this paper we consider the same idea forward and come up with making an effort to build a system for automatic recognition of mood underlying the audio song’s clips by mining their audio features and have evaluated several data classification algorithms in order to learn, train and test the model describing the moods of these audio songs and developed an open source framework. Before classification, Preprocessing and Feature Extraction phase is necessary for removing noise and gathering features respectively. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=music" title="music">music</a>, <a href="https://publications.waset.org/abstracts/search?q=mood" title=" mood"> mood</a>, <a href="https://publications.waset.org/abstracts/search?q=features" title=" features"> features</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a> </p> <a href="https://publications.waset.org/abstracts/24275/mood-recognition-using-indian-music" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/24275.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">498</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2094</span> Design of a Surveillance Drone with Computer Aided Durability</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Maram%20Shahad%20Dana%20Anfal">Maram Shahad Dana Anfal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research paper presents the design of a surveillance drone with computer-aided durability and model analyses that provides a cost-effective and efficient solution for various applications. The quadcopter's design is based on a lightweight and strong structure made of materials such as aluminum and titanium, which provide a durable structure for the quadcopter. The structure of this product and the computer-aided durability system are both designed to ensure frequent repairs or replacements, which will save time and money in the long run. Moreover, the study discusses the drone's ability to track, investigate, and deliver objects more quickly than traditional methods, makes it a highly efficient and cost-effective technology. In this paper, a comprehensive analysis of the quadcopter's operation dynamics and limitations is presented. In both simulation and experimental data, the computer-aided durability system and the drone's design demonstrate their effectiveness, highlighting the potential for a variety of applications, such as search and rescue missions, infrastructure monitoring, and agricultural operations. Also, the findings provide insights into possible areas for improvement in the design and operation of the drone. Ultimately, this paper presents a reliable and cost-effective solution for surveillance applications by designing a drone with computer-aided durability and modeling. With its potential to save time and money, increase reliability, and enhance safety, it is a promising technology for the future of surveillance drones. operation dynamic equations have been evaluated successfully for different flight conditions of a quadcopter. Also, CAE modeling techniques have been applied for the modal risk assessment at operating conditions.Stress analysis have been performed under the loadings of the worst-case combined motion flight conditions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=drone" title="drone">drone</a>, <a href="https://publications.waset.org/abstracts/search?q=material" title=" material"> material</a>, <a href="https://publications.waset.org/abstracts/search?q=solidwork" title=" solidwork"> solidwork</a>, <a href="https://publications.waset.org/abstracts/search?q=hypermesh" title=" hypermesh"> hypermesh</a> </p> <a href="https://publications.waset.org/abstracts/167463/design-of-a-surveillance-drone-with-computer-aided-durability" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/167463.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">144</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=70">70</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=71">71</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=drone%20audio%20signal&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10