CINXE.COM
Search results for: digital aerial camera
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: digital aerial camera</title> <meta name="description" content="Search results for: digital aerial camera"> <meta name="keywords" content="digital aerial camera"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="digital aerial camera" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="digital aerial camera"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 3665</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: digital aerial camera</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3665</span> An Investigation of Direct and Indirect Geo-Referencing Techniques on the Accuracy of Points in Photogrammetry</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=F.%20Yildiz">F. Yildiz</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Y.%20Oturanc"> S. Y. Oturanc</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Advances technology in the field of photogrammetry replaces analog cameras with reflection on aircraft GPS/IMU system with a digital aerial camera. In this system, when determining the position of the camera with the GPS, camera rotations are also determined by the IMU systems. All around the world, digital aerial cameras have been used for the photogrammetry applications in the last ten years. In this way, in terms of the work done in photogrammetry it is possible to use time effectively, costs to be reduced to a minimum level, the opportunity to make fast and accurate. Geo-referencing techniques that are the cornerstone of the GPS / INS systems, photogrammetric triangulation of images required for balancing (interior and exterior orientation) brings flexibility to the process. Also geo-referencing process; needed in the application of photogrammetry targets to help to reduce the number of ground control points. In this study, the use of direct and indirect geo-referencing techniques on the accuracy of the points was investigated in the production of photogrammetric mapping. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=photogrammetry" title="photogrammetry">photogrammetry</a>, <a href="https://publications.waset.org/abstracts/search?q=GPS%2FIMU%20systems" title=" GPS/IMU systems"> GPS/IMU systems</a>, <a href="https://publications.waset.org/abstracts/search?q=geo-referecing" title=" geo-referecing"> geo-referecing</a>, <a href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera" title=" digital aerial camera"> digital aerial camera</a> </p> <a href="https://publications.waset.org/abstracts/13852/an-investigation-of-direct-and-indirect-geo-referencing-techniques-on-the-accuracy-of-points-in-photogrammetry" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13852.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">411</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3664</span> Study on Construction of 3D Topography by UAV-Based Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yun-Yao%20Chi">Yun-Yao Chi</a>, <a href="https://publications.waset.org/abstracts/search?q=Chieh-Kai%20Tsai"> Chieh-Kai Tsai</a>, <a href="https://publications.waset.org/abstracts/search?q=Dai-Ling%20Li"> Dai-Ling Li</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, a method of fast 3D topography modeling using the high-resolution camera images is studied based on the characteristics of Unmanned Aerial Vehicle (UAV) system for low altitude aerial photogrammetry and the need of three dimensional (3D) urban landscape modeling. Firstly, the existing high-resolution digital camera with special design of overlap images is designed by reconstructing and analyzing the auto-flying paths of UAVs, which improves the self-calibration function to achieve the high precision imaging by software, and further increased the resolution of the imaging system. Secondly, several-angle images including vertical images and oblique images gotten by the UAV system are used for the detail measure of urban land surfaces and the texture extraction. Finally, the aerial photography and 3D topography construction are both developed in campus of Chang-Jung University and in Guerin district area in Tainan, Taiwan, provide authentication model for construction of 3D topography based on combined UAV-based camera images from system. The results demonstrated that the UAV system for low altitude aerial photogrammetry can be used in the construction of 3D topography production, and the technology solution in this paper offers a new, fast, and technical plan for the 3D expression of the city landscape, fine modeling and visualization. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=3D" title="3D">3D</a>, <a href="https://publications.waset.org/abstracts/search?q=topography" title=" topography"> topography</a>, <a href="https://publications.waset.org/abstracts/search?q=UAV" title=" UAV"> UAV</a>, <a href="https://publications.waset.org/abstracts/search?q=images" title=" images"> images</a> </p> <a href="https://publications.waset.org/abstracts/82548/study-on-construction-of-3d-topography-by-uav-based-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/82548.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">303</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3663</span> A Low-Cost Vision-Based Unmanned Aerial System for Extremely Low-Light GPS-Denied Navigation and Thermal Imaging</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Chang%20Liu">Chang Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20Nash"> John Nash</a>, <a href="https://publications.waset.org/abstracts/search?q=Stephen%20D.%20Prior"> Stephen D. Prior</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents the design and implementation details of a complete unmanned aerial system (UAS) based on commercial-off-the-shelf (COTS) components, focusing on safety, security, search and rescue scenarios in GPS-denied environments. In particular, the aerial platform is capable of semi-autonomously navigating through extremely low-light, GPS-denied indoor environments based on onboard sensors only, including a downward-facing optical flow camera. Besides, an additional low-cost payload camera system is developed to stream both infrared video and visible light video to a ground station in real-time, for the purpose of detecting sign of life and hidden humans. The total cost of the complete system is estimated to be $1150, and the effectiveness of the system has been tested and validated in practical scenarios. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20system" title="unmanned aerial system">unmanned aerial system</a>, <a href="https://publications.waset.org/abstracts/search?q=commercial-off-the-shelf" title=" commercial-off-the-shelf"> commercial-off-the-shelf</a>, <a href="https://publications.waset.org/abstracts/search?q=extremely%20low-light" title=" extremely low-light"> extremely low-light</a>, <a href="https://publications.waset.org/abstracts/search?q=GPS-denied" title=" GPS-denied"> GPS-denied</a>, <a href="https://publications.waset.org/abstracts/search?q=optical%20flow" title=" optical flow"> optical flow</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20video" title=" infrared video"> infrared video</a> </p> <a href="https://publications.waset.org/abstracts/37927/a-low-cost-vision-based-unmanned-aerial-system-for-extremely-low-light-gps-denied-navigation-and-thermal-imaging" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/37927.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">327</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3662</span> Application of Deep Learning in Colorization of LiDAR-Derived Intensity Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Edgardo%20V.%20Gubatanga%20Jr.">Edgardo V. Gubatanga Jr.</a>, <a href="https://publications.waset.org/abstracts/search?q=Mark%20Joshua%20Salvacion"> Mark Joshua Salvacion</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Most aerial LiDAR systems have accompanying aerial cameras in order to capture not only the terrain of the surveyed area but also its true-color appearance. However, the presence of atmospheric clouds, poor lighting conditions, and aerial camera problems during an aerial survey may cause absence of aerial photographs. These leave areas having terrain information but lacking aerial photographs. Intensity images can be derived from LiDAR data but they are only grayscale images. A deep learning model is developed to create a complex function in a form of a deep neural network relating the pixel values of LiDAR-derived intensity images and true-color images. This complex function can then be used to predict the true-color images of a certain area using intensity images from LiDAR data. The predicted true-color images do not necessarily need to be accurate compared to the real world. They are only intended to look realistic so that they can be used as base maps. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aerial%20LiDAR" title="aerial LiDAR">aerial LiDAR</a>, <a href="https://publications.waset.org/abstracts/search?q=colorization" title=" colorization"> colorization</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=intensity%20images" title=" intensity images"> intensity images</a> </p> <a href="https://publications.waset.org/abstracts/94116/application-of-deep-learning-in-colorization-of-lidar-derived-intensity-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94116.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">166</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3661</span> Detecting and Disabling Digital Cameras Using D3CIP Algorithm Based on Image Processing</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20Vignesh">S. Vignesh</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20S.%20Rangasamy"> K. S. Rangasamy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The paper deals with the device capable of detecting and disabling digital cameras. The system locates the camera and then neutralizes it. Every digital camera has an image sensor known as a CCD, which is retro-reflective and sends light back directly to its original source at the same angle. The device shines infrared LED light, which is invisible to the human eye, at a distance of about 20 feet. It then collects video of these reflections with a camcorder. Then the video of the reflections is transferred to a computer connected to the device, where it is sent through image processing algorithms that pick out infrared light bouncing back. Once the camera is detected, the device would project an invisible infrared laser into the camera's lens, thereby overexposing the photo and rendering it useless. Low levels of infrared laser neutralize digital cameras but are neither a health danger to humans nor a physical damage to cameras. We also discuss the simplified design of the above device that can used in theatres to prevent piracy. The domains being covered here are optics and image processing. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=CCD" title="CCD">CCD</a>, <a href="https://publications.waset.org/abstracts/search?q=optics" title=" optics"> optics</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a>, <a href="https://publications.waset.org/abstracts/search?q=D3CIP" title=" D3CIP"> D3CIP</a> </p> <a href="https://publications.waset.org/abstracts/1736/detecting-and-disabling-digital-cameras-using-d3cip-algorithm-based-on-image-processing" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/1736.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">357</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3660</span> Real-Time Aerial Marine Surveillance System for Safe Navigation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vinesh%20Thiruchelvam">Vinesh Thiruchelvam</a>, <a href="https://publications.waset.org/abstracts/search?q=Umar%20Mumtaz%20Chowdry"> Umar Mumtaz Chowdry</a>, <a href="https://publications.waset.org/abstracts/search?q=Sathish%20Kumar%20Selvaperumal"> Sathish Kumar Selvaperumal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The prime purpose of the project is to provide a sophisticated system for surveillance specialized for the Port Authorities in the Maritime Industry. The current aerial surveillance does not have a wide dimensioning view. The channels of communication is shared and not exclusive allowing for communications errors or disturbance mainly due to traffic. The scope is to analyze the various aspects as real-time aerial and marine surveillance is one of the most important methods which could ensure the domain security of the sailors. The system will improve real time data as obtained for the controller base station. The key implementation will be based on camera speed, angle and adherence to a sustainable power utilization module. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=SMS" title="SMS">SMS</a>, <a href="https://publications.waset.org/abstracts/search?q=real%20time" title=" real time"> real time</a>, <a href="https://publications.waset.org/abstracts/search?q=GUI" title=" GUI"> GUI</a>, <a href="https://publications.waset.org/abstracts/search?q=maritime%20industry" title=" maritime industry "> maritime industry </a> </p> <a href="https://publications.waset.org/abstracts/6760/real-time-aerial-marine-surveillance-system-for-safe-navigation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6760.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">498</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3659</span> Comparison between Photogrammetric and Structure from Motion Techniques in Processing Unmanned Aerial Vehicles Imageries</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ahmed%20Elaksher">Ahmed Elaksher</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Over the last few years, significant progresses have been made and new approaches have been proposed for efficient collection of 3D spatial data from Unmanned aerial vehicles (UAVs) with reduced costs compared to imagery from satellite or manned aircraft. In these systems, a low-cost GPS unit provides the position, velocity of the vehicle, a low-quality inertial measurement unit (IMU) determines its orientation, and off-the-shelf cameras capture the images. Structure from Motion (SfM) and photogrammetry are the main tools for 3D surface reconstruction from images collected by these systems. Unlike traditional techniques, SfM allows the computation of calibration parameters using point correspondences across images without performing a rigorous laboratory or field calibration process and it is more flexible in that it does not require consistent image overlap or same rotation angles between successive photos. These benefits make SfM ideal for UAVs aerial mapping. In this paper, a direct comparison between SfM Digital Elevation Models (DEM) and those generated through traditional photogrammetric techniques was performed. Data was collected by a 3DR IRIS+ Quadcopter with a Canon PowerShot S100 digital camera. Twenty ground control points were randomly distributed on the ground and surveyed with a total station in a local coordinate system. Images were collected from an altitude of 30 meters with a ground resolution of nine mm/pixel. Data was processed with PhotoScan, VisualSFM, Imagine Photogrammetry, and a photogrammetric algorithm developed by the author. The algorithm starts with performing a laboratory camera calibration then the acquired imagery undergoes an orientation procedure to determine the cameras’ positions and orientations. After the orientation is attained, correlation based image matching is conducted to automatically generate three-dimensional surface models followed by a refining step using sub-pixel image information for high matching accuracy. Tests with different number and configurations of the control points were conducted. Camera calibration parameters estimated from commercial software and those obtained with laboratory procedures were comparable. Exposure station positions were within less than few centimeters and insignificant differences, within less than three seconds, among orientation angles were found. DEM differencing was performed between generated DEMs and few centimeters vertical shifts were found. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=UAV" title="UAV">UAV</a>, <a href="https://publications.waset.org/abstracts/search?q=photogrammetry" title=" photogrammetry"> photogrammetry</a>, <a href="https://publications.waset.org/abstracts/search?q=SfM" title=" SfM"> SfM</a>, <a href="https://publications.waset.org/abstracts/search?q=DEM" title=" DEM"> DEM</a> </p> <a href="https://publications.waset.org/abstracts/71553/comparison-between-photogrammetric-and-structure-from-motion-techniques-in-processing-unmanned-aerial-vehicles-imageries" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/71553.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">294</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3658</span> Multiplayer RC-car Driving System in a Collaborative Augmented Reality Environment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kikuo%20Asai">Kikuo Asai</a>, <a href="https://publications.waset.org/abstracts/search?q=Yuji%20Sugimoto"> Yuji Sugimoto</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We developed a prototype system for multiplayer RC-car driving in a collaborative Augmented Reality (AR) environment. The tele-existence environment is constructed by superimposing digital data onto images captured by a camera on an RC-car, enabling players to experience an augmented coexistence of the digital content and the real world. Marker-based tracking was used for estimating position and orientation of the camera. The plural RC-cars can be operated in a field where square markers are arranged. The video images captured by the camera are transmitted to a PC for visual tracking. The RC-cars are also tracked by using an infrared camera attached to the ceiling, so that the instability is reduced in the visual tracking. Multimedia data such as texts and graphics are visualized to be overlaid onto the video images in the geometrically correct manner. The prototype system allows a tele-existence sensation to be augmented in a collaborative AR environment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=multiplayer" title="multiplayer">multiplayer</a>, <a href="https://publications.waset.org/abstracts/search?q=RC-car" title=" RC-car"> RC-car</a>, <a href="https://publications.waset.org/abstracts/search?q=collaborative%20environment" title=" collaborative environment"> collaborative environment</a>, <a href="https://publications.waset.org/abstracts/search?q=augmented%20reality" title=" augmented reality"> augmented reality</a> </p> <a href="https://publications.waset.org/abstracts/4359/multiplayer-rc-car-driving-system-in-a-collaborative-augmented-reality-environment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/4359.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">289</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3657</span> A Four-Step Ortho-Rectification Procedure for Geo-Referencing Video Streams from a Low-Cost UAV</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=B.%20O.%20Olawale">B. O. Olawale</a>, <a href="https://publications.waset.org/abstracts/search?q=C.%20R.%20Chatwin"> C. R. Chatwin</a>, <a href="https://publications.waset.org/abstracts/search?q=R.%20C.%20D.%20Young"> R. C. D. Young</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20M.%20Birch"> P. M. Birch</a>, <a href="https://publications.waset.org/abstracts/search?q=F.%20O.%20Faithpraise"> F. O. Faithpraise</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20O.%20Olukiran"> A. O. Olukiran</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Ortho-rectification is the process of geometrically correcting an aerial image such that the scale is uniform. The ortho-image formed from the process is corrected for lens distortion, topographic relief, and camera tilt. This can be used to measure true distances, because it is an accurate representation of the Earth’s surface. Ortho-rectification and geo-referencing are essential to pin point the exact location of targets in video imagery acquired at the UAV platform. This can only be achieved by comparing such video imagery with an existing digital map. However, it is only when the image is ortho-rectified with the same co-ordinate system as an existing map that such a comparison is possible. The video image sequences from the UAV platform must be geo-registered, that is, each video frame must carry the necessary camera information before performing the ortho-rectification process. Each rectified image frame can then be mosaicked together to form a seamless image map covering the selected area. This can then be used for comparison with an existing map for geo-referencing. In this paper, we present a four-step ortho-rectification procedure for real-time geo-referencing of video data from a low-cost UAV equipped with multi-sensor system. The basic procedures for the real-time ortho-rectification are: (1) Decompilation of video stream into individual frames; (2) Finding of interior camera orientation parameters; (3) Finding the relative exterior orientation parameters for each video frames with respect to each other; (4) Finding the absolute exterior orientation parameters, using self-calibration adjustment with the aid of a mathematical model. Each ortho-rectified video frame is then mosaicked together to produce a 2-D planimetric mapping, which can be compared with a well referenced existing digital map for the purpose of georeferencing and aerial surveillance. A test field located in Abuja, Nigeria was used for testing our method. Fifteen minutes video and telemetry data were collected using the UAV and the data collected were processed using the four-step ortho-rectification procedure. The results demonstrated that the geometric measurement of the control field from ortho-images are more reliable than those from original perspective photographs when used to pin point the exact location of targets on the video imagery acquired by the UAV. The 2-D planimetric accuracy when compared with the 6 control points measured by a GPS receiver is between 3 to 5 meters. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=geo-referencing" title="geo-referencing">geo-referencing</a>, <a href="https://publications.waset.org/abstracts/search?q=ortho-rectification" title=" ortho-rectification"> ortho-rectification</a>, <a href="https://publications.waset.org/abstracts/search?q=video%20frame" title=" video frame"> video frame</a>, <a href="https://publications.waset.org/abstracts/search?q=self-calibration" title=" self-calibration"> self-calibration</a> </p> <a href="https://publications.waset.org/abstracts/33730/a-four-step-ortho-rectification-procedure-for-geo-referencing-video-streams-from-a-low-cost-uav" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/33730.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">478</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3656</span> Digital Image Forensics: Discovering the History of Digital Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Gurinder%20Singh">Gurinder Singh</a>, <a href="https://publications.waset.org/abstracts/search?q=Kulbir%20Singh"> Kulbir Singh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Digital multimedia contents such as image, video, and audio can be tampered easily due to the availability of powerful editing softwares. Multimedia forensics is devoted to analyze these contents by using various digital forensic techniques in order to validate their authenticity. Digital image forensics is dedicated to investigate the reliability of digital images by analyzing the integrity of data and by reconstructing the historical information of an image related to its acquisition phase. In this paper, a survey is carried out on the forgery detection by considering the most recent and promising digital image forensic techniques. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Computer%20Forensics" title="Computer Forensics">Computer Forensics</a>, <a href="https://publications.waset.org/abstracts/search?q=Multimedia%20Forensics" title=" Multimedia Forensics"> Multimedia Forensics</a>, <a href="https://publications.waset.org/abstracts/search?q=Image%20Ballistics" title=" Image Ballistics"> Image Ballistics</a>, <a href="https://publications.waset.org/abstracts/search?q=Camera%20Source%20Identification" title=" Camera Source Identification"> Camera Source Identification</a>, <a href="https://publications.waset.org/abstracts/search?q=Forgery%20Detection" title=" Forgery Detection"> Forgery Detection</a> </p> <a href="https://publications.waset.org/abstracts/76669/digital-image-forensics-discovering-the-history-of-digital-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/76669.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">247</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3655</span> GIS-Based Automatic Flight Planning of Camera-Equipped UAVs for Fire Emergency Response</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohammed%20Sulaiman">Mohammed Sulaiman</a>, <a href="https://publications.waset.org/abstracts/search?q=Hexu%20Liu"> Hexu Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohamed%20Binalhaj"> Mohamed Binalhaj</a>, <a href="https://publications.waset.org/abstracts/search?q=William%20W.%20Liou"> William W. Liou</a>, <a href="https://publications.waset.org/abstracts/search?q=Osama%20Abudayyeh"> Osama Abudayyeh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Emerging technologies such as camera-equipped unmanned aerial vehicles (UAVs) are increasingly being applied in building fire rescue to provide real-time visualization and 3D reconstruction of the entire fireground. However, flight planning of camera-equipped UAVs is usually a manual process, which is not sufficient to fulfill the needs of emergency management. This research proposes a Geographic Information System (GIS)-based approach to automatic flight planning of camera-equipped UAVs for building fire emergency response. In this research, Haversine formula and lawn mowing patterns are employed to automate flight planning based on geometrical and spatial information from GIS. The resulting flight mission satisfies the requirements of 3D reconstruction purposes of the fireground, in consideration of flight execution safety and visibility of camera frames. The proposed approach is implemented within a GIS environment through an application programming interface. A case study is used to demonstrate the effectiveness of the proposed approach. The result shows that flight mission can be generated in a timely manner for application to fire emergency response. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=GIS" title="GIS">GIS</a>, <a href="https://publications.waset.org/abstracts/search?q=camera-equipped%20UAVs" title=" camera-equipped UAVs"> camera-equipped UAVs</a>, <a href="https://publications.waset.org/abstracts/search?q=automatic%20flight%20planning" title=" automatic flight planning"> automatic flight planning</a>, <a href="https://publications.waset.org/abstracts/search?q=fire%20emergency%20response" title=" fire emergency response"> fire emergency response</a> </p> <a href="https://publications.waset.org/abstracts/125166/gis-based-automatic-flight-planning-of-camera-equipped-uavs-for-fire-emergency-response" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/125166.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">125</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3654</span> H.263 Based Video Transceiver for Wireless Camera System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Won-Ho%20Kim">Won-Ho Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, a design of H.263 based wireless video transceiver is presented for wireless camera system. It uses standard WIFI transceiver and the covering area is up to 100m. Furthermore the standard H.263 video encoding technique is used for video compression since wireless video transmitter is unable to transmit high capacity raw data in real time and the implemented system is capable of streaming at speed of less than 1Mbps using NTSC 720x480 video. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=wireless%20video%20transceiver" title="wireless video transceiver">wireless video transceiver</a>, <a href="https://publications.waset.org/abstracts/search?q=video%20surveillance%20camera" title=" video surveillance camera"> video surveillance camera</a>, <a href="https://publications.waset.org/abstracts/search?q=H.263%20video%20encoding%20digital%20signal%20processing" title=" H.263 video encoding digital signal processing"> H.263 video encoding digital signal processing</a> </p> <a href="https://publications.waset.org/abstracts/12951/h263-based-video-transceiver-for-wireless-camera-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/12951.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">364</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3653</span> Mechanism of Changing a Product Concept</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kiyohiro%20Yamazaki">Kiyohiro Yamazaki</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The purpose of this paper is to examine the hypothesis explaining the mechanism in the case, where the product is deleted or reduced the fundamental function of the product through the product concept changes in the digital camera industry. This paper points out not owning the fundamental technology might cause the change of the product concept. Casio could create new competitive factor so that this paper discusses a possibility of the mechanism of changing the product concept. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=firm%20without%20fundamental%20technology" title="firm without fundamental technology">firm without fundamental technology</a>, <a href="https://publications.waset.org/abstracts/search?q=product%20development" title=" product development"> product development</a>, <a href="https://publications.waset.org/abstracts/search?q=product%20concept" title=" product concept"> product concept</a>, <a href="https://publications.waset.org/abstracts/search?q=digital%20camera%20industry" title=" digital camera industry"> digital camera industry</a>, <a href="https://publications.waset.org/abstracts/search?q=Casio" title=" Casio"> Casio</a> </p> <a href="https://publications.waset.org/abstracts/16190/mechanism-of-changing-a-product-concept" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/16190.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">562</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3652</span> Video Sharing System Based On Wi-fi Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Qidi%20Lin">Qidi Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Jinbin%20Huang"> Jinbin Huang</a>, <a href="https://publications.waset.org/abstracts/search?q=Weile%20Liang"> Weile Liang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper introduces a video sharing platform based on WiFi, which consists of camera, mobile phone and PC server. This platform can receive wireless signal from the camera and show the live video on the mobile phone captured by camera. In addition that, it is able to send commands to camera and control the camera’s holder to rotate. The platform can be applied to interactive teaching and dangerous area’s monitoring and so on. Testing results show that the platform can share the live video of mobile phone. Furthermore, if the system’s PC sever and the camera and many mobile phones are connected together, it can transfer photos concurrently. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wifi%20Camera" title="Wifi Camera">Wifi Camera</a>, <a href="https://publications.waset.org/abstracts/search?q=socket%20mobile" title=" socket mobile"> socket mobile</a>, <a href="https://publications.waset.org/abstracts/search?q=platform%20video%20monitoring" title=" platform video monitoring"> platform video monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20control" title=" remote control"> remote control</a> </p> <a href="https://publications.waset.org/abstracts/31912/video-sharing-system-based-on-wi-fi-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/31912.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">337</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3651</span> Automatic Battery Charging for Rotor Wings Type Unmanned Aerial Vehicle</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jeyeon%20Kim">Jeyeon Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper describes the development of the automatic battery charging device for the rotor wings type unmanned aerial vehicle (UAV) and the positioning method that can be accurately landed on the charging device when landing. The developed automatic battery charging device is considered by simple maintenance, durability, cost and error of the positioning when landing. In order to for the UAV accurately land on the charging device, two kinds of markers (a color marker and a light marker) installed on the charging device is detected by the camera mounted on the UAV. And then, the UAV is controlled so that the detected marker becomes the center of the image and is landed on the device. We conduct the performance evaluation of the proposal positioning method by the outdoor experiments at day and night, and show the effectiveness of the system. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title="unmanned aerial vehicle">unmanned aerial vehicle</a>, <a href="https://publications.waset.org/abstracts/search?q=automatic%20battery%20charging" title=" automatic battery charging"> automatic battery charging</a>, <a href="https://publications.waset.org/abstracts/search?q=positioning" title=" positioning"> positioning</a> </p> <a href="https://publications.waset.org/abstracts/71183/automatic-battery-charging-for-rotor-wings-type-unmanned-aerial-vehicle" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/71183.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">363</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3650</span> Remote Vital Signs Monitoring in Neonatal Intensive Care Unit Using a Digital Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Fatema-Tuz-Zohra%20Khanam">Fatema-Tuz-Zohra Khanam</a>, <a href="https://publications.waset.org/abstracts/search?q=Ali%20Al-Naji"> Ali Al-Naji</a>, <a href="https://publications.waset.org/abstracts/search?q=Asanka%20G.%20Perera"> Asanka G. Perera</a>, <a href="https://publications.waset.org/abstracts/search?q=Kim%20Gibson"> Kim Gibson</a>, <a href="https://publications.waset.org/abstracts/search?q=Javaan%20Chahl"> Javaan Chahl</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Conventional contact-based vital signs monitoring sensors such as pulse oximeters or electrocardiogram (ECG) may cause discomfort, skin damage, and infections, particularly in neonates with fragile, sensitive skin. Therefore, remote monitoring of the vital sign is desired in both clinical and non-clinical settings to overcome these issues. Camera-based vital signs monitoring is a recent technology for these applications with many positive attributes. However, there are still limited camera-based studies on neonates in a clinical setting. In this study, the heart rate (HR) and respiratory rate (RR) of eight infants at the Neonatal Intensive Care Unit (NICU) in Flinders Medical Centre were remotely monitored using a digital camera applying color and motion-based computational methods. The region-of-interest (ROI) was efficiently selected by incorporating an image decomposition method. Furthermore, spatial averaging, spectral analysis, band-pass filtering, and peak detection were also used to extract both HR and RR. The experimental results were validated with the ground truth data obtained from an ECG monitor and showed a strong correlation using the Pearson correlation coefficient (PCC) 0.9794 and 0.9412 for HR and RR, respectively. The RMSE between camera-based data and ECG data for HR and RR were 2.84 beats/min and 2.91 breaths/min, respectively. A Bland Altman analysis of the data also showed a close correlation between both data sets with a mean bias of 0.60 beats/min and 1 breath/min, and the lower and upper limit of agreement -4.9 to + 6.1 beats/min and -4.4 to +6.4 breaths/min for both HR and RR, respectively. Therefore, video camera imaging may replace conventional contact-based monitoring in NICU and has potential applications in other contexts such as home health monitoring. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=neonates" title="neonates">neonates</a>, <a href="https://publications.waset.org/abstracts/search?q=NICU" title=" NICU"> NICU</a>, <a href="https://publications.waset.org/abstracts/search?q=digital%20camera" title=" digital camera"> digital camera</a>, <a href="https://publications.waset.org/abstracts/search?q=heart%20rate" title=" heart rate"> heart rate</a>, <a href="https://publications.waset.org/abstracts/search?q=respiratory%20rate" title=" respiratory rate"> respiratory rate</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20decomposition" title=" image decomposition"> image decomposition</a> </p> <a href="https://publications.waset.org/abstracts/147786/remote-vital-signs-monitoring-in-neonatal-intensive-care-unit-using-a-digital-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/147786.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">104</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3649</span> A Process of Forming a Single Competitive Factor in the Digital Camera Industry</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kiyohiro%20Yamazaki">Kiyohiro Yamazaki</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper considers a forming process of a single competitive factor in the digital camera industry from the viewpoint of product platform. To make product development easier for companies and to increase product introduction ratios, development efforts concentrate on improving and strengthening certain product attributes, and it is born in the process that the product platform is formed continuously. It is pointed out that the formation of this product platform raises product development efficiency of individual companies, but on the other hand, it has a trade-off relationship of causing unification of competitive factors in the whole industry. This research tries to analyze product specification data which were collected from the web page of digital camera companies. Specifically, this research collected all product specification data released in Japan from 1995 to 2003 and analyzed the composition of image sensor and optical lens; and it identified product platforms shared by multiple products and discussed their application. As a result, this research found that the product platformation was born in the development of the standard product for major market segmentation. Every major company has made product platforms of image sensors and optical lenses, and as a result, this research found that the competitive factors were unified in the entire industry throughout product platformation. In other words, this product platformation brought product development efficiency of individual firms; however, it also caused industrial competition factors to be unified in the industry. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=digital%20camera%20industry" title="digital camera industry">digital camera industry</a>, <a href="https://publications.waset.org/abstracts/search?q=product%20evolution%20trajectory" title=" product evolution trajectory"> product evolution trajectory</a>, <a href="https://publications.waset.org/abstracts/search?q=product%20platform" title=" product platform"> product platform</a>, <a href="https://publications.waset.org/abstracts/search?q=unification%20of%20competitive%20factors" title=" unification of competitive factors"> unification of competitive factors</a> </p> <a href="https://publications.waset.org/abstracts/95172/a-process-of-forming-a-single-competitive-factor-in-the-digital-camera-industry" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/95172.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">158</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3648</span> Design and Optimization of a Mini High Altitude Long Endurance (HALE) Multi-Role Unmanned Aerial Vehicle</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vishaal%20Subramanian">Vishaal Subramanian</a>, <a href="https://publications.waset.org/abstracts/search?q=Annuatha%20Vinod%20Kumar"> Annuatha Vinod Kumar</a>, <a href="https://publications.waset.org/abstracts/search?q=Santosh%20Kumar%20Budankayala"> Santosh Kumar Budankayala</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Senthil%20Kumar"> M. Senthil Kumar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper discusses the aerodynamic and structural design, simulation and optimization of a mini-High Altitude Long Endurance (HALE) UAV. The applications of this mini HALE UAV vary from aerial topological surveys, quick first aid supply, emergency medical blood transport, search and relief activates to border patrol, surveillance and estimation of forest fire progression. Although classified as a mini UAV according to UVS International, our design is an amalgamation of the features of ‘mini’ and ‘HALE’ categories, combining the light weight of the ‘mini’ and the high altitude ceiling and endurance of the HALE. Designed with the idea of implementation in India, it is in strict compliance with the UAS rules proposed by the office of the Director General of Civil Aviation. The plane can be completely automated or have partial override control and is equipped with an Infra-Red camera and a multi coloured camera with on-board storage or live telemetry, GPS system with Geo Fencing and fail safe measures. An additional of 1.5 kg payload can be attached to three major hard points on the aircraft and can comprise of delicate equipment or releasable payloads. The paper details the design, optimization process and the simulations performed using various software such as Design Foil, XFLR5, Solidworks and Ansys. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aircraft" title="aircraft">aircraft</a>, <a href="https://publications.waset.org/abstracts/search?q=endurance" title=" endurance"> endurance</a>, <a href="https://publications.waset.org/abstracts/search?q=HALE" title=" HALE"> HALE</a>, <a href="https://publications.waset.org/abstracts/search?q=high%20altitude" title=" high altitude"> high altitude</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20range" title=" long range"> long range</a>, <a href="https://publications.waset.org/abstracts/search?q=UAV" title=" UAV"> UAV</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a> </p> <a href="https://publications.waset.org/abstracts/57692/design-and-optimization-of-a-mini-high-altitude-long-endurance-hale-multi-role-unmanned-aerial-vehicle" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/57692.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">397</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3647</span> Application of Optical Method Based on Laser Devise as Non-Destructive Testing for Calculus of Mechanical Deformation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=R.%20Da%C3%AFra">R. Daïra</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20Chalvidan"> V. Chalvidan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We present the speckle interferometry method to determine the deformation of a piece. This method of holographic imaging using a CCD camera for simultaneous digital recording of two states object and reference. The reconstruction is obtained numerically. This latest method has the advantage of being simpler than the methods currently available, and it does not suffer the holographic configuration faults online. Furthermore, it is entirely digital and avoids heavy analysis after recording the hologram. This work was carried out in the laboratory HOLO 3 (optical metrology laboratory in Saint Louis, France) and it consists in controlling qualitatively and quantitatively the deformation of object by using a camera CCD connected to a computer equipped with software of Fringe Analysis. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=speckle" title="speckle">speckle</a>, <a href="https://publications.waset.org/abstracts/search?q=nondestructive%20testing" title=" nondestructive testing"> nondestructive testing</a>, <a href="https://publications.waset.org/abstracts/search?q=interferometry" title=" interferometry"> interferometry</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a> </p> <a href="https://publications.waset.org/abstracts/26022/application-of-optical-method-based-on-laser-devise-as-non-destructive-testing-for-calculus-of-mechanical-deformation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/26022.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">497</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3646</span> Gnss Aided Photogrammetry for Digital Mapping</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Usman%20Akram">Muhammad Usman Akram</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research work based on GNSS-Aided Photogrammetry for Digital Mapping. It focuses on topographic survey of an area or site which is to be used in future Planning & development (P&D) or can be used for further, examination, exploration, research and inspection. Survey and Mapping in hard-to-access and hazardous areas are very difficult by using traditional techniques and methodologies; as well it is time consuming, labor intensive and has less precision with limited data. In comparison with the advance techniques it is saving with less manpower and provides more precise output with a wide variety of multiple data sets. In this experimentation, Aerial Photogrammetry technique is used where an UAV flies over an area and captures geocoded images and makes a Three-Dimensional Model (3-D Model), UAV operates on a user specified path or area with various parameters; Flight altitude, Ground sampling distance (GSD), Image overlapping, Camera angle etc. For ground controlling, a network of points on the ground would be observed as a Ground Control point (GCP) using Differential Global Positioning System (DGPS) in PPK or RTK mode. Furthermore, that raw data collected by UAV and DGPS will be processed in various Digital image processing programs and Computer Aided Design software. From which as an output we obtain Points Dense Cloud, Digital Elevation Model (DEM) and Ortho-photo. The imagery is converted into geospatial data by digitizing over Ortho-photo, DEM is further converted into Digital Terrain Model (DTM) for contour generation or digital surface. As a result, we get Digital Map of area to be surveyed. In conclusion, we compared processed data with exact measurements taken on site. The error will be accepted if the amount of error is not breached from survey accuracy limits set by concerned institutions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=photogrammetry" title="photogrammetry">photogrammetry</a>, <a href="https://publications.waset.org/abstracts/search?q=post%20processing%20kinematics" title=" post processing kinematics"> post processing kinematics</a>, <a href="https://publications.waset.org/abstracts/search?q=real%20time%20kinematics" title=" real time kinematics"> real time kinematics</a>, <a href="https://publications.waset.org/abstracts/search?q=manual%20data%20inquiry" title=" manual data inquiry"> manual data inquiry</a> </p> <a href="https://publications.waset.org/abstracts/189835/gnss-aided-photogrammetry-for-digital-mapping" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/189835.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">30</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3645</span> System Response of a Variable-Rate Aerial Application System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Daniel%20E.%20Martin">Daniel E. Martin</a>, <a href="https://publications.waset.org/abstracts/search?q=Chenghai%20Yang"> Chenghai Yang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Variable-rate aerial application systems are becoming more readily available; however, aerial applicators typically only use the systems for constant-rate application of materials, allowing the systems to compensate for upwind and downwind ground speed variations. Much of the resistance to variable-rate aerial application system adoption in the U.S. pertains to applicator’s trust in the systems to turn on and off automatically as desired. The objectives of this study were to evaluate a commercially available variable-rate aerial application system under field conditions to demonstrate both the response and accuracy of the system to desired application rate inputs. This study involved planting oats in a 35-acre fallow field during the winter months to establish a uniform green backdrop in early spring. A binary (on/off) prescription application map was generated and a variable-rate aerial application of glyphosate was made to the field. Airborne multispectral imagery taken before and two weeks after the application documented actual field deposition and efficacy of the glyphosate. When compared to the prescription application map, these data provided application system response and accuracy information. The results of this study will be useful for quantifying and documenting the response and accuracy of a commercially available variable-rate aerial application system so that aerial applicators can be more confident in their capabilities and the use of these systems can increase, taking advantage of all that aerial variable-rate technologies have to offer. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=variable-rate" title="variable-rate">variable-rate</a>, <a href="https://publications.waset.org/abstracts/search?q=aerial%20application" title=" aerial application"> aerial application</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=precision%20application" title=" precision application"> precision application</a> </p> <a href="https://publications.waset.org/abstracts/24198/system-response-of-a-variable-rate-aerial-application-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/24198.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">474</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3644</span> Unmanned Aerial Vehicle Use for Emergency Purpose</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shah%20S.%20M.%20A.">Shah S. M. A.</a>, <a href="https://publications.waset.org/abstracts/search?q=Aftab%20U."> Aftab U.</a> </p> <p class="card-text"><strong>Abstract:</strong></p> It is imperative in today’s world to get a real time information about different emergency situation occurred in the environment. Helicopters are mostly used to access places which are hard to access in emergencies like earthquake, floods, bridge failure or in any other disasters conditions. Use of helicopters are considered more costly to properly collect the data. Therefore a new technique has been introduced in this research to promptly collect data using drones. The drone designed in this research is based on trial and error experimental work with objective to construct an economical drone. Locally available material have been used for this purpose. And a mobile camera were also attached to prepare video during the flight. It was found that within very limited resources the result were quite successful. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=UAV" title="UAV">UAV</a>, <a href="https://publications.waset.org/abstracts/search?q=real%20time" title=" real time"> real time</a>, <a href="https://publications.waset.org/abstracts/search?q=camera" title=" camera"> camera</a>, <a href="https://publications.waset.org/abstracts/search?q=disasters" title=" disasters"> disasters</a> </p> <a href="https://publications.waset.org/abstracts/79652/unmanned-aerial-vehicle-use-for-emergency-purpose" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/79652.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">237</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3643</span> Mapping of Geological Structures Using Aerial Photography</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ankit%20Sharma">Ankit Sharma</a>, <a href="https://publications.waset.org/abstracts/search?q=Mudit%20Sachan"> Mudit Sachan</a>, <a href="https://publications.waset.org/abstracts/search?q=Anurag%20Prakash"> Anurag Prakash</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Rapid growth in data acquisition technologies through drones, have led to advances and interests in collecting high-resolution images of geological fields. Being advantageous in capturing high volume of data in short flights, a number of challenges have to overcome for efficient analysis of this data, especially while data acquisition, image interpretation and processing. We introduce a method that allows effective mapping of geological fields using photogrammetric data of surfaces, drainage area, water bodies etc, which will be captured by airborne vehicles like UAVs, we are not taking satellite images because of problems in adequate resolution, time when it is captured may be 1 yr back, availability problem, difficult to capture exact image, then night vision etc. This method includes advanced automated image interpretation technology and human data interaction to model structures and. First Geological structures will be detected from the primary photographic dataset and the equivalent three dimensional structures would then be identified by digital elevation model. We can calculate dip and its direction by using the above information. The structural map will be generated by adopting a specified methodology starting from choosing the appropriate camera, camera’s mounting system, UAVs design ( based on the area and application), Challenge in air borne systems like Errors in image orientation, payload problem, mosaicing and geo referencing and registering of different images to applying DEM. The paper shows the potential of using our method for accurate and efficient modeling of geological structures, capture particularly from remote, of inaccessible and hazardous sites. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=digital%20elevation%20model" title="digital elevation model">digital elevation model</a>, <a href="https://publications.waset.org/abstracts/search?q=mapping" title=" mapping"> mapping</a>, <a href="https://publications.waset.org/abstracts/search?q=photogrammetric%20data%20analysis" title=" photogrammetric data analysis"> photogrammetric data analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=geological%20structures" title=" geological structures "> geological structures </a> </p> <a href="https://publications.waset.org/abstracts/26316/mapping-of-geological-structures-using-aerial-photography" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/26316.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">686</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3642</span> Real-Time Web Map Service Based on Solar-Powered Unmanned Aerial Vehicle</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sunghun%20Jung">Sunghun Jung</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The existing web map service providers contract with the satellite operators to update their maps by paying an astronomical amount of money, but the cost could be minimized by operating a cheap and small UAV. In contrast to the satellites, we only need to replace aged battery packs from time to time for the usage of UAVs. Utilizing both a regular camera and an infrared camera mounted on a small, solar-powered, long-endurance, and hoverable UAV, daytime ground surface photographs, and nighttime infrared photographs will be continuously and repeatedly uploaded to the web map server and overlapped with the existing ground surface photographs in real-time. The real-time web map service using a small, solar-powered, long-endurance, and hoverable UAV can also be applied to the surveillance missions, in particular, to detect border area intruders. The improved real-time image stitching algorithm is developed for the graphic map data overlapping. Also, a small home server will be developed to manage the huge size of incoming map data. The map photographs taken at tens or hundreds of kilometers by a UAV would improve the map graphic resolution compared to the map photographs taken at thousands of kilometers by satellites since the satellite photographs are limited by weather conditions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=long-endurance" title="long-endurance">long-endurance</a>, <a href="https://publications.waset.org/abstracts/search?q=real-time%20web%20map%20service%20%28RWMS%29" title=" real-time web map service (RWMS)"> real-time web map service (RWMS)</a>, <a href="https://publications.waset.org/abstracts/search?q=solar-powered" title=" solar-powered"> solar-powered</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle%20%28UAV%29" title=" unmanned aerial vehicle (UAV)"> unmanned aerial vehicle (UAV)</a> </p> <a href="https://publications.waset.org/abstracts/80443/real-time-web-map-service-based-on-solar-powered-unmanned-aerial-vehicle" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/80443.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">274</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3641</span> Conceptual Design of Unmanned Aerial Targets</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20Adamski">M. Adamski</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20Cwiklak"> J. Cwiklak</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The contemporary battlefield creates a demand for more costly and highly advanced munitions. Training personnel responsible for operations, as well as an immediate execution of combat tasks, which engage real assets, is unrealistic and economically not feasible. Owing to a wide array of exploited simulators and various types of imitators, it is possible to reduce the costs. One of the effective elements of training, which can be applied in the training of all service branches, are imitators of aerial targets. This research serves as an introduction to the commencement of design analysis over a real aerial target imitator. Within the project, the basic aerodynamic calculations were made, which enabled to determine its geometry, design layout, performance, as well as the mass balance of individual components. The conducted calculations of the parameters of flight characteristics come closer to the real performance of such unmanned aerial vehicles. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aerial%20target" title="aerial target">aerial target</a>, <a href="https://publications.waset.org/abstracts/search?q=aerodynamics" title=" aerodynamics"> aerodynamics</a>, <a href="https://publications.waset.org/abstracts/search?q=imitator" title=" imitator"> imitator</a>, <a href="https://publications.waset.org/abstracts/search?q=performance" title=" performance"> performance</a> </p> <a href="https://publications.waset.org/abstracts/32414/conceptual-design-of-unmanned-aerial-targets" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/32414.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">398</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3640</span> Applying Semi-Automatic Digital Aerial Survey Technology and Canopy Characters Classification for Surface Vegetation Interpretation of Archaeological Sites</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yung-Chung%20Chuang">Yung-Chung Chuang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The cultural layers of archaeological sites are mainly affected by surface land use, land cover, and root system of surface vegetation. For this reason, continuous monitoring of land use and land cover change is important for archaeological sites protection and management. However, in actual operation, on-site investigation and orthogonal photograph interpretation require a lot of time and manpower. For this reason, it is necessary to perform a good alternative for surface vegetation survey in an automated or semi-automated manner. In this study, we applied semi-automatic digital aerial survey technology and canopy characters classification with very high-resolution aerial photographs for surface vegetation interpretation of archaeological sites. The main idea is based on different landscape or forest type can easily be distinguished with canopy characters (e.g., specific texture distribution, shadow effects and gap characters) extracted by semi-automatic image classification. A novel methodology to classify the shape of canopy characters using landscape indices and multivariate statistics was also proposed. Non-hierarchical cluster analysis was used to assess the optimal number of canopy character clusters and canonical discriminant analysis was used to generate the discriminant functions for canopy character classification (seven categories). Therefore, people could easily predict the forest type and vegetation land cover by corresponding to the specific canopy character category. The results showed that the semi-automatic classification could effectively extract the canopy characters of forest and vegetation land cover. As for forest type and vegetation type prediction, the average prediction accuracy reached 80.3%~91.7% with different sizes of test frame. It represented this technology is useful for archaeological site survey, and can improve the classification efficiency and data update rate. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20survey" title="digital aerial survey">digital aerial survey</a>, <a href="https://publications.waset.org/abstracts/search?q=canopy%20characters%20classification" title=" canopy characters classification"> canopy characters classification</a>, <a href="https://publications.waset.org/abstracts/search?q=archaeological%20sites" title=" archaeological sites"> archaeological sites</a>, <a href="https://publications.waset.org/abstracts/search?q=multivariate%20statistics" title=" multivariate statistics"> multivariate statistics</a> </p> <a href="https://publications.waset.org/abstracts/81393/applying-semi-automatic-digital-aerial-survey-technology-and-canopy-characters-classification-for-surface-vegetation-interpretation-of-archaeological-sites" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/81393.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">141</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3639</span> Underwater Remotely Operated Vehicle (ROV) Exploration</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20S.%20Sukumar">M. S. Sukumar </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Our objective is to develop a full-fledged system for exploring and studying nature of fossils and to extend this to underwater archaeology and mineral mapping. This includes aerial surveying, imaging techniques, artefact extraction and spectrum analysing techniques. These techniques help in regular monitoring of fossils and also the sensing system. The ROV was designed to complete several tasks which simulate collecting data and samples. Given the time constraints, the ROV was engineered for efficiency and speed in performing tasks. Its other major design consideration was modularity, allowing the team to distribute the building process, to easily test systems as they were completed and troubleshoot and replace systems as necessary. Our design itself had several challenges of on-board waterproofed sensor mounting, waterproofing of motors, ROV stability criteria, camera mounting and hydrophone sound acquisition. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=remotely%20operated%20vehicle%20%28ROV%29%20dragonair" title="remotely operated vehicle (ROV) dragonair">remotely operated vehicle (ROV) dragonair</a>, <a href="https://publications.waset.org/abstracts/search?q=underwater%20archaeology" title=" underwater archaeology"> underwater archaeology</a>, <a href="https://publications.waset.org/abstracts/search?q=full-fledged%20system" title=" full-fledged system"> full-fledged system</a>, <a href="https://publications.waset.org/abstracts/search?q=aerial%20imaging%20and%20detection" title=" aerial imaging and detection"> aerial imaging and detection</a> </p> <a href="https://publications.waset.org/abstracts/7945/underwater-remotely-operated-vehicle-rov-exploration" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/7945.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">237</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3638</span> Aerial Survey and 3D Scanning Technology Applied to the Survey of Cultural Heritage of Su-Paiwan, an Aboriginal Settlement, Taiwan</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=April%20Hueimin%20Lu">April Hueimin Lu</a>, <a href="https://publications.waset.org/abstracts/search?q=Liangj-Ju%20Yao"> Liangj-Ju Yao</a>, <a href="https://publications.waset.org/abstracts/search?q=Jun-Tin%20Lin"> Jun-Tin Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Susan%20Siru%20Liu"> Susan Siru Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper discusses the application of aerial survey technology and 3D laser scanning technology in the surveying and mapping work of the settlements and slate houses of the old Taiwanese aborigines. The relics of old Taiwanese aborigines with thousands of history are widely distributed in the deep mountains of Taiwan, with a vast area and inconvenient transportation. When constructing the basic data of cultural assets, it is necessary to apply new technology to carry out efficient and accurate settlement mapping work. In this paper, taking the old Paiwan as an example, the aerial survey of the settlement of about 5 hectares and the 3D laser scanning of a slate house were carried out. The obtained orthophoto image was used as an important basis for drawing the settlement map. This 3D landscape data of topography and buildings derived from the aerial survey is important for subsequent preservation planning as well as building 3D scan provides a more detailed record of architectural forms and materials. The 3D settlement data from the aerial survey can be further applied to the 3D virtual model and animation of the settlement for virtual presentation. The information from the 3D scanning of the slate house can also be used for further digital archives and data queries through network resources. The results of this study show that, in large-scale settlement surveys, aerial surveying technology is used to construct the topography of settlements with buildings and spatial information of landscape, as well as the application of 3D scanning for small-scale records of individual buildings. This application of 3D technology, greatly increasing the efficiency and accuracy of survey and mapping work of aboriginal settlements, is much helpful for further preservation planning and rejuvenation of aboriginal cultural heritage. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aerial%20survey" title="aerial survey">aerial survey</a>, <a href="https://publications.waset.org/abstracts/search?q=3D%20scanning" title=" 3D scanning"> 3D scanning</a>, <a href="https://publications.waset.org/abstracts/search?q=aboriginal%20settlement" title=" aboriginal settlement"> aboriginal settlement</a>, <a href="https://publications.waset.org/abstracts/search?q=settlement%20architecture%20cluster" title=" settlement architecture cluster"> settlement architecture cluster</a>, <a href="https://publications.waset.org/abstracts/search?q=ecological%20landscape%20area" title=" ecological landscape area"> ecological landscape area</a>, <a href="https://publications.waset.org/abstracts/search?q=old%20Paiwan%20settlements" title=" old Paiwan settlements"> old Paiwan settlements</a>, <a href="https://publications.waset.org/abstracts/search?q=slat%20house" title=" slat house"> slat house</a>, <a href="https://publications.waset.org/abstracts/search?q=photogrammetry" title=" photogrammetry"> photogrammetry</a>, <a href="https://publications.waset.org/abstracts/search?q=SfM" title=" SfM"> SfM</a>, <a href="https://publications.waset.org/abstracts/search?q=MVS%29" title=" MVS)"> MVS)</a>, <a href="https://publications.waset.org/abstracts/search?q=Point%20cloud" title=" Point cloud"> Point cloud</a>, <a href="https://publications.waset.org/abstracts/search?q=SIFT" title=" SIFT"> SIFT</a>, <a href="https://publications.waset.org/abstracts/search?q=DSM" title=" DSM"> DSM</a>, <a href="https://publications.waset.org/abstracts/search?q=3D%20model" title=" 3D model"> 3D model</a> </p> <a href="https://publications.waset.org/abstracts/155681/aerial-survey-and-3d-scanning-technology-applied-to-the-survey-of-cultural-heritage-of-su-paiwan-an-aboriginal-settlement-taiwan" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/155681.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">168</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3637</span> Flicker Detection with Motion Tolerance for Embedded Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jianrong%20Wu">Jianrong Wu</a>, <a href="https://publications.waset.org/abstracts/search?q=Xuan%20Fu"> Xuan Fu</a>, <a href="https://publications.waset.org/abstracts/search?q=Akihiro%20Higashi"> Akihiro Higashi</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhiming%20Tan"> Zhiming Tan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> CMOS image sensors with a rolling shutter are used broadly in the digital cameras embedded in mobile devices. The rolling shutter suffers the flicker artifacts from the fluorescent lamp, and it could be observed easily. In this paper, the characteristics of illumination flicker in motion case were analyzed, and two efficient detection methods based on matching fragment selection were proposed. According to the experimental results, our methods could achieve as high as 100% accuracy in static scene, and at least 97% in motion scene. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=illumination%20flicker" title="illumination flicker">illumination flicker</a>, <a href="https://publications.waset.org/abstracts/search?q=embedded%20camera" title=" embedded camera"> embedded camera</a>, <a href="https://publications.waset.org/abstracts/search?q=rolling%20shutter" title=" rolling shutter"> rolling shutter</a>, <a href="https://publications.waset.org/abstracts/search?q=detection" title=" detection"> detection</a> </p> <a href="https://publications.waset.org/abstracts/14449/flicker-detection-with-motion-tolerance-for-embedded-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/14449.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">420</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3636</span> Topographic Characteristics Derived from UAV Images to Detect Ephemeral Gully Channels</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Recep%20Gundogan">Recep Gundogan</a>, <a href="https://publications.waset.org/abstracts/search?q=Turgay%20Dindaroglu"> Turgay Dindaroglu</a>, <a href="https://publications.waset.org/abstracts/search?q=Hikmet%20Gunal"> Hikmet Gunal</a>, <a href="https://publications.waset.org/abstracts/search?q=Mustafa%20Ulukavak"> Mustafa Ulukavak</a>, <a href="https://publications.waset.org/abstracts/search?q=Ron%20Bingner"> Ron Bingner</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A majority of total soil losses in agricultural areas could be attributed to ephemeral gullies caused by heavy rains in conventionally tilled fields; however, ephemeral gully erosion is often ignored in conventional soil erosion assessments. Ephemeral gullies are often easily filled from normal soil tillage operations, which makes capturing the existing ephemeral gullies in croplands difficult. This study was carried out to determine topographic features, including slope and aspect composite topographic index (CTI) and initiation points of gully channels, using images obtained from unmanned aerial vehicle (UAV) images. The study area was located in Topcu stream watershed in the eastern Mediterranean Region, where intense rainfall events occur over very short time periods. The slope varied between 0.7 and 99.5%, and the average slope was 24.7%. The UAV (multi-propeller hexacopter) was used as the carrier platform, and images were obtained with the RGB camera mounted on the UAV. The digital terrain models (DTM) of Topçu stream micro catchment produced using UAV images and manual field Global Positioning System (GPS) measurements were compared to assess the accuracy of UAV based measurements. Eighty-one gully channels were detected in the study area. The mean slope and CTI values in the micro-catchment obtained from DTMs generated using UAV images were 19.2% and 3.64, respectively, and both slope and CTI values were lower than those obtained using GPS measurements. The total length and volume of the gully channels were 868.2 m and 5.52 m³, respectively. Topographic characteristics and information on ephemeral gully channels (location of initial point, volume, and length) were estimated with high accuracy using the UAV images. The results reveal that UAV-based measuring techniques can be used in lieu of existing GPS and total station techniques by using images obtained with high-resolution UAVs. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aspect" title="aspect">aspect</a>, <a href="https://publications.waset.org/abstracts/search?q=compound%20topographic%20index" title=" compound topographic index"> compound topographic index</a>, <a href="https://publications.waset.org/abstracts/search?q=digital%20terrain%20model" title=" digital terrain model"> digital terrain model</a>, <a href="https://publications.waset.org/abstracts/search?q=initial%20gully%20point" title=" initial gully point"> initial gully point</a>, <a href="https://publications.waset.org/abstracts/search?q=slope" title=" slope"> slope</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle" title=" unmanned aerial vehicle"> unmanned aerial vehicle</a> </p> <a href="https://publications.waset.org/abstracts/152233/topographic-characteristics-derived-from-uav-images-to-detect-ephemeral-gully-channels" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/152233.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">114</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=122">122</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=123">123</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=digital%20aerial%20camera&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>