CINXE.COM
Search results for: infrared camera
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: infrared camera</title> <meta name="description" content="Search results for: infrared camera"> <meta name="keywords" content="infrared camera"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="infrared camera" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="infrared camera"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 1702</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: infrared camera</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1702</span> Detecting and Disabling Digital Cameras Using D3CIP Algorithm Based on Image Processing</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20Vignesh">S. Vignesh</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20S.%20Rangasamy"> K. S. Rangasamy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The paper deals with the device capable of detecting and disabling digital cameras. The system locates the camera and then neutralizes it. Every digital camera has an image sensor known as a CCD, which is retro-reflective and sends light back directly to its original source at the same angle. The device shines infrared LED light, which is invisible to the human eye, at a distance of about 20 feet. It then collects video of these reflections with a camcorder. Then the video of the reflections is transferred to a computer connected to the device, where it is sent through image processing algorithms that pick out infrared light bouncing back. Once the camera is detected, the device would project an invisible infrared laser into the camera's lens, thereby overexposing the photo and rendering it useless. Low levels of infrared laser neutralize digital cameras but are neither a health danger to humans nor a physical damage to cameras. We also discuss the simplified design of the above device that can used in theatres to prevent piracy. The domains being covered here are optics and image processing. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=CCD" title="CCD">CCD</a>, <a href="https://publications.waset.org/abstracts/search?q=optics" title=" optics"> optics</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a>, <a href="https://publications.waset.org/abstracts/search?q=D3CIP" title=" D3CIP"> D3CIP</a> </p> <a href="https://publications.waset.org/abstracts/1736/detecting-and-disabling-digital-cameras-using-d3cip-algorithm-based-on-image-processing" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/1736.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">357</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1701</span> Infrared Lightbox and iPhone App for Improving Detection Limit of Phosphate Detecting Dip Strips</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=H.%20Heidari-Bafroui">H. Heidari-Bafroui</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20Ribeiro"> B. Ribeiro</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Charbaji"> A. Charbaji</a>, <a href="https://publications.waset.org/abstracts/search?q=C.%20Anagnostopoulos"> C. Anagnostopoulos</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Faghri"> M. Faghri</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we report the development of a portable and inexpensive infrared lightbox for improving the detection limits of paper-based phosphate devices. Commercial paper-based devices utilize the molybdenum blue protocol to detect phosphate in the environment. Although these devices are easy to use and have a long shelf life, their main deficiency is their low sensitivity based on the qualitative results obtained via a color chart. To improve the results, we constructed a compact infrared lightbox that communicates wirelessly with a smartphone. The system measures the absorbance of radiation for the molybdenum blue reaction in the infrared region of the spectrum. It consists of a lightbox illuminated by four infrared light-emitting diodes, an infrared digital camera, a Raspberry Pi microcontroller, a mini-router, and an iPhone to control the microcontroller. An iPhone application was also developed to analyze images captured by the infrared camera in order to quantify phosphate concentrations. Additionally, the app connects to an online data center to present a highly scalable worldwide system for tracking and analyzing field measurements. In this study, the detection limits for two popular commercial devices were improved by a factor of 4 for the Quantofix devices (from 1.3 ppm using visible light to 300 ppb using infrared illumination) and a factor of 6 for the Indigo units (from 9.2 ppm to 1.4 ppm) with repeatability of less than or equal to 1.2% relative standard deviation (RSD). The system also provides more granular concentration information compared to the discrete color chart used by commercial devices and it can be easily adapted for use in other applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=infrared%20lightbox" title="infrared lightbox">infrared lightbox</a>, <a href="https://publications.waset.org/abstracts/search?q=paper-based%20device" title=" paper-based device"> paper-based device</a>, <a href="https://publications.waset.org/abstracts/search?q=phosphate%20detection" title=" phosphate detection"> phosphate detection</a>, <a href="https://publications.waset.org/abstracts/search?q=smartphone%20colorimetric%20analyzer" title=" smartphone colorimetric analyzer"> smartphone colorimetric analyzer</a> </p> <a href="https://publications.waset.org/abstracts/127794/infrared-lightbox-and-iphone-app-for-improving-detection-limit-of-phosphate-detecting-dip-strips" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/127794.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">123</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1700</span> A Study of Effective Stereo Matching Method for Long-Wave Infrared Camera Module</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hyun-Koo%20Kim">Hyun-Koo Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Yonghun%20Kim"> Yonghun Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Yong-Hoon%20Kim"> Yong-Hoon Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Ju%20Hee%20Lee"> Ju Hee Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Myungho%20Song"> Myungho Song</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we have described an efficient stereo matching method and pedestrian detection method using stereo types LWIR camera. We compared with three types stereo camera algorithm as block matching, ELAS, and SGM. For pedestrian detection using stereo LWIR camera, we used that SGM stereo matching method, free space detection method using u/v-disparity, and HOG feature based pedestrian detection. According to testing result, SGM method has better performance than block matching and ELAS algorithm. Combination of SGM, free space detection, and pedestrian detection using HOG features and SVM classification can detect pedestrian of 30m distance and has a distance error about 30 cm. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=advanced%20driver%20assistance%20system" title="advanced driver assistance system">advanced driver assistance system</a>, <a href="https://publications.waset.org/abstracts/search?q=pedestrian%20detection" title=" pedestrian detection"> pedestrian detection</a>, <a href="https://publications.waset.org/abstracts/search?q=stereo%20matching%20method" title=" stereo matching method"> stereo matching method</a>, <a href="https://publications.waset.org/abstracts/search?q=stereo%20long-wave%20IR%20camera" title=" stereo long-wave IR camera"> stereo long-wave IR camera</a> </p> <a href="https://publications.waset.org/abstracts/58413/a-study-of-effective-stereo-matching-method-for-long-wave-infrared-camera-module" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/58413.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">414</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1699</span> Visual Search Based Indoor Localization in Low Light via RGB-D Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yali%20Zheng">Yali Zheng</a>, <a href="https://publications.waset.org/abstracts/search?q=Peipei%20Luo"> Peipei Luo</a>, <a href="https://publications.waset.org/abstracts/search?q=Shinan%20Chen"> Shinan Chen</a>, <a href="https://publications.waset.org/abstracts/search?q=Jiasheng%20Hao"> Jiasheng Hao</a>, <a href="https://publications.waset.org/abstracts/search?q=Hong%20Cheng"> Hong Cheng</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Most of traditional visual indoor navigation algorithms and methods only consider the localization in ordinary daytime, while we focus on the indoor re-localization in low light in the paper. As RGB images are degraded in low light, less discriminative infrared and depth image pairs are taken, as the input, by RGB-D cameras, the most similar candidates, as the output, are searched from databases which is built in the bag-of-word framework. Epipolar constraints can be used to relocalize the query infrared and depth image sequence. We evaluate our method in two datasets captured by Kinect2. The results demonstrate very promising re-localization results for indoor navigation system in low light environments. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=indoor%20navigation" title="indoor navigation">indoor navigation</a>, <a href="https://publications.waset.org/abstracts/search?q=low%20light" title=" low light"> low light</a>, <a href="https://publications.waset.org/abstracts/search?q=RGB-D%20camera" title=" RGB-D camera"> RGB-D camera</a>, <a href="https://publications.waset.org/abstracts/search?q=vision%20based" title=" vision based"> vision based</a> </p> <a href="https://publications.waset.org/abstracts/66057/visual-search-based-indoor-localization-in-low-light-via-rgb-d-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/66057.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">461</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1698</span> A Study on the Comparatison of Mechanical and Thermal Properties According to Laminated Orientation of CFRP through Bending Test</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hee%20Jae%20Shin">Hee Jae Shin</a>, <a href="https://publications.waset.org/abstracts/search?q=Lee%20Ku%20Kwac"> Lee Ku Kwac</a>, <a href="https://publications.waset.org/abstracts/search?q=In%20Pyo%20Cha"> In Pyo Cha</a>, <a href="https://publications.waset.org/abstracts/search?q=Min%20Sang%20Lee"> Min Sang Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Hyun%20Kyung%20Yoon"> Hyun Kyung Yoon</a>, <a href="https://publications.waset.org/abstracts/search?q=Hong%20Gun%20Kim"> Hong Gun Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In rapid industrial development has increased the demand for high-strength and lightweight materials. Thus, various CFRP (Carbon Fiber Reinforced Plastics) with composite materials are being used. The design variables of CFRP are its lamination direction, order, and thickness. Thus, the hardness and strength of CFRP depend much on their design variables. In this paper, the lamination direction of CFRP was used to produce a symmetrical ply [0°/0°, -15°/+15°, -30°/+30°, -45°/+45°, -60°/+60°, -75°/+75°, and 90°/90°] and an asymmetrical ply [0°/15°, 0°/30°, 0°/45°, 0°/60° 0°/75°, and 0°/90°]. The bending flexure stress of the CFRP specimen was evaluated through a bending test. Its thermal property was measured using an infrared camera. The symmetrical specimen and the asymmetrical specimen were analyzed. The results showed that the asymmetrical specimen increased the bending loads according to the increase in the orientation angle; and from 0°, the symmetrical specimen showed a tendency opposite the asymmetrical tendency because the tensile force of fiber differs at the vertical direction of its load. Also, the infrared camera showed that the thermal property had a trend similar to that of the mechanical properties. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Carbon%20Fiber%20Reinforced%20Plastic%20%28CFRP%29" title="Carbon Fiber Reinforced Plastic (CFRP)">Carbon Fiber Reinforced Plastic (CFRP)</a>, <a href="https://publications.waset.org/abstracts/search?q=bending%20test" title=" bending test"> bending test</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20camera" title=" infrared camera"> infrared camera</a>, <a href="https://publications.waset.org/abstracts/search?q=composite" title=" composite"> composite</a> </p> <a href="https://publications.waset.org/abstracts/21385/a-study-on-the-comparatison-of-mechanical-and-thermal-properties-according-to-laminated-orientation-of-cfrp-through-bending-test" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/21385.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">398</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1697</span> Hand Gesture Recognition Interface Based on IR Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yang-Keun%20Ahn">Yang-Keun Ahn</a>, <a href="https://publications.waset.org/abstracts/search?q=Kwang-Soon%20Choi"> Kwang-Soon Choi</a>, <a href="https://publications.waset.org/abstracts/search?q=Young-Choong%20Park"> Young-Choong Park</a>, <a href="https://publications.waset.org/abstracts/search?q=Kwang-Mo%20Jung"> Kwang-Mo Jung</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Vision based user interfaces to control TVs and PCs have the advantage of being able to perform natural control without being limited to a specific device. Accordingly, various studies on hand gesture recognition using RGB cameras or depth cameras have been conducted. However, such cameras have the disadvantage of lacking in accuracy or the construction cost being large. The proposed method uses a low cost IR camera to accurately differentiate between the hand and the background. Also, complicated learning and template matching methodologies are not used, and the correlation between the fingertips extracted through curvatures is utilized to recognize Click and Move gestures. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=recognition" title="recognition">recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20gestures" title=" hand gestures"> hand gestures</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20camera" title=" infrared camera"> infrared camera</a>, <a href="https://publications.waset.org/abstracts/search?q=RGB%20cameras" title=" RGB cameras"> RGB cameras</a> </p> <a href="https://publications.waset.org/abstracts/13373/hand-gesture-recognition-interface-based-on-ir-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13373.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">406</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1696</span> Video Sharing System Based On Wi-fi Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Qidi%20Lin">Qidi Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Jinbin%20Huang"> Jinbin Huang</a>, <a href="https://publications.waset.org/abstracts/search?q=Weile%20Liang"> Weile Liang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper introduces a video sharing platform based on WiFi, which consists of camera, mobile phone and PC server. This platform can receive wireless signal from the camera and show the live video on the mobile phone captured by camera. In addition that, it is able to send commands to camera and control the camera’s holder to rotate. The platform can be applied to interactive teaching and dangerous area’s monitoring and so on. Testing results show that the platform can share the live video of mobile phone. Furthermore, if the system’s PC sever and the camera and many mobile phones are connected together, it can transfer photos concurrently. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wifi%20Camera" title="Wifi Camera">Wifi Camera</a>, <a href="https://publications.waset.org/abstracts/search?q=socket%20mobile" title=" socket mobile"> socket mobile</a>, <a href="https://publications.waset.org/abstracts/search?q=platform%20video%20monitoring" title=" platform video monitoring"> platform video monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20control" title=" remote control"> remote control</a> </p> <a href="https://publications.waset.org/abstracts/31912/video-sharing-system-based-on-wi-fi-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/31912.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">337</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1695</span> Characterization of Thermal Images Due to Aging of H.V Glass Insulators Using Thermographic Scanning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nasir%20A.%20Al-Geelani">Nasir A. Al-Geelani</a>, <a href="https://publications.waset.org/abstracts/search?q=Zulkurnain%20Abdul-Malek"> Zulkurnain Abdul-Malek</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Afendi%20M.%20Piah"> M. Afendi M. Piah</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research paper investigation is carried out in the laboratory on single units of transmission line glass insulator characterized by different thermal images, which aimed to find out the age of the insulators. The tests were carried out on virgin and aged insulators using the thermography scan. Various samples having different periods of aging 20, 15, and 5 years from a 132 kV transmission line which have exhibited a different degree of corrosion. The second group of insulator samples was relatively mild aged insulators, while the third group was lightly aged; finally, the fourth group was the brand new insulators. The results revealed a strong correlation between the aging and the thermal images captured by the infrared camera. This technique can be used to monitor the aging of high voltage insulators as a precaution to avoid disaster. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=glass%20insulator" title="glass insulator">glass insulator</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20camera" title=" infrared camera"> infrared camera</a>, <a href="https://publications.waset.org/abstracts/search?q=corona%20diacharge" title=" corona diacharge"> corona diacharge</a>, <a href="https://publications.waset.org/abstracts/search?q=transmission%20lines" title=" transmission lines"> transmission lines</a>, <a href="https://publications.waset.org/abstracts/search?q=thermograpy" title=" thermograpy"> thermograpy</a>, <a href="https://publications.waset.org/abstracts/search?q=surface%20discharge" title=" surface discharge"> surface discharge</a> </p> <a href="https://publications.waset.org/abstracts/98959/characterization-of-thermal-images-due-to-aging-of-hv-glass-insulators-using-thermographic-scanning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/98959.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">160</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1694</span> Multiplayer RC-car Driving System in a Collaborative Augmented Reality Environment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kikuo%20Asai">Kikuo Asai</a>, <a href="https://publications.waset.org/abstracts/search?q=Yuji%20Sugimoto"> Yuji Sugimoto</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We developed a prototype system for multiplayer RC-car driving in a collaborative Augmented Reality (AR) environment. The tele-existence environment is constructed by superimposing digital data onto images captured by a camera on an RC-car, enabling players to experience an augmented coexistence of the digital content and the real world. Marker-based tracking was used for estimating position and orientation of the camera. The plural RC-cars can be operated in a field where square markers are arranged. The video images captured by the camera are transmitted to a PC for visual tracking. The RC-cars are also tracked by using an infrared camera attached to the ceiling, so that the instability is reduced in the visual tracking. Multimedia data such as texts and graphics are visualized to be overlaid onto the video images in the geometrically correct manner. The prototype system allows a tele-existence sensation to be augmented in a collaborative AR environment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=multiplayer" title="multiplayer">multiplayer</a>, <a href="https://publications.waset.org/abstracts/search?q=RC-car" title=" RC-car"> RC-car</a>, <a href="https://publications.waset.org/abstracts/search?q=collaborative%20environment" title=" collaborative environment"> collaborative environment</a>, <a href="https://publications.waset.org/abstracts/search?q=augmented%20reality" title=" augmented reality"> augmented reality</a> </p> <a href="https://publications.waset.org/abstracts/4359/multiplayer-rc-car-driving-system-in-a-collaborative-augmented-reality-environment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/4359.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">289</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1693</span> Hyperspectral Band Selection for Oil Spill Detection Using Deep Neural Network</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Asmau%20Mukhtar%20Ahmed">Asmau Mukhtar Ahmed</a>, <a href="https://publications.waset.org/abstracts/search?q=Olga%20Duran"> Olga Duran</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Hydrocarbon (HC) spills constitute a significant problem that causes great concern to the environment. With the latest technology (hyperspectral images) and state of the earth techniques (image processing tools), hydrocarbon spills can easily be detected at an early stage to mitigate the effects caused by such menace. In this study; a controlled laboratory experiment was used, and clay soil was mixed and homogenized with different hydrocarbon types (diesel, bio-diesel, and petrol). The different mixtures were scanned with HYSPEX hyperspectral camera under constant illumination to generate the hypersectral datasets used for this experiment. So far, the Short Wave Infrared Region (SWIR) has been exploited in detecting HC spills with excellent accuracy. However, the Near-Infrared Region (NIR) is somewhat unexplored with regards to HC contamination and how it affects the spectrum of soils. In this study, Deep Neural Network (DNN) was applied to the controlled datasets to detect and quantify the amount of HC spills in soils in the Near-Infrared Region. The initial results are extremely encouraging because it indicates that the DNN was able to identify features of HC in the Near-Infrared Region with a good level of accuracy. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=hydrocarbon" title="hydrocarbon">hydrocarbon</a>, <a href="https://publications.waset.org/abstracts/search?q=Deep%20Neural%20Network" title=" Deep Neural Network"> Deep Neural Network</a>, <a href="https://publications.waset.org/abstracts/search?q=short%20wave%20infrared%20region" title="short wave infrared region">short wave infrared region</a>, <a href="https://publications.waset.org/abstracts/search?q=near-infrared%20region" title=" near-infrared region"> near-infrared region</a>, <a href="https://publications.waset.org/abstracts/search?q=hyperspectral%20image" title=" hyperspectral image"> hyperspectral image</a> </p> <a href="https://publications.waset.org/abstracts/153072/hyperspectral-band-selection-for-oil-spill-detection-using-deep-neural-network" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/153072.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">113</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1692</span> Multi-Sensor Image Fusion for Visible and Infrared Thermal Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amit%20Kumar%20Happy">Amit Kumar Happy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper is motivated by the importance of multi-sensor image fusion with a specific focus on infrared (IR) and visual image (VI) fusion for various applications, including military reconnaissance. Image fusion can be defined as the process of combining two or more source images into a single composite image with extended information content that improves visual perception or feature extraction. These images can be from different modalities like visible camera & IR thermal imager. While visible images are captured by reflected radiations in the visible spectrum, the thermal images are formed from thermal radiation (infrared) that may be reflected or self-emitted. A digital color camera captures the visible source image, and a thermal infrared camera acquires the thermal source image. In this paper, some image fusion algorithms based upon multi-scale transform (MST) and region-based selection rule with consistency verification have been proposed and presented. This research includes the implementation of the proposed image fusion algorithm in MATLAB along with a comparative analysis to decide the optimum number of levels for MST and the coefficient fusion rule. The results are presented, and several commonly used evaluation metrics are used to assess the suggested method's validity. Experiments show that the proposed approach is capable of producing good fusion results. While deploying our image fusion algorithm approaches, we observe several challenges from the popular image fusion methods. While high computational cost and complex processing steps of image fusion algorithms provide accurate fused results, they also make it hard to become deployed in systems and applications that require a real-time operation, high flexibility, and low computation ability. So, the methods presented in this paper offer good results with minimum time complexity. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=image%20fusion" title="image fusion">image fusion</a>, <a href="https://publications.waset.org/abstracts/search?q=IR%20thermal%20imager" title=" IR thermal imager"> IR thermal imager</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-sensor" title=" multi-sensor"> multi-sensor</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-scale%20transform" title=" multi-scale transform"> multi-scale transform</a> </p> <a href="https://publications.waset.org/abstracts/138086/multi-sensor-image-fusion-for-visible-and-infrared-thermal-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/138086.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">115</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1691</span> Spatially Encoded Hyperspectral Compressive Microscope for Broadband VIS/NIR Imaging</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Luk%C3%A1%C5%A1%20Klein">Lukáš Klein</a>, <a href="https://publications.waset.org/abstracts/search?q=Karel%20%C5%BD%C3%ADdek"> Karel Žídek</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Hyperspectral imaging counts among the most frequently used multidimensional sensing methods. While there are many approaches to capturing a hyperspectral data cube, optical compression is emerging as a valuable tool to reduce the setup complexity and the amount of data storage needed. Hyperspectral compressive imagers have been created in the past; however, they have primarily focused on relatively narrow sections of the electromagnetic spectrum. A broader spectral study of samples can provide helpful information, especially for applications involving the harmonic generation and advanced material characterizations. We demonstrate a broadband hyperspectral microscope based on the single-pixel camera principle. Captured spatially encoded data are processed to reconstruct a hyperspectral cube in a combined visible and near-infrared spectrum (from 400 to 2500 nm). Hyperspectral cubes can be reconstructed with a spectral resolution of up to 3 nm and spatial resolution of up to 7 µm (subject to diffraction) with a high compressive ratio. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=compressive%20imaging" title="compressive imaging">compressive imaging</a>, <a href="https://publications.waset.org/abstracts/search?q=hyperspectral%20imaging" title=" hyperspectral imaging"> hyperspectral imaging</a>, <a href="https://publications.waset.org/abstracts/search?q=near-infrared%20spectrum" title=" near-infrared spectrum"> near-infrared spectrum</a>, <a href="https://publications.waset.org/abstracts/search?q=single-pixel%20camera" title=" single-pixel camera"> single-pixel camera</a>, <a href="https://publications.waset.org/abstracts/search?q=visible%20spectrum" title=" visible spectrum"> visible spectrum</a> </p> <a href="https://publications.waset.org/abstracts/155053/spatially-encoded-hyperspectral-compressive-microscope-for-broadband-visnir-imaging" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/155053.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">89</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1690</span> A Survey and Analysis on Inflammatory Pain Detection and Standard Protocol Selection Using Medical Infrared Thermography from Image Processing View Point</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mrinal%20Kanti%20Bhowmik">Mrinal Kanti Bhowmik</a>, <a href="https://publications.waset.org/abstracts/search?q=Shawli%20Bardhan%20Jr."> Shawli Bardhan Jr.</a>, <a href="https://publications.waset.org/abstracts/search?q=Debotosh%20Bhattacharjee"> Debotosh Bhattacharjee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human skin containing temperature value more than absolute zero, discharges infrared radiation related to the frequency of the body temperature. The difference in infrared radiation from the skin surface reflects the abnormality present in human body. Considering the difference, detection and forecasting the temperature variation of the skin surface is the main objective of using Medical Infrared Thermography(MIT) as a diagnostic tool for pain detection. Medical Infrared Thermography(MIT) is a non-invasive imaging technique that records and monitors the temperature flow in the body by receiving the infrared radiated from the skin and represent it through thermogram. The intensity of the thermogram measures the inflammation from the skin surface related to pain in human body. Analysis of thermograms provides automated anomaly detection associated with suspicious pain regions by following several image processing steps. The paper represents a rigorous study based survey related to the processing and analysis of thermograms based on the previous works published in the area of infrared thermal imaging for detecting inflammatory pain diseases like arthritis, spondylosis, shoulder impingement, etc. The study also explores the performance analysis of thermogram processing accompanied by thermogram acquisition protocols, thermography camera specification and the types of pain detected by thermography in summarized tabular format. The tabular format provides a clear structural vision of the past works. The major contribution of the paper introduces a new thermogram acquisition standard associated with inflammatory pain detection in human body to enhance the performance rate. The FLIR T650sc infrared camera with high sensitivity and resolution is adopted to increase the accuracy of thermogram acquisition and analysis. The survey of previous research work highlights that intensity distribution based comparison of comparable and symmetric region of interest and their statistical analysis assigns adequate result in case of identifying and detecting physiological disorder related to inflammatory diseases. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=acquisition%20protocol" title="acquisition protocol">acquisition protocol</a>, <a href="https://publications.waset.org/abstracts/search?q=inflammatory%20pain%20detection" title=" inflammatory pain detection"> inflammatory pain detection</a>, <a href="https://publications.waset.org/abstracts/search?q=medical%20infrared%20thermography%20%28MIT%29" title=" medical infrared thermography (MIT)"> medical infrared thermography (MIT)</a>, <a href="https://publications.waset.org/abstracts/search?q=statistical%20analysis" title=" statistical analysis"> statistical analysis</a> </p> <a href="https://publications.waset.org/abstracts/22181/a-survey-and-analysis-on-inflammatory-pain-detection-and-standard-protocol-selection-using-medical-infrared-thermography-from-image-processing-view-point" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/22181.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">342</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1689</span> Automatic Diagnosis of Electrical Equipment Using Infrared Thermography </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Y.%20Laib%20Dit%20Leksir">Y. Laib Dit Leksir</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Bouhouche"> S. Bouhouche </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Analysis and processing of data bases resulting from infrared thermal measurements made on the electrical installation requires the development of new tools in order to obtain correct and additional information to the visual inspections. Consequently, the methods based on the capture of infrared digital images show a great potential and are employed increasingly in various fields. Although, there is an enormous need for the development of effective techniques to analyse these data base in order to extract relevant information relating to the state of the equipments. Our goal consists in introducing recent techniques of modeling based on new methods, image and signal processing to develop mathematical models in this field. The aim of this work is to capture the anomalies existing in electrical equipments during an inspection of some machines using A40 Flir camera. After, we use binarisation techniques in order to select the region of interest and we make comparison between these methods of thermal images obtained to choose the best one. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=infrared%20thermography" title="infrared thermography">infrared thermography</a>, <a href="https://publications.waset.org/abstracts/search?q=defect%20detection" title=" defect detection"> defect detection</a>, <a href="https://publications.waset.org/abstracts/search?q=troubleshooting" title=" troubleshooting"> troubleshooting</a>, <a href="https://publications.waset.org/abstracts/search?q=electrical%20equipment" title=" electrical equipment "> electrical equipment </a> </p> <a href="https://publications.waset.org/abstracts/21224/automatic-diagnosis-of-electrical-equipment-using-infrared-thermography" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/21224.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">476</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1688</span> A Low-Cost Vision-Based Unmanned Aerial System for Extremely Low-Light GPS-Denied Navigation and Thermal Imaging</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Chang%20Liu">Chang Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20Nash"> John Nash</a>, <a href="https://publications.waset.org/abstracts/search?q=Stephen%20D.%20Prior"> Stephen D. Prior</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents the design and implementation details of a complete unmanned aerial system (UAS) based on commercial-off-the-shelf (COTS) components, focusing on safety, security, search and rescue scenarios in GPS-denied environments. In particular, the aerial platform is capable of semi-autonomously navigating through extremely low-light, GPS-denied indoor environments based on onboard sensors only, including a downward-facing optical flow camera. Besides, an additional low-cost payload camera system is developed to stream both infrared video and visible light video to a ground station in real-time, for the purpose of detecting sign of life and hidden humans. The total cost of the complete system is estimated to be $1150, and the effectiveness of the system has been tested and validated in practical scenarios. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20system" title="unmanned aerial system">unmanned aerial system</a>, <a href="https://publications.waset.org/abstracts/search?q=commercial-off-the-shelf" title=" commercial-off-the-shelf"> commercial-off-the-shelf</a>, <a href="https://publications.waset.org/abstracts/search?q=extremely%20low-light" title=" extremely low-light"> extremely low-light</a>, <a href="https://publications.waset.org/abstracts/search?q=GPS-denied" title=" GPS-denied"> GPS-denied</a>, <a href="https://publications.waset.org/abstracts/search?q=optical%20flow" title=" optical flow"> optical flow</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20video" title=" infrared video"> infrared video</a> </p> <a href="https://publications.waset.org/abstracts/37927/a-low-cost-vision-based-unmanned-aerial-system-for-extremely-low-light-gps-denied-navigation-and-thermal-imaging" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/37927.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">327</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1687</span> Concealed Objects Detection in Visible, Infrared and Terahertz Ranges</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20Kowalski">M. Kowalski</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Kastek"> M. Kastek</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Szustakowski"> M. Szustakowski</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Multispectral screening systems are becoming more popular because of their very interesting properties and applications. One of the most significant applications of multispectral screening systems is prevention of terrorist attacks. There are many kinds of threats and many methods of detection. Visual detection of objects hidden under clothing of a person is one of the most challenging problems of threats detection. There are various solutions of the problem; however, the most effective utilize multispectral surveillance imagers. The development of imaging devices and exploration of new spectral bands is a chance to introduce new equipment for assuring public safety. We investigate the possibility of long lasting detection of potentially dangerous objects covered with various types of clothing. In the article we present the results of comparative studies of passive imaging in three spectrums – visible, infrared and terahertz <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=terahertz" title="terahertz">terahertz</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared" title=" infrared"> infrared</a>, <a href="https://publications.waset.org/abstracts/search?q=object%20detection" title=" object detection"> object detection</a>, <a href="https://publications.waset.org/abstracts/search?q=screening%20camera" title=" screening camera"> screening camera</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a> </p> <a href="https://publications.waset.org/abstracts/6914/concealed-objects-detection-in-visible-infrared-and-terahertz-ranges" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6914.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">357</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1686</span> Improvement of Ground Truth Data for Eye Location on Infrared Driver Recordings</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sorin%20Valcan">Sorin Valcan</a>, <a href="https://publications.waset.org/abstracts/search?q=Mihail%20Gaianu"> Mihail Gaianu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Labeling is a very costly and time consuming process which aims to generate datasets for training neural networks in several functionalities and projects. For driver monitoring system projects, the need for labeled images has a significant impact on the budget and distribution of effort. This paper presents the modifications done to an algorithm used for the generation of ground truth data for 2D eyes location on infrared images with drivers in order to improve the quality of the data and performance of the trained neural networks. The algorithm restrictions become tougher, which makes it more accurate but also less constant. The resulting dataset becomes smaller and shall not be altered by any kind of manual label adjustment before being used in the neural networks training process. These changes resulted in a much better performance of the trained neural networks. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=labeling%20automation" title="labeling automation">labeling automation</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20camera" title=" infrared camera"> infrared camera</a>, <a href="https://publications.waset.org/abstracts/search?q=driver%20monitoring" title=" driver monitoring"> driver monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=eye%20detection" title=" eye detection"> eye detection</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20networks" title=" convolutional neural networks"> convolutional neural networks</a> </p> <a href="https://publications.waset.org/abstracts/148969/improvement-of-ground-truth-data-for-eye-location-on-infrared-driver-recordings" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/148969.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">117</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1685</span> Advantages of Multispectral Imaging for Accurate Gas Temperature Profile Retrieval from Fire Combustion Reactions</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jean-Philippe%20Gagnon">Jean-Philippe Gagnon</a>, <a href="https://publications.waset.org/abstracts/search?q=Benjamin%20Saute"> Benjamin Saute</a>, <a href="https://publications.waset.org/abstracts/search?q=St%C3%A9phane%20Boubanga-Tombet"> Stéphane Boubanga-Tombet</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Infrared thermal imaging is used for a wide range of applications, especially in the combustion domain. However, it is well known that most combustion gases such as carbon dioxide (CO₂), water vapor (H₂O), and carbon monoxide (CO) selectively absorb/emit infrared radiation at discrete energies, i.e., over a very narrow spectral range. Therefore, temperature profiles of most combustion processes derived from conventional broadband imaging are inaccurate without prior knowledge or assumptions about the spectral emissivity properties of the combustion gases. Using spectral filters allows estimating these critical emissivity parameters in addition to providing selectivity regarding the chemical nature of the combustion gases. However, due to the turbulent nature of most flames, it is crucial that such information be obtained without sacrificing temporal resolution. For this reason, Telops has developed a time-resolved multispectral imaging system which combines a high-performance broadband camera synchronized with a rotating spectral filter wheel. In order to illustrate the benefits of using this system to characterize combustion experiments, measurements were carried out using a Telops MS-IR MW on a very simple combustion system: a wood fire. The temperature profiles calculated using the spectral information from the different channels were compared with corresponding temperature profiles obtained with conventional broadband imaging. The results illustrate the benefits of the Telops MS-IR cameras for the characterization of laminar and turbulent combustion systems at a high temporal resolution. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=infrared" title="infrared">infrared</a>, <a href="https://publications.waset.org/abstracts/search?q=multispectral" title=" multispectral"> multispectral</a>, <a href="https://publications.waset.org/abstracts/search?q=fire" title=" fire"> fire</a>, <a href="https://publications.waset.org/abstracts/search?q=broadband" title=" broadband"> broadband</a>, <a href="https://publications.waset.org/abstracts/search?q=gas%20temperature" title=" gas temperature"> gas temperature</a>, <a href="https://publications.waset.org/abstracts/search?q=IR%20camera" title=" IR camera"> IR camera</a> </p> <a href="https://publications.waset.org/abstracts/146725/advantages-of-multispectral-imaging-for-accurate-gas-temperature-profile-retrieval-from-fire-combustion-reactions" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/146725.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">143</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1684</span> Infrared Thermography Applications for Building Investigation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hamid%20Yazdani">Hamid Yazdani</a>, <a href="https://publications.waset.org/abstracts/search?q=Raheleh%20Akbar"> Raheleh Akbar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Infrared thermography is a modern non-destructive measuring method for the examination of redeveloped and non-renovated buildings. Infrared cameras provide a means for temperature measurement in building constructions from the inside, as well as from the outside. Thus, heat bridges can be detected. It has been shown that infrared thermography is applicable for insulation inspection, identifying air leakage and heat losses sources, finding the exact position of heating tubes or for discovering the reasons why mold, moisture is growing in a particular area, and it is also used in conservation field to detect hidden characteristics, degradations of building structures. The paper gives a brief description of the theoretical background of infrared thermography. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=infrared%20thermography" title="infrared thermography">infrared thermography</a>, <a href="https://publications.waset.org/abstracts/search?q=examination%20of%20buildings" title=" examination of buildings"> examination of buildings</a>, <a href="https://publications.waset.org/abstracts/search?q=emissivity" title=" emissivity"> emissivity</a>, <a href="https://publications.waset.org/abstracts/search?q=heat%20losses%20sources" title=" heat losses sources"> heat losses sources</a> </p> <a href="https://publications.waset.org/abstracts/15901/infrared-thermography-applications-for-building-investigation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/15901.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">520</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1683</span> Evaluation of Heterogeneity of Paint Coating on Metal Substrate Using Laser Infrared Thermography and Eddy Current</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20Mezghani">S. Mezghani</a>, <a href="https://publications.waset.org/abstracts/search?q=E.%20Perrin"> E. Perrin</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20L.%20Bodnar"> J. L. Bodnar</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20Marthe"> J. Marthe</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20Cauwe"> B. Cauwe</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20Vrabie"> V. Vrabie</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Non contact evaluation of the thickness of paint coatings can be attempted by different destructive and nondestructive methods such as cross-section microscopy, gravimetric mass measurement, magnetic gauges, Eddy current, ultrasound or terahertz. Infrared thermography is a nondestructive and non-invasive method that can be envisaged as a useful tool to measure the surface thickness variations by analyzing the temperature response. In this paper, the thermal quadrupole method for two layered samples heated up with a pulsed excitation is firstly used. By analyzing the thermal responses as a function of thermal properties and thicknesses of both layers, optimal parameters for the excitation source can be identified. Simulations show that a pulsed excitation with duration of ten milliseconds allows to obtain a substrate-independent thermal response. Based on this result, an experimental setup consisting of a near-infrared laser diode and an Infrared camera was next used to evaluate the variation of paint coating thickness between 60 µm and 130 µm on two samples. Results show that the parameters extracted for thermal images are correlated with the estimated thicknesses by the Eddy current methods. The laser pulsed thermography is thus an interesting alternative nondestructive method that can be moreover used for non conductive substrates. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=non%20destructive" title="non destructive">non destructive</a>, <a href="https://publications.waset.org/abstracts/search?q=paint%20coating" title=" paint coating"> paint coating</a>, <a href="https://publications.waset.org/abstracts/search?q=thickness" title=" thickness"> thickness</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20thermography" title=" infrared thermography"> infrared thermography</a>, <a href="https://publications.waset.org/abstracts/search?q=laser" title=" laser"> laser</a>, <a href="https://publications.waset.org/abstracts/search?q=heterogeneity" title=" heterogeneity"> heterogeneity</a> </p> <a href="https://publications.waset.org/abstracts/20665/evaluation-of-heterogeneity-of-paint-coating-on-metal-substrate-using-laser-infrared-thermography-and-eddy-current" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/20665.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">639</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1682</span> Real-Time Web Map Service Based on Solar-Powered Unmanned Aerial Vehicle</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sunghun%20Jung">Sunghun Jung</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The existing web map service providers contract with the satellite operators to update their maps by paying an astronomical amount of money, but the cost could be minimized by operating a cheap and small UAV. In contrast to the satellites, we only need to replace aged battery packs from time to time for the usage of UAVs. Utilizing both a regular camera and an infrared camera mounted on a small, solar-powered, long-endurance, and hoverable UAV, daytime ground surface photographs, and nighttime infrared photographs will be continuously and repeatedly uploaded to the web map server and overlapped with the existing ground surface photographs in real-time. The real-time web map service using a small, solar-powered, long-endurance, and hoverable UAV can also be applied to the surveillance missions, in particular, to detect border area intruders. The improved real-time image stitching algorithm is developed for the graphic map data overlapping. Also, a small home server will be developed to manage the huge size of incoming map data. The map photographs taken at tens or hundreds of kilometers by a UAV would improve the map graphic resolution compared to the map photographs taken at thousands of kilometers by satellites since the satellite photographs are limited by weather conditions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=long-endurance" title="long-endurance">long-endurance</a>, <a href="https://publications.waset.org/abstracts/search?q=real-time%20web%20map%20service%20%28RWMS%29" title=" real-time web map service (RWMS)"> real-time web map service (RWMS)</a>, <a href="https://publications.waset.org/abstracts/search?q=solar-powered" title=" solar-powered"> solar-powered</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicle%20%28UAV%29" title=" unmanned aerial vehicle (UAV)"> unmanned aerial vehicle (UAV)</a> </p> <a href="https://publications.waset.org/abstracts/80443/real-time-web-map-service-based-on-solar-powered-unmanned-aerial-vehicle" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/80443.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">274</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1681</span> Experimental Investigation of the Out-of-Plane Dynamic Behavior of Adhesively Bonded Composite Joints at High Strain Rates</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sonia%20Sassi">Sonia Sassi</a>, <a href="https://publications.waset.org/abstracts/search?q=Mostapha%20Tarfaoui"> Mostapha Tarfaoui</a>, <a href="https://publications.waset.org/abstracts/search?q=Hamza%20Ben%20Yahia"> Hamza Ben Yahia</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this investigation, an experimental technique in which the dynamic response, damage kinetic and heat dissipation are measured simultaneously during high strain rates on adhesively bonded joints materials. The material used in this study is widely used in the design of structures for military applications. It was composed of a 45° Bi-axial fiber-glass mat of 0.286 mm thickness in a Polyester resin matrix. In adhesive bonding, a NORPOL Polyvinylester of 1 mm thickness was used to assemble the composite substrate. The experimental setup consists of a compression Split Hopkinson Pressure Bar (SHPB), a high-speed infrared camera and a high-speed Fastcam rapid camera. For the dynamic compression tests, 13 mm x 13 mm x 9 mm samples for out-of-plane tests were considered from 372 to 1030 s-1. Specimen surface is controlled and monitored in situ and in real time using the high-speed camera which acquires the damage progressive in specimens and with the infrared camera which provides thermal images in time sequence. Preliminary compressive stress-strain vs. strain rates data obtained show that the dynamic material strength increases with increasing strain rates. Damage investigations have revealed that the failure mainly occurred in the adhesive/adherent interface because of the brittle nature of the polymeric adhesive. Results have shown the dependency of the dynamic parameters on strain rates. Significant temperature rise was observed in dynamic compression tests. Experimental results show that the temperature change depending on the strain rate and the damage mode and their maximum exceed 100 °C. The dependence of these results on strain rate indicates that there exists a strong correlation between damage rate sensitivity and heat dissipation, which might be useful when developing damage models under dynamic loading tacking into account the effect of the energy balance of adhesively bonded joints. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=adhesive%20bonded%20joints" title="adhesive bonded joints">adhesive bonded joints</a>, <a href="https://publications.waset.org/abstracts/search?q=Hopkinson%20bars" title=" Hopkinson bars"> Hopkinson bars</a>, <a href="https://publications.waset.org/abstracts/search?q=out-of-plane%20tests" title=" out-of-plane tests"> out-of-plane tests</a>, <a href="https://publications.waset.org/abstracts/search?q=dynamic%20compression%20properties" title=" dynamic compression properties"> dynamic compression properties</a>, <a href="https://publications.waset.org/abstracts/search?q=damage%20mechanisms" title=" damage mechanisms"> damage mechanisms</a>, <a href="https://publications.waset.org/abstracts/search?q=heat%20dissipation" title=" heat dissipation"> heat dissipation</a> </p> <a href="https://publications.waset.org/abstracts/90187/experimental-investigation-of-the-out-of-plane-dynamic-behavior-of-adhesively-bonded-composite-joints-at-high-strain-rates" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/90187.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">212</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1680</span> Camera Trapping Coupled With Field Sign Survey Reveal the Mammalian Diversity and Abundance at Murree-Kotli Sattian-Kahuta National Park, Pakistan</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shehnila%20Kanwal">Shehnila Kanwal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Murree-Kotli Sattian-Kahta National Park (MKKNP) was declared in 2009. However, not much is known about the diversity and relative abundance of the mammalian fauna of this park. In the current study, we used field sign survey and infrared camera trapping techniques to get an insight into the diversity of mammalian species and their relative abundance. We conducted field surveys in different areas of the park at various elevations from April 2023 up to March 2024 to record the field signs (scats, pug marks etc.) of the mammals’ species; in addition, we deployed a total of 22 infrared trail camera traps in different areas of the park, for 116 nights. We obtained a total of 5201 photographs using camera trapping. Results of camera trapping coupled with field sign surveys confirmed the presence of a total of twenty-one different mammalian species (large, meso and small mammals) recorded in the study area. The common leopard was recorded at four different sites in the park, with an altitudinal range between 648m-1533m. Distribution of Asiatic jackal and a red fox was recorded positive at all the sites surveyed in the park with an altitudinal range between 498m-1287m and 433m-2049m, respectively. Leopard cats were recorded at two different sites within the altitudinal range between 498m-894m. Jungle cat was recorded at three sites within an altitudinal range between 498m-846. Asian palm civets and small Indian civets were both recorded at three sites. Grey mongoose and small Indian mongoose were recorded at four and three sites. We also collected a total of 75 scats of different mammal species in the park to further confirm their occurrence. For the Indian pangolin, we recorded three field burrows at two different sites. Diversity index (H’=2.369960) and species evenness (E=0.81995) were calculated. Analysis of data revealed that wild boar (Sus sucrofa) was the most abundant species in the park; most of the mammal species were found nocturnal; these remain active from dusk throughout the night, and some of them remain active at dawn time. Leopard and Asian palm civets were highly overlapping species in the study area. Their temporal activity pattern overlapped 61%. Barking deer and Indian crested porcupine were also found to be nocturnal species they remained active throughout the night. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=MKKNP" title="MKKNP">MKKNP</a>, <a href="https://publications.waset.org/abstracts/search?q=diversity" title=" diversity"> diversity</a>, <a href="https://publications.waset.org/abstracts/search?q=abundance" title=" abundance"> abundance</a>, <a href="https://publications.waset.org/abstracts/search?q=evenness" title=" evenness"> evenness</a>, <a href="https://publications.waset.org/abstracts/search?q=distribution" title=" distribution"> distribution</a>, <a href="https://publications.waset.org/abstracts/search?q=mammals" title=" mammals"> mammals</a>, <a href="https://publications.waset.org/abstracts/search?q=overlapped" title=" overlapped"> overlapped</a> </p> <a href="https://publications.waset.org/abstracts/191725/camera-trapping-coupled-with-field-sign-survey-reveal-the-mammalian-diversity-and-abundance-at-murree-kotli-sattian-kahuta-national-park-pakistan" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/191725.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">19</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1679</span> Infrared Thermography as an Informative Tool in Energy Audit and Software Modelling of Historic Buildings: A Case Study of the Sheffield Cathedral</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ademuyiwa%20Agbonyin">Ademuyiwa Agbonyin</a>, <a href="https://publications.waset.org/abstracts/search?q=Stamatis%20Zoras"> Stamatis Zoras</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Zandi"> Mohammad Zandi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper investigates the extent to which building energy modelling can be informed based on preliminary information provided by infrared thermography using a thermal imaging camera in a walkthrough audit. The case-study building is the Sheffield Cathedral, built in the early 1400s. Based on an informative qualitative report generated from the thermal images taken at the site, the regions showing significant heat loss are input into a computer model of the cathedral within the integrated environmental solution (IES) virtual environment software which performs an energy simulation to determine quantitative heat losses through the building envelope. Building data such as material thermal properties and building plans are provided by the architects, Thomas Ford and Partners Ltd. The results of the modelling revealed the portions of the building with the highest heat loss and these aligned with those suggested by the thermal camera. Retrofit options for the building are also considered, however, may not see implementation due to a desire to conserve the architectural heritage of the building. Results show that thermal imaging in a walk-through audit serves as a useful guide for the energy modelling process. Hand calculations were also performed to serve as a 'control' to estimate losses, providing a second set of data points of comparison. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=historic%20buildings" title="historic buildings">historic buildings</a>, <a href="https://publications.waset.org/abstracts/search?q=energy%20retrofit" title=" energy retrofit"> energy retrofit</a>, <a href="https://publications.waset.org/abstracts/search?q=thermal%20comfort" title=" thermal comfort"> thermal comfort</a>, <a href="https://publications.waset.org/abstracts/search?q=software%20modelling" title=" software modelling"> software modelling</a>, <a href="https://publications.waset.org/abstracts/search?q=energy%20modelling" title=" energy modelling"> energy modelling</a> </p> <a href="https://publications.waset.org/abstracts/103567/infrared-thermography-as-an-informative-tool-in-energy-audit-and-software-modelling-of-historic-buildings-a-case-study-of-the-sheffield-cathedral" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/103567.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">170</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1678</span> Using Infrared Thermography, Photogrammetry and a Remotely Piloted Aircraft System to Create 3D Thermal Models</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=C.%20C.%20Kruger">C. C. Kruger</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Van%20Tonder"> P. Van Tonder</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Concrete deteriorates over time and the deterioration can be escalated due to multiple factors. When deteriorations are beneath the concrete’s surface, they could be unknown, even more so when they are located at high elevations. Establishing the severity of such defects could prove difficult and therefore the need to find efficient, safe and economical methods to find these defects becomes ever more important. Current methods using thermography to find defects require equipment such as scaffolding to reach these higher elevations. This could become time- consuming and costly. The risks involved with personnel scaffold or abseil to such heights are high. Accordingly, by combining the technologies of a thermal camera and a Remotely Piloted Aerial System it could be used to find better diagnostic methods. The data could then be constructed into a 3D thermal model to easy representation of the results <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=concrete" title="concrete">concrete</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared%20thermography" title=" infrared thermography"> infrared thermography</a>, <a href="https://publications.waset.org/abstracts/search?q=3D%20thermal%20models" title="3D thermal models">3D thermal models</a>, <a href="https://publications.waset.org/abstracts/search?q=diagnostic" title=" diagnostic"> diagnostic</a> </p> <a href="https://publications.waset.org/abstracts/142250/using-infrared-thermography-photogrammetry-and-a-remotely-piloted-aircraft-system-to-create-3d-thermal-models" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/142250.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">173</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1677</span> Image Features Comparison-Based Position Estimation Method Using a Camera Sensor</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jinseon%20Song">Jinseon Song</a>, <a href="https://publications.waset.org/abstracts/search?q=Yongwan%20Park"> Yongwan Park</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=positioning" title="positioning">positioning</a>, <a href="https://publications.waset.org/abstracts/search?q=distance" title=" distance"> distance</a>, <a href="https://publications.waset.org/abstracts/search?q=camera" title=" camera"> camera</a>, <a href="https://publications.waset.org/abstracts/search?q=features" title=" features"> features</a>, <a href="https://publications.waset.org/abstracts/search?q=SURF%28Speed-Up%20Robust%20Features%29" title=" SURF(Speed-Up Robust Features)"> SURF(Speed-Up Robust Features)</a>, <a href="https://publications.waset.org/abstracts/search?q=database" title=" database"> database</a>, <a href="https://publications.waset.org/abstracts/search?q=estimation" title=" estimation"> estimation</a> </p> <a href="https://publications.waset.org/abstracts/11844/image-features-comparison-based-position-estimation-method-using-a-camera-sensor" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/11844.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">349</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1676</span> Real Time Detection, Prediction and Reconstitution of Rain Drops</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=R.%20Burahee">R. Burahee</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20Chassinat"> B. Chassinat</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20de%20Laclos"> T. de Laclos</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20D%C3%A9p%C3%A9e"> A. Dépée</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Sastim"> A. Sastim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The purpose of this paper is to propose a solution to detect, predict and reconstitute rain drops in real time – during the night – using an embedded material with an infrared camera. To prevent the system from needing too high hardware resources, simple models are considered in a powerful image treatment algorithm reducing considerably calculation time in OpenCV software. Using a smart model – drops will be matched thanks to a process running through two consecutive pictures for implementing a sophisticated tracking system. With this system drops computed trajectory gives information for predicting their future location. Thanks to this technique, treatment part can be reduced. The hardware system composed by a Raspberry Pi is optimized to host efficiently this code for real time execution. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=reconstitution" title="reconstitution">reconstitution</a>, <a href="https://publications.waset.org/abstracts/search?q=prediction" title=" prediction"> prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=detection" title=" detection"> detection</a>, <a href="https://publications.waset.org/abstracts/search?q=rain%20drop" title=" rain drop"> rain drop</a>, <a href="https://publications.waset.org/abstracts/search?q=real%20time" title=" real time"> real time</a>, <a href="https://publications.waset.org/abstracts/search?q=raspberry" title=" raspberry"> raspberry</a>, <a href="https://publications.waset.org/abstracts/search?q=infrared" title=" infrared"> infrared</a> </p> <a href="https://publications.waset.org/abstracts/12821/real-time-detection-prediction-and-reconstitution-of-rain-drops" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/12821.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">419</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1675</span> Temperature-Based Detection of Initial Yielding Point in Loading of Tensile Specimens Made of Structural Steel</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Aqsa%20Jamil">Aqsa Jamil</a>, <a href="https://publications.waset.org/abstracts/search?q=Tamura%20Hiroshi"> Tamura Hiroshi</a>, <a href="https://publications.waset.org/abstracts/search?q=Katsuchi%20Hiroshi"> Katsuchi Hiroshi</a>, <a href="https://publications.waset.org/abstracts/search?q=Wang%20Jiaqi"> Wang Jiaqi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The yield point represents the upper limit of forces which can be applied to a specimen without causing any permanent deformation. After yielding, the behavior of the specimen suddenly changes, including the possibility of cracking or buckling. So, the accumulation of damage or type of fracture changes depending on this condition. As it is difficult to accurately detect yield points of the several stress concentration points in structural steel specimens, an effort has been made in this research work to develop a convenient technique using thermography (temperature-based detection) during tensile tests for the precise detection of yield point initiation. To verify the applicability of thermography camera, tests were conducted under different loading conditions and measuring the deformation by installing various strain gauges and monitoring the surface temperature with the help of a thermography camera. The yield point of specimens was estimated with the help of temperature dip, which occurs due to the thermoelastic effect during the plastic deformation. The scattering of the data has been checked by performing a repeatability analysis. The effects of temperature imperfection and light source have been checked by carrying out the tests at daytime as well as midnight and by calculating the signal to noise ratio (SNR) of the noised data from the infrared thermography camera, it can be concluded that the camera is independent of testing time and the presence of a visible light source. Furthermore, a fully coupled thermal-stress analysis has been performed by using Abaqus/Standard exact implementation technique to validate the temperature profiles obtained from the thermography camera and to check the feasibility of numerical simulation for the prediction of results extracted with the help of the thermographic technique. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=signal%20to%20noise%20ratio" title="signal to noise ratio">signal to noise ratio</a>, <a href="https://publications.waset.org/abstracts/search?q=thermoelastic%20effect" title=" thermoelastic effect"> thermoelastic effect</a>, <a href="https://publications.waset.org/abstracts/search?q=thermography" title=" thermography"> thermography</a>, <a href="https://publications.waset.org/abstracts/search?q=yield%20point" title=" yield point"> yield point</a> </p> <a href="https://publications.waset.org/abstracts/151454/temperature-based-detection-of-initial-yielding-point-in-loading-of-tensile-specimens-made-of-structural-steel" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/151454.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">107</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1674</span> Subpixel Corner Detection for Monocular Camera Linear Model Research</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Guorong%20Sui">Guorong Sui</a>, <a href="https://publications.waset.org/abstracts/search?q=Xingwei%20Jia"> Xingwei Jia</a>, <a href="https://publications.waset.org/abstracts/search?q=Fei%20Tong"> Fei Tong</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiumin%20Gao"> Xiumin Gao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Camera calibration is a fundamental issue of high precision noncontact measurement. And it is necessary to analyze and study the reliability and application range of its linear model which is often used in the camera calibration. According to the imaging features of monocular cameras, a camera model which is based on the image pixel coordinates and three dimensional space coordinates is built. Using our own customized template, the image pixel coordinate is obtained by the subpixel corner detection method. Without considering the aberration of the optical system, the feature extraction and linearity analysis of the line segment in the template are performed. Moreover, the experiment is repeated 11 times by constantly varying the measuring distance. At last, the linearity of the camera is achieved by fitting 11 groups of data. The camera model measurement results show that the relative error does not exceed 1%, and the repeated measurement error is not more than 0.1 mm magnitude. Meanwhile, it is found that the model has some measurement differences in the different region and object distance. The experiment results show this linear model is simple and practical, and have good linearity within a certain object distance. These experiment results provide a powerful basis for establishment of the linear model of camera. These works will have potential value to the actual engineering measurement. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=camera%20linear%20model" title="camera linear model">camera linear model</a>, <a href="https://publications.waset.org/abstracts/search?q=geometric%20imaging%20relationship" title=" geometric imaging relationship"> geometric imaging relationship</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20pixel%20coordinates" title=" image pixel coordinates"> image pixel coordinates</a>, <a href="https://publications.waset.org/abstracts/search?q=three%20dimensional%20space%20coordinates" title=" three dimensional space coordinates"> three dimensional space coordinates</a>, <a href="https://publications.waset.org/abstracts/search?q=sub-pixel%20corner%20detection" title=" sub-pixel corner detection"> sub-pixel corner detection</a> </p> <a href="https://publications.waset.org/abstracts/77747/subpixel-corner-detection-for-monocular-camera-linear-model-research" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/77747.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">277</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1673</span> X-Corner Detection for Camera Calibration Using Saddle Points</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Abdulrahman%20S.%20Alturki">Abdulrahman S. Alturki</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20S.%20Loomis"> John S. Loomis</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper discusses a corner detection algorithm for camera calibration. Calibration is a necessary step in many computer vision and image processing applications. Robust corner detection for an image of a checkerboard is required to determine intrinsic and extrinsic parameters. In this paper, an algorithm for fully automatic and robust X-corner detection is presented. Checkerboard corner points are automatically found in each image without user interaction or any prior information regarding the number of rows or columns. The approach represents each X-corner with a quadratic fitting function. Using the fact that the X-corners are saddle points, the coefficients in the fitting function are used to identify each corner location. The automation of this process greatly simplifies calibration. Our method is robust against noise and different camera orientations. Experimental analysis shows the accuracy of our method using actual images acquired at different camera locations and orientations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=camera%20calibration" title="camera calibration">camera calibration</a>, <a href="https://publications.waset.org/abstracts/search?q=corner%20detector" title=" corner detector"> corner detector</a>, <a href="https://publications.waset.org/abstracts/search?q=edge%20detector" title=" edge detector"> edge detector</a>, <a href="https://publications.waset.org/abstracts/search?q=saddle%20points" title=" saddle points"> saddle points</a> </p> <a href="https://publications.waset.org/abstracts/40538/x-corner-detection-for-camera-calibration-using-saddle-points" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/40538.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">406</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=56">56</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=57">57</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=infrared%20camera&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>